Exploring marketing, revolutionary features and potential of Apple's latest innovation

Apple's Vision Pro has hit the US market, and early birds who secured their preorders are already immersing themselves in the spatial computing experience. Reviews for the Vision Pro have been flooding in, showcasing a mix of positive (and critical) feedback, alongside firsthand accounts from consumers who have invested in the device.
While Apple has kept mum on specific sales figures, estimates suggest that approximately 200,000 Vision Pro units were snapped up during the preorder window spanning from January 19th to February 2nd.
Although this figure may seem minimal by the millions of iPhones typically sold every year on launch day, it's important to contextualize these numbers. As Gene Munster of MacDaily News put it:
"The concept of bringing the digital and real worlds together is here. As for timing, no need to buckle up for the next paradigm shift, because it will take five years to take off". Gene Muster
To put things into perspective, keep in mind that the first iPhone sold approximately 270,000 units during its very first week in 2007, outpacing the Vision Pro. However, given the Vision Pro's substantial price tag ($ 3499), its opening week revenues significantly exceeded those of the iPhone.
New dystopian era or viral user-generated marketing?

credits: AP
As expected, 200,000 devices were more than enough to made headlines worldwide. Many web creators have begun sharing their feelings about the mixed reality headset with which the tech giant aims to revolutionize personal computing, and the initial images of the device in public settings are sparking a mix of surprise and concern.
Much of the hype generated plays into Apple’s marketing strategy and aims to shine the spotlight on Cupertino's new product, but it is not really that relevant to announce the arrival of a dystopian new era.
The fact that several people have been filmed using the Vision Pro in subways, restaurants, courtside at NBA games, or even while driving, seems more aligned with the fervor of online content creators than with the concept of spatial computing itself.
There is no doubt that contrast, distortion, and reduced field of vision can negatively affect users' ability to correctly perceive reality, increasing the risk of accidents and raising legal and ethical questions on the adequacy of current regulations.
Nonetheless, it appears that many of those viral videos used the gadget as a “stunt” to create content while others were just staging moves with their hands in “Transparency mode” or simply trying the commands for the first time in public, to emphasize the new interaction between the real and virtual worlds.
How it works
To fully leverage and enjoy Apple's new technology, it's much better to understand the characteristics of the device and use it accordingly. So, let’s try to delve deeper into the features of Apple Vision Pro, starting with how it works, to better understand the possible future developments of this incredible tool.
You can navigate Vision Pro simply by using eyes, hands, and voice. When interacting with objects and apps, you'll need to visually target them. The Vision Pro's sensors track eyes with stunning precision and gaze can also be utilized to authenticate through a feature Apple refers to as Optic ID.
Controlling the device through gestures may require some initial adaptation since your hands have to be visible to the Vision Pro’s external cameras, but once mastered, navigation becomes effortless. You can scroll horizontally and vertically by pinching your fingers together and dragging in the desired direction. Zooming in on photos, images, or web pages is achieved by simultaneously pinching your fingers together with both hands and spreading them apart.
Safety and security
credits: The Guardian
In its user guide, Apple warns that the Vision Pro is designed to be used exclusively in controlled and secure areas and adds that the gadget may not detect all obstacles depending on the environmental and lighting conditions in which it is used.
It's a good idea to set up a safe space clear of any obstacles that could bump into, trip over, or hit with hands and follow all the safety information provided by the guide.
“Use the Vision Pro in controlled indoor and outdoor spaces, and always remain aware of your surroundings and body posture during use. Apple Vision Pro should never be used on or near roads, streets, or any other area where moving objects present a collision risk”. Apple Inc.
According to numerous reports, Apple has gone to the extent of programming the VisionOS to notify users and restrict functionalities if it detects unsafe movements, displaying a message like "Moving at Unsafe Speed".
YouTuber Casey Neistat showcased some daring behavior by navigating busy Manhattan streets on an electric skateboard while wearing the Vision Pro. The VisionOS code suggests that such actions might lead the headset to deactivate all features and go in Transparency mode to prioritize user safety.
Although Apple has introduced a Travel Mode intended for airplane use, allowing limited functionality during motion, the company advises users of potential issues while moving. Some users claim to have utilized the Vision Pro in moving vehicles, nonetheless the code implies that users must maintain a stationary position for features to function effectively.
What’s new in the Vision Pro, really?

credits: Apple
So, what’s new in this device? The easy answer is the idea of Spatial Computing, and the first important innovative concept is of course SPACE. The visionOS operating system transfers the old desktop into the world around you and offers a limitless canvas where people can view virtual content like windows (2D content), volumes and 3D content simultaneously.
The other keyword is IMMERSION. In a visionOS app, people can fluidly transition between different levels of immersion. In one mode, the Vision Pro places windows and virtual objects in the environment you are in and fixes them so that when your head moves, the windows do not: it appears to stay there in your current environment as in an Augmented Reality device.
This overlay of virtual images onto real ones captured by Vision Pro is very realistic. The delay between the time it takes for the headset to capture the surrounding environment, process the image with virtual objects, and display everything is almost imperceptible, making it seem like you are observing something through a transparent visor.
Turn the dial, and that can change! You can be transported somewhere else as the live video “Passthrough” from your current environment is replaced by a fully immersive digitally created 3D environment. When people want to see more or less of their surroundings, they turn the little dial on the device to control the amount of Passthrough. A function called "Breakthrough" enables users to maintain a connection to the real environment around them even when they’re fully immersed in apps.
The real challenge: breaking the glass through the real and the virtual

credits: Apple
The ability to continuously adjust the level of immersion while using Apple Vision Pro is truly remarkable. This feature represents perhaps the most significant advancement that Apple's new Spatial Computing brings to the table compared to other XR headsets currently available on the market.
The Passthrough technology, which enables users to perceive their surrounding environment, is impressive to say the least. While you're not directly observing your surroundings, the Vision Pro effectively simulates the experience as if you were.
Apple's technical approach contrasts with Microsoft's HoloLens: while the Vision Pro (and the Meta Quest 3) enhances a video capture of the environment, the HoloLens overlays virtual elements directly onto your own field of view.
To guarantee the lowest latency, external cameras of the Vision Pro provide 90 fps of image data – but the higher the number of frames per second, the more the cameras struggle in low-light conditions. That’s why Passthrough won't have the same performance in a basement as it does in in a well-lit living room, but it sets a completely new standard, as confirmed by the (still very skeptical) WIRED Magazine:
Apple’s engineers deserve all the praise they can get—achieving this stuff even with DSLRs strapped to the front of Vision Pro would be tough enough.
While wearing the device, people can also get up and walk around. But the vast majority of apps will offer stationary experiences that require minimal movement. Unless it's a core part of an experience, people should use the Vision Pro without needing to move at all. Sometimes people do move to a new seat in their room or to face a different direction. When they settle, they can press the dial to recenter.
The very fact that spatial computing drives the development of 2D screen-centric, or 3D immersive experiences melting them with reality, makes the Vision Pro experience more attractive in calm and fairly quiet spaces rather than crowded, busy and chaotic public locations. Not just for an elementary safety rule but for a better overall experience: human attention is limited, and people get overwhelmed by excessive background noise.
The Role of Creators
Experts suggest that initial sales figures of the Vision Pro will ultimately have minimal impact on determining the long-term success of Spatial Computing. They contend that the key for success lies in whether developers will be willing to join Apple and then able to create "killer apps" tailored specifically for the headset in the coming years, unlocking entirely new experiences for users.
As for now the Vision Pro has the familiar apps for OS users: Safari, Photos, Music, TV, Cloud, Messages and so on. A brand‑new App Store has the task to deliver groundbreaking apps built for visionOS, as well as compatible iPad and iPhone, in order to tempt consumers to purchase the new hardware product.
Apple Vision Pro offers an infinite spatial canvas to explore, experiment and play, giving the freedom to completely rethink the 3D experience.
People can interact with an app while staying connected to their surroundings or immerse themselves completely. And experiences can also be fluid: start in a window, bring in 3D content, transition to a fully immersive scene, and come right back.
PIGIAMA KASAMA joins Unity’s vision OS

credits: Igor Ormilaev - Unsplash
PIGIAMA KASAMA is going to be among the first to create games, lifestyle experiences, and industry apps for Apple Vision Pro and usher in the new era of spatial computing leveraging Unity’s vision OS: a collection of development technologies targeted to address Apple’ new platform and its unique challenges.
Unity's integration with visionOS combines the comprehensive capabilities of Unity's Editor and runtime engine with the rendering functionalities provided by RealityKit, the new framework built from the ground up by Apple specifically for Spatial Computing development.
Unity's fundamental features, encompassing scripting, physics, animation blending, AI, scene management, and more, are fully supported in visionOS. This ensures that Unity can operate on visionOS just like any other platform supported, enabling existing Unity games or applications to migrate seamlessly.
Tailored expertise for your unique project needs
To build immersive games and apps for visionOS Unity offers PolySpatial. Developing for the visionOS platform using PolySpatial in Unity introduces new capabilities to bolster XR content creation that can operate across disparate devices, while maintaining a seamless and efficient development process. Notably, Unity PolySpatial for visionOS responds to real-world elements by default, resembling any other Unity application.
Whether you're keen on testing the new power of visionOS experiences, diving into new Spatial Computing projects for creating immersive and innovative content for entrertainment, lifestyle and industrial use, PIGIAMA KASAMA has you covered.
Explore endless opportunities tailored to your needs and interests and begin a new journey towards bringing your vision to life on visionOS!