Apple yesterday released the first ever beta of visionOS and the SDK that will allow developers to create apps for the Apple Vision Pro headset. visionOS can only be explored through Xcode right now, but we thought we'd take a hands-on look to see what we can glean about the headset experience from the operating system.
Testing out visionOS is as simple as getting the latest Xcode 15 beta and the visionOS 1.0 simulator, but to be honest, there's not a whole lot to see that Apple didn't already tell us about.
You can only see the operating system on the screen of your Mac, so it's not what the headset will really be like, and you can't experience the same level of immersion. That said, you can see what visionOS will look like, including the Home View and app windows, plus you can see how 2D iPad and iPhone apps will look.
Webpages can be loaded into a visionOS version of Safari so website developers can see what their webpages will look like and what needs to be tweaked. Everything looks a lot like iOS, but if iOS were in your living room or kitchen.
There's a Control Center with customizable options for things like light and dark mode, and there's a Guest Mode, which is how you'll be able to let curious people try out the headset without access to your sensitive data. Spotlight is available for searches, and you can set up a range of "Environments" that block out the world around you.
From the visionOS Xcode experience and visionOS code we know there are over a dozen Environments you can select, such as Joshua Tree, Yosemite, Mount Hood, and even the moon. There's a Visual Search feature that will be able to identify items around you, copy printed text from the real world, translate languages in real time, and more, plus Apple has designed a Travel Mode that can be activated when you're on an airplane.
Travel Mode ensures that you're stationary while you're wearing the Vision Pro, and it blocks out distractions around you. Certain sensors are turned off, perhaps for the privacy of other passengers or because close proximity to a number of other people can cause the sensors to malfunction.
Apple will provide Vision Pro testing labs to developers in several locations worldwide starting next month, plus the company is going to open up applications for a hardware-based Vision Pro developer kit that will allow developers to test their apps right on the Vision Pro itself.
Make sure to watch our full video to get a closer look at the early stages of visionOS.
Top Rated Comments
I can envision many professional use cases for the Apple Vision Pro and the concept of spatial computing.
A wedding photography studio could offer a new premium service with still photos, videos, and spatial photos and videos. A real estate company could use Vision Pro to provide immersive property tours with staging based on a potential buyer's preferred home decor style or corporate branding. A corporate events production company could use it to showcase different stage and lighting designs. Vision Pro has great potential in various industries.
I wonder, where have all the dreamers and visionaries disappeared to?
However, price aside, and assuming they will come out with cheaper versions....I wouldn't downplay the "giant screen" aspect of this. Maybe at its core it's "just a big screen" but being able to have multiple very large screens in front of you is what most office people have been trying for and doing to the limited degree you can for the past couple decades. If people didn't want large monitors, they wouldn't be selling. But bigger than that, it's the canvas of a room, windows of any size, and the fact they "live" in your space that pretty much removes limitations on how we want our content to be displayed. This is not just a "large screen" but rather no longer having screens, just content. Placed where you want it, at the size you want it, where you want it, etc.... Let's take one example....so maybe you are a banker and instead of buying a physical product for stock ticking, you can get an AR app that you can customize exactly the way you want it and place it exactly wherever you want it in your office, at the exact size you want it. It can then interact and open browsers, send email, perform automations, etc... Things that are simply not possible today with physical products. Could you do that with multiple monitors placed around your room? Sure, but clunky wouldn't even begin to describe it. It truly is a brand new "spatial computing" (I hate they have to brand EVERYTHING) mindset with a ridiculous amount of potential!
In terms of AR/XR....yes and no. From the reviews of the product, the fact that EVERYTHING you see is on a digital screen, it is sort of the best case AR right now. We get the super high resolution of the screens and ability to have crisp text, ridiculously good placement of objects in the environment, but also no loss of being "present" in your space. Multiple people said that they moved around the pretend room almost forgetting they had a headset on. If you have used any other headset you know the lag prevents this same carelessness unless you want to bang into things. This is going to be the best way to experience AR until we can get the resolution and tracking technology to work with transparent materials.
From a VR perspective, yeah not a lot there. The 3D videos and some of the "cool demos" were more highlight what may come in the future. But honestly they are smart by focusing on the AR with "VR Light" as the starting point. I have not really been looking forward to this product, but after the Keynote and hearing from people who have tried, I see the path they are going and I am more than excited. I do believe that their approach will mainstream AR and the use cases will explode and we will end up in 5 years not even remembering when this product seemed to not have a purpose.