Apple Vision Pro: Day One
It’s Friday, February 2, 2024. Today is the day. You’ve been eyeing the Vision Pro since Tim Cook stepped onstage with the product at last year’s WWDC. Longer than that, really, if you factor in the years of rumors, leaks and renderings. The price wasn’t anywhere near what you had hoped, but it’s a first-gen product. Manufacturing isn’t at full consumer scale and you’ve got to factor in the millions poured into seven or eight years of R&D.
After a few months of waffling, you hovered your cursor over the “Buy” button, held your breath, closed your eyes and committed to the tune of $3,500. Congratulations, you’re an early adopter.
The box arrives. It’s huge. It’s also quintessentially Apple — it’s premium, designed with intention. Tear the tabs on other side and slide off the top. The visor is inside, anchored to a small platform that’s more display case than shipping container. Dig deeper, and you’ll find another strap and a second “light seal” insert.
Me, I’m currently partial to the Dual Loop Band. It doesn’t look as cool as the Solo Knit Band, but the top strap does a much better job distributing weight (the Vision Pro is not a light headset). As for the light seal inserts, I advise glasses-wearers to go with the larger of the pair to create more distance between your eyes and the inserts.
Last, of course, is the now-infamous battery pack. Plug it into the port on the left side and give it a twist. A small white light pulses before turning solid. The boot-up has begun.
After eight months, what’s another 60 seconds between friends? There’s a bit of a setup process. Understandably so. The Vision Pro has to orient its sensors, get to know your space and lighting. If you had Zeiss optical inserts made for your vision, now is the time to snap them in, magnetically. If you’re a glasses wearer, don’t freak out about the image too much until you’ve enrolled your lenses by holding up a piece of paper with a QR-like code on it. Pairing the device to your iPhone works in much the same fashion.
You’ll be asked to pull the headset off for a bit, to take a scan of your face. But first, a short introductory video.
The face scan process utilizes the camera on the front of the visor to construct a shoulders-up 3D avatar. The process is extremely similar to enrolling in Face ID on your iPhone. Look forward. Turn your head to the side. Then the other. Look up and rotate down. Look down and rotate up. Find some good lighting. Maybe a ring light if you have one. If you wear glasses, make a point not to squint. I apparently did, and now my Persona looks like it spent the last week celebrating the passage of Ohio’s Issue 2 ballot measure.
The Personas that have thus far been made public have been a mixed bag. All of the influencers nailed theirs. Is it the lighting? Good genes? Maybe it’s Maybelline. I hope yours goes well, and don’t worry, you can try again if you didn’t stick the landing the first time. Mine? This is actually the better of the two I’ve set up so far. I still look like a talking thumb with a huffing addiction, and the moment really brings out the lingering Bell’s palsy in my right eye. Or maybe more of a fuzzy Max Headroom? I’ll try again tomorrow, and until then be mindful of the fact that the feature is effectively still in beta.
This is the version of you who will be speaking to people through FaceTime and other teleconferencing apps. This is meant to circumvent the fact that 1) You have a visor on your face and 2) There (probably) isn’t an external camera pointed at you. It definitely takes some getting used to.
Oh, you’ll also have to take it again if you want to change your hair or shirt. I was hoping for something a bit more adaptable à la Memojis, but that’s not in the current feature set. It will, however, respond to different facial expressions like smiling, raising your eyebrows and even sticking out your tongue (handy for Zoom work calls). The scan is also used to generate an image of your eyes for the EyeSight feature on the front of the visor to alert others in the room when you are looking in their direction.
Put the headset back on and hold your hands up so the hand-tracking feature knows what to look out for. Next, three circles of dots will appear, each with brighter light than the last. Here you’ll have to look at each while pinching your thumb and index fingers together. This helps calibrate eye tracking.
Input has long been a big question mark in the world of extended reality. You can pair Bluetooth game controllers, keyboards and trackpads with the headset, but in Apple’s vision of the future, the lion’s share of interaction utilizes your eyes and hands. Look at an object to highlight and pinch your fingers to select. Pinches also come into play when zooming (pinch with both hands before pulling them apart) and scrolling (pinch and swipe).
The digital crown is your friend. It’s basically a bigger version of the one on your Apple Watch. Pressing it brings up an apps display, similar to Launchpad on MacOS. The apps sidebar also showcases different Environments and People/Contacts. Long-pressing the crown centers visionOS where you’re looking.
My biggest tip to you, the owner of a shiny new Vision Pro, is give yourself time to adjust. This is going to be the last thing you want to hear. Listen, I get it. You spent a car down payment on a device you’ve been waiting more than half a year to try. But coming face to face with a new version of reality can do weird things to your brain if you don’t take breaks. People have reported headaches from the weight. Personally, I’m prone to motion sickness and am feeling a bit off at the end of my first full day with the device.
Watch an episode of a TV show. Play a quick game (you can play the iPadOS versions of Fruit Ninja and Angry Birds with an Apple Arcade subscription). If this is, indeed, the dawn of a new era for computing, you’ve got plenty of time to acclimate.
See you tomorrow.