When the Apple Vision Pro was first announced, a select few in the press got a chance to test it out. While many gasped at the price, a prevailing theme was the sense of magic the device conveyed through its visual interface. To choose an item, you merely glance at it. To click on it, you simply bring your thumb to your forefinger.
Also: I watched my favorite TV show on Apple Vision Pro and it was glorious, strange, and tiring
Upon initial shipment in February, reviews followed a similar theme. The price was extreme, but the user interface was magical. Everyone, it seemed, loved how the eye tracking made the device respond to their very glance.
Everyone, apparently, except me.
When I first got the Vision Pro, I found it barely usable. It sometimes took a supreme effort merely to get the proper icon to highlight, and not jump to the next icon. It was entirely impossible, without actually lifting and angling down the physical headset, to select any of the play and pause interfaces for any of the media apps.
I had no such problem with the Meta Quest 3, which uses controllers to point and choose items on the screen. Pointing and choosing with a controller worked just fine.
Also: Meta permanently slashes the Quest 2's price again, dropping it to an all-time low
But with the Apple Vision Pro, every single time I had to select something with my eyes, it was like I had to fight the device. For me, the device just didn't eye-track. For me, the device felt more like a poke in the eye.
So far, I've made nearly 200 YouTube videos across a number of channels. Commenters, ever the paragons of kindness and decency, have many times pointed out certain physical characteristics I seem to evidence on screen.
For example, I would never have known that it was time to cut out flour and sugar, had my commenters refrained from sharing with me that not only is my expertise well-rounded, but so is my body.
I also would have never been aware that my eyes appear nearly closed or squinting on film. While some have suggested this disqualifies me from posting videos to the public, and others seem to think I should use this as a reason to disqualify myself from life itself, their observations -- that my eyelids behave differently than most people's-- do hold some truth. I'm used to it. They've behaved like this all my life, folks.
Also: Installing a VPN on Apple Vision Pro: How to do it and why you should
My eyes don't open as wide as those of most people. While I've never received a formal diagnosis, Google tells me that this droopy eyelid condition is known as ptosis. As far as I can tell, it doesn't interfere with my vision looking out at the world. But it does appear to confound the eye-tracking capabilities of the Vision Pro.
Fortunately, the Apple Vision Pro has accessibility features that have pretty much helped me resolve the problem. Here's how that works.
Also: How to take perfect Vision Pro screenshots and recordings
Although the Vision Pro defaults to eye tracking, it also supports head, finger, and wrist tracking. I found that it misreads head tracking, again missing selections from the bottom third of the screen. I found that finger tracking is too fluttery, bouncing the selection too much. But wrist tracking seems to be fairly consistent and solid, and it's been what I've been using since I configured the device this way.
When you enable wrist tracking, you also get a round cursor on the screen. This helps identify what you're clicking on.
You access this feature by opening Settings, scrolling down to Accessibility, and choosing the Interaction section.
Next, select Pointer Control.
Next, turn on Pointer Control (shown at 1). Then, we'll choose the Control mechanism (shown at 2). In this screenshot, mine is already at Wrist, because that's what I use. Yours will probably say Eyes initially. You can also change the contrast of the pointer, set its border color, and choose your pointer size.
As you can see, you can choose from the controls I described above. I have Wrist selected (shown at 1). Notice you can choose which hand to use (shown at 2). I use my right wrist. Of particular importance is the Movement Sensitivity slider (shown at 3). I found that sliding down the sensitivity did wonders for precision control.
There's one more option, which I used for a while, and then turned off. That's the Show Depth Ray option. You can see the ray used as a pointer in the image below. This gives you a point much like what the Meta Quest provides. I turned it off because I found it distracting on the Vision Pro.
I think there might be a bug in this version of VisionOS. Every so often, the round cursor disappears. It seems to happen most often after using a Compatibility Mode app, but I've seen it happen after leaving some of the Apple Vision Pro apps as well.
When it disappears, and you're in wrist pointer mode, selecting and choosing items becomes very difficult. Fortunately, there's a fix. It's called the Accessibility Shortcut. Here's how to enable it.
Also: Itching to try Vision Pro's Travel Mode? Here's what to expect before you go
Go back to Settings and Accessibility again, and scroll down until you see Accessibility Shortcut.
Click on it. You'll be given a set of options.
This screen allows you to choose what accessibility feature is turned on or off when you triple-click on the Digital Crown at the top of the headset.
It's interesting that something that is so eye-centric has the ability to bypass the eye tracking. Have you used any of the Vision Pro accessibility features? Have you had any difficulty selecting items in the Vision Pro? What other challenges, if any, have you faced? Let us know in the comments below.
You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.