Since it was unveiled at WWDC early last month, one of the more mysterious aspects of Apple’s Vision Pro headset is accessibility. There are many questions to be answered that Apple has been predictably mum over, even off the record. Moreover, this week’s announcement of the Vision Pro development kit raises even more questions, insofar as curiosity over the accessibility features in visionOS and how they apply to apps from third-party developers. Questions, they are aplenty.
With so many unknowns, I’m proud at how I’m doing my journalistic due diligence by taking literal notes on accessibility-oriented details that trickle out from my peers and friends at Apple rumor sites. One such instance came earlier this week, when Felipe Esposito at 9to5 Mac reported on the release of visionOS Beta 2. One of the so-called “tidbits” included in Esposito’s story is mention of the software’s Hand Pointer feature, which Esposito described as functionality that “lets users know exactly where they’re clicking in the interface.” In addition, Esposito’s note about Optic ID, which he says “scans the iris of the eyes to authenticate the user,” makes my mind race about how authentication works for someone whose eyes are not visible or has another condition. Personally, I’m reminded of the initial troubles I had using the then-new Face ID on the iPhone X in 2017 due to the strabismus in my right eye. To this day, I need to disable the Require Attention toggle in Settings because Face ID is too difficult to use with it enabled, despite Apple’s pretty strident warning that turning it off weakens Face ID’s security story. It may not be an ideal scenario, but it’s necessary for me to trade off some privacy in the name of usability. Given these circumstances, it’s more than fair to wonder what affordances Apple has built—surely they have—into Optic ID on the headset for people similar to myself.
Another noteworthy bit of visionOS sleuthing came recently when, in a piece from iMore’s Daryl Baxter about the visionOS software development kit, or SDK, I noticed a screenshot wherein the Accessibility section of Settings is prominently displayed. Owing to its iOS underpinnings, a cursory glance at the image shows the usual suspects in terms of Apple accessibility: VoiceOver, Switch Control, motion options, and more. One area exclusive to Vision Pro is what Apple is apparently calling Crown Button. Like Apple Watch and AirPods Max, Vision Pro has a Digital Crown that users can manipulate in order to change vantage points and the like. As with Hand Pointer and Optic ID, it’s interesting to ponder what the Crown Button actually does.
Broadly, it’s worth reiterating the salient point that a wearing a computer on your face is an entirely different beast to wearing one on your wrist (Apple Watch) or in your ears (AirPods). Even if you only wear it in short bursts, the sheer fact that Vision Pro—or Meta’s Quest or any other—is literally on one’s face brings with it new considerations in terms of ergonomics and usability for someone with disabilities.
All told, these bits of information that dribble out from my peers in the Apple journalism world are exciting despite still not knowing very much about the device. In the run-up to WWDC, I kept telling anyone who would listen to me that Vision Pro’s accessibility story will be something to pay attention to; disabled people use technology too, and how the headset functions for them is an angle worthy of people’s analytical energy. One thing to be said with full certainty is I was absolutely right when I said Vision Pro news in the coming months wouldn’t be boring.
Read the full article here