Accessibility features on the Apple Vision Pro
As one of the first companies in Switzerland with an Apple Vision Pro, we have often had the honor of presenting the glasses to various people. This was also the case last week when we received a surprise visit from someone who had read that you can try out the Vision Pro at Bitforge. He has a physical impairment that limits the mobility of his arms and fingers, which limits his ability to use touch screens or a mouse. The Apple Vision Pro, on the other hand, where navigation is largely done via the eyes, is a real revolution for him!

Source: Apple
After this meeting, we immediately dived into the accessibility settings and it took us a while just to get an overview of all the functions. “Are Sound Actions and Voice Control the same thing? Don’t these settings also exist on the iPhone?” The fact is that the accessibility features on the Vision Pro are very similar to those on iOS. However, there are also features such as Zoom that work completely differently. In this blog post, we want to look at the most interesting and important accessibility features.
Accessibility features in detail
Voice Control
Anyone familiar with the Accessibility Options on the iPhone already knows everything there is to know about Voice Control. You can navigate through the menus using phrases such as “Open AppStore” and “Tap Continue” – this also works on the glasses. This already covers a large part of the navigation, but does not yet utilize the unique strengths of the Vision Pro. We’ll get to that in the next option.
Sound Actions
One of the most innovative features of the Vision Pro is eye tracking. The first moments with the glasses feel like magic. You simply look at the desired button and press your thumb and index finger together. No aiming with controllers or hands, just looking is enough. For some people with disabilities, the first part may be easy, but the second part – pressing your index finger and thumb together – can be tedious or even impossible.
The Sound Actions close this gap. There are a handful of ready-made “sounds” (e.g. tongue clicking or “Shhh”) that can be assigned commands. The sounds are not phrases as with Voice Control, but very simple, short sounds. One obvious benefit of this is the alternative method of pressing a button with eye tracking. This means you can operate most of the menus using only your eyes and the sounds. The next feature shows that you can even do this with just your eyes.

Source: Appleinsider.com
Dwell Control
Dwell Control is perfect for eye tracking. To click, you no longer have to press your index finger and thumb together or perform any other action, but simply look at the button for a brief moment. With a few tricks, you can even scroll and move windows around. However, this is clearly the interaction with the steepest learning curve.
Pointer Control
With Pointer Control, you can use your head position, wrist or index finger as a pointer instead of your eyes.

Source: Youtube
The Vision Pro covers individual needs
That was just a brief overview of the most important accessibility features. But there are of course countless options and combinations. Typing and scrolling with a Playstation controller? No problem! Switching sound on and off with a shake of the head? Everything works! It gets really exciting when you consider that the Vision Pro can be connected to a Mac computer – the computer can be operated entirely without a keyboard and mouse. We recommend this YouTube video as a supplement to this blog post.
Vision Pro: The digital aid for accessibility of the future?
If you look at the glasses from an accessibility perspective, it becomes clear that much more is possible. Just one idea: subtitles for the deaf in a real conversation. Preferably as a speech bubble above the person speaking. Of course, you don’t see very many people on the street with glasses. But the potential is huge if you see the glasses as an “accessibility filter” for the real world. In other words, exactly what “augmented reality” actually promises: an improved reality.
Contribution by: Raphael Anderegg, developer specializing in AR applications