Finally a screen reader…in a VR headset
I’ve been interested in the use of virtual reality (VR) and associated technologies for longer than some of my work colleagues have been alive. The one aspect I’ve continually focused on with these technologies is their accessibility.
When I purchased my first commercial headset (a Quest 1) I was disappointed. It felt like the early days of computing where accessibility was an after thought. A number of app developers took it upon themselves to make their games more inclusive but one thing continued to stand out: the operating system itself really lacked any accessibility features.
A few years later I acquiesced and purchased a Quest 3 hoping to see some changes. Did I? Well, to some extent but not where I felt it could be with the right focus.
VR is very much an immersive visual medium. But when you peel away the graphics there is an immersive auditory experience that everyone should be able to access. But if the primary interface, the operating system, isn’t accessible to people with vision impairment or blindness, how can you get to experience it?
Admittedly my latest headset hasn’t felt any love for a while and started gathering dust faster than I thought any accessibility issues were being addressed.
That is until I reviewed a new immersive “productivity” suite bundled into the v81 release of Meta’s Horizon OS. The concept behind this new option was to create a link between Windows 11 and the Meta Quest via the Mixed Reality Link. (more information can be found here: https://blogs.windows.com/windowsexperience/2025/10/30/immersive-productivity-with-windows-and-meta-quest-now-generally-available/)
Suffice to say I can see the potential.
Suddenly I can access my PC’s screen in widescreen (even ultra wide screen if you want to go to extremes) and focus in on elements on screen without physically having to get closer to it. Interesting. There have been few dedicated assistive technologies for people with a vision impairment that could achieve the same especially with the expansive active field of view.
Which of course made me come back to my original concern about the accessibility of the operating system itself. Whilst being able to cast my PC screen into my Quest 3 will be beneficial (it’s just given my Forza Horizon apps a new lease on life) you still need to navigate through the Quest’s visual menu system.
The Meta Quest has had some rudimentary accessibility features for some time including:
- Adjusting your height within a virtual environment.
- Adjusting the audio balance.
- Managing colour corrections.
- Adjusting display contrast.
- Turning on mono audio.
Seems that whilst my Quest 3 was not feeling the love Meta have implemented a reasonably comprehensive screen reader.
A screen reader…
…in a VR headset.
Like any screen reader this one is going to take some time to get used to. But my initial impressions are very favourable. OK, the default English (UK) voice was unlike any text to speech voice I have heard before (I won’t attempt to describe it) but I envisage there will be a more expansive selection over time.
Strangely although the Quest 3 generally doesn’t need the handsets to navigate the operating system to access the screen reader I found I had to pick them up again. Apple’s Vision Pro manages to integrate hand gestures with its in-built screen reader: I wonder how long it will be before Meta follows suit?
Whilst having a screen reader (finally) built into the headset it won’t mean a lot unless software developers provide it support. I suspect I may need to invest in some new titles to see whether or not the screen reader can interact with games in a meaningful way.
Interesting times ahead!