About once every month I receive an email or Twitter message asking – usually outstandingly nicely – whether my Unity Accessibility Plugin will make it possible for blind developers to create games with Unity. It’s happened often enough that I think it warrants a short post to clarify what my plugin does, and what it does not do.
So similar and yet so different.
It’s a screen reader
In a nutshell, the plugin is a screen reader, specifically tailored to work with apps and games created with Unity. Neither VoiceOver nor TalkBack can recognize the UI elements that Unity renders, so all apps created with Unity are automatically inaccessible otherwise. The important part is that the plugin makes the apps created with Unity accessible, not Unity itself.
It does not make Unity itself accessible
The Unity Editor – at least on Windows – is not very accessible to screen reader software. NVDA will read the menus and the names of the individual panels, but nothing else. JAWS apparently fares not much better. For development, this is useless. This particular plugin doesn’t change that, unfortunately.
Experimental accessibility for the Editor
A few weeks ago, over Christmas, I was playing around with making the Editor itself accessible. Inspired by a blind developer who wanted to use Unity to make a Go-Fish game I started to create a plugin that adds accessibility functionality to Unity. But this project is so early in its infancy that I feel almost uncomfortable writing about it at all.
Currently, this plugin adds keyboard shortcuts to read out the errors in the console, it plays sound effects when entering or leaving game mode, and notifies the developer if the compilation fails. It let’s the developer tab through the game objects in the scene hierarchy, reading out their names, how many children they have and reads out hints on how to add new children. It also makes the project view a little more accessible, reading out the names of files and folders.
But I haven’t found any solution for managing the components on a game object. Unity’s Inspector window supports keyboard navigation, but I can’t find a way to query what is currently highlighted and focused, so that I could tell NVDA (or any other screen reader) to read it out. Managing components and their values, usually in the form of prefabs, is probably the most important core feature of Unity. That makes this a major road block at the moment
Interested in joining the project?
In a finished version I would love this plugin to allow blind developers to do as much as possible with the Editor, including the creation and management of prefabs. It should also be possible to create builds for the various target platforms. And it should include all kinds shortcuts to make the most common tasks quick to do. And I would also want it to include a stack of documentation and tutorials on how to use Unity and create games with it without sight. Then throw in a demo project or two. All neatly wrapped in a free, easy-to-download-and-install package.
I would love to see this work. But I am enough of a realist to know that I don’t have a lot of time left over to put into this – at least not while I’m still working on the other accessibility plugin. If I tried to split my time between the two, neither one would ever get finished.
For that reason, this is an open invitation to other developers willing to help with this. I’d be happy to put what I have up on Git Hub if anyone was interested in joining in.
Just contact me: email@example.com