Unity Accessibility Plugin – Update 2 – First Version Up and Running

Recently I began writing a plugin for Unity that would make the user interface of apps (using uGUI) accessible for blind and sight-impaired users. If you missed the first two parts of this dev blog, read them here and here.

Unity Logo wearing a blindfold, hands reaching out carefully to feel around.

Label, Value, Type and Hint

For the most part, I followed the iOS VoiceOver design for my plugin, because it makes the most sense to me. Each UI element has a label, a value, a type, and a hint. All of these will be read out when a UI element is selected, unless the user makes another input. (Note: There are a few optional additional properties, such as traits, language and frame, but those are kind of out of scope for this plugin.)

This allows the user to quickly scroll through the UI elements, and getting the most relevant information (the label) first. If he needs more information, he can stop and listen to the rest.
Here are two examples of how that works:

Example 1:
The selected UI element is a toggle for the ingame tutorials. The system would read out “Tutorials”, then after a brief moment it would add the value “Enabled”. After another moment it would read “Switch” to explain the object type. After a longer pause it would then add “Double tap to change.”

Example 2:
The selected UI element is the PLAY button. It has no value, so this part is skipped. The system reads out “Play” then waits for a brief moment and continues “Button”. After a longer pause, it reads “Double Tap to Select”.

First Running Version

A first version of the plugin with some very basic functionality is implemented. The UI can be marked up with accessibility components and the plugin – if enabled – will automatically navigate through them, read them out and allow the user to interact. The plugin currently supports Text Labels, Buttons, Toggles and Sliders.

Screenshot from Unity showing a Menu screen with a number of buttons and sliders.

The plugin in action.

Everything is currently made to work with the new Unity UI (uGUI). I have a task on my list to look into NGUI as well, but it is pretty far down the list. Once it is working, I am hoping it won’t be too hard to adapt it for the different UI systems Unity has to offer. My feature wishlist in JIRA has been growing steadily since I started.

The plugin also includes yet another feature from VoiceOver: iOS puts a frame around the currently selected UI element. I know that this is completely useless for someone who is blind. But as a sighted developer, I find it incredibly helpful. So I added that in as well.

Windows

Because I develop on a Windows PC, and because I am lazy and don’t want to build and run on my Android phone for every little test, I built a quick library to access SAPI, the Text-to-Speech system that comes with every Windows installation. This allows me to test directly in the Unity editor and still have my screen read out to me. As a side effect, since I already implemented the keyboard controls, the plugin now supports Windows as a platform.

Further Reading

I found a wonderful article by Matt Gemmel about common misconceptions from sighted people about making their apps accessible. It’s well-written and easy to read, and definitely worth it if the topic interests you even a little bit or if you are considering making your own app accessible. Here is the link.

Continue Reading with the next part here

Unity Accessibility Plugin – Update 1 – TalkBack, VoiceOver and Unity

As I mentioned in my last post, I’ve started developing a plugin for Unity that will make the UI (uGUI) accessible for blind and sight impaired users. I made some progress and done some more research and it is definitely time for an update.

Google TalkBack and iOS VoiceOver

I really like Google as a company. As in, I would love to work there and I have Google stock in my investment portfolio, even though I usually stay away from technology companies. Unfortunately, in terms of accessibility on Android, they have some catching up to do.

A woman wearing a blindfold is trying to use an iPad.

Not as impossible as it looks.

The integrated screen-reader that comes with Android is called TalkBack. It kind of works, but not as smoothly as it could. I put my phone in accessibility mode with a blacked out screen for a few days. Then I had to give up – most of the apps are not accessible enough to be usable – at least not for a novice. I constantly got stuck somewhere. I am talking about Chrome, News, Music and Email, nothing fancy. More importantly, those are the in-house apps from Google, so I expected a smooth ride. After some research online (yes, googleing, pun intended) I found that others shared the experience. Blind users usually install additional software, such as different Email clients and browsers, that are more accessible. But at least from what I learned, the majority of users seem to choose an Apple device instead.

A hand is holding up an iPhone and pointing the Home button towards the viewer.

You can see the advantage of having an actual button. Get it? You can see it?

The iOS counterpart to TalkBack is called VoiceOver. It does more or less the same thing in terms of screen-reading, but I felt like the navigation gestures were a little more intuitive and I felt more in control. I still got stuck a few times using my iPad this way, but not as often as on Android. Another plus is the physical Home button that all iOS devices have. A quick, impossible to miss safety net when stuck. The advantage of having a palpable physical button when you can’t see the screen needs no explanation.

From a developer’s standpoint

I suppose both systems do their job – but as a Unity developer, TalkBack is a lot more difficult to deal with. Let me explain why.

Almost all of VoiceOver’s special gestures all use multi-touch. Examples are the two-finger double tap or the two-finger swipe up. This makes a lot of sense. There must be plenty of games and other apps that want to use single finger swipes themselves, so a screen reader shouldn’t interfere with all of these apps.

TalkBack however uses only single-finger gestures. I suppose not all Android devices support multi-touch and so they ended up using the lowest common denominator. As a result, when TalkBack is enabled, it will block all single finger inputs. I am not sure how native apps deal with this, but in Unity this is detrimental. See – TalkBack doesn’t work with Unity itself, but it still blocks all the input. I cannot try to catch a swipe gesture if I never receive the finger-down event in the first place.

VoiceOver kind of does the same thing when it comes to multi-finger input – but with a major difference. It has a setting that turns this kind of input blocking off, giving full control to the app. And this setting is turned on in Unity by default. So while the Unity app is active and in the foreground, VoiceOver sits quietly in the back. Different from its iOS counterpart, TalkBack has no such setting as far as I could find.

This isn’t just bad news for my plugin. The Unity game I had already prototyped when I began this quest uses single-finger gestures as the main gameplay component. This is beyond frustrating and I haven’t found a way around it yet. My best solution for the time being is to detect when Talkback is running, and ask users to suspend it while they’re inside this app.

And just to make matters worse, TalkBack allows to configure your own gestures for certain actions. I have not found a way to query these gestures, so I can only listen to those supported in the default settings.

Continue Reading with the next part here.

 

Unity Plugin for UI Accessiblity

Updates on the game have been scarce of late, I know. I’ve been looking into making my mobile games accessible and as a result felt forced to take on a new side project.

Accessible Mobile Games

Not everybody realizes that blind people are avid smartphone and tablet users too, since both Android and iOS have integrated screen readers. These help navigating around the screen, reading out buttons and text and preventing accidental clicks. There are also a few special gestures that can be drawn on screen that will trigger the Back or the Home button.

Blind Man Walking with a stick

The problem is that I am using Unity for my games, which doesn’t produce apps that work with these screen-readers. This is perfectly alright, it’s just in the nature of the design on which Unity is built.

No Unity Solution – yet

I searched up and down the Internet for a solution, but all I could find was other people facing the same problem. They either gave up on making their apps accessible, or had to re-implement their menus, and record additional audio or use Text-To-Speech engines. If you want to read a great story from one such developer, let me point you to this Gamasutra Article.

Woman standing excitedly in front of the Unity Asset Store.

My usual asset store experience.

My hope was to find a plugin to do some of the heavy lifting for me, but for the first time, the mighty Asset Store, my trusted friend, who usually has a solution for every problem known to man, came up empty. There are a number of Text-To-Speech engines, but nothing that automatically handles the UI navigation, reads out the elements, makes sure buttons can be triggered, sliders can be moved and toggles can be flipped – let alone listen to any magic gestures.

So I decided I will write such a plugin myself then.
I have no clue whether the world at large has any need for this (I guess not, or someone would have done it already), but I do, so let’s get started.

Feature Wishlist

In short, what I would like is a way to mark up my UI (uGUI) with accessibility tags. The plugin has to then automatically allow the users to navigate the menus using gestures, interact with the UI and have a text-to-speech system read out what’s on screen.

There will obviously be some manual setup involved, but a good plugin will make things work with minimal effort. Everything should work on Android and iOS, including Text-to-Speech.

First Steps – UI Navigation

Unfortunately, there seems to be no gold standard of UI navigation for the blind. Both Android and iOS have their own way of navigating through the screen. Sure, both use swipes, but that is almost where the similarities end. On Android, you can swipe down to jump to the next element on the screen. On iOS devices you swipe right. Swiping down will jump to the next UI container instead. On Android you can jump to the top of the screen by swiping up and then immediately down. On iOS you can read from the top of the screen by doing a two-finger upward swipe. The list goes on forever.

I can only assume that the respective users would like to keep using the gestures that they are familiar with – so a good plugin will have to support the appropriate navigation depending on the platform.  Heck, while I am at it, I might as well throw in listening to the keyboard and make everything work on PC as well.

Continue Reading with Part 2 here.