As I mentioned in my last post, I’ve started developing a plugin for Unity that will make the UI (uGUI) accessible for blind and sight impaired users. I made some progress and done some more research and it is definitely time for an update.
Google TalkBack and iOS VoiceOver
I really like Google as a company. As in, I would love to work there and I have Google stock in my investment portfolio, even though I usually stay away from technology companies. Unfortunately, in terms of accessibility on Android, they have some catching up to do.
The integrated screen-reader that comes with Android is called TalkBack. It kind of works, but not as smoothly as it could. I put my phone in accessibility mode with a blacked out screen for a few days. Then I had to give up – most of the apps are not accessible enough to be usable – at least not for a novice. I constantly got stuck somewhere. I am talking about Chrome, News, Music and Email, nothing fancy. More importantly, those are the in-house apps from Google, so I expected a smooth ride. After some research online (yes, googleing, pun intended) I found that others shared the experience. Blind users usually install additional software, such as different Email clients and browsers, that are more accessible. But at least from what I learned, the majority of users seem to choose an Apple device instead.
The iOS counterpart to TalkBack is called VoiceOver. It does more or less the same thing in terms of screen-reading, but I felt like the navigation gestures were a little more intuitive and I felt more in control. I still got stuck a few times using my iPad this way, but not as often as on Android. Another plus is the physical Home button that all iOS devices have. A quick, impossible to miss safety net when stuck. The advantage of having a palpable physical button when you can’t see the screen needs no explanation.
From a developer’s standpoint
I suppose both systems do their job – but as a Unity developer, TalkBack is a lot more difficult to deal with. Let me explain why.
Almost all of VoiceOver’s special gestures all use multi-touch. Examples are the two-finger double tap or the two-finger swipe up. This makes a lot of sense. There must be plenty of games and other apps that want to use single finger swipes themselves, so a screen reader shouldn’t interfere with all of these apps.
TalkBack however uses only single-finger gestures. I suppose not all Android devices support multi-touch and so they ended up using the lowest common denominator. As a result, when TalkBack is enabled, it will block all single finger inputs. I am not sure how native apps deal with this, but in Unity this is detrimental. See – TalkBack doesn’t work with Unity itself, but it still blocks all the input. I cannot try to catch a swipe gesture if I never receive the finger-down event in the first place.
VoiceOver kind of does the same thing when it comes to multi-finger input – but with a major difference. It has a setting that turns this kind of input blocking off, giving full control to the app. And this setting is turned on in Unity by default. So while the Unity app is active and in the foreground, VoiceOver sits quietly in the back. Different from its iOS counterpart, TalkBack has no such setting as far as I could find.
This isn’t just bad news for my plugin. The Unity game I had already prototyped when I began this quest uses single-finger gestures as the main gameplay component. This is beyond frustrating and I haven’t found a way around it yet. My best solution for the time being is to detect when Talkback is running, and ask users to suspend it while they’re inside this app.
And just to make matters worse, TalkBack allows to configure your own gestures for certain actions. I have not found a way to query these gestures, so I can only listen to those supported in the default settings.