Unity Accessibility Plugin – Update 18 – WebGL Support

One of the features requested rather often for the Unity Accessibility Plugin is WebGL support. I’m happy to announce that it is almost ready to ship. In this post I’ll explain a bit about the problems WebGL presents – and their solutions.

TL;DR
Check out the UAP WebGL Demo

Woman sitting in front of her PC which displays the WebGL logo. She's looking exhausted, but happy.

Yes, the Unity Accessibility Plugin will have WebGL support!

WebGL and screen readers

My latest update to the UAP (v1.0.4) includes experimental support for WebGL. Experimental is my way of saying that the screen reader portion of the plugin works – aka UI navigation and interaction – but there is no speech synthesis. To use the plugin with WebGL, developers would have to write their own or purchase a third party text-to-speech plugin and connect it with the UAP.

This is because browsers do not have speech synthesis themselves. They rely on the user having a screen reader installed on their system instead. This works for text based content, but not for 2D and 3D content rendered in Unity via WebGL. Unity cannot access a user’s screen reader via the browser.

I’m currently working on the v1.0.5 release of the plugin to solve this issue.

All Text-to-Speech isn’t created equal

There are of course high quality speech synthesis plugins available on the asset store, which can be made to work with UAP with just a few lines of code. But I would like for the plugin to be as self-contained as possible and work as much out-of-the-box as I can get it. So I want something I can include with the plugin.

I could possibly get some low quality speech synthesis up and running based on some of the free and open source tech out there. But there’s no point in including a crappy text-to-speech engine with the plugin that produces barely intelligible audio.

Do you want your app to sound like this?

Keep reading – it gets better!

Google Cloud Text-to-Speech

Google offers a cloud based Text-to-Speech system. Cloud-based means that the Unity game would send the text it wants spoken to the Google server and Google sends it back a sound file with the generated speech.

I’m looking for a solution specifically for WebGL, meaning players will already be online. So an internet-based solution is a viable option.

And the quality is great.  Here’s the same sentence from the previous example, generated from Google:

Go ahead and listen to the first one again, to compare…

Latency

I was a little worried about the speed of the system. With this being a cloud based solution, the process is a little slower than a screen reader, which runs directly on the end-user’s system. Not only does the speech have to be synthesized, but it has to be send as an audio file via the internet to the end user.

But after I’ve tested the system both in the Unity Editor and in an actual WebGL app, I found that it’s not bad at all.
If you want to know how good or bad the latency is, here’s a demo:
UAP WebGL Demo

Some assembly required – but not much

The upcoming v1.0.5 release of the UAP plugin will include fully integrated support for the Google Text-to-Speech service. All a developer will have to do is to set their API key in the plugin settings and it will work. I will include a guide with lots of pictures on how to create the API key.

UAP plugin settings showing an input field asking to insert the Google API key.

The API key shown in this picture is bogus, by the way 🙂

Because the service is part of the Google Cloud, it isn’t entirely free and you need a Google account for it. You get a $300/12 months free trial period though, which is pretty awesome.

Here’s the link, if you want to test it out:
https://cloud.google.com/text-to-speech/
No, I’m not getting paid by Google, I just think a direct link is safe and convenient.

Other TTS Plugins

No solution is right for everyone. Developer’s might not want to use the Google Cloud API, or they might already own a different Text-to-Speech plugin.

With the new release, I reworked the way UAP handles Text-to-Speech, to make it very simple to connect to any other TTS plugin that the developer might want to use. Not that it was complicated before, but it’s just gotten a lot easier. I also added step-by-step instructions into the documentation.

Screen Reader Detection

I’ve written before about how important it is to detect whether your users are or (more importantly) are not using a screen reader.
Read the post here: Playing Hide-And-Seek with TalkBack

But it is near impossible to auto-detect whether a screen reader is installed or running on a user’s system when all you have is a WebGL app.

The reason is simple. The app could run in a myriad of different browsers and operating systems, and each system has a range of different screen readers that could or could not be installed. There is simply no way to get that kind of system information, especially not from within a browser.

In other words, on WebGL, UAP cannot detect whether it should turn on/off automatically. The only way to work around this is to pass a parameter to the game, for example via the URL. This is unfortunately still up to the developer.

Unity Accessibility Plugin – Update 17 – Hide and Seek with TalkBack

One thing that continues to keep me on my toes is screen reader detection. And when I say screen reader, of course what I really mean is Android TalkBack.

Screen Reader Detection – and why it’s Important for your Sighted Users

My plugin needs to know whether a screen reader is running, so it can spring into action and turn itself on.
And this needs to work flawlessly. 

We recently released a Crafting Kingdom update that turned on accessibility mode by accident on some devices – and we got swamped with feedback like this:
My phone started muttering to me and none of the buttons work!“.
(Note: I edited out the many colorful expletives that embellished these complaints.)

Man looking in horror at his cellphone, which is telling him to double tap a button to activate it.

“Help! What’s going on? It speaks!”

See, if you’re sighted, using a screen reader is not something that comes naturally. In fact, it is almost counter-intuitive. I’ve watched several friends and relatives stab at their screen helplessly, and after about 10 seconds they had enough. Our actual users will probably have less patience.

In a best case scenario they’ll will simply uninstall the game. They might also leave a bad review. If they spent money they’ll likely want a refund.
In a nutshell, better to err on the side of caution and only activate the plugin when a screen reader has been detected for sure.

I worked through the weekend to find a fix, because the only other option was to disable accessibility completely.

Detecting TalkBack on Android – the Pain Continues

Every major cellphone brand has their own version of Android. Samsung even has their own version of TalkBack, Samsung Voice Assistant. Code that works on one device is not guaranteed to work on another. And every update to Android can potentially cause more trouble. This happened with the Android Oreo release.

As it turns out, that update is incompatible with my TalkBack detection code. I had three different mechanisms implemented, just to be on the safe side. After the update, one of them would crash, and the other two would always report it was turned off.

Woman asks TalkBack whether it is running in the background. TalkBack hesitates and then replies 'Maybe'

Is it me, or is TalkBack dodging me…?

I rewrote my code and tried what I thought was a clever trick: I checked the device’s settings to see if accessibility was enabled. It worked great on my pure clean Android Nexus test devices. But when I tested it on an LG phone, the settings would always claim TalkBack was running, even when it was off. I was pulling my hair out at this point.

Not all devices were affected, but some of the popular ones were – most Samsung Galaxy S devices for example. Samsung has a market share of around 35% in the US. Not a user base you want to ignore. In fact, my company just bought yet another Samsung phone as a test device because of this.

Here’s the Solution – and the Source Code

In the end, I solved the problem by asking the phone for all accessibility services that are both currently enabled and provide spoken feedback. If there were any at all, it’s fairly safe to say that a screen reader is on – or that the user is familiar enough with them to not freak out if the phone started talking.

That worked on all test devices that I could find lying around in the office. It also worked on ten different devices in the Firebase test lab. That’s a Google service that runs your app on a plethora of different devices for you and provides error reports and video recordings. (Hint: If you haven’t yet taken a look at the Pre-Launch report in your Google Play developer console, don’t ignore it any longer.)

I want to spare other developers the pain, because Googleing the issue will lead to a lot of out-of-date code that won’t produce the correct result.
So here’s a code snippet for ya (works as of March 4th, 2017):

import android.accessibilityservice.AccessibilityServiceInfo;
import android.view.accessibility.AccessibilityManager;

private static boolean isScreenReaderActive(Context context) 
{
  AccessibilityManager am = (AccessibilityManager) context.getSystemService(ACCESSIBILITY_SERVICE);
  List<AccessibilityServiceInfo> enabledScreenReaderServices = am.getEnabledAccessibilityServiceList(AccessibilityServiceInfo.FEEDBACK_SPOKEN);

  return !enabledScreenReaderServices.isEmpty();
}

Disclaimer

Whenever I complain about Android TalkBack I feel the urge to put a disclaimer at the end of my blog, to make sure I’m not mistaken for an Apple fan girl, who’s bashing on Android out of principle alone.
I’m not.

If you don’t believe me just come by the office on one of those days that I have to create new certificates and mobile provisioning profiles for our games on the Apple Store. See how much love I have for Apple then.

Unity Accessibility Plugin – Update 16 – Asset Store Sales (Am I Rich Yet?)

The Unity Accessibility Plugin has been on the Asset Store for 10 months now – time to take a look at the numbers.

Sales and Updates

The plugin is performing well for what it is: A small niche product.
On average, about 4 people pick it up each month. Not all of these are paid, since I occasionally give out a free key, and the plugin was on sale at a reduced price for a while.

Dark haired woman holding up two bags full of money with a satisfied smile

Do you really think this is what’s happening?

If you do the math you’ll see that no one is getting rich from it.
Of course that’s not stopping the occasional side remark:
Why isn’t this free!?” (to which I quietly sigh and then move on)

I always had very realistic expectations about sales (remember my post: Is It Worth Doing?). In fact, I had only anticipated about 1 or 2 per month. So by my standards, the plugin is actually outperforming! And the sales I get allow me to spend some time each month on email support and on updating the plugin. Since release, I’ve updated the plugin three times and I’m about to release a fourth update.

This is exactly what I was hoping for and I’m really happy the plugin is doing this well!

Mac Support (and JAWS)

One of the biggest features I’ve added was support for voice out on a Mac. I am not a Mac user and feel uncomfortable using one, so I kept putting it off. It wasn’t until a friend needed it that I finally took a deep breath and jumped in. The code for it turned out to be surprisingly trivial. My friend also helped me test it, which made it a lot easier.

I know that JAWS support is still sorely missing from the plugin, and it’s in the works. But that is a whole other can of worms, for another day.
Here’s the link to my Road Map on Trello if you’re interested in what else is coming:
UAP Road Map


					

Unity Accessibility Plugin – Update 15 – Version 1.0 Released

Finally, a little more than a year after its inception, the Unity Accessibility Plugin is now live in the Unity Asset Store!

Three dark lower-case letters u, a and p. The a and p are shaped to resemble eyes or glasses.

The plugin now also has an official logo!

Check it out over here: UAP on the Asset Store

The last few weeks I mainly worked on finishing the documentation, cleaning up the code and creating a tutorial video.
I want to give a huge thanks to my beta testers who helped out a lot and found a few important bugs!

Initial Release: Android and iOS

UAP 1.0 officially supports iOS and Android. With that version out now, I want to focus on supporting Windows and Mac next. The released version actually already contains Windows support, but because it’s lacking in a few places I decided not to advertise that.

Development Roadmap

There are a number of things that didn’t make it into this initial release apart from Mac and Windows support. I have been using a Trello board for the past 5 months to track my work on the plugin, and to build a wishlist of nice-to-have features I want to put in eventually.
It’s a public board, so if you want to take a look at where things are going, here it is:
UAP Roadmap on Trello

 

Unity Accessibility Plugin – Update 14 – NGUI support

After putting it off for months, I finally sat down to attack this task I had secretly been dreading – adding NGUI support to the accessibility plugin.

What is NGUI and why is it important?

Up until now the accessibility plugin for Unity only supported Unity’s own UI system (uGUI). The system is included in your standard Unity installation and it’s quite capable. But that doesn’t mean uGUI is the most used UI solution in Unity. It’s a still fairly new and some kinks need to be worked out.

The NGUI logo, an upright hexagon with the lower left side missing, and two horns sticking out the top.

NGUI stands for Next-Gen UI

This is where NGUI comes in. NGUI existed long before Unity introduced their new uGUI system, and it’s one of the most used, most powerful and best supported UI plugins out there. (And I strongly believe that last bit ultimately led to the first two).
Here’s the link to the plugin on the Asset Store: NGUI: Next-Gen UI

The user base for this plugin is quite large and I don’t want to exclude them, obviously. Since people stick to what they know I don’t expect all those NGUI users to switch over to the new uGUI. Instead, I have to come to them. So, consequentially, the plugin needs to support NGUI.

Work In Progress

I was dreading this task because of all the code changes I would have to make to accommodate for a different UI system. For example, NGUI uses a completely different coordinate system for its UI, and things that were already working fine, like Explore-By-Touch, would basically have to be reimplemented. All calculations have to be adjusted to be able to tell what NGUI element is under your finger.

It turned out I had been worrying over this for nothing. After a weekends work, I already have basic NGUI support running. Navigating the UI with swipes or keyboard works, labels are read and buttons can be pressed. Other UI elements, like edit boxes and sliders are still on the TODO list, but Touch Explore is already working.

It was nowhere near as difficult as I expected. Incidentally, uGUI was written in parts by the same developer who created the NGUI system – because Unity hired him after NGUI’s success. (Smart move on their part!) As a result, there are a lot of similarities between the two systems, that make supporting them both rather easy. The biggest differences are in the calculation of screen coordinates and some juggling with the different UI roots and UI cameras that NGUI is based upon.

Compile Errors

But then there is the issue of conditional compilation.

To get the information from the UI elements from NGUI, I need to access NGUI’s script components. But these classes won’t exist in any project not using NGUI. If I don’t separate my code, everybody would get compile errors. One option to avoid this would be to use reflection for everything NGUI related, and that is just such a mess. Not to mention it doesn’t make for easy to fix, maintainable code, in case some of the NGUI interfaces change in the future. Which is very likely to happen, since it is so very well maintained.

The only other solution is to create a pre-processor define that conditionally compiles in or out those bits of code that are only needed for NGUI. The downside of this is that I would like my plugin to work out of the box and not require the user to mess around with the Player Settings to manually add variables.

My compromise is using a preprocessor define, but make setting it up as easy for the user as possible.
The plugin tries to detect that NGUI is present in the project. If it is, it will ask the user to click a button to enable NGUI support and then set the preprocessor flag automatically. I also made it hard to miss. Every component, the About window and the Accessibility Plugin menu will have the option to do this, and the documentation explains how to do it manually as well, with pictures!

Too Long, Didn’t Read

Summing up: The initial release of the plugin will come with NGUI support, right from the start.

Unity Accessibility Plugin – Update 13 – How fast can you make an existing app accessible?

With my plugin now in beta testing stage, I was wondering how long it actually takes to make an existing app accessible using the plugin.

So I did a little experiment.

How long does it take?

For this test, I used a game from our company that had just come out of beta testing and was about to be released. Aside from some final bug fixes, the game was done and just waiting for the localization to finish. Not a single thought had been given to accessibility during development.

Result:
It took me a little less than four days to make it totally accessible.

A forest backdrop with the words 'Crafting Kingdom' written in front. A hammer and a pickaxe hover on either side of the text.

If you want to skip the rest of this blog and just check out the game, you can download it here:
Crafting Kingdom on the iOS App Store
Crafting Kingdom on the Google Play Store

Some Perspective

Maybe you have a game in mind that you’d like to make accessible and wonder how much work it would be.
Allow me to give you some additional context, so that you can put my result into perspective and draw your own conclusions.

The game I used is neither small nor simple, and has a ton of UI. It has a scrolling world map, over a dozen different locations with custom UI screens and dialogs, and almost each of them has a custom help screen. There are tutorial windows and statistics screens. In comparison, the Match 3 game I made had only a main menu screen, the game screen and a pause menu.

A medieval world map. Several locations are highlighted, such as a castle and a city.

This map can scroll in two directions

And while this game has a lot of text, it also makes heavy use of icons instead of text labels. It uses 32 different resource icons, a coin image to indicate a price, an X on cancel buttons, a check mark on confirm buttons, an image of a left arrow to indicate a back button… The list goes on. 

Many games use icons in lieu of text. Using images is prettier, uses less screen space, it makes the app playable by people who can’t read, and it saves money on localization. It is a win-win-win-win for the developer. Really the only drawback is that it makes accessibility a little harder.

One of the game screen showing a production pipeline. Graphics show that a ladder can be crafted from two pieces of timber and two pieces of leather.

A screen full of information, but barely any written text.

But not that much harder. Adding accessibility text to an image is easy enough, it just means it isn’t a one-click deal to add an accessibility component to the UI element and be done. You have to type in a name into the appropriate field of the accessibility component. For the product icons I had to modify the code to write the correct text into the accessibility component dynamically.
However, this particular game is localized in 12 languages, and I didn’t want to break with that. So I had to generate localization keys in our database as well which slowed me down a little.

The optional Extra Mile

I also didn’t just slap on the accessibility components and call it a day. That would have worked, but I knew that with a little effort the playability could be much improved.
For example, when a help text was useless for a blind player in its original form – e.g. “Click on the white house” – I created an alternate version of the text and had that read out instead.

I added in additional sound effects for extra feedback which are played only when accessibility mode is active. When a treasure chest appears somewhere on the map, a jingle sound will play. When the player cannot afford to buy something, he will be informed verbally that he doesn’t have enough money. Similarly, the game will speak up and tell the player when the warehouse has run full. Sighted players are notified of this with an animated, bright red colored text at the top of the screen.

Game screen showing the farm. It has four work stations, which are busy producing grain, meat and wool.

When a product is finished, a sound effect will be played.

Because this game has so many buttons and labels on the screen all the time, I grouped the UI elements by context, to make navigation a little quicker. The plugin supports up and down swipes (instead of the normal left and right swipes) to skip directly between groups. This means blind players can get to where they want faster, instead of having to cycle through every single UI element on screen. In a few dialogs I also set up a custom manual traversal order for the individual elements, because the most important information wasn’t always at the top. For example, the game always displays the name of the location that you are in at the bottom of the screen.

In one screen I actually decided to change the interface a little. I added a new layer of UI which offered buttons where there was a slider for sighted people. In that particular context it was the better choice. An accessible slider wouldn’t have given the same fine control to blind players as it gave to sighted ones in that particular context.

Results

With all that extra work it stands to reason whether I could have been done sooner, had I just done the bare minimum. On the other hand, I know my plugin better than anyone else. The speed advantage I gained from that probably more than made up for the difference.

You know your games best, so you can judge whether my results are applicable to you. In any case, they are a good indicator of the general scope. It doesn’t take months or weeks to make an app accessible. The work is more in the realm of days.

Just to be on the safe side – I am definitely not promising that you can make your game completely accessible within four days using my plugin. So don’t sue me.
(Write to me for help instead!)

How fast you will actually be will probably strongly depend on my ability to write good documentation. That’s what I am currently working on.

Afterthought: Not everybody does accessibility from the start, and that’s OK

As a general rule of thumb, I’d say if accessibility is added in early in development, it will be a smoother ride for everyone.

If you do it from the beginning, you quickly get into the habit of attaching an accessibility component to every UI right when you create it. And you implement any additional feedback for blind players at the same time that you actually implement the features. At no time during development will you be more familiar with the feature’s code. Thus, you are less likely to forget any special cases and can make sure everything runs smoothly in accessibility mode as well.

But in reality accessibility is often added later – possibly only after the game is finished. I really want to avoid saying ‘accessibility as an afterthought’ here in this context, which I stumble on a lot when browsing. It just sounds so negative. That’s unfair to any small game developer who wants to put in the work to make their app accessible. They are trying to do something good and don’t deserve to be berated for not having thought about it sooner.

Also consider this – everybody who adds in accessibility late in their current game might add it in early in their next one.

Unity Accessibility Plugin – Update 12 – Release Date and Anniversary

Roughly a year ago I started work on the Unity Accessibility Plugin. That doesn’t mean I have been working on it for one year – far from it. I honestly have no way of telling how many weekend and evening hours went into it. (It was a lot, though.)

Which is why I am a little excited to see that the plugin is finally so close to release!

What’s the holdup?

If you’ve followed this blog a little, you might remember that I already released a game with the plugin.

It might seem counter-intuitive, but having released a game with it doesn’t mean the plugin itself is ready for release. It’s one thing for me to use my own plugin, and another to have the plugin usable by others. It needs rigorous testing, documentation, a nice UI, demo projects and a support forum. And then everything needs to be bubble wrapped and uploaded to the Unity Asset Store.

When?

I hate disappointing people. So I won’t give an official release date. I couldn’t anyway, even if I tried.

Remember the old saying that the last 5% of a software project take 95% of the time? And I still need to do actual (I mean paid) work, too. It is impossible to predict anything with any reasonable certainty.

WHEN???

Ok, so here’s the jist.

The documentation just isn’t done. There’s not actually all that much left to do on the code before the initial release. Oh, yes, I have a long list of things on my wish list (from localization to Braille support) – but all of those are merely nice-to-have features that can wait until after release. The documentation can not.

Documentation, including at least one tutorial video, needs to be there from the start and it needs to be good.

I’m also in the process of finding a few developers that will beta test the plugin (and documentation) with their own projects. I don’t know how long that will take, either.

Sooon…ish?

So that’s where things are at and that’s why I can’t give a release date prediction. If you’re a developer, you probably understand this anyway, buy I just felt like I should explain why I am so evasive when asked for a date.