Unity Accessibility Plugin – Update 4 – Popup Windows

Thanks to the power of Antibiotics I’m back on my feet and back working on the Unity Accessibility Plugin. This week, I implemented support for popup dialogs.

If you want to start reading about the plugin from the beginning, you can find the first post here.

Popup Menus and Overlays

Not all user interfaces in an app are full screen menus with one clear list of items. There are also partial screen overlays and popup menus. Just think of a level end screen that grays out the game screen in the background and overlays itself in the front. In an app for seeing users this is easy to achieve. Simply put a full screen panel behind the popup and voila – the buttons and elements below it cannot be pressed.

A notification window asking the user for confirmation. There are other buttons grayed out in the back behind the confirmation window..

An example of a popup confirmation window.

Automatic vs. Manual Setup

However – the accessibility plugin can’t know that a button cannot be pressed because something is laying over it. Yes, I know, I could do raycasts and see if there is something blocking it. But that wouldn’t end up being a very clean solution. I would have to do those checks every frame on every UI component – and even then it might be tricky in special cases (which every project always has). It also would contradict the ideally very small footprint the plugin should have.

When a popup menu or dialog window opens, the expected behavior is very clear. The input should focus to that window, and that window alone.

The simplest solution I found was to simply add a little checkbox to my UI Container component: [ ] IsPopup.
This tells the plugin that the other UI elements on the screen are no longer accessible for as long as this popup is open. The plugin will then push all those elements onto a stack, and then restrict focus to the newly opened dialog. It will then get those elements back once the popup closes. All of that happens automatically. The checkbox is the only manual action needed from the developer. Seems like a fair trade off between performance, clean functionality and required manual setup.


Continue Reading with the next part here

Sick Again

Almost to the date one year ago everyone here in the office and at home was sick – except for Sascha. I blogged about it back then (see here). Back then I swore to get a flu shot like him this time.

Woman sitting in bed, looking ill and unhappy. She holds an ice bag to her head.

Well, I did, even at the same place as him. And now I am sick – again. And guess what? Sascha is healthy as a horse – again. How does he do it?

I know a flu shot doesn’t protect you from the common cold, but when you’re feeling as horrible as I am right now, you’re willing to try out everything.

Muscle Memory and our Office Alarm System

Do you sometimes notice yourself doing things without thinking about it, and without actively making the decision to do them?

One example that gets me a lot is locking the door after I leave the house. And then five minutes later I wonder whether I locked the door, can’t remember, frantically turn around and run back to check – only to find the door locked – of course. Because I always lock it, and it’s become so practiced a routine that I don’t even spend enough conscious effort on it to remember it. At least that’s how I explain it to myself.

A woman is walking to work and her steps become insecure as she wonders whether she locked the door, turned off the stove or left the lights on at home.

Is it Muscle Memory?

Another thing I noticed is that when I step in front of my piano, I can still play some of the tunes that I learned when I was 16. Even though I stopped playing since then. My hands just play and hit all the right keys – until the moment I look down on them. That’s when I mess up. But it’s still kinda cool.

Where this really gets me in trouble is our office alarm system. I am so used to turning it on when I leave at night, that I don’t even think about punching in the code anymore. But, on the rare occasion that I am the first in in the morning, my fingers won’t cooperate. They punch the keys to arm the system, not to disarm it. It wouldn’t be so bad – the system allows me a second chance – but I don’t even notice that it is wrong! A few seconds later, when the countdown is over, a blaring siren sets off. I know, because this happens to me about every third time I do this. To really rub it in, our security system sends out alert text messages and emails to every cellphone on their notification roster, making sure my slip up doesn’t go unnoticed by anyone…

I’ve switched now to using the app instead. It has just one big round button that will either arm or disarm the system, depending on what state it is in. Almost foolproof.


Unity Accessibility Plugin – Update 3 – You’re Using It Wrong!

Time for an update on the progress of the Unity Plugin to make the Unity UI (uGUI) accessible for blind and sight-impaired users. This is part four in this dev blog. You can read the first three parts here, here and here.

You’re using it wrong!

I am a big fan of writing software that requires only a minimal amount manual setup. Do you know Doctor House’s motto “Everybody lies”? Well, as coders we live by a similar motto: “Everybody uses our software wrong.”

A alien is telling a human to stand back and not touch any of the controls in his space ship.

This might be a horrible generalization, but it’s always best to assume that your users don’t read the documentation, don’t follow tutorials and don’t ask for help unless they have to. They rather wildly click on all kinds of buttons and see what blows up. Therefore good software best works out of the box and doesn’t need the average user to click anything.

I couldn’t agree more, which means I am trying to make it so that this plugin requires no more than a minimal amount of setup. Just add the accessibility component to all relevant UI elements and you’re close to being done. I am actually considering a semi-automatic markup feature, that goes through the scene and adds the component automatically to every UI element it finds.


Continue Reading with the next part here

Unity Accessibility Plugin – Update 2 – First Version Up and Running

Recently I began writing a plugin for Unity that would make the user interface of apps (using uGUI) accessible for blind and sight-impaired users. If you missed the first two parts of this dev blog, read them here and here.

Unity Logo wearing a blindfold, hands reaching out carefully to feel around.

Label, Value, Type and Hint

For the most part, I followed the iOS VoiceOver design for my plugin, because it makes the most sense to me. Each UI element has a label, a value, a type, and a hint. All of these will be read out when a UI element is selected, unless the user makes another input. (Note: There are a few optional additional properties, such as traits, language and frame, but those are kind of out of scope for this plugin.)

This allows the user to quickly scroll through the UI elements, and getting the most relevant information (the label) first. If he needs more information, he can stop and listen to the rest.
Here are two examples of how that works:

Example 1:
The selected UI element is a toggle for the ingame tutorials. The system would read out “Tutorials”, then after a brief moment it would add the value “Enabled”. After another moment it would read “Switch” to explain the object type. After a longer pause it would then add “Double tap to change.”

Example 2:
The selected UI element is the PLAY button. It has no value, so this part is skipped. The system reads out “Play” then waits for a brief moment and continues “Button”. After a longer pause, it reads “Double Tap to Select”.

First Running Version

A first version of the plugin with some very basic functionality is implemented. The UI can be marked up with accessibility components and the plugin – if enabled – will automatically navigate through them, read them out and allow the user to interact. The plugin currently supports Text Labels, Buttons, Toggles and Sliders.

Screenshot from Unity showing a Menu screen with a number of buttons and sliders.

The plugin in action.

Everything is currently made to work with the new Unity UI (uGUI). I have a task on my list to look into NGUI as well, but it is pretty far down the list. Once it is working, I am hoping it won’t be too hard to adapt it for the different UI systems Unity has to offer. My feature wishlist in JIRA has been growing steadily since I started.

The plugin also includes yet another feature from VoiceOver: iOS puts a frame around the currently selected UI element. I know that this is completely useless for someone who is blind. But as a sighted developer, I find it incredibly helpful. So I added that in as well.


Because I develop on a Windows PC, and because I am lazy and don’t want to build and run on my Android phone for every little test, I built a quick library to access SAPI, the Text-to-Speech system that comes with every Windows installation. This allows me to test directly in the Unity editor and still have my screen read out to me. As a side effect, since I already implemented the keyboard controls, the plugin now supports Windows as a platform.

Further Reading

I found a wonderful article by Matt Gemmel about common misconceptions from sighted people about making their apps accessible. It’s well-written and easy to read, and definitely worth it if the topic interests you even a little bit or if you are considering making your own app accessible. Here is the link.

Continue Reading with the next part here

Unity Accessibility Plugin – Update 1 – TalkBack, VoiceOver and Unity

As I mentioned in my last post, I’ve started developing a plugin for Unity that will make the UI (uGUI) accessible for blind and sight impaired users. I made some progress and done some more research and it is definitely time for an update.

Google TalkBack and iOS VoiceOver

I really like Google as a company. As in, I would love to work there and I have Google stock in my investment portfolio, even though I usually stay away from technology companies. Unfortunately, in terms of accessibility on Android, they have some catching up to do.

A woman wearing a blindfold is trying to use an iPad.

Not as impossible as it looks.

The integrated screen-reader that comes with Android is called TalkBack. It kind of works, but not as smoothly as it could. I put my phone in accessibility mode with a blacked out screen for a few days. Then I had to give up – most of the apps are not accessible enough to be usable – at least not for a novice. I constantly got stuck somewhere. I am talking about Chrome, News, Music and Email, nothing fancy. More importantly, those are the in-house apps from Google, so I expected a smooth ride. After some research online (yes, googleing, pun intended) I found that others shared the experience. Blind users usually install additional software, such as different Email clients and browsers, that are more accessible. But at least from what I learned, the majority of users seem to choose an Apple device instead.

A hand is holding up an iPhone and pointing the Home button towards the viewer.

You can see the advantage of having an actual button. Get it? You can see it?

The iOS counterpart to TalkBack is called VoiceOver. It does more or less the same thing in terms of screen-reading, but I felt like the navigation gestures were a little more intuitive and I felt more in control. I still got stuck a few times using my iPad this way, but not as often as on Android. Another plus is the physical Home button that all iOS devices have. A quick, impossible to miss safety net when stuck. The advantage of having a palpable physical button when you can’t see the screen needs no explanation.

From a developer’s standpoint

I suppose both systems do their job – but as a Unity developer, TalkBack is a lot more difficult to deal with. Let me explain why.

Almost all of VoiceOver’s special gestures all use multi-touch. Examples are the two-finger double tap or the two-finger swipe up. This makes a lot of sense. There must be plenty of games and other apps that want to use single finger swipes themselves, so a screen reader shouldn’t interfere with all of these apps.

TalkBack however uses only single-finger gestures. I suppose not all Android devices support multi-touch and so they ended up using the lowest common denominator. As a result, when TalkBack is enabled, it will block all single finger inputs. I am not sure how native apps deal with this, but in Unity this is detrimental. See – TalkBack doesn’t work with Unity itself, but it still blocks all the input. I cannot try to catch a swipe gesture if I never receive the finger-down event in the first place.

VoiceOver kind of does the same thing when it comes to multi-finger input – but with a major difference. It has a setting that turns this kind of input blocking off, giving full control to the app. And this setting is turned on in Unity by default. So while the Unity app is active and in the foreground, VoiceOver sits quietly in the back. Different from its iOS counterpart, TalkBack has no such setting as far as I could find.

This isn’t just bad news for my plugin. The Unity game I had already prototyped when I began this quest uses single-finger gestures as the main gameplay component. This is beyond frustrating and I haven’t found a way around it yet. My best solution for the time being is to detect when Talkback is running, and ask users to suspend it while they’re inside this app.

And just to make matters worse, TalkBack allows to configure your own gestures for certain actions. I have not found a way to query these gestures, so I can only listen to those supported in the default settings.

Continue Reading with the next part here.


Unity Plugin for UI Accessiblity

Updates on the game have been scarce of late, I know. I’ve been looking into making my mobile games accessible and as a result felt forced to take on a new side project.

Accessible Mobile Games

Not everybody realizes that blind people are avid smartphone and tablet users too, since both Android and iOS have integrated screen readers. These help navigating around the screen, reading out buttons and text and preventing accidental clicks. There are also a few special gestures that can be drawn on screen that will trigger the Back or the Home button.

Blind Man Walking with a stick

The problem is that I am using Unity for my games, which doesn’t produce apps that work with these screen-readers. This is perfectly alright, it’s just in the nature of the design on which Unity is built.

No Unity Solution – yet

I searched up and down the Internet for a solution, but all I could find was other people facing the same problem. They either gave up on making their apps accessible, or had to re-implement their menus, and record additional audio or use Text-To-Speech engines. If you want to read a great story from one such developer, let me point you to this Gamasutra Article.

Woman standing excitedly in front of the Unity Asset Store.

My usual asset store experience.

My hope was to find a plugin to do some of the heavy lifting for me, but for the first time, the mighty Asset Store, my trusted friend, who usually has a solution for every problem known to man, came up empty. There are a number of Text-To-Speech engines, but nothing that automatically handles the UI navigation, reads out the elements, makes sure buttons can be triggered, sliders can be moved and toggles can be flipped – let alone listen to any magic gestures.

So I decided I will write such a plugin myself then.
I have no clue whether the world at large has any need for this (I guess not, or someone would have done it already), but I do, so let’s get started.

Feature Wishlist

In short, what I would like is a way to mark up my UI (uGUI) with accessibility tags. The plugin has to then automatically allow the users to navigate the menus using gestures, interact with the UI and have a text-to-speech system read out what’s on screen.

There will obviously be some manual setup involved, but a good plugin will make things work with minimal effort. Everything should work on Android and iOS, including Text-to-Speech.

First Steps – UI Navigation

Unfortunately, there seems to be no gold standard of UI navigation for the blind. Both Android and iOS have their own way of navigating through the screen. Sure, both use swipes, but that is almost where the similarities end. On Android, you can swipe down to jump to the next element on the screen. On iOS devices you swipe right. Swiping down will jump to the next UI container instead. On Android you can jump to the top of the screen by swiping up and then immediately down. On iOS you can read from the top of the screen by doing a two-finger upward swipe. The list goes on forever.

I can only assume that the respective users would like to keep using the gestures that they are familiar with – so a good plugin will have to support the appropriate navigation depending on the platform.  Heck, while I am at it, I might as well throw in listening to the keyboard and make everything work on PC as well.

Continue Reading with Part 2 here.

Dev Update #14 – Concept Graphics and Aspect Ratios

Have your base (classes) covered!

In spite of what I wrote in my previous blog post, I realized I need to create one more department before I can finally tackle the Build Menu: The Management department. Like the Design department, this one works differently from the production floors. This makes a difference from the coding point of view – as I want a base class to take care of all floors, and then handle only specialized stuff in derived classes. And like I said, I want all my bases covered.

Here is the mockup graphic for the Management floor:

The idea was that a Management floor can employ up to four development directors. Each director would allow for one game to be in production. This way the player can work on multiple games at the same time. But I made a pretty noobish mistake when creating the above mockup – I hadn’t given any thought to the aspect ratio of my floors.

Aspect Ratio, anyone?

When creating the actual game graphics for this floor based on this mockup, I ran into some trouble. I hadn’t really thought about aspect ratios and screen width when I created those images. That was a big mistake, as it turns out. The mockup screens were too wide. There was simply no room left on the right side of the screen for the two big purple buttons, not unless I scaled the height down.  But then everything else in the floor would become tiny. To counteract, I would have to scale up the characters. But then I didn’t have enough room above the head of the managers to put the actual game icons. My characters were simply too big.

So I redesigned that department completely. Here’s the result (middle floor):


Instead of placing the icons above the characters’ heads, they were placed next to them, sorta like a PowerPoint screen in the background. It actually gives it a nice manager look and feel. And as a bonus, this will free up some screen space at the top of each floor where I have been wanting to put a floor number. In another change, the Release and Trash buttons on the right are done away with. Instead, scripts can be released or trashed by clicking on them and opening up their details dialog.

I definitely learned my lesson and will create future mockups in a sensible aspect ratio.

Now, this time for real – next up is going to be the build menu.

Dev Update #13 – Code and Design Departments

It should have come as no surprise to me that spending my free time working on a game directly clashes with writing blog posts in that same free time. And one of these things simply comes more natural to me than the other. So it’s been five months… Time for an update.

I really want to start working on the build menu, but before I can do that, I need a few floors that I can actually build. So I created basic prefabs for the Code and Design departments.

Here are the results.
BlogPost12The Code department layout (top) represents the basic layout for all the basic production floors. 3D Art, Animation, Sound etc… they all will share the same basic setup with three queue slots. The Design department (bottom) works differently and needed a special setup. So it made sense to create these first and cover all my bases.

In my original game design document, the workers would all sit with their backs towards the player. This was mainly due to my poor drawing skills. But for the actual game I found this to be too boring. So instead, I turned them all around, so that they would be facing the camera.

The floors all use the same background graphic for the floor, which is actually grayscale. The graphic is tinted in realtime with the color of the floor.

Please keep in mind that the floors are not really finished yet. The coder for example doesn’t even yet have a computer in front of her. But it’s good enough to work with. Now I can start creating the menu that let’s me build floors into my game studio tower.

‘Fastest Growing’ mobile device? That Means Nothing!

I just read an article on Flurry about Phablets being the fastest growing device type. Phablet usage grew a seemingly impressive 148% in one year. (You can read the article here)

But is it really that impressive? Should we all now rush to create apps specifically for phablets?

blog_fastestgrowingI like phablets. I am NOT comparing them to cockroaches in this comic.

Fastest Growing means nothing

Growth is measured in percentage, relative to your current size. Which means, the smaller you are, the easier it is for you to have incredibly large growth numbers.

Imagine a bored farmer starting a company to make cellphones. In the first year he makes one for himself. Then he sells one to his mother and one to his neighbor in his second year. Thus he has grown his sales by 100%. Sounds impressive, right?
(Explanation: He sold one in the first, and two in the second year.)

On the other side: If Apple, which has sold roughly 170 million iPhones in 2014 (Source here), now sold 171 million iPhones in the next year – a whopping 1 million more devices! – it would only make iPhone sales grow by 0.006%. Simply because the number was already that high.

Clearly, the farmer has the faster growing product.
But, does that mean anything? Or at least, does it mean anything good? Is a total of three devices something to celebrate?

Which company would you rather invest your money in?
What cellphone operating system would you spend your time and money developing exclusive content for?

(Of course I wish the best of luck to the farmer in his endeavors!)

Small growth can be more impressive

Growth alone doesn’t mean much. But that changes when you provide surrounding data, ideally in the form of some absolute numbers.

If you grow your company size by 10% and had 1000 employees before – congratulations! You just created 100 new jobs!.
If you increase the amount of money you give to charity each year by 100%, and you gave $3 before… not much to celebrate here.

I am getting tired of reading ‘fastest growing’ everywhere as if it was somehow impressive all by itself. If anything, big growth usually means that whatever is growing so fast is currently really quite small.

It’s the small growth numbers that are often way more interesting. Something that is already big and still manages to grow 5%-10%? Now that is actually impressive.

And if something claims to be the fastest growing industry/company/religion/tech/whatever, but then doesn’t provide any real hard numbers, things start smelling fishy.

Here is a comic by XKCD that pretty much hits it on the spot:

Back to Phablets

Don’t get me wrong – I like Flurry’s insights articles a lot. And the core of the information is perfectly fine and interesting. More and more Phablets are being bought and used. They are probably here to stay and will become a permanent and relevant part of the mobile device market.

Unfortunately the article doesn’t list any absolute numbers on how many phablets are out there, how often a day they are being used and so on. It just states that their usage is growing faster than that of iPhones in the same period. Which means nothing. Even without knowing the exact numbers I can tell that there must be a multiple times the amount of iPhones out there than phablets. That makes it hard to judge how important phablets really already are.

From a developer standpoint it probably doesn’t matter much. Phablets are in between phones and tablets in screen size, so they’ll run all apps from these two markets just fine. No need to target phablets specifically.
There’s already apps for that.

Summing it all up, I’m really just ranting.
My point: Beware when something claims to be the ‘fastest growing’ in its field.


I mentioned Apple and iPhones in this article. This was purely for making a point. Neither do I promote buying Apple products, nor do I discourage from it.

Fun fact:
Did you know that Apple actually trademarked the phrase “There’s an app for that”?