I’ve previously written at some length about making your iOS apps accessible to visually impaired users, and the topic continues to be very important to me.
I’ve been enjoying using the App.Net social network lately, and have thus been trying out various new iOS clients for that service - of which there are many. It’s an exciting time, and reminds me very much of my first few months on Twitter.
An exciting time, that is, if your eyes are fully functional, but a frustrating one for those who rely on VoiceOver to read their iPhone or iPad’s interface to them. I’ve found two App.Net clients which illustrate the two extremes of VoiceOver support, and I wanted to graphically (in both senses) show the difference that a little bit of extra development work can make to visually impaired users.
Accessibility Villain: Netbot for iPhone
Tapbots, makers of probably the most beloved iOS (and indeed now also Mac) Twitter client, Tweetbot, have created an App.Net version of that app called Netbot. It supports both iPhone and iPad, just as its Twitter-focused sibling does, and to my continued disappointment has essentially no functional VoiceOver support whatsoever.
Consider the following screenshot, of which a description can be found below for those who can’t see it. It’s a collage of two side-by-side copies of the same screen from Netbot. The first one is untouched, and thus shows the visual interface that most of us are familiar with: the interface for sighted users.
The second instance of the screenshot has been annotated to reflect the interface as it appears to those using VoiceOver exclusively. The difference is stark and dismaying.
Most of Netbot’s interface is entirely opaque to VoiceOver, and the app cannot be used by visually impaired or blind users. The UIAccessibility API allows Tapbots, the authors of the software, to readily make Netbot usable via VoiceOver with comparatively little effort, but this work has yet to be done in either Netbot or Tweetbot, on iPhone or iPad.
The annotated screenshot shows that whilst the header and contextual buttons are visible to VoiceOver, they are labelled in a developer-focused and confusingly-worded way due to not having their accessibility titles or descriptions set.
The entire toolbar along the bottom of the screen is not accessible at all, and the primary content area (containing all of the actual messages posted on app.net) is simply a series of blank areas with no readable content, rendering the app entirely useless unless you have sufficient visual acuity to actually read the screen. It’s worth noting that, like Tweetbot, Netbot does allow increasing the point-size of the displayed text, however, but this only suits those with some degree of visual ability to begin with.
It’s a poor state of affairs for what is surely the de facto flagship iOS client for the service, from an extremely (and deservedly) well-respected software company.
Accessibility Hero: hAppy for iPhone
We can contrast Netbot’s VoiceOver support with that of hAppy, another App.Net iOS client (which is currently in private beta, at time of writing on 26th October 2012).
hAppy’s VoiceOver spoken interface is exemplary. The annotated screenshot above shows that the app’s entire interface is exposed logically and concisely via VoiceOver, and is fully usable regardless of visual acuity or sightedness.
(Indeed, in some cases - such as the presumably temporary and ambiguous toolbar icons - the spoken interface is actually more discoverable and usable than the visual one.)
Whilst the developers continue to work on refining the beta towards an initial release, I can comfortably recommend it as an excellent App.Net client for VoiceOver users - I look forward to the 1.0.
Providing VoiceOver support in your app is very simple indeed, even if you’ve never done it before: in essence, you’re just specifying which elements on screen should be readable, and supplying the relevant text for each one. iOS and VoiceOver do the vast bulk of the work for you. Your contribution will be implementing a simple protocol (a couple of methods, in most cases), and you can probably do about 90% of the work in the Interface Builder environment within Xcode. It’ll take less time than you think.
Of course, most people who use your apps are sighted, whether they use corrective lenses or not. Of the minority who have impaired vision, only a small percentage are legally or completely blind. I’m not going to try to make a convincing commercial argument for supporting accessibility; I’m not even sure that I could. But that’s just the money side of things. There’s another aspect.
For those with visual impairments, iOS devices are a lifeline. They’re a bionic enhancement - a pocket full of superpowers. The difference that they make to the life of a blind person is truly profound. They’re tools of independence, and of participation. Blind and VI people want to use your app, even if you can’t imagine how they’d possibly do so.
So, please, consider spending some time on accessibility; an hour or three will very likely be enough. It’s a truly worthwhile thing to do, and will have an enormous positive impact on those who don’t enjoy perfect vision - or any at all.
I offer an accessibility review service, if you’d like to improve your app’s VoiceOver support. I’d be delighted to work with you to assess your app’s accessibility, and offer concrete advice and suggestions on how to improve it (and I’m an iOS developer too, so my advice is specific, attainable, implementable, and suitable for developers).