Matt Gemmell

Accessibility for iPhone and iPad apps

23 min read5755 words

Ensuring that your iPhone or iPad app is accessible (in this case, to visually impaired users) is the right thing to do, and thankfully it’s very easy – in many cases, there’s no work to do at all. This article for iOS developers describes how to implement accessibility support.

Update: I’m now offering Accessibility Review services for iOS apps via my business site, Instinctive Code. You can read about the services I offer here. I’ve also been informed that when Apple sends out their “Make Your App Accessible” email to companies, they link to this article.

If you’re the average sighted person, you probably have no idea what accessibility technologies are (other than being vaguely aware of text-to-speech, screen magnification, and suchlike). You probably implicitly believe that using a computer must be a nightmare if you’re visually impaired, and have a horror of the very idea of it. As a software developer, adding support for accessibility likely seems like a major hassle to benefit a minority of customers. I felt that way too, on all counts, but it’s just not the case.

Here’s the deal: if you’ll agree to just read this one article (do it over lunch or while you’re on the bus), I’ll absolve you of any moral obligation to think about accessibility for your software. If you decide to pursue the subject afterwards, that’s great – it’s completely up to you.

First question has to be: am I, the author of this piece, blind? No, I’m not. Second question is: do I have a vested interest? Absolutely. Time for a quick story (if you want to skip the story, jump down to the next heading).

I’ve needed to wear corrective lenses (either spectacles or contact lenses) since I was 5 years old; I’m very long-sighted, and have astigmatism (my left eye is much worse than my right). All the same, with appropriate specs/lenses I have excellent vision. I see things with perfect clarity, use a computer all day long, drive a car, and so forth.

Almost two years ago, sometime in January 2009, I had my regular optician’s appointment. Unexpectedly, they found a noticeable deterioration in the vision in my left eye, and I was referred to an ophthalmologist for further tests. The outcome was that, whilst there’s nothing active going on at the moment, it’s a strong possibility that I’ll suffer from a condition known as macular degeneration during my adult life. This condition usually affects much older people (and thus is commonly called age-related macular degeneration, or AMD). I’m 31, which is far younger than the average.

If AMD does indeed flare up and progress, it will slowly rob me of my central vision. The world will dissolve into a grey, distorted mush, with ever-lengthening permanent shadows clouding my sight. Eventually, I will be unable to drive a car, read a book, or see my partner’s face, making me legally blind. I will likely retain some peripheral vision as a ring of brightness around the void, but will be unable to ever again see anything in detail or directly. When AMD affects one eye, it’s very likely to affect the other one too. A distinctly gloomy prospect.

But, back to today. It’s the middle of December 2010, and my vision is OK (as much as it ever was; it’s 20/20 when I wear my spectacles or contact lenses). I can turn my head ninety degrees to the left and look out my office window, and see clearly that the dog 200m away is a boxer, and that the car parked down the hill and across the river is a silver Vauxhall Astra. I can walk to the other end of the room and read BBC News stories from my monitor comfortably (without increasing the default text size).

Every crease and line on my hands is all too visible, and grey hairs in my beard do not go unnoticed in the bathroom mirror. I play my PS3 and Wii all the time, as well as a fair bit of World of Warcraft. At my last optician’s checkup, my prescription hadn’t even changed. I check my vision at home weekly with an Amsler Grid. I’m fine – for the moment.

Some people are worse off. There are many who already suffer from visual impairment or even complete blindness, including some from birth. When I was diagnosed, I was shocked – it really threw me off balance. I researched the condition, and realised what the future might hold. It wasn’t a pleasant time. I envisioned the loss of my career and my independence, and imagined not being able to see Lauren’s face as I grew old.

After a couple of days, when I had calmed down sufficiently, my instinct as a scientist to fully understand the situation once again kicked in, and I decided to spend some time learning about VoiceOver, the accessibility technology built into Mac OS X and iOS (on iPads, iPhones and iPods).

When you first enable VoiceOver on a Mac, you’re asked if you’d like to take a brief tutorial; I did. After the first couple of minutes, I closed my eyes, and really used it. I wept.

These technologies are a lifeline for visually impaired users. Accessibility isn’t about providing simplified or alternative content, it’s about ensuring everyone has equal access to your existing content. VoiceOver (and similar technologies on other platforms) allow blind or partially-sighted users to fully experience your apps, and, by extension, to communicate and be productive and express themselves. In this day and age, visual impairment need not hinder use of a computer or the internet.

Myths about visually impaired users

Let’s start by listing a few popular myths about visually impaired users, and contrasting them with the reality.

  1. Myth: visually impaired users are blind.

    It’s strange how common this belief is; we just implicitly seem to assume the most extreme case. In reality, of course, there’s an entire spectrum of visual impairment, from the lowest prescription of corrective lenses right through to complete blindness. It’s worth realising that the needs of partially-sighted users will differ from those of blind users, and so on.

  2. Myth: visually impaired users access things sequentially.

    How laborious life would be if this were true. In reality, blind and partially-sighted users make every bit as much use of their memory of spatial location as sighted users do, because it accelerates target acquisition. When you decide to create a new email message, you don’t scan the whole screen to find the ‘New Message’ button; your hand pushes the mouse in that direction automatically.

    Similarly, visually impaired users will quickly learn the location of interface elements, and can access them directly. Don’t take steps to artificially linearise your content for such users, and equally don’t assume that you can arbitrarily move elements around either.

  3. Myth: visually impaired users listen to all on-screen text.

    You quickly learn this isn’t true the first time you either see someone using accessibility technologies, or use them yourself. Accessibility users listen to just enough to orient themselves, and make a decision, and then skip to the next element. Putting verbose descriptions or help as accessibility information is fruitless, and will only frustrate.

    Your users aren’t reading a book, they’re trying to use your app just the same as a fully sighted person would. I’ll revisit this point with some specific tips later.

So, visual impairment is an entire spectrum of acuity, and indeed a range of different types of conditions – perhaps affecting focus, clarity, colour-perception, peripheral or central vision, movement perception, and more. Accordingly, there are a wide range of different ways that software and hardware can assist visually impaired users.

Types of accessibility support

Accessibility aids for computing devices can be broadly split into three categories, as follows.

  1. Basic accessibility support, which will be present in any modern operating system without additional software. This includes features like text zoom, screen zoom, cursor magnification and highlighting, on-screen keyboard support, text-to-speech for alerts and other critical system events, spatial and/or monaural audio, and global white-on-black and/or grayscale display modes. Mac OS X, Windows and most (all?) of the various contemporary Linux distributions support the majority of these features by default.

  2. Advanced accessibility support. This tends to focus on ease of on-screen navigation and customised content presentation (often with a significant focus on web browsing). On Mac OS X and iOS, these functions are provided by VoiceOver, which is a part of the operating system and thus available at no extra charge and without requiring any additional installation. On Windows, a variety of screen-reading and other accessibility software is available, the most prominent example being JAWS, which is commercial software costing many hundreds of dollars. On Linux, there are several free screen-readers available, which often require to be explicitly installed first (a relatively trivial process on most contemporary desktop Linux distributions).

  3. Assistive hardware. This includes devices such as refreshable braille displays, and a wide range of other input and output peripherals.

In this article, we’re focusing on iOS, thus we already have VoiceOver built-in, and so our goal is to facilitate the kind of advanced accessibility support described in point 2 above. It’s my opinion that in this area, iOS is a uniquely suitable platform.

iOS devices are ideal for visually impaired users

iOS devices are an excellent choice for visually impaired users. They’re portable, they’re fully self-contained and are designed to be used without external peripherals, and the iOS user experience is very functionally concise and goal-oriented. iOS will not present a proliferation of windows, toolbars and minuscule user interface elements. For the most part, they’re comprehensible, focused, and as intuitive as possible. These are all ideal qualities for accessible software, and indeed for software in general.

The question then becomes how we, as developers, can ensure that our applications are as accessible as possible, and that VoiceOver can provide the best assistance. The good news is that it’s incredibly easy to add accessibility support to your application (there’s no bad news, incidentally). The reality of adding accessibility support to your app is that:

  • About 80% of your app is probably accessible already, via the built-in VoiceOver support in UIKit.

  • You can probably boost that to around 95% simply by spending a few minutes in Interface Builder, without writing a single line of code.

  • And you can very likely reach 100% accessibility support via implementing some incredibly trivial methods, which will also take you just a few minutes.

Adding robust accessibility support is exceptionally easy. You don’t need to learn much, and it won’t take much time at all. You can very likely begin and complete your app’s accessibility support the same day. Before we get into the details, I want to briefly describe how VoiceOver actually works, in case you’ve never used it.

How VoiceOver works

In a nutshell, VoiceOver allows the user to navigate between, and interact with, accessible elements in an application. An accessible element might be a control, static text, an image or any other kind of content. It’s up to the app (with a lot of help from UIKit) to make suitable items available as accessible elements. Additionally, VoiceOver provides audio cues when the user moves between interface elements, or when the screen layout alters in some way.

The decision about what constitutes an accessible element is up to you, but generally you should aim to make your app available for use in the same way to a completely blind user as it is to a sighted user. This doesn’t always mean what you think it does. Let’s clarify:

  1. Make your app’s entire functionality available even to completely blind users; and

  2. Allow blind users to access similar expedients to sighted users.

That second point bears some discussion. If you’re a sighted user, close your eyes and try to imagine using your computer. You’ll probably experience a mild state of panic. You have no points of reference. No way to read ahead. No way to perform a visual parallel search of information. No way to skim-read to quickly determine what information is of interest or relevance. You also lack context. You’re suddenly placed in the position of having to rely intensely on memory rather than inspection.

Your feelings are natural, and some – not all – of them represent the challenges that visually impaired users face. But let’s turn down the anxiety a notch; it’s not actually as bad as all that.

Visually impaired or blind users can make use of the dozens of visual optimisations you enjoy when processing information; they just can’t necessarily use their eyes to do so. Blind users can still skim, and rely on muscle memory, and get the sense of the structure of a window or screen without exhaustively exploring it. We just need to make sure that our apps don’t stand in the way.

An excellent example is web browsing. Eye-tracking experiments show that people don’t read web pages from left to right, top to bottom (or whatever the cultural and linguistic norm is for the specific user). Instead, we skim. We pick out what looks important in terms of describing the structure and core topics of a page, and sample portions of those areas to make a decision about whether to continue reading, or jump elsewhere. We filter out extraneous information and learn to rapidly home-in on what is likely to be of relevance to us.

With VoiceOver in Safari on iOS, visually impaired users can do that too. VoiceOver has a concept called the rotor, which you can think of as a mode-toggle. With VoiceOver enabled, you adjust the rotor by literally making a two-finger rotation gesture on screen. In Safari, the rotor changes what type of element on the page that VoiceOver navigates between. In “Links” mode, VoiceOver will move between links. In “Headings” mode, it will move between headings on the page. There are modes for images, form elements, visited or unvisited links, and more. It allows users to perform a sort of parallel search without even looking (or scrolling). The rotor exists system-wide too, for example toggling between reading out characters or words when entering text, and so forth.

The rotor is an example of how robust accessibility support entails allowing differently-abled users to not only access the same functionality and content, but (crucially) also to benefit from similar navigational optimisations as unimpaired users. It’s a very important point to grasp.

Accessibility isn’t about providing a cut-down experience, but neither is it about making your app’s sighted experience available as-is. Always consider the implicit capabilities of the unimpaired user, and take steps to even the playing field. That’s what accessibility means. Not special treatment, but tailored access to the same treatment. Put another way, don’t just give special treatment to fully-sighted users.

Without further preamble, it’s time to make your app accessible. Let’s take a look at Interface Builder first.

Adding accessibility support via Interface Builder

As mentioned previously, UIKit controls are accessible by default. Buttons report that they are buttons, and switches know when they’ve been pressed. Text in labels and fields is read aloud automatically, and the user hears a cue when you push another view controller on screen. There are hundreds of other examples, but suffice to say that most of the hard work is already done. You just have to fill in a few details.

Accessible elements have three primary properties which are of interest for accessibility: an accessibility label, an accessibility hint, and one or more accessibility traits. Note that the accessibility label is independent of any other label the element might have (such as a button’s label), but that if you don’t specify an accessibility label and your control does have a regular label, VoiceOver will of course use it.

There are three further accessibility properties (the accessibility frame, which specifies the element’s location and size on screen; the accessibility value, which gives the element’s current value as a string; and the accessibility language which specifies the language in which to speak the item’s accessibility information, if that language should differ from the user’s chosen default), but you’ll only need to set them if your control isn’t a subclass of a standard UIKit control or view. These additional properties are set programmatically; a topic we’ll touch on later.

You can set accessibility labels, hints and traits in the Identity inspector in Interface Builder. There’s a screenshot of it below. If you cannot see the screenshot for any reason, note that all options shown in it are fully described afterwards.

Screenshot of accessibility options in Interface Builder on Mac OS X

Let’s talk about what each of these attributes means, and how VoiceOver uses them.

  • Accessibility Enabled

    This is a master switch for whether the element is accessible or not. UIViews and any custom direct subclasses of it are not accessible by default, whereas UIControls are. Elements which are not marked as accessible will be ignored by VoiceOver, and will be skipped when the user is navigating between accessible elements.

  • Accessibility Label

    This is a very brief textual label, title or name for the item, analogous to the title or label of a button or other control. It should never include information about what type of control the item is; such information belongs in the accessibility traits instead.

    Whilst VoiceOver will fall back on the control’s regular label if this is not specified, this attribute is extremely important for items that either have no label (such as buttons with icons or symbols instead of text), or those with a generic label whose meaning is implied by context which may not be apparent to visually impaired users, such as proximity to another element. A common example would be a “New” or “+” button.

    A good accessibility label will be one to three words in length (preferably one word), and will concisely describe the element, ideally in terms of its purpose. Examples are “Play”, “Favorites”, or “New event”. Remember that labels should be as short as possible, but that you must also provide context which may not be obvious to a visually impaired user – for example, it might not be immediately obvious that your “New” button is adjacent to your list of events, so the accessibility label “New event” is superior to simply “New”.

    In all cases, accessibility labels should start with a capital letter, and should not end in a period. This will allow VoiceOver to read it aloud with appropriate vocal inflection. Labels should be localised.

  • Accessibility Hint

    This is a brief description of what the element does. It should never include information which is already provide by the item’s accessibility traits (such as the type of control, whether it is selected or not, and such). An accessibility hint can be thought of an expanded, clarified version of the accessibility label. The hint will be spoken after a brief pause, once the label and traits have been spoken; this allows users to skip to another element quickly, or to wait a moment and hear the full hint.

    Whilst accessibility labels should be provided for every control, accessibility hints are comparatively rarely needed. Only provide a hint if your label isn’t sufficiently explanatory in all situations. You should provide both a label and a hint instead of making your label longer; accessibility labels must always be as short as possible.

    A good accessibility hint should describe the outcome of interacting with the control. It should not describe the actual means of interaction (like tapping or swiping), because there are multiple ways for a VoiceOver user to interact with your control, and you can’t predict which they’ll use. Hints should begin with a plural verb, if at all possible; for example, “Deletes the event”. Do not use a singular verb (such as “Delete the event”), because this will sound like an imperative (a command or instruction to the user).

    Hints should be brief, without sacrificing grammar – don’t omit words like “a” or “the” just to save space. Hints should always begin with a capital letter and end with a period (unlike labels, which should not end with a period), and they should never include the name or type of the control. For example, you would not use a hint like “Button that saves the document”, or “Save will save your document”. Hints should be localised.

  • Accessibility Traits

    These are characteristics of the item’s nature and behaviour which are relevant to accessibility.

    You should use whatever combination of traits is necessary to accurately describe your control or view. The meanings of each trait and how they affect VoiceOver are described below.

Accessibility Traits

Customise a control’s traits only if you’re certain it’s appropriate. Use your judgement. Remember that traits are both semantic and behavioural; if you feel that a trait is appropriate and meaningful, by all means use it. The deciding factor should always be: will this add useful meaning to how the element is described to the user?

The available traits can be combined freely, but are logically split into two groups: control type traits, and behavioural traits. You should generally consider the control type traits to be mutually exclusive – use only one of them, unless you have a compelling reason otherwise. You can then use one or more of the behavioural traits to further describe the element.

The control type traits are

  • Button: This trait indicates that the element is a button, and/or behaves substantially like a button.

  • Link: This trait indicates that the element is a link, or behaves like one. If interacting with the element will cause an URL to be launched, you should prefer the Link trait over the Button trait, since it’s more specific and thus more appropriate.

  • Search Field: This trait indicates that the element is a search field. There is an expectation that the user will be able to enter or choose a query of some kind, and will then be able to initiate a search which produces appropriate results. This trait may seem unusually specific and arbitrary, but search functionality is exceptionally important for visually impaired users, thus searching is considered a first-class trait for accessibility purposes.

  • Keyboard Key: This trait indicates that the element is a keyboard key, or behaves like one. This trait is automatically set on the system-supplied on-screen keyboards, as you’d expect. If you provide a control which modifies or otherwise affects on-screen keyboard-like input, you should consider applying this trait. For example, a control which (when active) changed the keys available on the on-screen keyboard would itself be considered a Keyboard Key. Likewise for controls which inserted predefined text, or switched between an alphanumeric keyboard and a formula-entry keypad, and so forth.

Always rely on your own judgement when specifying a control type trait; provide as much meaningful information as is reasonable, even if the definitions don’t fit exactly. Pick the most applicable one, and use it as long as it’s not too much of a stretch. The definitions are loose, and are intended to convey useful contextual hints rather than to be absolutely prescriptive.

The remaining traits describe the behaviour or state of the element, and can be combined freely. They are:

  • Static Text: This trait indicates that the element is static text, which is readable but cannot be interacted with and will not change. VoiceOver will automatically include the element if it is reading part of the screen aloud. UILabel controls are already considered static text, but if you use another type of view which happens to contain static text, you should indicate the fact using this trait.

  • Image: This trait indicates that the element is an image. You should supply an appropriate accessibility label to describe it in context. If interacting with the image does something, you can combine this trait with the Button or Link traits as appropriate.

  • Plays Sound: This trait indicates that the element plays a sound, and is commonly combined with the Button trait. It’s important to add this trait to elements which play sound, so that VoiceOver can automatically pause or reduce the volume of other sounds or speech which are playing at the time.

  • Selected: This trait indicates that the element is currently selected, and is usually dynamically applied depending on the state of the control. The Selected trait is appropriate for things like the currently-highlighted row in a table, the selected segment in a segmented control, a Keyboard Key which is currently active (such as a modifier or toggle key), and so on. The precise meaning of “Selected” is up to you, but it’s generally appropriate for currently chosen or active elements, which are not always in that state, and where the state has meaning and consequence for the user.

  • Summary Element: This trait indicates that the element provides a summary of the current situation or state when the application starts. A download manager app would use this trait for an element which held a summary of the number of current and completed downloads, for example, and a weather application would use this trait for an element which gave today’s forecast for the user’s home location. In each case, VoiceOver will read the element aloud to give the user some initial context when the app starts. This is an extremely useful trait when used judiciously. Be very sparing in your choice of summary elements.

  • Updates Frequently: This trait indicates that the element updates its label and/or value too often for VoiceOver to read out every single change. VoiceOver will take note of this trait and will instead poll for changes at suitable times. It is your responsibility to apply this trait appropriately to avoid overwhelming the user. Good examples of potentially frequently-updating elements would be progress bars, countdown timers, stopwatches, network throughput monitors and so forth.

  • Not Enabled: This trait indicates that the element is not enabled and does not respond to interaction. VoiceOver will also fall back on the control’s actual enabled state, as you’d expect. Note that this trait indicates that the control would normally be accessible and would support interaction, but that it is (currently) not enabled. If your element should never be accessible at all, do not use this trait – instead, simply mark the element as not being accessible, and it will be ignored by VoiceOver.

The following traits are also available programmatically, but not (at time of writing, in late 2010) in Interface Builder:

  • Starts Media Session: This trait indicates that interacting with the element starts a session during which VoiceOver should be silenced. An example would be a “record” button in a voice memo application, where you would not want assistive technologies to speak over your recording.

  • Adjustable: This trait indicates that the element can be adjusted continuously up or down through a range of values; an example would be a slider or a picker view. This allows VoiceOver to notify the user that they can interact with the element repeatedly to increase or decrease its value.

Chose the right combination of traits to most appropriately describe your app’s interface elements, and VoiceOver will do the rest. There are still some situations, however, in which Interface Builder isn’t sufficient. We cover these in the next section.

Adding accessibility support in code

In some cases, you won’t be able to provide the best possible accessibility support for a given control by only using Interface Builder. Three of the most common such situations are listed below.

  1. Your control’s accessibility information needs to change depending on the state of your app. For example, you might have a button which has a different effect in different contexts.

  2. You’ve created a new type of control, and its inherited accessibility information isn’t expressive enough. For example, perhaps you’ve created your own kind of split-view, or a custom tab-view, or some other kind of container or view-hierarchy. Or, perhaps you’ve created a three-stage switch button, or some other widget.

  3. You need to ensure that a visually impaired user is informed when some part of your interface changes, even if they’re not interacting with that specific part of the app at the time. For example, you may have a radio-button type of control which changes which view is displayed elsewhere in the window.

You can handle all of these situations easily, via some simple protocols and a handful of trivial function-calls.

For accessibility purposes, your app’s views and controls can be though of as falling into one of three types: non-accessible elements, accessible elements, and containers for accessible elements. Containers are not themselves accessible or useful to the user, but they do contain sub-elements which should be accessible (indeed, they may contain sub-elements which are themselves containers, and so forth).

An example of an accessibility container would be a custom control which contains three related buttons, all drawn and managed by the same view. The view itself need not be accessible, but the three interactive elements within it should be treated as discrete accessible elements. This is what the concept of an accessibility container is intended for.

UIKit thus provides two informal protocols: UIAccessibility, and UIAccessibilityContainer. You can choose which one each of your elements should implement (or indeed neither). All standard UIKit controls are already configured appropriately, though you can of course customise their behaviour via subclassing.

The UIAccessibility protocol simply allows you to return appropriate values for each of the accessibility-related attributes listed previously, including whether the element is accessible at all. If you’re subclassing a UIKit control and implementing the accessibilityTraits method, be sure to combine your traits with the superclass’ implementation.

The UIAccessibilityContainer protocol is more interesting, though even briefer. In essence, it allows you specify the number of accessible elements within the container, then to provide each one in turn (as an instance of the UIAccessibilityElement class).

Instances of UIAccessibilityElement have properties corresponding to the accessibility label, hint, value, frame, traits etc as provided by the UIAccessibility protocol. You should set these to appropriate values on your UIAccessibilityElement objects, and VoiceOver will do the rest. It’s very, very easy, and should take little time.

Lastly, you may occasionally need to inform the user that some aspect of the app’s layout has changed, or that they’ve passed over a boundary into a new area, or even to explicitly read some text aloud as a notification. There are a number of notifications and function calls defined by UIAccessibility which allow you to inform the user of certain pre-defined situations, to post custom textual notifications, and to be informed about changes in VoiceOver’s enabled status.

Use the notifications with restraint, where they’ll add value. You can obtain a very good idea of how the notifications trigger audio cues, and when it’s appropriate to do so, by simply using your iOS device with VoiceOver enabled (particularly the Home screen).

Testing your app’s accessibility support

It’s essential to test your app’s accessibility support properly. There are two situations you’ll particularly want to look out for:

  1. Elements which either do not provide a suitable accessibility label, or are not marked as accessible at all.

  2. Elements which are read by VoiceOver (and are thus accessible) but which would be better ignored.

You can perform very basic testing in the Simulator by showing the Accessibility Inspector, but this won’t let you hear what your accessibility information sounds like to a visually impaired user. You should always test on the device itself; you can enable VoiceOver in the Settings application (or via iTunes, when your device is connected).

Testing on the device will also give you some perspective on how visually impaired users will interact with your app, and you’ll perhaps become aware of some ways to improve the workflow. Generally, if you find a way to improve the user experience for visually impaired users, that improvement will also benefit sighted users. If you’re diligent about accessibility support, the opposite will also be true.

Conclusion

Implementing accessibility support for iOS applications is easy and fast. You can do most of it in Interface Builder, and the rest very rapidly in code – without requiring any structural changes to your project. It’s the right thing to do, and I truly hope you’ll consider spending the extra couple of hours to make your app accessible.

I didn’t write this to bring you down, or to sadden or depress anyone. There’s no need to feel sorry for anybody, least of all for me. With these devices and technologies, there’s no need for visually impaired users to be kept from using your apps. They can read, write, chat, browse the web, play games, make music, develop software and pretty much anything else that sighted users can do, when given equal access. That’s a wonderful, amazing thing. These technologies are a lifeline, and as a developer it’s your duty and privilege to make it possible.

I’ve never asked for an article to be tweeted or dugg or whatever the fashionable method of sharing is at the moment. I do believe that this topic is truly important, and not just because someday the person whose life you improve with accessibility support may be yourself (or me). If you know someone who could benefit from reading this article (or even better, who could benefit others by reading and acting on it), then I hope you’ll share it with them.

If you’d like to read more about iOS accessibility for developers, I can highly recommend Apple’s Accessibility Programming Guide for iOS.

I have a speed-dial (and SMS, email and FaceTime) iPhone app available on the app store called Favorites; it’s fully accessible via VoiceOver using the techniques described in this article.

If you enjoyed this article, you may want to follow me (@mattgemmell) on Twitter; I always announce significant new articles via a tweet. You can also subscribe to this blog in your preferred RSS reader app; there’s a link near the top-right of this page, in the header section.

Feedback is most welcome, as always. You can post a comment using the comments form below, and you can also contact me directly – you can find my contact details on my About page.

You may also be interested in TouchTalk for iPhone, an app designed to help blind or deaf-blind people to communicate, using a modified version of the deaf-blind manual alphabet.