My notes from Sally Shepard‘s talk on Accessibility at iOSDevUK. It was really good, I thought I knew quite a bit about a11y but actually only VoiceOver really, so I learned a lot. Her slides are here.
Passionate about accessibility, accessibility issues in family.
- Not that many people.
- Time consuming.
- Too complicated.
- Don’t know how to test it.
1 in 7 people have some form of disability. It’s a growing population. This doesn’t include temporary impairments (break an arm, finger). Disabilities can make life extremely difficult. Can use technology to overcome these challenges.
- Complete blindness.
Wide range. how do these people use iOS?
- Replicates the UI for users who can’t see it.
- 36 languages.
- On iOS and OS X, iPod shuffle.
- Can also extend using braille.
- Brail displays.
- Brail keyboards.
- Makes a device that is completely usable for wide range of people that wouldn’t be able to otherwise.
- Single tap to hear. Double tap to open.
- Camera app. Demo – finally understand face detection.
- Wouldn’t have thought camera could be accessible.
- Demo Text edit.
- Demo Flappy bird. Voice over doesn’t see anything on the screen.
- If an app isn’t accessible, it’s just like a blank screen.
Make Views accessible using isAccessibilityElement. Can also set accessibilityLabel. UIKit uses the title. Image based controls need to specify this! Don’t include the control type.
accessibilityTraits: Combination of traits that best characterise the accessibility element.
accessibilityValue: Used when element has a dynamic value. Like a slider.
AccessibilityHint: Describes outcome of performing an action.
Adding support to xib or storyboard:
- Enable a11y.
- Fill out label.
- Add hint traits.
Adding support programmatically:
- Set label.
- Set hint.
- Set value.
- Set traits.
Most apps have moved beyond basics. gestures, games. Handle this by finding out if user has voice over on, and if so, present something different.
UIAccessibilityCustomAction: Can add multiple actions to an element. e.g. array of actions on a table cell. In apple’s own apps since iOS 7, now in the API for iOS 8.
UIAccessibilityContainer: Specify the order voiceover should go through the elements.
accessibilityActivate added in iOS7. Gets called when user double taps. Good when gesture is normally used to activate.
DirectInteraction. Have to be careful about how you use it.
A11y notifications. Know if VO is speaking, when it has finished speaking. Can tell it to read things out at specific times.
Two finger double tap. e.g. in camera, will take a picture.
What if not using UIKit? Implement UIAccessibilityContainer protocol. VoiceOver just needs to know the frame of the contents and where they are on screen. Good sample code from WWDC.
- Test plans
- User stories
- Use cases
- Do all of these with VO.
- Simulator good for debugging. Use accessibility inspector.
- A11y shortcut – triple tap home button. Or tell Siri!
- Screen curtain. Three finger triple tap on the screen. good way to conserve battery! Makes sure you are not cheating.
- WWDC labs
- Charities and local councils
- Support groups
Motor skills: Maybe can’t perform gestures, or press buttons, or hold a phone. In that case, device is blank screen. Can’t do anything with it.
Assistive touch: Can access things like more fingers, gestures, shaking.
Switch control: In iOS 7. Allows people to use device by using a series of switches. Can be used by hands, feet, head, anything. One switch or multiple switches depending on abilities.
- Camera with switch control, take a picture.
- Flappy birds with switch control. not very successful!
Amazing feature, v necessary, glad they added it.
Adding support for switch control:
- Find elements that have actionable behaviour
- If you’ve gone through a11y APIs for voiceover, should work.
- Could make it better, if you did the a11y container protocol, specify a better order.
Have to test on a device. Simulator only gives you inspector.
Go though, same thing, make sure you can do the things you app does.
Contact apple, super happy to help with things like that. Talk to local charities or user groups.
Autism, or cognitive disabilities. iOS can be distracting, because it’s quite an engaging experience. How does someone use it?
Guided accesses. Helps them focus. Parent or care giver can specify what actions shouldn’t be allowed.
UIAccessibilityIsGuidedAccess, new in iOS8
- Is bold text enabled
- Reduce transparency
- Darker system colors
- Reduce motion
Why add a11y?
- Good thing to do
- Apple really care about it, e.g. Tim Cook’s “When we work on making our devices accessible by the blind, I don’t consider the bloody ROI“
- More users
- A11y is for everyone: lots of features, we take for granted. Siri, dictation, audio and video messages, are huge things for people with impairments.
- Craftmanship – commitment to quality
- Most amazing thing you can do as a dev. Change someone’s life. Give them a voice.
Things to do:
- Push to OSS projects you use
- Talk about it more – blog about it
- Get involved
- Still a LOT to do
- Even if it seems like only a few people you can make a big diff
- Spend a whole day with voice over on (very few support it)
- Take one weekend to do something with a11y.
- Work with charity to run a hackathon or hack day
- As a dev it’s up to you to make your app a11y
- Is a lot of people, 1 in 7
- Very simple to add
- No app is too complicated to be a11y
- Testing is straightforward