Site icon Accidentally in Code

#iOSDevUK – Sally Shepard: Beyond VoiceOver

Surprise Attack
Credit: DeviantArt / SkylerTrinityRapture

My notes from Sally Shepard‘s talk on Accessibility at iOSDevUK. It was really good, I thought I knew quite a bit about a11y but actually only VoiceOver really, so I learned a lot. Her slides are here.

Passionate about accessibility, accessibility issues in family.

Myths:

1 in 7 people have some form of disability. It’s a growing population. This doesn’t include temporary impairments (break an arm, finger). Disabilities can make life extremely difficult. Can use technology to overcome these challenges.

Vision:

Wide range. how do these people use iOS?

VoiceOver

Make Views accessible using isAccessibilityElement. Can also set accessibilityLabel. UIKit uses the title. Image based controls need to specify this! Don’t include the control type.

accessibilityTraits: Combination of traits that best characterise the accessibility element.

accessibilityValue: Used when element has a dynamic value. Like a slider.

AccessibilityHint: Describes outcome of performing an action.

Adding support to xib or storyboard:

Adding support programmatically:

Most apps have moved beyond basics. gestures, games. Handle this by finding out if user has voice over on, and if so, present something different.

UIAccessibilityCustomAction: Can add multiple actions to an element. e.g. array of actions on a table cell. In apple’s own apps since iOS 7, now in the API for iOS 8.

UIAccessibilityContainer: Specify the order voiceover should go through the elements.

accessibilityActivate added in iOS7. Gets called when user double taps. Good when gesture is normally used to activate.

DirectInteraction. Have to be careful about how you use it.

A11y notifications. Know if VO is speaking, when it has finished speaking. Can tell it to read things out at specific times.

Two finger double tap. e.g. in camera, will take a picture.

What if not using UIKit? Implement UIAccessibilityContainer protocol. VoiceOver just needs to know the frame of the contents and where they are on screen. Good sample code from WWDC.

Testing VO:

User testing:

Motor skills: Maybe can’t perform gestures, or press buttons, or hold a phone. In that case, device is blank screen. Can’t do anything with it.

Assistive touch: Can access things like more fingers, gestures, shaking.

Switch control: In iOS 7. Allows people to use device by using a series of switches. Can be used by hands, feet, head, anything. One switch or multiple switches depending on abilities.

Amazing feature, v necessary, glad they added it.

Adding support for switch control:

Have to test on a device. Simulator only gives you inspector.

Go though, same thing, make sure you can do the things you app does.

Contact apple, super happy to help with things like that. Talk to local charities or user groups.

Learning Difficulties

Autism, or cognitive disabilities. iOS can be distracting, because it’s quite an engaging experience. How does someone use it?

Guided accesses. Helps them focus. Parent or care giver can specify what actions shouldn’t be allowed.

UIAccessibilityIsGuidedAccess, new in iOS8

Visual accommodations:

Why add a11y?

Things to do:

Try:

Facts:

Exit mobile version