RWAPP

My name's Rob. I'm an iOS software development engineer for Capital One UK. I talk & write about mobile accessibility.

SwiftUI Accessibility

Accessibility is important. We can take that as a given. But as iOS devs we’re not always sure how to make the most of the accessibility tools that Apple have provided us.

We’re lucky as iOS developers that we work on such a forward-thinking accessibility platform. Many people consider Apple’s focus on accessibility for iOS as the driver for other technology vendors to include accessibility features as standard. To the point that we now consider accessibility an expected part of any digital platform. This was not the case before 2009.

In Shelly Brisbin’s fantastic audio documentary 36 Seconds that Changed Everything: How the iPhone Learned To Talk she outlines what it meant to blind and partially sighted people to be locked out of the early iPhones.

I was sad because I felt, ‘here’s another time we’re going to be left out’. Eventually, someone’s going to make a special blindness-specific iDevice. It’ll be three versions old. It’ll cost four times as much, and we’ll just keep buying it, cuz it’s the only option that we have.

– Steve Sawczyn

For the first time in 20 years, Apple had built a product I couldn’t use. I’m fairly sure I cried about that.

– Shelly Brisbin

Being locked out is the reality for many of your customers if you don’t consider accessibility right from the start. Accessibility in UIKit is indeed world-class, but it will only ever be an add-on.

This time around, with SwiftUI, Apple has taken the chance to re-think how some of their accessibility tools work for developers, and they’ve baked in accessibility right from the very beginning. Apple’s accessibility teams have been an integral part of some of the decisions that have shaped SwiftUI. You can see this throughout your SwiftUI code. Like the way images are now accessible by default. How controls are now all linked to text names. And how dynamic type is now the default. This is exactly why I believe this guide to SwiftUI accessibility is important right now. Let’s follow Apple’s lead and make accessibility a first-class citizen in our apps.

The biggest change, that will make the most impact for your users requires no work from you at all aside from adopting SwiftUI. That is down to how SwiftUI generates its accessibility tree or accessible user interface. Meaning your assistive technology users will always get the experience you intended.

Tweaking your accessible experience is still possible in areas where your UI doesn’t quite work for assistive technology users. Accessibility attributes and traits can still be set on every view in a way that should feel familiar from UIKit. But SwiftUI’s improvements for setting the accessibility sort priority and creating semantic views make these techniques so simple there’s really no reason not to use them.

Sometimes we make the mistake of thinking about accessibility as something for other people. But accessibility is all about customisation. We all like to make changes to our device to make it work better for us, like every developer’s favourite: dark mode. So it’s also essential to listen to your customer’s preferences for accessibility settings and decide how your app should respond. This will give all your customers the best possible experience.

I can’t wait to start using your accessible SwiftUI apps. If you’re unsure of the best way to improve accessibility for your app, feel free to reach out.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Semantic Views

Semantic views are not new to SwiftUI, but changes in SwiftUI mean creating them is simple. Semantic views are not so much a language feature. They’re more a technique for manipulating the accessible user interface and improving the experience for assistive technology users.

A what view?

A semantic view is not one view, but a collection of views grouped together because they have meaning (or semantic) together. Take a look at this iOS table view cell from the files app.

VoiceOver highlighting a table cell with the title MyPlaygroud

I have enabled VoiceOver, and have navigated down the list to a swift playgrounds file. VoiceOver reads “MyPlayground. 16th of July two thousand and nineteen. 606 bytes. In iCloud.” All this information is made up from two labels and a button: the cell title “MyPlayground”, the subtitle including the date and size, and the iCloud download button. But VoiceOver reads them all together without me needing to navigate each element.
iOS does this for us automatically in table views by grouping the cell’s content into the cell and presenting the cell to VoiceOver as one semantic view. This makes navigation simpler by reducing swipes. It also provides more context for each item. If we heard “MyPlayground.” swipe “16th of July two thousand and nineteen. 606 bytes.” How can we be certain what the date and size refer to?

Stacks

Semantic views in SwiftUI start with stacks. This makes perfect sense because stacks are how we visually group elements, so why shouldn’t we use these to group accessibility elements too? Stacks aren’t accessibility elements by default because on their own they have no value we can convey to our user. By adding a modifier we can make the stack take on the accessibility attributes and traits of the elements they contain.
For this example, we’re going to use this stack. It contains a title, subtitle, and an image that acts as a button.

Title: Mars. Subtitle: The Red Planet. Heart image.

Here’s the code used to create it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
HStack {
VStack (alignment: .leading, spacing: 10) {

Text("Mars")
.foregroundColor(.white)
.font(.largeTitle)
Text("The Red Planet")
.foregroundColor(.white)
.font(.subheadline)
}
.padding()

Image(systemName: self.liked ? "heart.fill" : "heart")
.padding()
.foregroundColor(.pink)
.font(.title)
.onTapGesture { self.tappedLiked() }
.accessibility(addTraits: .isButton)
}
.background(Color.black.opacity(0.7))
.padding()

.accessibilityElement

The current behaviour for VoiceOver on this stack is to read “Mars.” Swipe. “The Red Planet.’ Swipe. “Button. Image.” That’s fine, but really this stack has only one purpose: to introduce the screen and allow our user to like this planet. So if we group this together for accessibility customers, it’s going to be simpler and clearer. We can do this using the .accessibilityElement(children: ) modifier. There are 3 arguments we can pass to this modifier.

.contain

The .contain argument is the default behaviour. It tells the stack that all accessibility elements within the stack should be treated as individual elements for assistive technology. This produces the result “Mars.” Swipe. “The Red Planet.” Swipe. “Button. Image.”

.combine

.combine takes all the accessibility properties from inside the stack and combines them into one set of properties. These properties are then added to the stack. The individual elements in the stack are then hidden from assistive technologies.
If we add .accessibilityElement(children: .combine) to our stack above, the result in VoiceOver is “Mars. Newline. The Red Planet. Button. Image.”

.ignore

The .ignore property tells assistive technology to ignore any accessibility elements within this stack. Ignore will turn your stack into a focusable element for assistive technology, but no content will be read. This means we can add our own accessibility attributes and traits to the stack view for VoiceOver to read, replacing the attributes of the child elements.

For the stack above we want to add the following modifiers.

1
2
3
.accessibilityElement(children: .ignore)
.accessibility(label: Text("Mars. The Red Planet. Like."))
.accessibility(addTraits: .isButton)

This causes VoiceOver to read “Mars. The Red Planet. Like. Button”.

This is not the same behaviour as using the modifier .accessibility(hidden: true). This modifier will remove the stack and all elements within it from the accessible user interface. Adding accessibility attributes along with the hidden modifier would be pointless.

.accessibilityAction

There’s one more thing we need to add. While our .accessibilityElement(children: .combine / .ignore) changes above are telling assistive technologies our stack is a button, the stack doesn’t have an action when activated. For that, we need to tell our accessibility user interface that our stack has an accessibility action that can only be activated with assistive technology. We do this with the .accessibilityAction modifier. This modifier takes a trailing closure where we can call the same code as our button.

1
.accessibilityAction { self.tappedLiked() }

Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - User Settings

SwiftUI allows us to read environmental values that might affect how we want to present ­­our UI. Things like size classes and locale for example. We also get the ability to read some of the user’s chosen accessibility settings allowing us to make decisions that will best fit with your customer’s preference.

Why?

Before we cover what these options are and how to detect them I think it’s important to briefly cover why we need to detect them. There’s a few dos and don’ts worth consideration.

Don’t – Creep on accessibility customers

The percentage of your customers with one or more of these options selected is going to be small. This makes these settings an easy choice for fingerprinting. In fact, some accessibility experts argue we shouldn’t allow software to detect these settings at all. While I’m not sold on that argument, you must respect your users as an individual.
I wince every time I see someone who uses a wheelchair referred to as a ‘wheelchair.’ The person is not a wheelchair. The person is an individual who just happens to use a specific tool to help with their mobility. If you detect one of your customers is using VoiceOver and record this as personal information you’re doing the digital equivalent. It’s pretty hard to argue this kind of fingerprinting is not immoral, so please, don’t even try.

Do – keep anonymous stats

What is important is keeping anonymous stats. Record percentages for your customers who have different settings or assistive technologies enabled. This will help your design teams creating designs that work best for your customers. It will also help your business prioritise accessibility work.

Don’t – make a new interface

It can be tempting to make wholesale changes to an interface when you detect certain settings or assistive technologies. This is a somewhat paternalistic approach that can come across as patronising. More than this though, it’s liable to create a ghettoisation of accessibility experiences. Your accessibility-specific experience is liable to have less testing and fewer updates. If you feel your interface would be significantly improved for accessibility users by nuking it, then you probably need to nuke it for everyone. Any changes you make as a result of your customer changing one of these setting should be nothing more than a tweak.

Do – respect your customer’s choices

If your customer has chosen to enable reduce motion, that’s because they get a benefit from that setting. So if you’re using animation in your app, you should consider whether you might be better to cut it out with this setting enabled. It might only be a small number of people who use this setting, but the feeling of being valued when a user notices you’ve listened to their choice is equal to the feeling that you don’t care when your customer notices you haven’t.

SwiftUI Environment properties

There are two ways of detecting settings, which one you can use depends on what setting you’re looking for. I’m unclear why some use the SwiftUI environment and others don’t. My best guess is that Apple will migrate others to the SwiftUI environment in future versions.
Firstly, we can use the @Environment property wrapper, and provide the key path we want to observe. We then assign this to a var we can use in our view. For example, if we wanted to determine our users differentiate without colour setting, we’d ask SwiftUI to set this to a differentiateWithoutColor variable.

1
@Environment(\.accessibilityDifferentiateWithoutColor) var differentiateWithoutColor

In practical terms, we then would want to change the appearance of our UI based on our knowledge of our customer’s setting. If we wanted to determine whether to use a transparent background, we can do something like this.

1
2
3
4
5
6
7
8
struct ContentView: View {
@Environment(\.accessibilityReduceTransparency) var reduceTransparency
var body: some View {
Text("Some text")
.padding()
.background(Color(.sRGB, red: 255, green: 0, blue: 0, opacity: reduceTransparency ? 1.0 : 0.8))
}
}

Here are some user settings you might want to listen to, and how you might want to adjust your SwiftUI views to accommodate them.

sizeCategory

Returns a ContentSizeCategory enum value describing your customer’s chosen text size. Majid Jabrayilov has a great post on how you might use this to re-arrange your layout based on this value.

legibilityWeight

An enum returning .bold if your user has chosen bold text, .regular if not. There’s no need to add a .bold() modifier to your text if the this returns .bold, SwiftUI does this for us.

accessibilityDifferentiateWithoutColor

If your customer has this set to true you should consider using shapes and/or extra text labels if your UI conveys information in colour.

accessibilityReduceTransparency

If this is true you should remove any transparency from backgrounds.

accessibilityEnabled

A Boolean that tells your view if VoiceOver, Voice Control, or Switch Control is enabled, but not which one. To determine which assistive technology is enabled you can query the UIAccessibility API. .isVoiceOverRunning or .isSwitchControlRunning for VoiceOver and Switch Control respectively. There doesn’t appear to be a nice way to check for Voice Control but accessibilityEnabled == true && !UIAccessibility.isVoiceOverRunning && !UIAccessibility.isSwitchControlRunning seems to work ok. Obvisouly though, this relies on this property not reporting for any other types of assistive tech.

accessibilityInvertColors

A Boolean value to determine if invert colours is enabled. This appears to be somewhat unreliable at the moment, from my testing it appears to only report smart invert, not classic invert.

accessibilityReduceMotion

A Boolean value that tells us if our user has asked for animation to be minimal. If this is true you should consider slowing, reducing, or removing any non-essential animation.

colorSchemeContrast

An enum that returns .increased if your customer has Increase Contrast enabled. With this setting, you should check your text has around 7:1 contrast ratio with your background.

UIAccessibility properties

The second, less SwiftUI-like method is to query the UIAccessibility API. These values are set on the creation of your view and are not updated when your customer changes the setting or when the view is redrawn.
If we wanted to know if our user has on/off labels enabled for their switches, we’d use the following.

1
var onOffLabels = UIAccessibility.isOnOffSwitchLabelsEnabled

Alternatively, if you query UIAccessibility in place. This makes your code a little less clean, but the value returned will always be the current status at the time your view is (re)drawn, even if this changes while your view is visible.

1
Text("\(UIAccessibility.isVoiceOverRunning ? "Voice Over is running" : "Voice Over is not running")")

Unfortunately, we don’t get an update when these settings change as we do with @Environment properties. But using combine it’s possible to set up publishers for NotificationCentre events for many of these settings.

isVoiceOverRunning

A Boolean returning true if VoiceOver is currently enabled.

isClosedCaptioningEnabled

A Boolean value that returns true if your user has requested captions on video content. With this setting enabled you should force captions on all video and audio content in your app.

isGuidedAccessEnabled

Guided access allows users to disable some device functions or to lock the device to a single app. This is useful for people with more severe learning difficulties, motor issues, or cognitive impairments, who might leave an app unintentionally and become distressed by this. It’s often also used in retail situations to lock down devices. If you have created your app for either of these markets it’s worth listening to this setting so you can make it clear when guided access is not yet activated.

isGrayscaleEnabled

This Boolean value returns true if your customer has grayscale enabled. You might want to change to higher contrast colours if your customer has chosen this setting. Although if you feel you need to do this perhaps your colours aren’t high contrast enough already. You’ll also probably want to make the same changes as with accessibilityDifferentiateWithoutColor.

isVideoAutoplayEnabled

Auto-playing video can be an annoyance, but to many people with attention or anxiety disorders, they can make an app unusable. They can also get in the way for non-sighted or low vision people who don’t know a video has started. If this returns false you should only trigger video as a direct result of user interaction.

isSwitchControlRunning

Returns true if switch control is running.

isSpeakSelectionEnabled

Returns true if speak selection is enabled. This doesn’t mean speak selection has been triggered.

isSpeakScreenEnabled

Returns true if speak screen is enabled. This doesn’t mean speak screen has been triggered.

isShakeToUndoEnabled

People with motor issues may struggle with the shake to undo gesture. Either because they trigger it unintentionally, or because they can’t perform the motion required. If this returns true you should add or increase the prominence of undo controls where needed.

isAssistiveTouchRunning

Assistive touch doesn’t alter the navigation of your app, but there are a couple of considerations you might want to make if this property returns true. Firstly, this tool is designed for people with reduced motion, so if this option is enabled you might want to offer alternatives to any multi-finger gestures your app uses. Additionally, Assistive Touch adds a small overlay onto the screen. Your user can move this around the edge of the screen to avoid clashing with content and controls. But if your app makes use of regular gestures from the edge of the screen, remember this assistive touch control may obscure these.

isOnOffSwitchLabelsEnabled

This property returns true if your customer has asked for switch controls to be labelled with I for on and O for off. For standard toggles, you don’t need to do anything here. But if you have designed your own toggle appearance it’s important to listen to this. This setting is a primitive version of accessibilityDifferentiateWithoutColor, but that doesn’t mean you can ignore this option.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Traits

Accessibility traits are a group of attributes on a SwiftUI element. They inform assistive technologies how to interact with the element or present it to your customer. Each element has a selection of default traits, but you might need to change these as you create your UI.

In SwiftUI there are two modifiers to use for traits, .accessibility(addTraits: ) and .accessibility(removeTraits: ) which add or remove traits respectively. Each modifier takes as its argument either a single accessibility trait or a set of traits.

1
2
3
Button(action: {}, label: { Text("Button") })
.accessibility(addTraits: [.isSelected, .playsSound])
.accessibility(removeTraits: .isButton)

isButton

This element is a button that you customer can interact with. This causes VoiceOver to announce ‘button‘ after reading the items accessibility label. It also tells Switch Control and Voice Control that it’s possible to interact with this control.

isHeader

Any large text header element that divides content. For example a navigation bar title, or a table section header. This causes VoiceOver to read ‘heading‘ after reading the accessibility label.

By swiping vertically VoiceOver users can skip content and only read elements marked with this trait. This is an essential technique for VoiceOver users, as they can’t visually skim a screen to find the content that’s important to them right now.

isSelected

An item that is currently selected, such as a tab, or an item on a segmented control. By reading ‘selected‘ after the accessibility label this helps VoiceOver users to augment themselves on the screen.

An inline link such as in a webpage. This causes VoiceOver to announce ‘link‘ after reading the item. It also tells Voice Control and Switch Control this element is interactive. Using a rotor setting VoiceOver users can skip content and navigate only elements marked with this trait.

isSearchField

A text field that allows your customer to enter a string to perform a search. This differentiates this field from a standard text field and hints to the user that entering text here should cause the UI to update elsewhere. VoiceOver announces ‘search field‘ after announcing the element’s accessibility label.

isImage

Any image or visual element that has no text and no actions. Image elements set this trait by default, but if you are drawing your own graphics, you may want to set this property. Consider whether it makes sense for this element to be accessible.

playsSound

An element that will trigger sound once activated. This trait tells VoiceOver to stop announcing as soon as your customer activates this element. This avoids conflicting with the sound played.

isKeyboardKey

An item that acts as a key on a keyboard if you’re implementing a custom input control. With this trait on a button, VoiceOver no longer reads ‘button‘ after the accessibility label to allow for quick switching between keys.

isStaticText

Text that does not change throughout the lifecycle of your view. This tells the accessible user interface it doesn’t need to check if the value of this element has changed.

isSummaryElement

A Summary Element trait characterises an area that provides a brief summary of the state of the current screen. The best example of this is Apple’s built in Weather app. On opening a location, VoiceOver highlights the top area, marked as a Summary Element. VoiceOver then reads a quick overview of the current weather conditions in the selected location.

Apple's Weather app with VoiceOver highlighting the top Summary Element.

updatesFrequently

This trait is for elements that update either their label or value frequently. In UIKit you cause use this to tell the accessible user interface to regularly poll this element for changes. Due to the changes in the way SwiftUI generates the AUI, I don’t believe this is still the case. But I am unclear what the purpose of this trait is in SwiftUI.

startsMediaSession

An element that will start playing or recording media when activated. Like playsSound, this trait tells VoiceOver to stop announcing as soon as the user activated the element. This avoids conflicting with the media.

allowsDirectInteraction

Allows Direct Interaction tells VoiceOver there should be no deviation from the standard touch control for this view.

Imagine you have created a music app that provides a piano keyboard for the user to play. Using the VoiceOver paradigm of swiping to key and double tapping would not produce much of a tune. allowsDirectInteraction disables this control allowing your user to play the keyboard by directly tapping the keys. This means your user doesn’t have to disable VoiceOver for the rest of the UI. A game might be a good use for this trait. Inappropriate use of this trait will create a worse experience for your VoiceOver users.

Apple's Garage Band app displaying a piano keyboard

causesPageTurn

This trait indicates to Speak Screen or VoiceOver that this content represents one page out of a set of pages, such as an eBook.

This trait causes the assistive technology to call the closure in your .accessibilityScrollAction() modifier on your parent view immediately after completing reading the content. The assistive technology will then continue reading the new content. Reading will stop if the content does not change after calling this closure, or if you haven’t implemented this modifier. Scroll views implicitly handle the .accessibilityScrollAction() for you. If you want to continue reading in another way, say by transitioning to another screen, or swiping on a carousel, you will need to use this modifier.

isModal

This trait causes assistive technologies to ignore the contents of any other views on screen, allowing access only to the children of this view.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Attributes

When a customer enables an assistive technology to navigate your app the interface that technology navigates isn’t exactly the same as the one visible on the screen. They’re navigating a modified version that iOS creates especially for assistive technology. This is known as the accessibility tree or accessible user interface.

iOS does an incredible job at creating the AUI for you from your SwiftUI code. We can help iOS in creating this by tweaking some element’s accessibility attributes. Setting some accessibility attributes through modifiers is a simple way to add a little more meaning and context for your assistive technology users.

Label

An element’s Accessibility Label is the first string read by VoiceOver when landing on an accessible element. It’s also the string used to activate a control in Voice Control. You should think of this as the name of the element. Set an accessibility label on your SwiftUI element using the modifier .accessibility(label: Text("Send")).

In general, the accessibility label is the same as your control’s label or text value. So that’s what iOS uses by default. This means for most of your elements, you won’t ever need to set an accessibility label. There are a few times when you do need to set one: For example, if you haven’t given your control a text representation. Although the better option here might be to set the text value. If your text value is longer than a couple of words you might want to use a shorter version, this helps with Voice Control. Or if the label might be ambiguous without a little more context.

1
2
Button(action: {}, label: { Text("➡️✉️") })
.accessibility(label: Text("Send"))

A label should allow a VoiceOver user to quickly identify what that element is or does. Not what the content of that element is. Ideally, labels should convey meaning in one word, such as “Play“ or “Like“ for example. Apple advises you should capitalise your accessibility label and don’t end it with a full stop. Don’t include the type of element as this is redundant and will add noise.

Value

This should be the text representation of the value or content of a control. The current numerical value of a slider, or the current status of a switch for example. Typically your accessibility value is defined for you by your control. For example, a slider will always set the accessibility value to its current numerical value.

There are times when you will need to set this value yourself. If you group subviews together into a semantic view, you will need to choose which of your subviews’ values you need to report. You can set a value using the modifier .accessibility(value: Text("10 out of 10")).

At times it may be suitable to set the accessibility value to something different from the value displayed in your visual user interface. Imagine your UI features a slider to adjust, say, the rating of a good dog out of 10. The accessibility value generated for you by the slider will be “100 percent“. It will give your user more meaning if you adjust your sliders accessibility value to read “10 out of 10“.

1
2
Slider(value: $sliderValue, in: minimumValue…maximumvalue)
.accessibility(value: Text("\(Int(sliderValue)) out of 10"))

Don’t get in your customer’s way by adding redundant information. Remember, VoiceOver is not the only medium that uses this label, braille keyboards, for example, will display this value. If you feel you need to add further context for a customer to understand a control, the accessibility label or accessibility hint may be the more suitable attribute to set.

Buttons don’t have an accessibility value by default. But if your text is long, you may be better to set a short accessibility label and the rest of the text as a value. this will help Voice Control users. Imagine a Twitter client that allows users to select a tweet for more options. We would set the text of the tweet as the value, and set the accessibility label to “tweet from @RobRWAPP.

Hint

VoiceOver reads an element’s accessibility hint last after a short pause. Use the hint to give extra information on what the result of performing this element’s action will be. But only if this consequence is not immediately obvious from the element’s accessibility label. Many VoiceOver users disable or skip over hints and only use them if they find an element confusing at first. Because of this, you should use a hint to provide extra context, and not be a required part of your interface.

The hint attributre is optional, and not set for you by iOS. Set an accessibility hint on an element using the modifier .accessibility(hint: Text("Sends your message.")).

In their guidance on writing good accessibility hints, Apple suggests imagining describing the controls action to a friend. You might tell your friend “tapping the send button sends your message”. But assuming you set up your accessibility traits and label correctly, repeating the information that this element is a button and that it’s called “send“ is redundant. So, your hint would be “Sends your message.“ Avoid “Send your message.“ as this sounds like an instruction, rather than guidance. Hints should begin with a capital letter and end with a full stop.

1
2
Button(action: {}, label: { Text("➡️✉️") })
.accessibility(hint: Text("Sends your message."))

Identifier

The connection to accessibility here is a little tenuous. The identifier is not presented to your customer in any way. It is a string you can use to identify your view to UI tests or internally in your app’s code. Set it with the modifier .accessibility(identifier: "My unique identifier"). Where the other attributes take a SwiftUI Text() value, this is not user facing, so takes a Swift String.

1
2
Button(action: {}, label: { Text("➡️✉️") })
.accessibility(identifier: "sendMessageButton")

Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Accessible User Interface

Take a look at your app. Notice the collection of buttons, text, images, and other controls you can see and interact with that make up your app’s user interface. When one of your customers navigates your app with Voice Control, Switch Control, VoiceOver, or any other assistive technology, this isn’t the interface they’re using. Instead, iOS creates a version of your interface for assistive technology to use. This interface is generally known as the accessibility tree. Apple often refers to this as your app’s Accessible User Interface. For brevity and consistency in this article, I’ll refer to it as the AUI

Navigating the Weather app with switch control enabled highlights an hour's weather report

Your app’s AUI contains information about what elements are in your visual interface, what order they’re in, and how your users can interact with them. Your customer’s chosen assistive technology will then decide how to use this information. For example, Voice Control and Switch Control help people to interact with your app, so will only access interactive elements. Speak Screen is only concerned with reading content, so will only access elements that aren’t interactable.

Whenever we change an accessibility property, such as changing the sort order, we’re not changing anything on the visual interface. These are direct changes to the AUI for assistive technology to consume.

UIKit

The concept of an accessibility tree, or AUI, is not new to SwiftUI. UIKit has had one since iPhone OS 3, and it’s been on the web since way before that. If you’re a Chrome user, you can see the accessibility tree for this page by visiting chrome://accessibility/. The difference between SwiftUI and UIKit is how iOS creates the AUI. UIKit has a couple of features that mean your AUI can sometimes be less than perfect.

In UIKit iOS builds your visual interface from code or from Interface Builder files. iOS will then generate an AUI from the screen that iOS has drawn. The accessibility API then combines this with any accessibility modifications you have made in code. This step is lossy. The Accessibility API has to make a lot of assumptions about what you intended the experience to be. Apple has done a ton of work in making those assumptions for you, and the majority of them are great. But when you’re creating custom controls or complex UI, iOS won’t always make the right decision.

Additionally, if you change your interface by adding or removing elements without presenting a new screen, the Accessibility API has no way of knowing something has changed. This means you can be presenting elements in your AUI that no longer exist visually, and new visual elements won’t be present in your AUI.

Some fundamental design choices Apple have made in SwiftUI have made a great improvement in these areas. Including an entire class of accessibility bugs that are no longer possible.

Declarative

Our first issue with UIKit is the lossy step of generating an AUI. Because SwiftUI is declarative, we’re separating what we want to display from how we want to display it. Our SwiftUI code containing our Text(), Button(), Image(), and other elements is the what&. The *how is then left to iOS, TVOS, macOS, and WatchOS. This means each platform can make decisions to tailor your interface for itself.

The AUI is just another platform on this list. It can interpret the same code as your visual interface does, and then make a few small decisions about how best to present it in an accessible form. This completely skips the lossy AUI generation step and requires less intervention as a developer.

In Sync

Our second problem with UIKit comes when we change the visual interface without the AUI knowing. It is possible to fix this in UIKit by calling UIAccessibility.post(notification: .layoutChanged) or UIAccessibility.post(notification: .screenChanged) for larger changes. But this requires us as developers to know where these errors are likely to occur and adds dev and testing effort.

With SwiftUI this category of bug is completely eliminated. Because SwiftUI views are structs, and because structs are value types, when some state changes on a SwiftUI view, the struct is re-created. This struct creation triggers the simultaneous redraw of the screen and AUI update. This means your AUI can never be out of sync with what’s visible on the screen.

Diagram of the SwiftUI view creation process.

Both of these improvements require no developer effort aside from using SwiftUI. This makes adopting SwiftUI the simplest and most impactful decision you can make right now to improve accessibility for your customers.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Sort Priority

Assistive technology, such as VoiceOver, works in natural reading direction. In English, and most other languages, this means top left through to the bottom right. Mostly this is the right decision for assistive technology to make. This is the order anyone not using assistive technology would experience your app. Sometimes though, we make designs that don’t read in this way.
By using the .accessibility(sortPriority: ) modifier we can set the order in which assistive technology accesses elements. To achieve this, you must group elements in a stack (HStack, VStack or ZStack). Then use the .accessibilityElement(children: .contain) modifier. The higher the number we give to .accessibility(sortPriority: ), the earlier VoiceOver will focus on the item. This means an element with a priority of 2 comes before priority 1, and so on.

1
2
3
4
5
6
7
8
9
10
11
12
13
VStack {

Text("Read this last")
.accessibility(sortPriority: 0)

Text("Read this first")
.accessibility(sortPriority: 2)

Text("Read this second")
.accessibility(sortPriority: 1)

}
.accessibilityElement(children: .contain)

One example of using this might be captioning a large image. In SwiftUI images are accessible by default. This doesn’t mean we should focus on the image as the first element - the title is usually more meaningful. Here, we’d set the sort priority of the image to 0 so it receives focus after VoiceOver has read the title and caption.

1
2
3
4
5
6
7
8
9
10
11
12
13
VStack {
Image("shuttle")
.accessibility(sortPriority: 0)

Text("Shuttle")
.font(.largeTitle)
.accessibility(sortPriority: 2)

Text("This is an image of a shuttle on the launch pad")
.accessibility(sortPriority: 1)

}
.accessibilityElement(children: .contain)

Another use of this could be a custom stepper control. We’d want VoiceOver to focus on the value first to orientate your user and inform them which value they’re starting with. Then VoiceOver should follow with the decrease and increase buttons. We’d achieve this like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
HStack {

Button(action: {
self.value -= 1
}) {
Text("Decrease")
}
.accessibility(sortPriority: 1)

Text(String(value))
.accessibility(sortPriority: 2)

Button(action: {
self.value += 1
}) {
Text("Increase")
}

}
.accessibilityElement(children: .contain)

.Contain

As of October 2019, the sort priority only works for elements inside a stack where the stack has the modifier of .accessibilityElement(children: .contain). I don’t believe this is intentional, hopefully, future releases of SwiftUI will drop this requirement.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Named Controls

One big accessibility improvement in SwiftUI comes in the form of named controls. Nearly all controls and some non-interactive views (see Images) can take a Text view as part of their view builder. The purpose of this is to tie the meaning to the control.

1
2
3
Toggle(isOn: $updates) {
Text("Send me updates")
}

Imagine a UIKit layout with a UISwitch control. We’d most likely right align the switch, and provide a text label to the left. Something like this.

Send me updates label left with a switch control right

Visually this makes perfect sense. The control follows natural reading direction from the label so we know they’re connected. This isn’t clear when using some assistive technologies like VoiceOver, Braille keyboards, and Voice Control. For these technologies, there is no link between the separate elements. Remember, VoiceOver users are unlikely to have the benefit of inferring relation by following the visual layout of your UI.

For this UI, VoiceOver will read ‘Send me updates.’ Your user will then swipe, and VoiceOver will read ‘Toggle. Off.’ Your VoiceOver user cannot know you intend a connection. They also have no way of knowing what the consequence of toggling this switch would be. Consider the layout below.

Vertically aligned labels & toggles. The first option is 'Charge me $1m', the second 'send me updates'

The VoiceOver interaction here is exactly the same as above. Because VoiceOver reads in natural direction, both labels will be read before the switches are reached. Your user will hear ‘Send me updates.’ (swipe) ‘Toggle. Off.’ Except if they switch the toggle, we’ll charge them a bunch of money, and we won’t send them the updates they wanted.

Named controls exclude this ambiguity. Because SwiftUI is explicit about the link between the control and the label it can present them as such to assistive technology. This not only provides a clear consequence for activating the control. Grouping also reduces the number of swipes required, making navigation quicker and easier.

These labels also double up as the friendly names used for Voice Control. Without a name for the control, a Voice Control user would have to ask iOS to overlay a grid and say ‘tap 20’. This means unnecessary commands spoken by your customer. Additionally, covering a large proportion of your screen covered with numbers is not a great experience.

Toggle control overlayed by a numbered grid

With a properly named control, the only Voice Control command required to activate the switch is ‘tap Send me updates.’

Toggle control showing the Voice Control command to interact with it

Some elements, like Images, take a Text element as part of their view builder, but it is never displayed on screen. In some instances like Sliders only some platforms such as MacOS display the label. Or you can change the presentation style to include the label. Regardless of how SwiftUI presents your labels (or not), you should always provide a short descriptive label for every control you create. This ensures a better, frustration-free experience for your assistive technology users.


Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Dynamic Type

Like all accessibility features, Dynamic Type is about customisability. Many of your customers, and maybe even you, are using Dynamic Type without even considering it an accessibility feature. Dynamic type allows iOS users to set the text to a size that they find comfortable to read. This may mean making it a little larger so it’s easier to read for those of us who haven’t yet accepted we might need glasses. Or it could mean ramping it right up for people with low vision. Or taking the text size down for extra content and privacy.
Like many accessibility features on iOS, Dynamic Type support has been greatly improved in SwiftUI. There are a few things you should do (and not do) to make the most of it.

Do

Nothing

SwiftUI supports dynamic type by default and is multi-line by default. So if you add Text("Some text") to your view, you’re done.

Text Styles

Text is body style by default, which makes it great for the majority of uses. But any app with a single text style is going to look pretty boring. Fortunately, Apple provides a selection of 11 type styles for you to use. Each of these styles supports dynamic type, and adjusts the size, leading and trailing for you as needed. Using too many type styles can lead to your app looking messy and inconsistent. If you find these 11 aren’t enough, it might be worth taking another look at your designs.

Dynamic text sizes

You can choose your required text style using the modifier .font(.headline) to set your text to the headline style. The full list of tyoe styles can be found in Apple’s developer documentation.

Custom Fonts

Apple’s built-in Dynamic Type text styles all use the default San Francisco font. SF is a great font for iOS, but to make your app stand out you need a custom font.
Some devs have told me they don’t support Dynamic Type in their app because it doesn’t support custom fonts. This isn’t the case, Keith Harrison over at Use Your Loaf has a great post on using custom fonts with Dynamic Type in UIKit that I highly recommend.
Unfortunately, these methods won’t work for us in SwiftUI. Custom font support for dynamic type in SwiftUI needs some improvement from the current version. There is however, a simple way we can leverage some of the built-in text style’s dynamic type support. By adding a helper method, we can get the current point size for our desired text style.

1
2
3
func textSize(textStyle: UIFont.TextStyle) -> CGFloat {
return UIFont.preferredFont(forTextStyle: textStyle).pointSize
}

Then we can use the custom font modifier to apply this to our text.

1
.font(.custom("MyCustomFont", size: textSize(textStyle: .headline)))

Custom font

The downside of this approach is this fixes the text size until the screen is redrawn. So if your customer changes their dynamic type size your text won’t change until this screen is recreated.

Don’t

Line Limit

It’s possible to limit the number of lines your text wraps to using the modifier .lineLimit(1). Meaning if your text requires more lines it will end in an ellipsis. This is a poor choice as your users with the largest text sizes are likely to lose the full meaning. As a rule, your UI should be able to accommodate whatever text content at whatever size its provided with. If you find your screen or control can’t handle this, it’s worth taking another look at your design, or how you’ve built it.

Truncated text

Fixed sizes

There are two ways of fixing a font size. .font(.system(size: 17)) or .font(.custom("MyCustomFont", size: 17)). please don’t be tempted to use these. If your customer has chosen to set their preferred text size, either for accessibility reasons, or because they like it that way, it’s pretty arrogant as an app developer to ignore this – at least, that’s how your customer will see it.

Fixed text size

As with the line limit, if you find your screen or control doesn’t work with larger fonts re-visit the design or how you’ve built it. Sometimes it’s not possible to support larger text sizes for certain controls. See iOS’ standard tab bar for example. In these situations, we can add a function that calls our dynamic size function above, and provides a maximum value. This allows your text to scale down, but limits how large it can become.

1
2
3
func textSizeForThisOneSpecificUse() -> CGFloat {
return fmin(textSize(textStyle: .body), 28.0)
}

Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views

SwiftUI Accessibility - Images

Images in SwiftUI are accessible by default. This is the opposite of what we’d experience in UIKit, where images are not accessible unless you set isAccessibilityElement to true.

Sometimes making images not accessible to VoiceOver is the right decision. Like when using a glyph as a redundant way of conveying meaning alongside text. An example of this would be displaying a warning triangle next to the text ‘Error’ or a tick next to ‘success’. If these images were accessible your VoiceOver users would hear the word ‘error’ twice and have to swipe between each one. This makes navigation longer and frustrating for your customer.

error message

If images are a large part of your UI then making them not accessible can be confusing and frustrating to VoiceOver customers. Low vision users will often still be able to tell there is content on the screen, and will attempt to discover it by moving their finger over the area. If the image is not accessible to VoiceOver all your user will hear is an irritating ‘dunk’. This can lead VoiceOver users to assume your app is not accessible at all, and is a big turn-off.

Making images accessible by default helps to provide a more comparable experience for assistive technology users. But the main reason Apple have made this change is because of a new iOS 13 feature. VoiceOver in iOS 13 will now use CoreML to determine the content of your image and will describe the image to your VoiceOver user.

Because of this change, there are a couple of considerations you need to bear in mind when coding images in SwiftUI.

Image names

Because your image is accessible, VoiceOver needs some content to announce. Most of the time the only readable content you provide is the file name of the image.

1
Image("shuttle")

This is fine if you call your image image Space shuttle, but if your image is called 164234main_image_feature_713_ys_full then what your customer hears is useless and frustrating. Image, like most SwiftUI elements, can take a Text value providing a friendlier string to read to your user.

1
Image("164234main_image_feature_713_ys_full", label: Text("Shuttle"))

Decorative images

Sometimes it’s not appropriate for an image to be accessible. For example, providing an error icon next to the text ‘error’. If this image was accessible your VoiceOver user would hear ‘Error’ (swipe) ‘Error’. This duplication adds time and effort in navigation for VoiceOver users. In this instance it’s better to use the Image initializer decorative, this will display your image the same as above, but it is now hidden to VoiceOver.

1
Image(decorative: "Error")

error message

System Images

System images are a great new feature in iOS 13, Apple provides a suite of common system glyphs such as info circles and share icons that you can use as icons in your app. You can browse the full collection by downloading the SF Symbols app from the Apple Developer website.
In my opinion, Apple has implemented the accessibility wrong for these images. Like all images in SwiftUI, these system glyphs are accessible by default. As these are system images they have names like ‘keyboard.chevron.compact.down’ and ‘questionmark.video.fill’. These are the names that VoiceOver will read and are meaningless to your customers. Apple should mark these images as not accessible by default, or at least have an initializer option to add a friendly name. Unless and until Apple makes a change like this, you will need to add the .accessibility(hidden: true) modifier to any system images you are using.

1
2
Image(systemName: "exclamationmark.triangle.fill")
.accessibility(hidden: true)

Thanks for reading. This story is part of a series on SwiftUI Accessibility. Check out my other guides in this series:
SwiftUI Accessibility
SwiftUI Accessibility: Named Controls
SwiftUI Accessibility: Images
SwiftUI Accessibility: Dynamic Type
SwiftUI Accessibility: Accessible User Interface
SwiftUI Accessibility: Sort Priority
SwiftUI Accessibility: Attributes
SwiftUI Accessibility: Traits
SwiftUI Accessibility: User Settings
SwiftUI Accessibility: Semantic Views