Apple Voice Control Accessibility Case Study

Apple Voice Control and Custom Commands

This case study was originally created in 2020 and was edited in 2023 after noticing some bandaid fixes. This case study is still relevant as the main problems are still occuring as well as the same heuristic violations are still applicable. Accessibility is imperative and makes using your iPhone simpler for people with limited mobility as well as able bodied persons. To use your device with voice commands without the internet dependent “Siri” you can enable "Voice Control". However, if you are already depending on voice control having customized gestures can be essential when even unlocking your device without face ID. Unfortunately, many design heuristics are skimped over in the Accessibility settings. It is already quite difficult traversing accessibility functions as there are accessibility settings that use technical terms and not typical everyday normative termssuch as: "Switch Control" and "TTY", both which are terms most often used in the Programming/Tech community.


we make the assumption that apple users are not all programming and tech savvy community memebers as their targetted audience and markets are diverse but are not specific to this.

Accessibility Function: Voice Control

Immediately we can start to point out issues with the ease of accessibility in the design as it's skiping over vital design heuristics. The accessibility “Voice Control” “custom gesture” control only allows up to 5 touches to log (in 2020) before it won’t log anymore. This was tested again in 2023 it now goes up to 10 unique touches. However the design hueristic violation is not fixed with more touches being added. It never told you it maxed out at 5 touches as well as it currently doesn't announce to the user it ends at 10 (in 2023). This relies on the user to recall how many times it has touched the interface which can be hard for older ages or those with neurological conditions.
When speaking to friends and family with iPhones I also found that most don't know the capabilities of Voice Control. Due to the limited public general use knowledge settings often go under used. It doesn’t allow people to learn about the accessibility settings easily unless finding out about it through trendy TikToks or the Apple Knowledge Bank, which makes it rely on "recall" rather than "recognition". Accessibility with voice control can be crucial for users with Cerebral Palsy and/or issues with motor function, such as my moms little sister in the Big Sister Little Sister program in Canada.

Design Heuristic Violations

So let's talk about three main heuristics that have been broken and possible fixes!

1 Visibility of System Status:
There is no way to tell which gesture came first and relies on recall rather than recognition. There is also no way for the user to know if they are creating the gesture in the right area of the screen!

Possible fixes:

  1. Touch counter
  2. in 2020 it was not colored and now they are (easier to distinguish which came first), However, after the gesture finishes playing it all returns to the same yellow colour making it hard to see each gesture
  3. A Screen Underlay
  4. This would be used to ensure the user is creating touches/gestures in the correct area of the screen that they want to create a custom gesture for. Currently there are "hacks" posted online where people will use tape, lipgloss or lipstick to mark the areas where they need. However, this shouldn't be something people have to use to be able to set up a custom gesture for accessibility.

  5. ability to move the gesture play back slider
2 Error Prevention:
You can make a mistake in creating your gestures and will have to do it all over again to correct it.

Possible fixes:

  • an undo button!
  • Delete gesture button!
3 Flexibility and Efficiency:

ease of use doesn’t exist here, and help and documentation is not easily accessible – you would have to know about the knowledge bank to be able to use it, I learned from Tom at the Apple Store. circa 2020.

The process of creating a customized gesture for accessibility is also ridiculously long and weird to reach this area a screenshot of each action is show in sequence. Yes screenshots to emphasize how many darn steps there are, as seen below:

going through accessibility steps 1-3going through accessibility steps 4-10accessibility steps 11-14

Redesign Ideas

Idea Dump Sketch
Accessibility Custom Commands

A proper accessibility flow that is not confusing and redundant. This would require user experience testing to better understand the accessibility community and their needs, it is currently obvious that they are not being met as able-bodied people find it confusing. This would require new workflows, less technical jargon, better organization within accessibility settings/folders (for lack of confusion and increase of efficiency).

It is recommended to have users know their status and not rely on recall. Having numbers of used gestures inside the custom gestures set up so you can see your status, show that the maximum gestures to use is five(2020) or now 10 in 2023, numbers on the gestures themselves to be able to identify them or have them colour coordinated, allow for error correction by an “ edit” button and be able to move the gestures around- this would be for efficiency and advanced users. Keep the timeline at the bottom of the custom gestures creation page so people can see the play back. Allow for low opacity backgrounds of the home screen or apps to appear in the background of the “custom gesture” background on the creation page so the gestures can be made accurately instead of guessing where you will need the gestures to go.

Built with Next.js, Tailwind and Vercel