Apple Adds Gesture Control, ASL Support, and a Ton of Other Accessibility Features

Apple Adds Gesture Control, ASL Support, and a Ton of Other Accessibility Features

Apple’s always expressed a commitment to accessibility, but today it announced a plethora of upcoming features that could make a huge difference in how people with disabilities can interact with technology.

A big thing to note is that these features aren’t limited to a single device or disability. There are features designed for people with mobility, vision, hearing, and cognitive disabilities. While most will be coming later this year, starting tomorrow customers will be able to use sign language when reaching out to Apple Care and Retail Customer Care online. The feature, called SignTime, is also available in American Sign Language, British Sign Language, and French Sign Language. SignTime can also be used to remotely connect with an ASL interpreter at Apple Stores — without having to book an appointment beforehand.

Perhaps the most impressive feature is motion-control gestures for the Apple Watch. Dubbed Assistive Touch, the feature is meant to enable users with limited mobility or upper-body limb differences to use the Apple Watch one-handed with clenching or pinching gestures. It makes use of the watch’s built-in gyroscope and accelerometers, as well as the optical heart rate sensor and on-device machine learning, to interpret how muscles and tendons are moving. In a video demo, you can see a person answer incoming calls, start a workout, and access settings. But the coolest thing is that Assistive Touch also has a motion-based cursor that helps you navigate menus by tilting your arm.

In another hearing-related feature, Apple is also adding support for bi-directional hearing aids in the Made for iPhone hearing devices program. The new hearing aids feature microphones that enable those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations. The company is also bringing audiograms to Headphone Accommodations, a setting that enables users to amplify soft sounds or adjust frequencies to better suit an individual’s hearing needs.

Users will be able to customise their settings based on their latest hearing test results, which can be imported from both paper and PDFs audiograms. Neurodiverse people with aural sensitivities will also get the ability to play background sounds to help mask unwanted noise in their environment.

For those who are blind or have low-vision, Apple is also expanding its VoiceOver feature to include more information from images. These users will be able to parse information from a photo of a receipt as if it were a table (i.e. by row or column, or header name). VoiceOver will also provide more detailed descriptions of photos, such as where the person is located and what other objects might be in the frame.

Some other features slated for this year include the ability to replace physical buttons with mouth sounds for people who are non-speaking with limited mobility. Memojis will also get an update to include oxygen tubes, cochlear implants, and soft helmets for headwear. Meanwhile, iPadOS will also gain support for third-party eye-tracking gadgets to control an iPad with just their eye movements. There are even more features being added to existing Apple apps and services, which you can read about in full here.

Updates like these are crucial to helping people with disabilities better interact with technology in a world that increasingly relies on it. According to Pew Research Centre, 23% of Americans with disabilities never go online, which is three times higher than those without disabilities. Only 25% report having multiple devices (smartphone, computer, and tablet) and broadband.