The 6 Best iOS 14 Accessibility Features

The 6 Best iOS 14 Accessibility Features
Image: Getty

It’s a very rare person who lives an extremely healthy and unimpeded life before abruptly dropping dead without ever needing accessibility features. This could include subtitles, magnification or an easier way to reach something.

Whether you need these things yourself, or someone in your life might find them helpful, there is a bunch of really cool accessibility features built into iPhones and iPads in iOS 14 that you should know about.

In no particular order, this is six of the best.

iOS 14 Headphone Accommodations

If you listen to most of your music and podcasts using headphones with an H1 chip (2nd Gen AirPods, AirPod Pros, some Beats), then you can setup headphone accommodations which can help make voices and music clearer, depending on which frequencies you’ve lost.

Everyone loses higher frequencies as they age, and as this progresses it becomes harder to understand what people are saying and you start missing large sections of music.

Headphone Accommodations makes fairly basic changes, taking the user through three options asking which sounds better.

If you would like more fine-tuned accommodations, or you just want to know more about the state of your hearing, you can get an audiogram from apps like The Mimi Hearing Test and SonicCloud, and then upload the audiogram into Headphone Accommodations.

Taking regular audiograms will not only make sure your headphones are customised for your needs, but give you a better picture of your hearing health when you’re between audiologist appointments.

It will also let you know if you need to make an appointment sooner.

How to toggle it:

  • Settings > Accessibility > Audio/Visual > Headphone Accommodations

Headphone Safety

Before getting to the point where you need to customise your sound, it’s worth trying to avoid hearing loss, or at least avoid making it worse. That can be hard to do if you’re either not great at judging volume, or if you already have some degree of hearing loss.

Headphone Audio Levels in the Health app keeps a record of how loud you listen to things, so you can make sure you’re staying within healthy limits.

It has has lots of information so you can get a better idea of what healthy limits are. For example, a maximum of 40 hours of 80Db over 7 days, or 4 hours of 90Db over 7 days.

If you’re worried about any spikes in sound (like if you listen to a poorly mixed podcast so you need to have that on full blast and don’t want to be sent into next week by a loud notification) you can go to Headphone Safety in Settings and reduce loud sounds.

The handy slider for that setting not only tells you the volume of what you’re limiting, but gives it a comparison to a real-world sound. For example, 90Db is as loud as a motorcycle, and 100Db is as loud as an ambulance siren.

You can also set up the headphone volume checker to appear in control centre so you can view it at a glance by going to Settings > Control Center > and tapping the + next to ‘Hearing’. Then you’ll get a little button in Control Centre that will either be green or yellow depending on the volume.

Just tap that button for more information.

How to toggle it:

  • Health > Headphone Audio Levels
  • Settings > Sound and Haptics > Headphone Safety

Live Listen

Connect AirPods > Open Control Centre > Tap the picture of the ear > Tap Live Listen > Put your iPhone or iPad near the person you want to listen to.

According to several papers, people put off getting hearing aids for an average of eight years. Some experts even believe that to be an underestimation.

Whether that statistic is relevant to you, or you just have trouble distinguishing the sound you’re supposed to be focussing on when there’s a lot of background noise, Live Listen could be your jam.

Live Listen used to just be for Made For iPhone compatible hearing aids, and used the iPhone or iPad as a microphone for them. But in iOS 12 Apple expanded that functionality to all AirPods and other headphones with a H1 chip.

This is really helpful if you’re having dinner with someone in a crowded restaurant, or if you’re in a lecture hall and can’t get a seat down the front.

Using AirPod Pros with noise cancelling and getting the other person to speak relatively close to the phone could also work for people with sensory sensitivities who still want to be a part of a conversation, even though they can’t deal with one or more of the sounds in the background.

How to toggle it:

  • Settings > Control Centre > + Hearing

New iOS 14 Features in the Translate App

Mask wearing has really made it clear to a lot of people just how much they (consciously or unconsciously) relied on lip-reading to understand people.

While it’s not a perfect replacement, the Translate app can work offline to translate conversations in real time. It can translate English, Mandarin, French, German, Spanish, Italian, Japanese, Korean, Arabic, Portuguese and Russian.

This is also handy for when movies and TV shows don’t have subtitles.

How to toggle it:

  • Open Translate App > Set the input to English to English (or other language) > Turn the iPhone/iPad to landscape mode.

Back Tap

Back Tap was designed for people who might have trouble nimbly reaching various shortcuts on their phone. This could be due to arthritis, neurological disorders, or a simple lack of familiarity.

But it’s really a fantastic tool for anyone who would benefit from more and better shortcuts. In the settings there’s more than two dozen shortcuts by default, and you can also make more using the shortcuts app.

Functions include opening apps, people detection, mute, open notification centre, screenshots, magnifier and turning Voice Over on/off.

There’s even more Back Tap functionality for people who rely on Voice Over. With Voice Over on you can also add things like ‘hang up phone call’, which makes a huge difference for people who can’t see the touch screen.

How to toggle it:

  • Settings > Accessibility > Touch > Back Tap

People Detection with LiDAR

This is brand new for iOS 14.2, and it only works with devices that have the LiDAR sensor.

LiDAR uses light to measure the distance between things, and pairing that with the machine learning on board the iPad Pro and iPhone 12 Pro line makes it an incredibly powerful tool. And this is not just for AR, but for blind and vision impaired people.

Once you’ve set up the shortcut, with Voice Over turned on, you can triple press the lock button on your device (or Back Tap), and it will tell you how far away the closest person to you is, or whether it detects any people at all.

This is amazingly useful during times when we need to stay 1.5m away from each other, or for people with low or no vision to be able to tell if the line at the coffee shop has moved, or to see if there’s someone sitting in that seat on the train.

How to toggle it:

  • On iPad Pro, iPhone 12 Pro or iPhone 12 Pro Max > Settings > Accessibility > Accessibility Shortcuts > People Detection

This article has been updated since its original publication.