Apple Has Suspended Humans Listening To Your Siri Recordings (For Now)

Apple Has Suspended Humans Listening To Your Siri Recordings (For Now)
Image: iStock

It’s commonly accepted Siri is the most garbage of the smart AI assistants. To make matters worse, Apple is getting human contractors to listen to the questions you’re asking her – regardless of how private or confidential the audio is. Following a public outcry, this dubious practice will soon be an optional “feature”.

Apple Contractors Reportedly Overhear Sensitive Information And Sexy Times Thanks To Siri

First Amazon, then Google, and now Apple have all confirmed that their devices are not only listening to you, but complete strangers may be reviewing the recordings. Thanks to Siri, Apple contractors routinely catch intimate snippets of users’ private lives like drug deals, doctor’s visits, and sexual escapades as part of their quality control duties, the Guardian reported Saturday.

Read more

Apple, like Google’s Assistant and Amazon’s Alexa, has been sending off recordings to contractors around the world in order to “better recognise what you say”, according to a report by The Guardian. They’re calling it ‘grading’ and it’s supposed to improve services, though no one who uses Siri was explicitly informed about it. In a statement to The Guardian, less than one per cent of Siri’s daily interactions are used for this purpose.

In light of this report becoming public, Apple has since decided to suspend its Siri grading program while it undergoes review and is giving Siri users the option to opt out.

“As part of a future software update, users will have the ability to choose to participate in grading,” an Apple spokesperson told The Verge.

The initial report, leaked by a whistleblower, detailed the extent of what’s allegedly being captured by Siri and it exceeded just innocuous questions asked to the AI assistant.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower told The Guardian.

“These recordings are accompanied by user data showing location, contact details, and app data.”

How To Make Siri Less Garbage By Linking It To Google Assistant

Hey Siri, how do I make you suck less? Turns out the answer is to link it to Google Assistant. A new update to Google Assistant for iOS adds support for Siri Shortcuts, so you can actually get some decent voice assistant action on your iPhone.

Read more