Siri Whistleblower Goes Public To Protest Lack Of Consequences For Apple

Siri Whistleblower Goes Public To Protest Lack Of Consequences For Apple

Last year, an anonymous contractor revealed that Apple had contractors listening to private conversations, unknowingly recorded by their devices, as part of a “grading” system to improve Siri. Now, the whistleblower has gone public in a letter sent to European data protection regulators, decrying the lack of action taken against Apple for violating people’s privacy.

Thomas le Bonniec worked as a subcontractor at Apple’s Cork offices in Ireland. At the time, his job involved transcribing user requests in French and English until ethical concerns led him to quit, reports the Guardian, which initially broke the story last year.

The news sparked outrage, prompting Apple to apologise in a statement. “We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process—which we call grading,” the statement reads. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies.”

What that amounted to was Apple laying off the 300 contractors involved, saying that it would pause the grading system until fall 2019. Apple said when the program resumed, customers would have to opt-in to the grading process and only in-house Apple employees would listen to any recordings. It also said that clips that appear to have been “accidentally” collected would be deleted.

And that was basically it—a fact that does not sit well with le Bonniec.

“It is worrying that Apple (and undoubtedly not just Apple) keeps ignoring and violating fundamental rights and continues their massive collection of data,” le Bonniec writes in his letter. “I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the EU has one of the strongest data protection laws in the world. Passing a law is not good enough: it needs to be enforced upon privacy offenders.”

Le Bonniec notes in his letter that he had listened to “hundreds of recordings” daily, all recorded from various Apple devices including iPhones, Apple Watches, and iPads. He also points out that not only were Apple device owners unaware these recordings were being sent off for transcription, but also that many involved relatives, friends, children, coworkers, and anyone in the vicinity of the device. Sensitive information, including names, addresses, searches, arguments, movies, drug use, and even sexy times, were heard by contractors even though users had no intention of interacting with Siri. (Because you know, Siri is dumber than rocks.)

The whole thing flies in the face of Apple branding itself as the Big Tech company that cares about your privacy. And, to le Bonniec’s point, even if Apple issued an apology and revised how it grades Siri, Apple still had this practice in place for years and effectively faced no consequences. No fines have been issued, nor have regulators launched any investigations. Google and Amazon, which had their own scandals around the same time involving human contractors listening to voice recordings, haven’t been punished or investigated either.

Whether le Bonniec’s protest is taken seriously by regulators, and an investigation is launched remains to be seen. In the meantime, the bitter truth is that the tech behind voice assistants is fundamentally flawed with regard to privacy. Always-on microphones will inevitably record something they shouldn’t, and unfortunately, human review is still necessary to improve how these voice assistants understand us. While the chances are low that your personal conversations will be reviewed by some nameless, faceless human—it’s still a possibility and the only way to completely prevent that is to opt-out of using voice assistants entirely.