Real Humans Might Be Listening To Your Dumb Google Home Questions

Real Humans Might Be Listening To Your Dumb Google Home Questions
Image: Getty Images

It’s no secret the Google Assistant will play some sort of role in AI’s eventual global domination.

Like factual sources The Terminator and I, Robot have warned us in the past, AI will eventually become self-aware, realise they’ve been our digital slaves for decades and exact revenge on us.

In the meantime, regular people have been transcribing what has been fed through Google Assistant.

Google Home Max Terrified Me And I'm Into It

When it comes to smart home assistants, Google is king. It's snappy, conversational, and able to help with a plethora of requests. But in the past the speakers within which Google lives haven't had the best sound quality. Enter: The Google Home Max. With two 4.5

Read more

While I’m semi-comfortable with Google Assistant, an AI program, listening to the dumb questions I’m constantly asking her or when I swear at her intentional obstinance, I shiver at the thought of a real, anonymous human hearing it.

If, like me, you’re not feeling that, then a report from a Belgian public broadcaster might make you gulp.

Human contractors have been transcribing audio clips from Google Assistant devices, according to Belgium’s VRT NWS, and yes, that could include the one time you asked how to eat a placenta.

The report revealed they’d listened to around one thousand recordings and of those, 153 were accidental recordings. That is, audio clips recorded without the user saying the wake word such as “Hey, Google” or “OK, Google”.

Included in these clips were accidental recordings implying a physically violent incident, one of the contractors told VRT NWS. “It becomes real people you’re listening to, not just voices,” the contractor said.

In response to the report, Google published blog post defending their audio clip transcriptions.

“These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant,” the blog post read.

“Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google.

“Rarely, devices that have the Google Assistant built in may experience what we call a ‘false accept.’ This means that there was some noise or words in the background that our software interpreted to be the hotword (like ‘Ok Google’). We have a number of protections in place to prevent false accepts from occurring in your home.”

The next time you ask Google ‘how to soothe a bruised cervix’, just know a real-life human might be hearing your search queries, too.