Google defends listening to private conversations on Google Home – but what intimate moments are recorded?
Company admitted voice assistant is sometimes triggered by accident and recordings could be sent to workers for analysis
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google has defended its practice of allowing workers to listen to customers’ audio recordings from its Google Home smart speakers.
The technology giant admitted this week that ”snippets” of recordings are analysed by language experts, claiming it helps improve its artificial intelligence voice recognition systems.
Google said in a blog post defending the practice its language analysts only review around 0.2 per cent of audio recordings.
However, it also revealed some of these recordings had been leaked by a worker in the Netherlands.
This led many people across social media to question what information smart speakers like Google Home and Amazon Echo actually collect.
Which recordings do Google listen to?
Google explained in its blog that Google Assistant only sends audio to Google after a device detects the “wake” words that indicate a person is interacting with the Assistant – for example, by saying “Hey Google” or by physically triggering the Google Assistant.
It also admitted the voice assistant is sometimes triggered by accident and any recordings made as a result could also be sent for analysis.
This means private conversations and intimate moments could be captured by the smart speaker and then sent to people employed by Google without the user even realising.
This is also the same for accidental interactions with Amazon’s voice assistant Alexa, which shares snippets of conversations with language analysts to improve “customer experience”.
Could your private conversations be leaked?
Both Amazon and Google say the recordings are anonymised before being sent to the analysts, though this is only the location data of the device, not the actual sound of the voice.
There is no evidence any Amazon Echo recordings have ever been leaked, and Google’s revelations were the first time such an incident had been publicly acknowledged.
Google said it was investigating the situation, though it is not yet clear what changes – if any – will be made.
“Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action,” Google wrote in the blog.
“We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
Can you stop smart speakers from listening to private conversations?
Smart speakers need to be always listening in order to function, as they are constantly listening for their “wake” word.
This does not mean they are always recording, though an always-on microphone is a tempting target for hackers.
Both Amazon and Google say they employ the highest security standards to prevent third-parties from listening in, though as Google’s recent revelations prove, this does not prevent employees from listening to and sharing recordings.
“We hold ourselves to high standards of privacy and security in product development, and hold our partners to these same standards,” Google said.
“We also provide you with tools to manage and control the data stored in your account. You can turn off storing audio data to your Google account completely, or choose to auto-delete data after every 3 months or 18 months.”
Google Home users can configure their settings on their account page, while Amazon Echo users can review their settings by visiting Alexa Privacy.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments