April 12, 2019 05:57 PM
Updated on April 12, 2019 6:33 PM
Would you change your conversations with Alexa, Siri or Google Assistant if you knew that there are people who listen to them?
Amazon, Apple and Google have employees who listen to their customers' recordings with their intelligent speakers and voice aids, according to recent Bloomberg research.
The three companies say on their websites that the recordings are reviewed "occasionally" by some people on their team to improve their systems.
Bloomberg's research highlighted that data after collecting testimonials from several Amazon employees who "reviewed" Alexa's recordings. He says he talked to seven people who reviewed files on Amazon Echo devices and the Alexa assistant.
However, the reaction to the article suggests that many customers of Amazon and other brands did not know that there were people listening to them.
The "reviewers" of Amazon
Amazon recordings are linked to the account number, the customer's name and the serial number of the Echo device that it uses.
The reviewers' job is to transcribe and annotate voice clips to improve Amazon's speech recognition systems.
Reviewers can not identify which client corresponds to the conversation, says Amazon | Photo: GETTY IMAGES
Some of those employees told Bloomberg that they shared some "funny" voice clips with each other in an internal chat room.
They also described how they sometimes listened in group "to release stress" clips with disturbing content, such as possible sexual attacks.
However, his colleagues told them that Amazon's policy was not to intervene.
What does Amazon say?
The terms and conditions of Amazon's Alexa service establish that voice recordings are used to "answer your questions, satisfy your requests and improve your experience with our services". No human reviewer is explicitly mentioned.
In a statement, Amazon said it takes security and privacy seriously, and that it only added notes "to a very small sample of Alexa voice recordings."
"This information helps us train our speech recognition systems and natural language comprehension, so that Alexa can better understand your requests and ensure that the service works well for everyone," he said in a statement.
"We have strict technical and operational guarantees and zero tolerance policy towards the abuse of our system, employees do not have direct access to information that can identify a person or account as part of their work dynamics."
What does Apple say about Siri?
Apple also has human reviewers to ensure that its voice assistant Siri interprets the requests correctly.
Siri records voice commands through the iPhone and its smart HomePod speaker.
Google Home, Amazon Echo and Apple HomePod are the best-selling smart loudspeakers on the market | Photo: GETTY IMAGES
According to Apple's security policy, voice recordings lack personally identifiable information and are linked to a random number that is reset every time Siri shuts down.
All voice recordings saved after six months are stored without the random identification number.
Your human reviewers never receive personally identifiable information or the number that was assigned.
And what about Google and Assistant?
Google said that its reviewers can listen to audio clips from Assistant, the Google voice assistant that is integrated into most Android phones and its smart home speaker.
Also disguise the voice of the client.
Do all intelligent speakers record the conversations?
A common fear is that intelligent speakers secretly record everything that is said in the house.
However, although technically they are always "listening", they usually do not "listen" to your conversations.
Almost all domestic assistants record and analyze short audio fragments internally in order to detect a warning word, such as "Alexa", "Ok Google" or "Hey Siri". If the alert word is not heard, the audio is discarded.
Some customers fear that smart speakers will record all conversations | Photo: AMAZON
But if it is detected, the file is stored and the recording continues, so that the client's request can be sent to the voice recognition service.
That would be easier to detect if the devices continuously sent all conversations to a remote server for analysis, but security researchers found no evidence that this happened.
Can I keep the reviewers from listening to my conversations?
Amazon's Alexa privacy settings do not let you rule out a human review, but you can prevent your recordings from being used to "help develop new features." You can also listen to and delete previous files.
Google lets you listen and delete the recordings in the "My activity" section. You can also deactivate the "web and application history tracking" and the "voice and audio activity", which the Google Asistant invites you to activate.
Apple, on the other hand, does not let you listen to Siri's recordings again. In its privacy portal, which allows you to download a copy of your personal data, it reads that "it is not personally identifiable or it is related to your Apple ID".
To delete voice recordings created by Siri on an iOS device, you can go to the "Siri and search" menu, in settings, and disable Siri. Then go to the "Keyboard" menu (inside the "General" section) and deactivate "Dictation".