RED ALERT! Amazon Employees Are Listening to Your ALEXA Conversations
Busted: Thousands Of Amazon Employees Listening To Alexa Conversations
ZeroHedge.com
Amazon employs thousands of people to listen in on what people around the world are saying to their Alexa digital assistant, according to what is sure to be a Congressional hearing-inspiring report by Bloomberg, which cites seven people who have worked on the program.
While their job is to “help improve” NSAlexa – which powers the company’s line of Echo speakers – the team “listens to voice recordings captured in Echo owners’ homes and offices,” which are then transcribed, annotated and fed back into the software in order to try and improve Alexa’s understanding of human speech for more successful interactions. In other words, humans are effectively helping to train Amazon’s algorithm.
In marketing materials Amazon says Alexa “lives in the cloud and is always getting smarter.” But like many software tools built to learn from experience, humans are doing some of the teaching. -Bloomberg
The listening team is comprised of part-time contractors and full-time Amazon employees based all over the world; including India, Romania, Boston and Costa Rica.
Listeners work nine hour shifts, with each reviewing as many as 1,000 audio clips per shift according to two employees from Amazon’s Bucharest office – located in the top three floors of the Romanian capital’s Globalworth building. The location “stands out amid the crumbling infrastructure” of the Pipera district and “bears no exterior sign advertising Amazon’s presence.”
While much of the work is boring (one worker said his job was to mine for accumulated voice data for specific phrases such as “Taylor Swift” – letting the system know that the searcher was looking for the artist), reviewers are also listening on people’s most personal moments.
Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording. -Bloomberg
Occasionally Amazon listeners come across upsetting or possibly criminal recordings – such as two workers who say they listened in on what sounded like a sexual assault.
According to the report, when things like this happen the workers will mention it in the internal chat room to “relieve stress.”
And while Amazon says that it has procedures to follow when workers hear distressing things, two of the Romania-based employees say they were told “it wasn’t Amazon’s job to interfere” when they requested guidance for such instances.
“We take the security and privacy of our customers’ personal information seriously,” said an Amazon spokesman in a statement provided to Bloomberg.
“We only annotate an extremely small sample of Alexa voice recordings in order improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” the statement continues. “We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”
That said, Amazon does not mention the fact that humans are listening to recordings of some of the conversations picked up by Alexa. Instead, they have a generic disclaimer in their FAQ that says “We use your requests to Alexa to train our speech recognition and natural language understanding systems.”
What Amazon records
According to Amazon’s Alexa terms of use, the company collects and stores most of what you say to Alexa – including the geolocation of the product along with your voice instructions, reported CNBC‘s Todd Haselton last November.
Your messages, communication requests (e.g., “Alexa, call Mom”), and related instructions are “Alexa interactions,” as described in the Alexa Terms of Use. Amazon processes and retains your Alexa Interactions and related information in the cloud in order to respond to your requests (e.g., “Send a message to Mom”), to provide additional functionality (e.g., speech to text transcription and vice versa), and to improve our services. -Amazon Terms of Use
Alexa, are you spying on me?
Alexa: *coughs* No, of course not.
— Wrong, Brian (@leonidasmoderus) May 25, 2018
Last May, an Amazon Echo recorded a conversation between a husband and wife, then sent it to one of the husband’s phone contacts. Amazon claims that during the conversation someone used a word that sounded like “Alexa,” which caused the device to begin recording.
“Echo woke up due to a word in background conversation sounding like ‘Alexa,’” said Amazon in a statement. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
The wife, Danielle, however said that the Echo never requested her permission to send the audio. “At first, my husband was like, ‘No, you didn’t,’” Danielle told KIRO7. “And he’s like, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did!’”
Can you disable?
Alexa does allow people to stop sharing their voice recordings for the development of new features, while a screenshot reviewed by Bloomberg reveals that the recordings provided to Alexa’s listeners do not provide the full name or address of a user. It does, however, link the recording to an account number, the user’s first name, and the device’s serial number.
“You don’t necessarily think of another human listening to what you’re telling your smart speaker in the intimacy of your home,” said UMich professor Florian Schaub, who has researched privacy issues related to smart speakers. “I think we’ve been conditioned to the [assumption] that these machines are just doing magic machine learning. But the fact is there is still manual processing involved.”
“Whether that’s a privacy concern or not depends on how cautious Amazon and other companies are in what type of information they have manually annotated, and how they present that information to someone,” added Schaub.