How did an Amazon Echo end up recording a couple’s private conversation and sending it to an acquaintance without their knowledge? Alexa apparently misinterpreted some of the things they said.
“I felt invaded — like total privacy invasion,” said Danielle, the Portland woman affected by the incident, who was only identified by her first name. Danielle said she was bewildered when her husband’s colleague called to say he was sent recordings of what appeared to be private conversations the couple was having in their home. (She added that it was hard to believe until he referenced a specific discussion they had about “hardwood floors.”)
Turns out the couple’s chatter had been picked up by their Amazon device and transmitted to one of their contacts. “I’m never plugging that device in again; I can’t trust it,” Danielle said.
In response to the news station’s report, Amazon called the incident “an extremely rare occurrence,” and added that it is “taking steps to avoid this from happening in the future.” But the story spread rapidly, underlining a crisis of trust that tech companies are facing as questions mount about the data they’re collecting from users.
How exactly did this happen in the first place?
The short answer: Amazon’s Alexa assistant misinterpreted the background noise it heard as a command.
Amazon says that the device, once activated, picked up phrasing in the couple’s conversation that it construed as a “send message” request. Alexa then went on to perceive another mention in the conversation as the naming of a specific recipient for that message. And so on and so forth.
Here’s Amazon’s breakdown:
Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.
This isn’t the first time voice-activated devices have grappled with accurately pinpointing what users are saying. Amazon and Google products have struggled when it comes to understanding requests from people with different accents, for example.
But beyond the technological limitations it highlights, this occurrence touches on deeper qualms people have expressed about such devices, including the fear that they’re always on and always listening. “My husband and I would joke and say I’d bet these devices are listening to what we’re saying,” Danielle told KIRO 7.
(Amazon’s response even indicates that the device is constantly paying attention — to a degree — since it’s listening for that “Alexa” wake word to be activated.)
What’s more, the timing of this debacle isn’t great. What feels like a Black Mirror plot point comes as tech companies including Facebook and Amazon strive to prove that they have what it takes to protect user privacy — an open question Alexa isn’t likely to answer.