Last week, a judge in New Hampshire ordered Amazon to hand over recordings of an Echo smart speaker found in the home where a double murder took place last year in Farmington.
Authorities believe the recordings may provide information that could put the murderer behind bars. If Amazon does hand over the private data of its users to law enforcement, it won’t just involve the tech company in a murder case. It will also be the latest incident to raise serious questions about how much data tech companies collect about their customers with and without their knowledge, how that data can be used, and what it means for privacy.
An Amazon Echo might be a key witness in a murder trial
Last January, Timothy Verrill was charged with first-degree murder by the New Hampshire attorney general in the deaths of two women, Christine Sullivan and Jenna Pellegrini. Police found the women’s bodies in the backyard of Sullivan’s boyfriend, Dean Smoronk, whom local New Hampshire media reported Verrill knew.
Verrill was spotted on home surveillance video with both Sullivan and Pellegrini. He was also seen on video hours later buying cleaning supplies at a store and returning to the house. After Smoronk called 911 to report his girlfriend missing, police found the bodies and seized an Amazon Echo speaker in the kitchen, next to the spot where police believe Sullivan was killed. Last Friday, a judge ordered Amazon to hand over the recordings on the Echo, as well as any information of cellphones that were paired to the speaker on the date of the murder.
According to the Associated Press, prosecutors believe the Echo might have useful information to make the case against Verrill, whose trial begins May 2019, including details about what happened during and after the murder, such as “possible removal of the body from the kitchen.”
In a statement to Vox, Amazon said it won’t “release customer information without a valid and binding legal demand properly served on us” and that “Amazon objects to overbroad or otherwise inappropriate demands as a matter of course.” It didn’t comment on whether the company will hand over the data once it’s served a motion, or if it will challenge the ruling.
While it’s entirely possible the Echo speaker will have nothing recorded that relates to the case, it also may very well have pertinent info. The speaker is initiated with four wake-up words — “Alexa”, “Echo,” “computer,” and “Amazon” — and records after hearing these words, even when it’s not being spoken to. These recordings are then stored on an Amazon server, accessible to the company, and to owners via the Alexa app.
There’s plenty of evidence that the devices record more than what Amazon says. After a woman in Portland found out that her Echo speaker had recorded a conversation she had with her husband and sent it to a random contact, Amazon admitted that its Alexa technology can misinterpret household noises like conversations, TV soundtracks, and music as wake-up calls and start recording. The speaker also starts recording a few seconds before a command is issued, meaning there’s likely more private information in the recordings than customers are aware of.
If the prosecutors from the Farmington murder case do, indeed, find evidence on the Echo recordings, it raises a host of other questions: Will smart speakers be considered eyewitnesses? Should police start seizing all gadgets from crime scenes? And should they be allowed to use the data if the devices belong to private citizens?
Technology is starting to get intertwined with criminal cases
The New Hampshire judge’s order of Amazon to hand over the recordings of Dean Smoronk’s Echo isn’t the first time a tech company — or even Amazon — has been pulled into a criminal investigation. Last year, Amazon was subpoenaed to release the recordings of an Echo device that was present at a home in Arkansas where a murder took place. Amazon initially fought the order, claiming that it violated free speech. Once the defendant allowed Amazon to hand over his data, though, the tech giant released the recordings (the charges were eventually dropped).
Last summer, during a domestic violence incident in Albuquerque, law enforcement said an Echo called the police. The smart speaker reportedly overheard a boyfriend beating his girlfriend and shouting, “Did you call the sheriff?” which prompted it to phone the sheriff’s department. (Amazon and local law enforcement have conflicting stories about how the Echo was woken up, and who placed the call.) In a New York Times story about the case, the head of SAP National Security Services said most smart-speaker customers were unaware of how much of their information was being recorded, and that more explicit disclosures are needed for these gadgets.
Apple has faced similar situations. In 2015, a husband and wife opened fire in an office in San Bernardino, California, killing 14 and injuring 22. After the couple was killed, the FBI asked Apple to help it unlock the iPhone of the shooter, which couldn’t be opened without his password. Apple refused, igniting a heated debate about the privacy of individuals in the face of national security risks.
The FBI made the case that the man could have been involved with ISIS, and wanted to know who he had been talking to, while Apple CEO Tim Cook wrote an open letter to customers, warning them that the FBI “threatens the security of our customers” and that the government essentially could deploy this enforcement of privacy violation against anyone. The dispute made national headlines, and opinions were split; according to the Pew Research Center study, 51 percent of people wanted Apple to unlock the phone and 38 percent didn’t. The FBI eventually hired hackers to get into the iPhone.
Sometimes tech companies willingly release data to law enforcement to help with investigations. For example, when investigators were searching over the summer for a 20-year-old Iowa college student named Mollie Tibbetts, they turned to her Fitbit data, since Tibbetts went missing while out for a jog (her murderer was caught, later, via video surveillance).
What all these incidents underline: Big tech companies own a large amount of information about us — with antitrust advocates arguing they likely wield more power than they should. We know they use it to target us to buy things. But it’s also clear they aren’t totally forthcoming about just how much data they’re collecting, and we’re only just beginning to see how this may unfold around matters of national security and criminal justice.
Currently, surveys show that Amazon is the second-most trusted institution in the US, just behind the military. Studies have also found that after the financial crisis, there’s more public trust in Amazon than in banks. That’s an enormous amount of trust to place in one company. But there aren’t any laws that outline exactly how or when the government is allowed to request and use the private data Amazon and other companies collect.
This can lead to concerning outcomes if customers start seeing their data handed over, with no legal oversight. And what is the limit? Will data be used for just violent crimes, or will law enforcement eventually want to pore over data for parking violations and minor misdemeanors?
Last year, after Fitbit data foiled the false testimony of a husband who killed his wife and claimed they were attacked, a police officer who works in the High Technology Division for the state of Virginia told the Washington Post that in about five to 10 years, police cases involving recording smart devices will become the new norm.
How these tech companies will balance requests from law enforcement against the privacy of their customers remains to be seen. As one privacy expert told the Washington Post, “Americans are just waking up to the fact that their smart devices are going to snitch on them. And that they are going to reveal intimate details about their lives they did not intend law enforcement to have.”