When Victor Collins was found dead, floating faceup in his friend James Bates' hot tub in Bentonville, Ark., one chilly morning in November 2015, police were quick to suspect foul play. Broken glass littered the patio, and blood was splattered on a brown vinyl pool cover nearby. But in a subsequent investigation, which led police to indict Bates, 32, for Collins' murder, some of the most crucial evidence was gleaned not only from the crime scene but from an array of Internet-connected devices in Bates' home.
Data from his "smart" utility meter, for example, indicated that someone had used 140 gal. of water between 1 a.m. and 3 a.m., a detail that seemed to confirm investigators' suspicions that the patio had been hosed down before they arrived. Records from Bates' iPhone 6s Plus, which required a passcode or fingerprint to unlock, suggested he had made phone calls long after he told police he'd gone to sleep. And audio files captured by Bates' Echo, Amazon's popular personal assistant that answers to "Alexa," promised to offer police a rare window into Bates' living room the night Collins died.
The case, which goes to trial in July, marks the first time ever that data recorded by an Echo, or any other artificial intelligence--powered device, like Google's Home or Samsung's smart TV, will be submitted as evidence in court. The move has alarmed tech analysts and privacy advocates nationwide. The issue is not only that these new devices are equipped with so-called smart microphones that, unless manually disabled, are always on, quietly listening for a "wake word," like "Alexa" or "Hey, Siri." It's also that these now ubiquitous microphones live in our most intimate spaces: in our living rooms and kitchens, on our bedside tables. In a world in which these personal assistants are always listening for our voices and recording our requests, have we given up on any expectation of privacy within our own homes?
Joel Reidenberg, a founding director of Fordham University's Center on Law and Information Policy, says the answer isn't straightforward. The explosion of these always listening gadgets has outpaced much of the existing legal precedent on privacy. "We are living in an always on, always connected world," he said. "We are creating records that have never existed before."
And we continue to charge ahead. Between mid-2015 and last December, Amazon sold 11 million Echo devices, according to Morgan Stanley, and in April the company introduced a newer version. The Echo Look features a depth-sensing camera and LED lights, and is designed to perch in your bedroom, where it can best offer fashion advice. Last May, Google launched its Google Assistant, which is capable of two-way conversations, and Apple is expected to release its version, powered by Siri, later this year.
U.S. privacy laws, as they have been interpreted over the past 40 years, offer no clear guidance for how to deal with these shiny new gadgets. The Fourth Amendment, along with a host of state and federal privacy statutes, has traditionally provided citizens with a powerful right to privacy within their own homes. But caveats loom. For example, the "third-party doctrine," the result of two Supreme Court cases in the 1970s, establishes that while Americans do indeed enjoy a "reasonable expectation of privacy" within their own homes, that changes if they share information with anyone or anything that constitutes a "third party." That means that if you dial a number on your phone or access a web page, you voluntarily offer that information to your phone company or Internet Service Provider, both third parties. In doing so, you relinquish any reasonable expectation of privacy.
"The pervasiveness of disclosures to third parties in an always connected world eviscerates the Fourth Amendment," Reidenberg warns. "Because, of course, we are disclosing information to third parties all of the time."
The issue has not gone unnoticed. In 2012, Supreme Court Justice Sonia Sotomayor wrote that the third-party doctrine may simply be "ill-suited to the digital age." Other privacy advocates argue that in the context of devices, like the Echo, whose microphones are always on and that live within the walls of our homes, we ought to rethink the scope of privacy altogether. After all, our interactions with an Alexa or a Siri are in many ways unprecedented. When we type something into a Google search, post a message on Facebook or agree to share our GPS location with a mapping app, we are usually actively interfacing with a screen. That feels different than asking Siri about the weather, or standing alone, half-dressed in our bedrooms, trying on clothes for Echo Look.
The issue is further complicated by the nature of spoken interactions. If your iPhone mistakenly hears "Hey Siri" when you say "They seriously," you did not intend to interact with a third party, much less to create a record of your conversation. But nonetheless, it's there, transmitted and saved.
In July 2015, the Electronic Privacy Information Center, a research and advocacy group that has drawn support from both conservatives and liberals, pushed the U.S. Justice Department and Federal Trade Commission to weigh in on precisely this issue. "Americans do not expect that the devices in their homes will persistently record everything they say," the group's letter read. "It is unreasonable to expect consumers to monitor their every word in front of their home electronics. It is also genuinely creepy." Lee Tien, a senior staff attorney at the Electronic Frontier Foundation, went one step further. When we think about privacy, he told TIME, it should not be in the context of hiding embarrassing or incriminating data. We should have a reasonable expectation of privacy within our own homes, he said, unless we actively choose to waive it. "People should have the freedom to choose what they share," he said.
Many legal analysts and law-enforcement officials find themselves firmly on the other side of the debate. Voicing a search request to Alexa, they argue, is no different--legally or logically--from typing that same request into a search bar. There's no good reason devices with microphones instead of keyboards shouldn't be subject to the same rules. That's perhaps especially true in the context of criminal justice. After all, if police present probable cause and receive a search warrant, they can often enter a suspect's home, request phone records and access recent browser history. How is that any different than searching the audio collected by a digital appliance? "There is not a rational or legal reason that we shouldn't be able to search that device," Nathan Smith, the Bentonville County prosecutor, told reporters, referring to Bates' Echo.
Indeed, there is already plenty of precedent for law-enforcement officials' culling through precisely the kind of ultrapersonal digital records that, as Reidenberg pointed out, didn't even exist five years ago. In February, for instance, police in Ohio strengthened a case against a man accused of arson and insurance fraud after the heart-rate data collected from his smart pacemaker appeared to contradict the story he'd told investigators. In April, police in Connecticut were able to indict a man for murdering his wife in part because data from her Fitbit showed that she was home, walking around, long after he claimed an intruder had killed her.
There are no pending court cases that promise to bring any clarity to this issue. Which is one reason the Bentonville case has drawn so much attention. When Smith first subpoenaed Bates' Echo recordings in 2016, Amazon refused to comply, saying it would not "release customer information without a valid and binding legal demand properly served on us." In February, the company hired a top First Amendment lawyer with 30 years' experience and prepared for war. But then, in March, Bates' lawyer released the records voluntarily, postponing the broader privacy dilemma a bit longer.
Meanwhile, Bentonville officials have spent the last two months sifting through Bates' Echo records. They have not yet said what they found. But even if Alexa knows nothing more than the name of the song playing at the time of death, or who requested it, there's something unsettling in calling her to the stand.