On Tuesday, the government obtained a court order compelling Apple to hack into an iPhone as part of the FBI’s investigation into the San Bernardino shooters. While the government’s investigation is an important one, the legal order it has obtained crosses a dangerous line: It conscripts Apple into government service and forces it to design and build what is, in effect, a master key that could be used as a mold to weaken the security of an untold number of iPhones.
The resulting order is not only unconstitutional, but risks setting a precedent that would fundamentally undermine the security of all devices, not just the one iPhone being debated in the news.
A bit of background is necessary to understand this debate.
As part of its investigation, the FBI has apparently obtained an iPhone 5C used by one of the shooters. The bureau has said that the phone is encrypted and protected by a passcode, and that it needs Apple’s assistance to unlock the phone. Specifically, it has asked Apple to design and write custom software that would disable several security features on the phone.
While Apple has generally cooperated in the investigation, it has refused the FBI’s latest demand to write malware that would help the FBI hack the device. To its credit, Apple has poured incredible resources into securing its mobile devices. One consequence of that effort is that Apple does not have a ready way of breaking into its customers’ devices. In the words of Apple’s CEO, Tim Cook: “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”
But the FBI is dismissive of that effort. According to its legal filing, the FBI believes that Apple could, if compelled, build a master key that would allow the FBI to try to break into iPhones like the one involved in the San Bernardino investigation. The FBI acknowledges that this would require Apple to write new software and then cryptographically “sign” that software (as the iPhone will accept only software updates signed by Apple).
A federal magistrate judge granted the FBI’s request the same day, but it gave Apple five days to object. Again to its credit, Apple has vowed to fight.
It is critically important that Apple win—for cybersecurity and for the fate of privacy in the digital age—for several reasons.
First, the government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.
The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.
Second, this debate is not about one phone—it’s about every phone. And it’s about every device manufactured by a U.S. company. If the government gets its way, then every device—your mobile phone, tablet or laptop—will carry with it an implicit warning from its manufacturer: “Sorry, but we might be forced to hack you.”
Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple’s Cook points out, backdoors are uniquely dangerous: “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”
That risk is only growing every day as the “Internet of Things” expands. For the government, every device connected to the Internet will be more than just a novel convenience—it will be a new window into your home. The fridge that responds to your verbal commands might have a backdoor to allow for remote listening. The TV that allows you to video chat with your family might be commandeered into a ready-made spy camera.
These are the real stakes of the debate: Either American companies are allowed to offer secure products to their consumers, or the U.S. government is allowed to force those companies to break the security of their products, opening the door for malicious hackers and foreign intelligence agencies alike. For the sake of both our privacy and our security, the choice is clear.
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com