It would never be OK for a judge to require you to sign a petition or a letter disavowing your views. Whether you believe that the country would be better off if everyone had a gun, or if guns were banned, no warrant could require you to sign a letter to the contrary. The First Amendment simply doesn’t allow the government to force you to become a hypocrite, or to substitute the views of a judge, a prosecutor or others in the place of your own. But that’s what the FBI is trying to make Apple do.
In order to help the FBI access an iPhone used by one of the shooters in the San Bernardino attacks, it wants to force Apple to write and sign its name on code that Apple believes would be profoundly dangerous for all of us who increasingly need to trust that smartphones and other digital devices are secure.
The question in this case is a key part of a raging political and technological debate. At its heart is whether we should have real security on our phones, laptops and tablets (and items including refrigerators, cars and home thermostats that are connected to the Internet), or whether the security we rely on to protect us against criminals, foreign governments and spies should be undermined so that the government has guaranteed access.
The FBI seems to say that when smartphone security gets in its way, that security has to go. But we know that if security goes away when the FBI wants it to, we’ll be much more vulnerable to the bad guys, too. There’s no way around that.
Electronic Frontier Foundation (EFF) has been fighting for digital privacy rights and security for users for more than 25 years. We oppose the FBI because the kinds of “backdoors” and other security compromises that it seeks to have Apple create in this case cannot be limited by law or technology to a single phone.
Instead, this request would undermine security and privacy for millions of people in the U.S. and around the world. It would also set a legal precedent that could be used again and again, by governments far less trustworthy than our own, to require backdoors for any device.
Apple has taken the strong position that true security—that neither Apple nor anyone else can break–is necessary to protect its users’ personal data and keep it safe from hackers and criminals.
The ongoing cybersecurity struggles to protect our phones and stored information from criminals of all stripes demonstrates that their position is correct. It has implemented those beliefs in the actual security it has built into the iPhone, including the requirement that no code will run on the phones unless it has Apple’s specific endorsement, via a digital signature. Apple CEO Tim Cook told users in a letter: “It would be wrong to intentionally weaken our products with a government-ordered backdoor. If we lose control of our data, we put both our privacy and our safety at risk.”
The FBI wants Apple to write and sign code that is diametrically opposed to this belief. Just like forcing Apple to sign a document disavowing its position on guns, Apple would be required to sign code that undermines the security it has promised to its customers and believes—rightly—is vital.
The Supreme Court has rejected requirements that people put “Live Free or Die” on their license plates or sign loyalty oaths, and it has said that the government cannot compel a private parade to include views that organizers disagree with. That the signature and code in the Apple case are implemented via technology and computer languages rather than English makes no difference. For nearly 20 years in cases pioneered by EFF, the courts have recognized that writing computer code is protected by the First Amendment. In a brief from EFF and leading technology experts, we have told the court considering Apple’s case that forcing the company to write and sign a new operating system for the government is akin to the FBI dictating a letter endorsing backdoors and forcing Apple to sign its forgery-proof name at the bottom.
Our system rightly requires that companies like Apple provide relevant evidence in their custody, and Apple has done that. But it doesn’t allow the FBI to force Apple to become a hypocrite by writing and signing code that sends the message that backdoors, which will cripple security and endanger the privacy of millions of people around the world, are OK. They are not. They are incredibly dangerous, and forcing Apple to create and endorse them is not only counter to the strong security that all of us need; it also violates the First Amendment.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com