You’ve probably heard the warnings. Yet there you are: Scrolling frantically through an app's Terms of Service's pages for a glaring reason to not share your email or birth date — or, perhaps more likely, skipping right past it all and clicking “Agree.” The app makers know better than to bold anything or make anything clear — especially about how your actions will morph into marketing metadata, sprinkling a trail of "cookies" behind you. They don’t want anything to stand between you and your download — or them and your personal information. Besides, they know you probably won't get in their way.
And so your Google searches return, zombie-like, as ads. Your emails are mined for money-making chits. Elsewhere, your background, politics and even"ethnic affinity" are tracked. Meanwhile, retailers are notified, via Bluetooth and GPS, when you enter their store what your income is and how much time you'll probably spend shopping.
Security technologist and cryptographer Bruce Schneier compares walking around with a smartphone to carrying a tracking device 24/7. “If we were told we had to inform the police when we made a new friend, we would never do that," he says. "Instead, we inform Facebook." He goes on, "Or [if we were told to] mail the police a copy of every bit of our correspondence? Just in case? We don’t, but Google stores it for us."
The irony is that Americans say they care deeply about protecting their data. Pew Research found that being in control of who can get information about us is “very important” to 74% of Americans.
But if we care so much, why do we keep giving our information away? Researchers call this the "privacy paradox": We do it because we reason that our future self will probably suffer no consequences. We figure that the worst that will likely happen is we feel kind of violated by all the corporate algorithms (and maybe the government) tracking us along with everyone else.
Tech companies find their opening in our short-term reasoning. They compile all of that data, somewhere, to boost their bottom line — now, or some time well in the future. And whenever that happens, it will be too late. Our future self will have no say.
So, what can we do?
Well, reading the Terms of Service isn’t going to help much. First of all, researchers estimate would take 76 hours a year to read all the user agreements we meet. And second, clicking "No" often means not using a tool that you may actually need — to navigate, communicate or work.
But there are some practical steps you can take.
Use unique passwords on all your accounts. No pet’s names, no birthdays. Long passwords — and a different one on everything. With two-factor authentication (meaning the website or tool will send you a text to double-check that it's really you logging on). Download a password manager if you need; it takes some setting up, but pays off in peace-of-mind.
Consider some encrypted tools, which make it almost impossible for unauthorized people to read your communications. The text messaging app Signal limits the data it collects, and encrypts your messages. It’s easy to use and free.
Yes, the recent Wikileaks documents suggest the government has ways to access messages before they’re encrypted, but unless the CIA has compromised your device, the tools are still solid. You can also find encrypted backup services and encrypted file-sharing sites.
And you can change the way you behave online, too. Post less.
Go through the apps on your phone, and see if they really need access to your microphone, contacts, or location data. My friend Anil Dash recommend heterogeneity: Spread your data around so your search, email, documents and photos aren't all under one corporate umbrella. That makes it a little harder for marketers to pin you down. And shop small and local. Non-chain stores have less incentive — and capacity — to parse and sell your records.
But maybe most importantly, rethink the basic pact we've made — that digital connection is the only way to sustain a livelihood or connect with others and, therefore, disempowered, we must keep giving ourselves away.
"We need to return to the basic insight of the founding generation, which is that when people are under surveillance, their behavior changes," says Georgetown Law professor Laura Donohue, evoking the Fourth Amendment. "Their intimate relationships are affected. Their ability to question the world and their role in it is harmed."
You might feel like you have nothing to hide. And maybe you don’t. (Although who among us wants their bank details, medical records and snarky emails published online?) But the more people who use these tools, the more normal they become. And that helps protect people who really do need security online – human rights campaigners working under repressive regimes, or corporate whistleblowers looking to expose corruption. As Bruce Schneier told me, “If everybody uses a postcard, the envelope is suspicious. But if everyone uses an envelope, then it’s just an envelope.” These problems are not ones we can solve as individuals. They’re too big to fix download by download, user by user.
This is not how it was supposed to be. Not with the Internet, which was originally envisioned to help equal access. As its inventor, Sir Tim Berners-Lee told me, what we have now is "a huge, massive invasion of privacy." And not with the United States in particular. The Constitution's framers knew nothing of WikiLeaks or Instagram, but the Fourth Amendment stipulates: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated." Terms written more than 225 years ago, to which I would reply: Agree.