(To receive weekly emails of conversations with the world’s top CEOs and business decisionmakers, click here.)
Fifty years ago, journalist Bob Woodward descended into a dark underground parking lot in Arlington, Virginia, for a rendezvous far too sensitive to be carried out over the phone. His covert meetings with a government source—later immortalized as “Deep Throat” in the film All the President’s Men—eventually helped Woodward and his colleague Carl Bernstein blow open the Watergate scandal that would change the course of American history.
In our interconnected world, underground car park meetings may appear antiquated. But together with the internet’s revolution in connectivity has come an increase in surveillance – from spyware targeting journalists and activists, to governments pressuring technology corporations to hand over users’ personal data.
In this new online world, the safest way to communicate is still in-person, without electronic devices. (When whistleblower Edward Snowden met with lawyers in a Hong Kong safe house in 2013, he urged them to put their cell phones in a refrigerator in a different room to prevent digital eavesdropping.) But since the latter half of the 2010s, apps marrying the convenience of digital communication with the security of an underground car-park meeting have exploded in popularity. These apps use end-to-end encryption – a mathematical way of scrambling data that makes it very easy for the sender and recipient of a message to decode it, but extremely difficult, if not impossible, for an eavesdropper to decipher.
Chiefly responsible for this trend is Signal – the private messaging app employed by hundreds of millions of users. Its gold-standard encryption technology lies behind WhatsApp, used by more than 2 billion people worldwide. Unlike WhatsApp, Signal encrypts data from your contacts, whom you’re messaging, when, and how often, meaning this crucial metadata – oftentimes more sensitive than the contents of messages themselves – is equally safe.
As a journalist, Signal has long been one of the most important tools in my toolbox. When I can’t meet my source in person, it’s the next-best option for a sensitive conversation. When I last wrote about Signal, in 2020, the non-profit that runs the app was riding high. After a year of political and social instability, many dissidents and protesters had begun using Signal, amid a growing tech backlash that was also leading others to question the security of their personal data, and seek alternatives to Big Tech.
Read More: The Inside Story of How Signal Became the Private Messaging App for an Age of Fear and Distrust
I wanted to revisit Signal today, not only because it seems so much about the internet is in flux. It was also an excuse to speak with Meredith Whittaker—one of the sharpest critics of Big Tech—who earlier this year was appointed Signal’s President. Whittaker has a storied career, from leading protests inside Google against the firm’s work on military contracts to, most recently, advising FTC chair Lina Khan on the dangers of artificial intelligence. Now, Whittaker is helping lead Signal, one of the tech industry’s most promising outsiders, toward long-term financial stability. At the same time, she’s increasingly getting the recognition she’s due as torchbearer for a radical vision: not only for an internet where the surveillance business model is recognized as inherently dangerous, but one where alternatives are abundant and easy to use.
More from TIME
In this wide-ranging conversation, Whittaker spoke to TIME about the state of the tech landscape, where Signal is going next—and don’t miss her choice words about the crypto meltdown.
This conversation has been condensed and edited for clarity.
The internet as a whole is in a period of what seems like significant flux. Elon Musk has acquired Twitter and seems to be rewriting the rules about who can have access to large platforms. Companies like Meta are laying off large segments of their workforces. At a time where authoritarians around the globe are leveraging technology to strengthen social control, it appears Silicon Valley is turning away from the safeguards it belatedly erected to prevent these kinds of abuses. I’m curious as to what you think of the environment right now.
I think we are getting a really clear lesson that is bringing into stark relief the fact that what we think of as “tech” is a collection of companies, and they are controlled by people who have specific incentives: generally profit and growth. We have become, over the past couple of decades, dependent in pretty profound ways on computational infrastructures that are run at the whim of these companies that are, as a rule, predicated on mass surveillance. We don’t have any control over it in any meaningful sense.
Ultimately, they can call it a town square, all these corporate slogans that paint these products and services as socially beneficial. But I think we need to question the narrative that access to U.S. technology is inherently liberatory. We need to recognize that most of these technology companies conduct surveillance, and can be pushed by governments to turn over very, very sensitive information. We saw Apple turn off AirDrop in China recently, right before a wave of significant protests broke out. We saw Google pull the Navalny voting app from the Russian Google Play Store. Fifteen or so years ago we had the case in China where Yahoo! turned over sensitive information about dissidents that led to real personal harm – that’s one of the few cases where we have all the information so we can draw that connection.
I think we have to recognize that these tools are often collecting sensitive data, and that whether by interception or by coercing these companies directly, authoritarian governments can gain access to this data.
What is Signal doing differently?
Signal is working to provide the most widely used private messaging service on the planet.
To do that, we need to invest in research and development; we also need to structure our organization differently. We don’t think privacy is safe under shareholder driven corporations. So we are structured as a nonprofit. That means we don’t have equity, we don’t have a board breathing down our neck to increase profits and growth and we’re safer from the ‘founders take an unethical buyout’ possibility than we would be otherwise. So we’re not going to just sell Signal to a crypto exchange and go live on a yacht. That’s not actually a possibility, even if I decided to become evil.
Frankly, our operational revenue is different. We are funded by donations. And the premise there is that we are accountable to the people who rely on Signal for privacy, not to advertisers, not to customers behind the scenes who are paying for access to our users. We are directly accountable to our users.
Is it sustainable? Do you have enough donations to keep yourselves going?
We are just experimenting with and developing our donor-funded model. So at this point, we’re fully funded by donations, but part of those donations include a 0% interest long term loan from [executive chairman of the Signal Foundation] Brian Acton. We have that generous contribution from Brian that’s allowing us the room to do that. We have some large donors, and then we are ramping up the pipeline of small donors. Our focus on small donors is because we want to be directly accountable to the people who rely on Signal. And given that providing robust privacy challenges those in power, we also want sources of funding that are as hard as possible to quickly disrupt. So we don’t want a single point of failure. Millions of people donating a little bit is the ideal model for us.
You recently said on Twitter that Signal will be launching usernames soon. How’s that going?
Usernames are a feature we’ve had requests for, for a long time. They’re a difficult feature to build in accordance with our strict privacy promises, but we’re working on them. And, you know, we’re hopeful that they’ll launch in the first half of 2023. What they’ll do is allow people to use Signal without giving anyone else their phone number. So you can give someone your username, and someone can connect with you on Signal via your username without ever knowing your phone number. It’s another layer of privacy preservation. We’ve heard, particularly from journalists, or folks who use Signal in a professional or maybe more public capacity, that they want to be able to broadcast their Signal information without broadcasting their phone number. So this allows that.
When I interviewed Signal’s executive chairman Brian Acton in 2020, we spoke about the multi-year effort to make the app more user-friendly so that privacy doesn’t require a sacrifice on the part of the user. Signal recently released Stories, the feature pioneered by Snapchat and since copied by Instagram, WhatsApp and many others. Why was it important for you to bring Stories to Signal, and have they had the uptake you’d hoped for?
Stories are ephemeral, ambient messages that don’t demand a response, that let people share their day with folks visually. They are really, really popular, particularly in Southeast Asia and South America. And we’d heard from those folks that there’s a real desire; that they’re a make or break feature. So we built them. We’re not in the business of prescribing the tools people prefer to communicate. We’re in the business of making those tools private, so people who actually use them can benefit from privacy without having to be an ascetic staring at a command line.
People are using Signal Stories. We don’t collect any data about people, but we can see that the feature is being used. We can’t see anything about the accounts that are using it or anything that would be able to connect it back with individuals.
Our goal is that everyone in the world can pick up their device, and without thinking twice about it, or even having an ideological commitment to privacy, use Signal to communicate with anyone they want. We’re not growth hacking, we’re not trying to optimize for engagement. We’re not trying to get clicks. But we want to create that network effect of encryption, because privacy only works if the people you’re talking to use it.
If you don’t use the app, it doesn’t matter how much I care about privacy, I’m not going to be able to have a private communication with you. So we really need to take a hugely expansive view and put what people actually do, what they actually want, at the center of that. We’re not in the business of educating people until, browbeaten, they understand that privacy is important. We’re in the business of putting something in their hands they can use intuitively, that will genuinely protect their privacy, whether or not they believe it is important.
For so many years, the network effect has been one of the key dynamics keeping power in the hands of the giant tech corporations. If all your friends are on Facebook and Instagram it’s very hard to quit those services, even if you disagree with the business model or wish for better online spaces. I love the idea of subverting that dynamic and using it for privacy-protecting purposes. But on the other hand—and I don’t know if these ideas are slightly contradictory or not—the research you guys are doing is open source. Even as you’re building your own network effect, you’re not digging a moat to keep out competitors. You’re building an architecture that other people can take and freely integrate into their own services.
Yeah, absolutely. The Signal Protocol is the canonical example. It is the gold standard for messaging encryption. It was a significant and groundbreaking contribution to the field of cryptography. And it’s what WhatsApp uses, it’s what Google has used to encrypt their own messaging systems. We also developed methods for group chats, that means we don’t know who’s sending a message to whom. That metadata protection deserves almost more emphasis than the encryption of the messages themselves. Our goal is to know as little as possible, and we’re constantly pushing the boundaries of how little is possible.
As a general rule we open source our code, and we have a wonderful and robust community of people who are like, scrutinizing it, or validating it, who are finding issues and raising them in a ticket on our GitHub. So a key element there is that you don’t have to take our word for it. The lack of trade secrecy and a lack of obscurity means that it’s actually validated.
Curveball question here: quantum computing seems to be advancing very fast. And one of the risks of that is that at some point in the future, it will become possible for quantum computers to easily decrypt even material that is currently very strongly encrypted under current state-of-the-art protocols. No one knows when, so there’s quite a lot of uncertainty. How is Signal thinking about that risk?
We are investing in research to update the Signal protocol for post-quantum. It’s long term, slow burn research work that isn’t always publicized, but that we continue to do. But post quantum robustness for the protocol and the other encryption methods we use is definitely something we’re working on. The hope and the plan is to roll that out before it’s necessary, so there’s no gaps. I’m not a cryptographer, and I don’t spend all my time in the literature here. But I think the answer of ‘when’ is very much up in the air, based on what I do understand. There is a tech hype cycle playing fast and loose with terms like quantum computing, even as it’s very true that quantum is progressing, and we do need to be responsible and make the required investment to ensure we’re prepared.
Signal is currently beta-testing a way to transfer MobileCoin, a type of cryptocurrency that allows for anonymous payments, within the Signal app. But with the whole FTX collapse, and the increased scrutiny on crypto’s use case as a vehicle for money laundering, and for increasing the wealth of the wealthiest at the expense of some of the poorest in society—is crypto still something that Signal wants to connect its name and its reputation to?
I mean, look, crypto is a vehicle for extremely risky securities trading. Most of the money goes through centralized exchanges. A lot of these exchanges don’t seem to be capitalized. This is a moment in crypto where I think it’s becoming clear that a lot of what was valorized as technical innovation is more akin to securities fraud. Crypto is a sh-t show right now.
But the goal of enabling people who are unbanked, or outside of traditional economies, to make payments securely is one that I do agree with. We’ve seen folks be deplatformed from major credit card platforms. The technological mechanisms that are centralizing surveillance around financial transactions, have been used to deplatform sex workers or others, and render them unable to live. So I believe that cash and other forms of anonymous payments need to exist, because the coercive power of financial surveillance at the level that we’re seeing is dangerous and can lead to forms of social control that can be extremely dangerous. But I think we’re watching crypto at a moment where a lot of the mask is off.
Signal is not MobileCoin. Signal is a separate organization. Mobile coin is a very, very experimental integration [into Signal]. But again, I will say, you know, Signal’s core focus is private messaging. I think anonymous payments are a necessary service. But currently crypto is a sh-t show that we are observing carefully.
More Must-Reads from TIME
- How Donald Trump Won
- The Best Inventions of 2024
- Why Sleep Is the Key to Living Longer
- How to Break 8 Toxic Communication Habits
- Nicola Coughlan Bet on Herself—And Won
- What It’s Like to Have Long COVID As a Kid
- 22 Essential Works of Indigenous Cinema
- Meet TIME's Newest Class of Next Generation Leaders
Write to Billy Perrigo at billy.perrigo@time.com