In March 2018, Christopher Wylie blew the whistle on Cambridge Analytica, a political consultancy that worked for the Trump campaign. Cambridge Analytica, the Canadian data scientist revealed, had illegally obtained the Facebook information of 87 million people and used it to build psychological profiles of voters. Using cutting-edge research, Cambridge Analytica — which was funded by the billionaire hedge-fund owner Robert Mercer, and effectively run by Steve Bannon from 2014 onward — spread narratives on social media aiming to ignite a culture war, suppress black voter turnout and exacerbate racist views held by some white voters. (Trump’s campaign staff have denied Cambridge Analytica played a major role in the campaign.)
Wylie’s revelations caused a temporary meltdown in Facebook’s market value and set governments around the world scrambling to more stringently regulate social media, data collection and political campaigning. But, 18 months later and as another U.S. Presidential Election approaches, Wylie says not enough has been done to prevent similar problems. In his new book, Mindf*ck: Cambridge Analytica and the Plot to Break America, out Oct. 8, Wylie tells the story of his experience coming to realize how the enormous amount of data we now share about ourselves daily on social media could be combined, synthesized and eventually weaponized to shape our thoughts, feelings and even voting habits — all without us noticing.
Wylie spoke to TIME about the dangers of Facebook, his fears for 2020 and his best advice for would-be whistleblowers.
TIME: You left Cambridge Analytica in 2014, well before the systems you helped design were used in the Trump campaign. But in the book you describe working on disturbing stuff, for example finding ways to exacerbate racist ideas inside certain target populations. Why did you take so long to realize what you were doing was unethical?
In the beginning, you’re just building databases. It feels very mundane. You’re asking people lots of questions, you’re playing with models, it doesn’t feel like you’re going to hurt anyone. In some ways I distanced myself from the reality that these are people. But then when you start to see things like video footage of some of the focus groups that Cambridge Analytica was doing, you realize you were provoking paranoid ideation, racializing people’s thinking. It really starts to hit home that you have ended up contributing to manipulating these people’s worldviews to a point where they believe things that aren’t true, and are engaging in harmful actions and thinking harmful thoughts. You promote racialized thinking at scale or you provoke and encourage misogynistic viewpoints, and you end up harming society. That really bothered me, and I just sort of sat back and was like, what the hell am I doing? On top of that we had all kinds of really unusual meetings with some really unsavory people. It just built to a point where I was just like, I can’t do this. I’m not going to do this.
What did you hope to achieve by writing Mindf*ck?
As a journalist, you know that you’ve only got a certain amount of real estate that you can use, so everything becomes top-line. After I finished my testimony tour, as I’m calling it — giving evidence to governments around the world — I sat back and thought, there’s so many things that take so long to explain. That’s why a book is a nice format. There’s bits and pieces of new stuff in there, but one of the things I wanted to do was to use Cambridge Analytica as a case study of what can go wrong.
What can go wrong?
It’s both the profiling of people, but also targeting them and trying to dominate the informational environment around them. Once you can sever somebody’s ties with other sources of information, you put them into an environment where you have much more control over what information that they actually see. That’s a very powerful thing, because they still feel like they’re in charge, because in their heads they’re making the decision to click on something, share something or chat with some random account that they don’t actually know. They don’t see the thought process and the strategy behind that.
Even though Cambridge Analytica has dissolved, the capabilities are still there, the platforms are still there, the people are still there. What happens when China becomes the next Cambridge Analytica? Like anything, the second, third, fourth time you do something, you start to refine and perfect it. So my concern is that if you have a state-backed operation, they could fairly quickly reconstruct a capability that is similar to Cambridge Analytica’s, if not surpass it.
Do you see the rise of disinformation as an inevitable byproduct of our increased connectivity?
Disinformation has always existed. It’s not like all of a sudden we’ve just discovered this new thing called propaganda. But the irony of having these really open platforms is that you can actually accomplish very similar objectives to what the Soviet Union had in terms of dominating the information around somebody and crafting, curating and building their perceptions. You can just do that by dominating a platform, like Facebook, or particular people around them. So it’s not that propaganda is new or disinformation is new, but the barrier to entry is much lower now.
If the internet has lowered the barrier to entry, as you describe, and if these tools are available to more than just an obscure consultancy like Cambridge Analytica was in 2016, is there any hope for democracy?
For all the problems that we’re experiencing, there’s a lot of good things that can come with highly interconnected people. The internet is a brilliant thing. But when it comes to the internet and platforms on the internet, civil society and the state, broadly, don’t actually have a say in it right now, because it’s all companies. And why should they be the ones that we trust? If this is our democracy, if you’re an American citizen, why are you even conceding the point that they should be in charge of protecting such a fundamental part of civic discourse? Why are we entitling a company to have that power?
Aside from helping Trump to come to power and the Brexit vote to occur, do you see a deeper coup achieved by Cambridge Analytica’s tactics? That three years on, both countries are still deeply polarized?
Yeah, I do, and that’s the really unfortunate thing. That really gets to the heart of this idea of the Breitbart doctrine, which is that politics is downstream from culture. So, if you can change how people see themselves, their identity and how they see society and what’s happening, politics will just flow from that. There’s a lasting impact on what happens. God, that’s the understatement of the year. Cambridge Analytica’s tactics contributed to a world where people kind of hate each other, and don’t want to talk to each other, don’t want to hear each other, don’t want to speak to each other.
Do you agree that politics is downstream from culture?
I genuinely believe that. Because if you can change the cultural standpoint of people, then politics will respond to that. It’s a lot harder to undo culture than it is to undo a candidate.
Do you think Facebook has learned enough, ahead of the 2020 election? Do you think they’ve done enough to counter similar disinformation campaigns?
No, because at every turn when you look at their behavior, they are not upfront with problems. They have a long history of obfuscating what’s actually going on inside of their company.
But the loophole that Cambridge Analytica used in Facebook’s systems, that allowed them to gain intimate data on 87 million users, is now closed. Does that mean the danger isn’t as great?
No, and this gets to the architecture and engineering question, which is that if you have a browser extension that pulls session cookies, for example, you could do the exact same thing that Cambridge Analytica did, because I could then log into your account as you, and then mine all your friends’ data. It’s the very design of Facebook. It doesn’t matter if you put a new door on the data repository. As long as you can get in, everything is very open. So I have questions like: Why is it that people can see what you like? Why is it that people can see who you follow? Why is it that people can see who likes the things that you’re saying, or who’s commenting on it? This data can be used to profile you.
What do you fear could happen in 2020, knowing what you do about psychological profiling and the power of data?
When you look at 2016 as a case study, it was obvious that Russia was first to the block, in terms of realizing that you can use a lot of these platforms to manipulate voters. My concern is that it’s no longer just Russia. There’s going to be a wider constellation of threats. Like China, North Korea, Iran and frankly, America’s allies, too. If you have a trade dispute, who’s to say Mexico might not start interfering.
What advice would you give to somebody who’s considering blowing the whistle?
Talk. To. A. Lawyer. As obvious as that piece of advice is, this is something that whistleblowers almost never do. They jump to handing over stuff to journalists or they go public and they don’t think things through. You will be a far more effective whistleblower if you operate within a legal framework.
In the book you talk about an agreement you struck with an “exceptionally wealthy individual” who agreed to pay your legal fees. Was that agreement predicated on them remaining anonymous?
Yeah, because there are safety issues involved. We did due diligence on our side to understand where this came from. The reason why the person didn’t want to go public was because when you are going to rub salt in the Trump administration, the Russian government, a company like Cambridge Analytica that engages the use of hackers, there are safety issues there.
But now you’re speaking publicly about our right to know who’s interfering in our political campaigns. Isn’t it in the public interest to know who’s funding you?
It’s my legal defense. It’s never been used for anything other than legal defense, and I have a right to a legal defense.
Lastly, you memorably compared the Brexit campaign to a bar fight. Do you see a similar mechanic at play in the U.S.?
Yes, completely. I don’t know how regularly you are involved in bar fights, and I’m not saying I regularly engage in bar fights, either. But when you have somebody and they’re really angry, and they’re holding their empty beer bottle and yelling, if you start saying, “You’re being stupid,” or, “There’s going to be consequences,” you’re going to aggravate that person. When you look at Trump supporters in the alt-right in the United States, where it’s like, build the wall, shut down trade, and you say, “Don’t do it. It’s going to be bad for the economy. It’s going to be bad for you,” it’s not going to work, because people are angry. That’s the last thing they want to hear because all they’re hearing is prove them wrong. To take that metaphor even further, the bar is Facebook. If you are a responsible bar owner, you will step in and try to calm down the situation, right? But right now, Facebook is like some weird barman who is slightly masochistic and enjoys the bar fight. Who says, they’ve already bought their beers so I’m not going to kick them out. And actually, this bar fight is attracting more people to come into my bar — and buy more beer.
This interview has been edited for length and clarity.
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com