Wearing a blond wig and walking through the streets of central Vienna in October 2017, Julia Ebner reminded herself of her new identity: Jennifer Mayer, an Austrian philosophy student currently studying abroad in London. It was one of five different identities that Ebner, an Austrian researcher specialized in online radicalization and cumulative extremism, adopted in order to infiltrate far-right/Islamist extremist networks. That day in October, she met a local recruiter for Generation Identity (GI), the European equivalent of the American alt right, which is mostly an online political group that rejects mainstream politics and espouses ideas of white nationalism. GI is the main proponent of the Great Replacement Theory, the baseless idea that white populations are being deliberately replaced through migration and the growth of minority communities. The theory has inspired several recent extremist attacks, including the murder of 51 people in Christchurch, New Zealand last March, and the mass shooting at a Walmart in El Paso, Texas last August, which left 22 people dead.
The meeting with GI’s local leader proved to be significant. Ebner learned about how important the group considered social media for their strategy to expand and recruit members in schools, public baths and other public venues that young people visit. She found out that GI were planning to launch an App, “Patriotic Peer,” that would connect a “silent majority” (in the words of the leader), which was funded by donations from around the world.
Securing the meeting wasn’t easy. It took several months of setting up credible accounts within the various GI networks online and a couple of weeks of messaging with GI members. But it was necessary for Ebner’s research: the 28-year-old is a resident research fellow at the Institute for Strategic Dialogue, a London-based think-tank that develops responses to all forms of hate and extremism. She has advised the U.N., parliamentary working groups, frontline workers and tech firms on issues around radicalization, and her first book, The Rage: The Vicious Circle of Islamist and Far-Right Extremism, was published in 2017.
Two years ago, Ebner started to feel like she had reached the limits of her insights into the world of extremism. She wanted to find out how extremists recruit members, how they mobilize them to commit violence, and why people join and stay in the movements. Ebner believed she could only get her answers by being a part of these groups. Over the past two years, she has spent much of her spare time talking to people on online forums. They include the Discord group, used by the alt-right to coordinate the violent Charlottesville rally in August 2017, the Tradwives (short for Traditional Wives), which is a network of some 30,000 far-right women, who perceive gender roles in terms of a market place where women are sellers and men buyers of sex, and an online trolling army, Reconquista Germanica, which were active in the 2017 German federal election.
Ebner, whose new book Going Dark: The Secret Social Lives of Extremists is published Feb. 20, spoke with TIME about what she discovered. The conversation below has been edited for length and clarity.
TIME: There are a lot of tense moments in the book when you were close to being exposed, like when you dropped your real bank card in front of a GI member. What were the biggest challenges in going undercover?
Ebner: My first attempt of creating and maintaining a credible profile didn’t work. I was kicked out of a group and had to start all over again.
I found switching between different identities stressful and confusing. Remembering exactly what I had said in my online profiles, previous chats and real-life conversations in these various roles could get challenging. Sometimes staying in my role and not being able to talk back as my real self was also difficult. There were many moments when I wanted to debunk a crazy conspiracy theory, or say “you’re not funny!” instead of laughing at a racist joke, or convince younger members to cease their involvement with a group.
As you’d imagine, I made made plenty of stupid mistakes. Dropping my real credit card was only one of them. Once I even signed an email with “Julia” instead of “Jenni.” I’m not a professional MI5 agent, I did acting in high school but going undercover didn’t come naturally to me.
I received some tips from a friend who has done undercover investigations himself and also trained people to infiltrate dangerous groups. I probably did appear nervous but I imagine most people who go to a first recruitment meeting with a white nationalist group leader probably would be, so I didn’t think that it would be too suspicious.
What motivates people to join online extremist networks?
In many cases, they offer an escape from loneliness and a solution to grievances or fears. A lot of the time it was a fear of a relative loss of status, which the networks blamed on migration and changing demographics. They offered easy explanations — oversimplified rationalizations — to complex social and political issues.
The networks also offered support, consolation and counselling. They can turn into a kind of family. Some people spend so much time online that I doubt they socialize in the real world.
What were the recurring themes among users? Can you profile a recruit?
On the surface, there was no clear profile. Users were from different age groups, social classes, educational backgrounds and — depending on the group — different ethnic backgrounds. The lowest common denominator was people who were in a moment of crisis. The recruiters did a good job of tailoring their propaganda to pick up vulnerable individuals. The Tradwives reached women who had relationship grievances, Islamist extremists recruited alienated Muslims who’d experienced discrimination, and white supremacists exploited people who had security concerns.
It was a major part of the recruiters’ strategy. White supremacist networks, like the European far right, have a clear step-by-step radicalization manual, which they call “recruiting strategies.” The Tradwives, for example, made themselves seem like a self help group and I think that’s what attracted even women from different ideological backgrounds, and even those who don’t subscribe to traditional gender roles.
How do online extremist networks operate?
Some groups, the European Trolling Army for instance, had tightly-organized hierarchical structures. Neo-Nazi groups often have military-like structures, positions in the groups are even named after military ranks, and a person could rise to the top by running hate campaigns against political opponents.
Other networks, like the ones used by the perpetrator of Christchurch and the attack in Halle, Germany last October, had looser structures. They would get together on an opportunistic basis — when they saw that something could be gained by cross-border cooperation. They use their own vocabulary and insider references when they decide to collaborate on a campaign or a media stunt. The Matrix is one of many internet culture references from Japanese anime to Taylor Swift. And they would be very effective at advancing these operations.
How is the new far right, like GI, different to the traditional far right in the language it uses?
Far right groups have undergone a rebranding and have reframed the ideas held by traditional neo-Nazis. Generation Identity use euphemisms like “ethno pluralism” instead of “racial segregation” or “apartheid,” and combine video game language with racial slurs, creating their own satirical language.
Not only are extremist groups better at spreading their real ideologies behind satirical memes, they’re also being given a platform by politicians. Language which mirrors that used by proponents of conspiracy theories like the Great Replacement are retweeted by politicians and repeated in their campaigns. This is likely to become more prevalent in the next few months in the run up to the U.S. presidential election. The 2016 U.S. election proved to be one of the key turning points in uniting far right groups globally.
You write that the Europe-based Generation Identity receives global support on Twitter, with U.S. alt-right vloggers playing a big role in making their hashtags trend. This also translates into financial support. The ISD found that most of the 200,000 euros that GI received in donations for their “Defend Europe” campaign came from U.S. sources – despite its exclusive focus on European border. What is the extent of transatlantic cooperation?
Trans-Atlantic cooperation between the far right in Europe and the alt right in the U.S. has been growing. Some of the ideologies that inspired the GI and other far right groups have been propagated by leading far right figures in the U.S. And the European far right have adopted some of the strategies of gamification and propaganda used by the Americans alt right. They both see themselves as fighters in a war against white genocide or the Great Replacement and there is loyalty between them that makes the idea of ultra nationalism obsolete.
How do you see online extremism developing?
One of the biggest problems is in the infrastructure of social media and tech companies. Algorithms give priority to content that maximises our attention and to content that causes anger and indignation. It’s like handing a megaphone to extremists. It’s allowed fringe views to get a much bigger audience. Developments in deepfakes, cyber warfare and hacking campaigns are likely to help extremists to refine their strategies.
What kind of approach is needed to stop the spread of online extremism?
Firstly, we need a global legal framework that forces all the tech companies — not just the big ones but also the fringe networks, like 8chan and 4chan — to remove content that could inspire terrorism. After the shootings in Christchurch and Halle, the documents —the “manifestos” — left behind by perpetrators were translated into several languages and shared on the fringe corners of the internet. We need a global approach because people can always find a way to circumvent national laws.
But content removal alone won’t work. In my book I suggest 10 solutions for 2020; this includes more digital literacy programs in education settings, which can enhance critical thinking skills, help Internet users to spot manipulation and ultimately weaken extremists. We also need more deradicalization projects that use social media analyses to identify and engage with radicalized individuals. Counter-disinformation initiatives with the help of fact checkers and social media campaigners could be formed, as they have done in the Baltics, to debunk online manipulation.
Technology and society are intertwined. So, our response has to be integrated. We need an alliance across not only politicians and tech firms, but civil society and social workers.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com