It turns out you don’t need much to meddle in a U.S. election. Some cheap cell phones. An Internet connection. Maybe a few airline tickets and a good grasp of the English language. That was enough for the Russian troll farm to get started on their U.S. operation back in 2015. And they achieved what they set out to do.
Thirteen of them, mostly errand runners for the group known as the Internet Research Agency, have been charged for allegedly trying to skew the U.S. electoral process. The indictment against them, handed down on Friday by Special Counsel Robert Mueller, reads both like a warning and a potboiler. But it could also serve as an instruction manual, one that any determined group could use to replicate the operation. This is clearly not what the Special Counsel intended.
When it comes to catching criminals and deterring copycats, the indictment may yet succeed. It might at least become harder for the Internet Research Agency to recruit new trolls around its home base in St. Petersburg, especially now that some of them are wanted by the FBI and unable to travel outside Russia without fear of arrest and extradition. Their summer holidays may now be limited to the beaches of Sochi and Crimea.
But for the broader aims of the troll factory and its investors, the indictment could serve as a victory in disguise. Apart from providing a blueprint for their methods, it may further diminish public trust in the platforms people use to receive information, share ideas, and to engage in civic discourse. Disseminating those kind of doubts has been the aim of Russian propaganda for years.
“It does not function like traditional propaganda,” says David Patrikarakos, the author of War in 140 Characters, a recent book on modern information warfare. It doesn’t seek to promote any ideology or convince people to join any single cause. Instead, says Patrikarakos, “It tries to muddy the waters. It tries to sow as much confusion and as much misinformation as possible, so that when people see the truth, they find it harder to recognize.”
Take, for example, one of the troll factory‘s earlier campaigns in Russia, the one that followed the murder of Boris Nemtsov. On February 27, 2015, the Russian dissident and former Deputy Prime Minister was shot in the back while walking home a few steps from the Kremlin walls. Suspicion among his allies soon fell on the man he had spent his career trying to unseat: President Vladimir Putin, who denied any involvement.
The day after the killing, the staff at the Internet Research Agency received detailed instructions on how to spin the news. Their orders were to flood Russian news websites and social media with comments about Nemtsov’s killing, all in the hope of confusing the online discussion about who was responsible. “Technical instructions for Feb. 28,” the orders began, according to a copy that was later leaked to local journalists. “Create the opinion that Ukrainians could have been mixed up in the death of the Russian opposition figure.”
Other theories spouted that week by the Agency’s trolls put the blame on Nemtsov’s girlfriend, his fellow dissidents, his American allies and his former business partners. They did not focus on dispelling the notion that Putin or his allies could have been involved. They simply crowded the debate with so many theories and alternative facts that everything about the case began to seem suspicious. “Next they’ll say that space aliens did it,” Nemtsov’s personal assistant, Olga Shorina, told me after watching these theories spread on social media at the time. “I can’t even look at it anymore.”
About three weeks after Nemtsov’s death – when a decorated veteran of the Russian security services had already been arrested for pulling the trigger – an independent polling agency in Moscow found that only 15% of respondents believed the Russian authorities had been involved. Perhaps even more surprising, the same survey found that only 10% of respondents were even paying close attention to the highest profile political murder of the Putin era. A far larger number had simply tuned out.
The Kremlin’s main propaganda outlets – the television news – no doubt played a more powerful role in shaping public opinion around that case. But the role played by the Internet Research Agency suggested a shift in strategy. Long before Nemtsov’s killing, in 2011, Russia had overtaken Germany as the nation with the highest number of Internet users in Europe. Even then the public was beginning to turn off state TV and going online for uncensored news.
Across Russia, and especially in the big cities, the political debate was also migrating to the Web around that time, especially to the blogging platform known as LiveJournal, whose audience in Russia around 2011 had come to rival some of the state-run news networks – it had 5 million Russian accounts with 30 million monthly readers. It wasn’t long before that space also came under attack. In April 2011, hackers targeted not just the blogs of the dissidents and opposition figures who were writing on LiveJournal; they took down the entire service.
“There’s no ideology at play here, unless you want to talk about an anti-blogging ideology,” Alexander Plushchev, one of Russia’s leading tech journalists, told me at the time. “These are clearly just Internet hit men who got the order to take out LiveJournal.” The aim, in other words, was to stop the conversation. And for a little while it worked. The raucous debates on LiveJournal ground to a halt as the site remained inaccessible for days, and many of its users began migrating to Facebook, which is a lot more difficult for hackers to knock offline.
The rise of the Internet Research Agency in 2013 was, at least in part, a reaction to that shift. Its managers recognized that trying to shut down the means of political debate was no longer enough. In the age of social media, people would just find another place to exchange ideas. The best way to stop them would be to infiltrate the discourse itself — and, whenever possible, to fill it with nonsense, conspiracies and lies.
The indictment of the Internet Research Agency shows in minute detail how easily this can be done. Reading through the schemes it describes – the fake accounts the suspects created on social media, the fake activist groups they formed, the fake causes they claimed to champion, and the phony protests they were able to organize in American cities – it is hard to avoid the tug of paranoia, the feeling that the civic discourse in any democracy is vulnerable to sabotage, and that every political statement is worthy of suspicion.
The reaction to such doubts could, in many cases, be a healthy sort of skepticism. It could remind people to check their sources of information and to question the voices that reach them online. But that sort of vigilance is hard to maintain. For many people, the easier option would be to withdraw from the debate for fear of being fooled again. And as the efforts of the Agency’s trolls have shown in the past, that outcome would serve their interests perfectly well.
With reporting by Sandra Ifraimova / New York
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com