A digital forensic group has done an analysis of the 32 pages and accounts that Facebook recently removed from its platform for “coordinated inauthentic behavior.”
While the company stopped short of connecting the “bad actors” behind those pages to Russia when announcing the removal on July 31, the Atlantic Council’s Digital Forensic Research Lab (DFRLab) says that the accounts were “most probably” being run by a successor to the Russian operation that attempted to influence the 2016 election.
Facebook did identify the pages as part of a political influence campaign being operated ahead of the U.S. midterms. The new analysis, which TIME reviewed, picks apart pages in the campaign, outlining tactics that are “correlated with the behavior patterns and content we saw from the infamous St. Petersburg troll factory,” says the DRFLab’s Graham Brookie. Chief among those tactics, he says, was an attempt to amplify existing political discord in the United States. The pages posted content related to topics such as race, feminism and immigration.
The DFRLab, a non-partisan center established in 2016 within the Atlantic Council, a foreign policy think tank, has partnered with Facebook to analyze abuse on the platform. In the analysis, its experts note that the pages in the latest campaign not only had direct contact with accounts previously identified as part of the Russian Internet Research Agency but shared similar content and made similar grammatical errors in posts.
In one instance, a Facebook account (@warriorsofaztlan) had a very similar name to an account that Twitter identified as part of the Russian troll farm (@warriors_aztlan). Both were set up in the same month in 2017 and shared similar content related to “white crimes against Native Americans,” according to the analysis.
Though many posts shared by the accounts contained content that had been taken from elsewhere on the Internet — a technique that may help trolls better hide their identities — some original writing contained errors a native English speaker would be unlikely to make. One post referenced “a needed and important humans,” while another contained the muddled phrase “Since the beginning of the times.”
Most alarming, Brookie says, is that the latest campaign appeared to double down on the technique of turning online political debate into real-world protests. In several instances, a page titled “Resisters” created anti-Trump events and engaged legitimate political organizations in helping to promote them among thousands of social media users. The events had titles such as “Stop Ripping Families Apart,” an apparent reference to Trump’s now-defunct family separation policy, and “The Trump Nightmare Must End.” The inauthentic administrators attempted to engage users on a range of issues, including the transgender military ban, white supremacy and rape culture.
Facebook has said its own investigation into who is behind the campaign is ongoing. The DFRLab’s analysis is independent of work the tech company is doing to track disinformation agents on its platform, though Facebook is in contact with the organization and has dedicated funding to the DFRLab’s work on elections.
Brookie emphasizes that the analysis does not amount to “a hard ‘yes,'” in terms of whether the pages are definitely connected to Russia. The similarities could be a coincidence or the work of bad actors copying the techniques that Russian agents used.
“It’s nearly impossible to say with 100% degree of confidence that this was a Russian intelligence operation,” Brookie says, “but what we can say is it looked and acted much like Russian influence operations that we’ve seen before.”
If it is the same troll farm, both Facebook and the DFRLab say it’s doing a better job of covering its tracks than in years past, perhaps adapting to information that tech companies have shared about how they have tracked such operations. “They’re not paying for political ads in roubles anymore,” Brookie says.
Publishing a detailed analysis of the latest campaign, Brookie acknowledges, might serve as a kind of “manual to heighten their operation security.” But he says that raising awareness around the tactics that trolls are using to exacerbate political tensions, and to draw people into the street, is crucial in fighting disinformation.
“We create more resilience,” he says, “when people are more aware.”
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com