Facebook could have prevented billions of views on pages that shared misinformation related to the 2020 U.S. election, according to a new report released Tuesday, which slams the platform for “creating the conditions that swept America down the dark path from election to insurrection.”
The report, by the online advocacy group Avaaz, found that if Facebook had not waited until October to tweak its algorithms to stem false and toxic content amplified on the platform, the company could have prevented an estimated 10.1 billion views on the 100 most prominent pages that repeatedly shared misinformation on the platform ahead of the election.
For much of the summer of 2020, at the height of anti-racism protests and amid a surge in COVID-19 cases, data from Avaaz shows that the top 100 “repeat misinformers” received millions more interactions on Facebook than the top 100 traditional U.S. media pages combined.
“The scary thing is that this is just for the top 100 pages—this is not the whole universe of misinformation,” says Fadi Quran, a campaign director at Avaaz who worked on the report. “This doesn’t even include Facebook Groups, so the number is likely much bigger. We took a very, very conservative estimate in this case.”
Avaaz defined the top 100 repeat misinformers as pages that had shared at least three pieces of misinformation (as defined by Facebook’s own third party fact-checkers), including two within 90 days of each other. On average, the top 100 misinformers shared eight confirmed pieces of misinformation each—and refused to correct them after they were labeled by Facebook-affiliated fact-checkers. “Fact-checkers have limited resources, and can only fact-check a subset of misinformation on Facebook,” the report said, explaining the methodology as a way of determining pages that were “highly likely to not be seeking to consistently share trustworthy content.”
In a statement, Facebook spokesperson Andy Stone disputed the report’s methodology. “This report distorts the serious work we’ve been doing to fight violent extremism and misinformation on our platform,” he said. “Avaaz uses a flawed methodology to make people think that just because a Page shares a piece of fact-checked content, all the content on that Page is problematic.”
Stone said Facebook has “done more than any other Internet company to combat harmful content,” banning militarized social movements including QAnon, and removing millions of pieces of misinformation about COVID-19 and election interference. “Our enforcement isn’t perfect, which is why we’re always improving it while also working with outside experts to make sure that our policies remain in the right place,” he said.
The 10.1 billion number is intentionally broad to “show Facebook’s role in providing fertile ground for and incentivizing a larger ecosystem of misinformation and toxicity,” Avaaz said. But the group also quantified the number of views accumulated by the 100 most popular pieces of content ahead of the election that were flagged as false or misleading by Facebook-affiliated fact checkers: 162 million.
Zuckerberg heading to the Hill
The report’s findings increase pressure on Facebook CEO Mark Zuckerberg ahead of an important week in Washington. On Thursday he, along with Twitter’s Jack Dorsey and Google’s Sundar Pichai, will face Congress for the first time since the storming of the U.S. Capitol by angry Trump supporters and extremists on Jan. 6—an event that was partly planned on their platforms. Two subcommittees of the House Energy & Commerce Committee are expected to grill them on how their algorithms amplify disinformation and allow the spread of extremist ideologies.
Avaaz also said it found 118 pages, with nearly 27 million followers, still active as of March 19 on the platform, that had shared what the group said was “violence-glorifying content” related to the election. The group said that 58 were aligned with QAnon, anti-government militias, or Boogaloo, a far-right movement based on the idea of an impending civil war.
The posts included calls for “armed revolt,” memes about ambushing National Guard members to steal their ammunition, and other violent threats, according to Avaaz. All 118 of the pages were reported to Facebook by Avaaz during the election cycle, Quran said. Facebook removed 18 of them, including 14 after receiving an advance copy of the report on March 19, said Stone, the Facebook spokesperson. The rest did not violate Facebook’s policies, he said.
The report’s findings illustrate how quickly movements like QAnon and “Stop the Steal” groups, which connected ordinary Americans, political activists and far-right extremist groups in the same online ecosystem, were able to grow before Facebook took action. By the time it removed some of the largest QAnon groups last summer and fall, the movement was far too large to be contained, and its followers simply moved to other platforms like Parler, Telegram and Gab, where some went on to organize for the Jan. 6 insurrection.
Many of the lawmakers set to grill Zuckerberg in Washington on Thursday have signaled that after years of similar hearings, their patience is wearing thin. “We’ve had Mark Zuckerberg in front of the committee, and he gives us superficial answers and a sad face, but he doesn’t go back to the drawing board,” says Rep. Tony Cardenas, a California Democrat on the committee. He says he intends to use his time to question Zuckerberg about Facebook’s failure to stem the spread of Spanish language disinformation and conspiracy theories. “The bottom line is: he knows and he’s acknowledged with us that they can improve, but they don’t invest in those improvements.”
Will social media face tougher regulation?
The debate over accountability, content moderation, online misinformation and data privacy issues is likely to take center stage in other ways on Capitol Hill in the coming months as well. Democrats have indicated that they intend to make oversight of social media companies a top priority. Sen. Chris Coons, a Delaware Democrat who was named the new chairman of the Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law, has said he also expects to call on Zuckerberg and Twitter CEO Jack Dorsey to testify before his panel. Unlike Republicans, who spent hours in previous hearings pressing social media executives on alleged anti-conservative bias, Democrats plan to focus on the platforms’ role in allowing disinformation, hate speech and violent incitement by extremist groups to go unchecked.
Democrats including President Joe Biden have suggested revoking or rewriting Section 230 of the Communications Decency Act, a key legal provision that protects tech platforms from lawsuits for content posted by their users. Advocates for reform say that the law should be amended to make platforms more legally accountable for content including misinformation and incitement to violence. Although Facebook has publicly said it welcomes Section 230 reform, hostile lawmakers could make changes that would increase the company’s costs or force it to rethink its business model.
Facebook only stepped up its efforts to reduce the reach of repeat sharers of misinformation in the weeks leading up to the 2020 election, according to Avaaz. In October, it banned calls for coordinated interference at polling stations and posts that use “militarized language” meant to intimidate voters. This included words like “army” or “battle,” Facebook’s vice president for content policy Monika Bickert told reporters at the time. This came after the President’s son, Donald Trump Jr., was featured in campaign videos calling for “every able-bodied man and woman to join Army for Trump’s election security operation” to “defend their ballots.”
At the time, Facebook touted these last-minute changes as decisive actions. The company also said in October that it would display information about how to vote at the top of users’ feeds and add fact-checking labels to false information about the voting process or premature claims of victory by candidates.
The Avaaz report says even these late measures were implemented inconsistently, allowing millions of views on posts that slipped through the cracks between October and Election Day. It also found that copycats of misinformation posts that Facebook’s own fact-checking partners had debunked went undetected by the company’s AI, accumulating at least 142 million views.
“The message I have for Mark Zuckerberg is that Facebook needs to stop publicly scoring its own exams, and allow experts and democracies to audit the platform,” Quran says. “It’s time for Zuckerberg to stop saying ‘sorry,’ and start investing in proactive solutions to these problems.”
- Workers Are Furious. Their Unions Are Scrambling to Catch Up
- What the Facebook Whistleblower Did to the Company's Stock in 6 Weeks
- Photos from Migrants' Desperate Journeys to the U.S. Border
- Emily Ratajkowski: How I Learned to Let Go
- Afghanistan's Female Students Were Banned from Studying. Now Some Are Finding New Ways to Learn
- The 'Safe Supply' Movement Aims to Curb Drug Deaths Linked to the Opioid Crisis
- The 19 Most Underrated Movies on Netflix
- By Ending Legacy Admissions, Amherst Hopes to Change the Makeup of Its Student Body