• Tech
  • Social Media

Facebook Has Finally Banned Holocaust Denial. Critics Ask What Took Them So Long

5 minute read

Facebook updated its rules on Monday to explicitly ban any content that “denies or distorts” the Holocaust, after years of allowing people to deny that the genocide occurred.

The move reverses Facebook’s previous stance, which was articulated by CEO Mark Zuckerberg in years of interviews as not wanting his company to be an arbiter of truth.

“I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong,” he told Vox’s Recode in 2018.

Zuckerberg’s position, and Facebook’s, has “evolved” since then, he said in a Facebook post published Monday. “I’ve struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust. My own thinking has evolved as I’ve seen data showing an increase in anti-Semitic violence, as have our wider policies on hate speech.”

“Our decision is supported by the well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people,” said Facebook’s vice president of content policy, Monika Bickert, in a statement.

Civil rights groups welcomed the news, but questioned Facebook’s timing. “As Facebook finally decides to take a stance against Holocaust denial and distortion, they claim it is because of their work with the Jewish community over the past year,” said Jonathan Greenblatt, the CEO of the Anti Defamation League (ADL), in a statement. “We question this claim because if they had wanted to support the Jewish community, this change could have been implemented at any point in the last nine years.”

Although Facebook has over the past few years gradually imposed new guidelines on hate speech and content that the company defines as having the potential to incite violence, it has until recently largely stayed away from making decisions about individual conspiracy theories or claims to truth. (Last year, it said it would not fact check political ads placed by politicians or pressure groups on its platform, for example.) The first sign of a shift came as the coronavirus pandemic spread around the world, when Facebook announced it would limit the spread of COVID-19 misinformation on its platform. Then, in August, Facebook announced it would begin to consider conspiracy theories about Jewish people “controlling the world” as bannable hate speech. And on Oct. 7, the company announced it would ban QAnon, a sprawling, false conspiracy with anti-Semitic elements that says President Trump is bringing an elite cabal of child-molesters to justice. Less than a week later, it banned Holocaust denial too.

“I half-heartedly applaud the move,” says Yael Eisenstat, Facebook’s former head of global elections integrity for political ads, who left the company in 2018. “The fact that Zuckerberg has finally, after years of advocacy from anti-hate groups like the ADL and others, accepted that Holocaust denial is a blatant anti-Semitic tactic is, of course, a good thing. The fact that it took him this long to accept that these organizations had more experience than him and knew what they were talking about is dangerous.”

Notably, Facebook’s statement also did not mention the role that its own platform has played in allowing hate speech, including anti-Semitism, to propagate more broadly.

“More importantly, [Zuckerberg] still seems to ignore why Facebook is so ripe for spreading hate speech and disinformation to begin with,” she tells TIME. “If he does not accompany this decision with what so many have been calling for, a complete retooling of how the business model works, then it will just be another whack-a-mole content moderation plan without changing any of the core mechanisms that encourage and amplify this kind of behavior.”

Read more: Sacha Baron Cohen: We Must Save Democracy From Conspiracies

That retooling of the platform, critics hope, would see Facebook change its criteria for deciding if a post should be seen by more people. Currently, it prioritizes “engagement,” or how many people interact with a post, which critics say gives the most inflammatory of statements an unfair advantage, contributing over the long term to a coarsening of public debate.

“Facebook could, if they wanted to, fix some of this,” Eisenstat also said in a TED Talk published in August. “They could stop amplifying and recommending the conspiracy theorists, the hate groups, the purveyors of disinformation and, yes, in some cases even our president. They could stop using the same personalization techniques to deliver political rhetoric that they use to sell us sneakers. They could retrain their algorithms to focus on a metric other than engagement, and they could build in guardrails to stop certain content from going viral before being reviewed. And they could do all of this without becoming what they call the arbiters of truth.”

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com