• Tech
  • Social Media

Facebook’s Oversight Board Is Reviewing Its First Cases. Critics Say It Won’t Solve the Platform’s Biggest Problems

9 minute read

When Facebook founder Mark Zuckerberg laid out his plan in 2018 for an independent oversight board to review employees’ action on controversial posts, he said the process would ensure the company was making decisions that were in the best interests of the community.

Now the Oversight Board, whose 20 members include lawyers, former heads of state, media professionals and human rights experts, has picked its first cases to rule on.

The board was cautiously welcomed by some activists as a step in the right direction for transparency, but critics say its powers are too limited to actually take on the big-picture problems Facebook has with controlling the spread of misinformation and hate speech.

And at least one group that seeks to keep Facebook accountable says the Oversight Board’s cases show that it will not actually address the disinformation—including about COVID-19 and the 2020 U.S. election—that are rampant on the social media platform, which has an estimated 2.7 billion users worldwide.

Read more: How Conspiracy Theories Are Shaping the 2020 Election

The six cases selected by the Oversight Board cover some of the bases that Facebook has faced difficulty moderating in 2020. There are two pieces of alleged hate speech, one piece of alleged COVID-19 misinformation, one example of alleged incitement to violence, and one quote from a person deemed a “dangerous individual” by Facebook. There is also a dispute about a female nipple in a post about breast cancer. The board is not yet considering any posts that contain misinformation about the 2020 U.S. election.

Here’s what to know about the Facebook Oversight Board and the first cases it selected.

What is the Facebook Oversight Board?

The Facebook Oversight Board is an independent body set up by Facebook in May to review decisions to take down controversial posts.

Facebook executives hope the Oversight Board—a Supreme Court-like organization—will relieve some of the pressure that has built over years of controversial coverage of the company’s enforcement of rules (or lack of it) on hate speech and misinformation, among other issues.

The Board is “building a new model for platform governance,” Helle Thorning-Schmidt, one of its four co-chairs and a former Prime Minister of Denmark, told reporters on a call in May when the board members were announced.

P&G, Global Citizen, ANA #SheIsEqual Summit
Former Prime Minister of Denmark Helle Thorning-Schmidt speaks at the first-ever #SheIsEqual Summit on September 28, 2018 in New York City.Leigh Vogel/Getty Images for Procter & Gamble

But the board’s powers are limited. It can only rule on whether posts have been wrongly taken down, not on ones that are allowed to remain—though it says it will begin doing this in the future. It can only interpret Facebook’s current rules, not set new ones.

It also has no subpoena power to request internal documents from Facebook—for example, in cases where complaints have been made about staff decision-making, a spokesperson for the board confirmed to TIME.

And so far, it has only selected six cases out of more than 20,000 reported by users since submissions opened in October, and has given itself three months to deliberate. The goal, the board said, is to set precedents by ruling on cases that will likely affect lots of users.

“We don’t know if this board is actually going to make any sort of difference on the policy and behavior of the platform until they decide their first cases and we see how Facebook reacts,” says Dia Kayyali, associate director of advocacy at Mnemonic, a human rights group.

What are the cases the Facebook Oversight Board is considering?

The board shared details about its first six cases on Dec. 1, but did not release specifics that would identify the users involved.

Five of the Oversight Board’s cases are:

  • A case where a user shared two photos of a dead child, along with commentary asking why China’s treatment of Uyghur Muslims has not been met with retaliation.
  • A case of a user accusing Azerbaijanis of demolishing Armenian-built churches, alongside a caption indicating disdain for Azerbaijani people.
  • A case where a user shared photos of breast cancer symptoms, some of which included a visible female nipple, in what they said was an effort to raise awareness.
  • A case where a user shared a quote by Joseph Goebbels, the Nazi propaganda head, about the importance of appealing to emotions rather than intellect and the unimportance of truth in political propaganda. The user said the quote was meant as commentary on current U.S. politics.
  • A post within a Facebook group criticizing France’s COVID-19 strategy and claiming that the unproven drugs hydroxychloroquine and azithromycin are cures for COVID-19.
  • All of the posts were originally deleted by Facebook moderators—decisions that the users behind the messages argue were unfair.

    In a sixth case announced Tuesday, a user had shared comments made by former Malaysian Prime Minister Mahathir Mohamad saying Muslims have the right to kill millions of French people—comments which were removed by Facebook for violating hate speech rules. The user posted the message without comment, in what they said was an effort to raise awareness of the “horrible words” Mohamad had used.

    But on Friday, the Oversight Board issued an update saying that it could no longer consider the case involving messages by Malaysia’s former leader because the post in question had been removed by the user who posted it, making it ineligible for review.

    The Oversight Board replaced that case with a post involving alleged incitement to violence in India: a photo of a man with a sword, with a caption “that discusses drawing a sword from its scabbard in response to ‘infidels’ criticizing the prophet.” The accompanying text appears to refer to recent comments by President Emmanuel Macron of France, which some Muslims considered Islamophobic. Facebook removed the post for incitement to violence, for containing a “veiled threat” against Macron.

    Experts questioned the limited information that the Oversight Board has made available about each case. “I am pretty concerned that ultimately we are not going to have a good sense of what information they’re basing the decisions on,” Kayyali says. “Seeing it in action, it’s difficult to compare it to a court if this is the level of information that is provided.”

    The board’s first six cases highlight some of the tricky decisions that Facebook’s moderators regularly face. Should a user be allowed to share a politician’s hate speech with the intent of highlighting how dreadful it is, even if the user doesn’t post any accompanying information? Should photographs of deceased children be allowed if they are used to drive home a political point? Where should the line be drawn between misinformation and the criticism of a national government’s COVID-19 policy?

    “This model of independent oversight represents a new chapter in online governance, and we’re committed to implementing the board’s decisions,” said Brent Harris, Facebook’s director of governance and global affairs, in a statement. “We look forward to the board’s first decisions, which should be issued in the months to come.”

    Who’s criticizing the Facebook Oversight Board?

    Earlier this year, a group of tech policy experts and activists announced they were setting up an alternative group—confusingly called the “Real Facebook Oversight Board”—to hold Facebook to account over a process that they see as flawed.

    The group has welcomed the increase in transparency for a moderation process that is opaque, but says it is more of a publicity stunt than a solution to misinformation and hate speech, which it says are still rife on Facebook’s platforms.

    “This is not an effort to actually fix Facebook’s problems in real time,” says Jessica González, who sits on the alternative board and is the co-CEO of the non-profit Free Press, which campaigns for media and Internet freedoms.

    “It’s important for Facebook to have a process to review whether it’s appropriately taking down content,” she says. “But it’s not addressing the major problems that we’re seeing on Facebook, which are rampant amounts of disinformation about [COVID-19] and the U.S. election. Nothing comes before that board unless it has already been taken down from Facebook, and of course there is loads of content that remains on the platform.”

    Posts casting doubt upon the results of the 2020 election are still widely available, and so is misinformation about COVID-19. While Facebook labeled 180 million posts ahead of Nov. 3 as election-related misinformation, and has flagged a similar number of posts for falsehoods about COVID-19, executives have resisted taking stricter measures to crack down on what even many of Facebook’s own employees see as content that is harmful to democracy and public health, according to a Nov. 24 New York Times report that cited unnamed employees.

    In a stunt announced on the same day as the Facebook Oversight Board revealed its first cases, the alternative board said its members would hear three cases of their own: ones regarding decisions that under current rules would be ineligible for consideration by Facebook’s board.

    The first is the decision about whether to ban Steve Bannon for using the platform to reference Dr. Anthony Fauci, the U.S. government’s leading infectious disease expert, being beheaded. The two others are the case of a Vietnamese activist who Facebook banned after a request from the country’s government (ineligible for review by the Oversight Board,) and a specific Facebook group where COVID-19 misinformation is rampant, according to the activists. (Under current rules, the Oversight Board cannot rule on whether specific groups should be allowed on the platform.)

    “Our goal is a democratically accountable Facebook,” a spokesperson for the activists’ alternative board said in a statement. “There is so much content causing so much harm on the site right now but none of this is eligible to be reviewed by the board. We have established a shadow governance process to hear the cases that Facebook simply will not allow their own Oversight Board to adjudicate.”

    More Must-Reads From TIME

    Write to Billy Perrigo at billy.perrigo@time.com