Donald Trump will not be getting his social media megaphone back just yet, but a decision by Facebook’s handpicked Oversight Board on Wednesday opens the door for a possible comeback.
Facebook banned Trump indefinitely following his Jan. 6 incitement of supporters to storm the U.S. Capitol building as lawmakers voted to finalize the results of the 2020 U.S. election, which he lost.
On Wednesday, Facebook’s Oversight Board issued a ruling that called the company “justified” in its suspension of Trump’s accounts after the Capitol riot—but said the company must decide whether to permanently ban the former President, or give him a path to getting back control of his account.
The news that Trump would not be immediately returning to Facebook was immediately criticized by some conservatives. But the Oversight Board’s decision to kick back to Facebook the responsibility for banning Trump also received a lukewarm response from progressives, many of whom have called the Trump case a distraction from the need for government regulation of the platform. Nor is it likely to be happily received by Facebook, which set up the Board in 2020 with the precise intent of offloading controversial decisions to a body it could keep at arm’s length. “In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities,” the Oversight Board said in a statement Wednesday. “The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”
The Oversight Board appeared to focus narrowly on Facebook’s decision to impose an “indefinite” ban. “It was not appropriate for Facebook to impose an ‘indefinite’ suspension,” the Board said. “It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.”
In banning Trump “indefinitely” after the Capitol riot, Facebook did not go as far as Twitter, which banned him permanently, leaving no possibility of return. (Like Facebook, YouTube only banned Trump indefinitely, and has said that it will reinstate his account when the risk of violence subsides.)
The Oversight Board ordered Facebook to “justify a proportionate response that is consistent with the rules that are applied to other users of its platform” within six months—potentially opening the door to Trump returning to the platform before the end of the year. Facebook could still decide to ban Trump permanently.
Wednesday’s news came a day after Trump launched his own “communications” website, with a feed of posts that are labeled as “from the desk” of the former President. “There won’t be any Big Tech censors trying to muzzle Conservatives for sharing FACTS,” said the Republican National Committee in an email sent to supporters.
Though Trump still wields great influence in the Republican Party, his public profile has been relatively small since he left office and had his social media access cut off—relying on radio and TV interviews and public appearances, which don’t give him the same unfiltered access to millions of people.
“If he is no longer on these platforms, you know, memories fade,” says Shanto Iyengar, a professor of political science and communication at Stanford University. “Trump may just disappear into semi-oblivion and someone else may emerge as the populist standard-bearer for 2024. For a political figure, not being in the limelight is a liability, period.”
Why Facebook stripped Trump of his megaphone
Facebook’s decision to suspend Trump came just days before the end of his presidency, during which he had used Twitter, along with Facebook, Instagram and YouTube to spray a daily torrent of misinformation—including lies about COVID-19 and baseless claims of election fraud. Instead of banning him for those untruths, the platforms only went as far as labelling some of his statements as misinformation and removing several others. The free speech implications of banning a sitting President outright were clear—and so was the threat of retaliation from Trump, who had long complained of anti-Conservative bias at Facebook and other Silicon Valley firms. (In fact, research shows that Facebook tends to benefit far-right voices more than other news sources.)
But after a mob stormed the U.S. Capitol, each of the big platforms finally ejected the President, casting their decisions as last-ditch measures aimed at protecting American democracy. “We believe the risks of allowing President Trump to continue to use our service during this period are simply too great,” Facebook said in a statement at the time, announcing the President’s ability to post new content would be suspended indefinitely. (His pages and old posts have still been accessible for the duration of the period.)
On Jan. 21, Facebook passed the Trump case to its new Oversight Board for adjudication. The Board was set up by Facebook in May 2020 to adjudicate the company’s most controversial decisions.
Announcing the decision, Nick Clegg, Facebook’s vice president for global affairs, cast the decision as good for democracy. “Many argue private companies like Facebook shouldn’t be making these big decisions on their own. We agree,” he said, adding that the company believed it would be better for lawmakers to set the rules. “But in the absence of such laws, there are decisions that we cannot duck. This is why we established the Oversight Board.”
So, what is the Facebook Oversight Board, and who gave them the right to make such a momentous decision?
What is the Facebook Oversight Board?
Mark Zuckerberg first publicly floated an idea for a Supreme Court-style body for Facebook in 2018. “I’ve increasingly come to believe that Facebook should not make so many important decisions about free expression and safety on our own,” he said in a blog post.
In May 2020, that idea became reality when Facebook announced the creation of an Oversight Board, with 20 members with experience in fields including government, media, constitutional law, and human rights.
The Board is funded by a $130 million trust, set up by Facebook. The trust pays each of the Board’s members a six figure sum, according to the New York Times. Facebook says the board is legally independent, and that its rulings will be both binding and transparent.
In January, the Board ruled on its first six cases. It overruled Facebook’s original decision in five of them, forcing Facebook to reinstate content that it had removed.
Facebook is bound by the Board’s bylaws to abide by its decisions. But there are several limits on what the Board can currently rule on. It cannot tell Facebook to remove Groups, just individual pieces of content or pages. And it cannot tell Facebook to change the algorithms that decide which content is amplified in users’ newsfeeds.
Already, at least one member of the Facebook Oversight Board has publicly criticized its limited remit. “I think the Board will want to expand in its scope. I think we’re already a bit frustrated by just saying take it down or leave it up,” Alan Rusbridger, former editor of the U.K.’s Guardian newspaper, told a parliamentary committee in March. He also said the Board would eventually demand to see Facebook’s content-ranking algorithms, “At some point we’re going to ask to see the algorithm, I feel sure,” he said. “Whether we can understand it when we see it is a different matter.”
While some observers have welcomed the increase in transparency, the Oversight Board has also come in for criticism. One leading critic is Rashad Robinson, president of the civil rights group Color of Change. “Zuckerberg and Facebook want us to believe they’ve given real power to the Oversight Board, when in fact they have essentially made these people, who have deep credibility and years of work, into hall monitors,” he says.
“Trump’s absence from Facebook has created more room for conversations that are not centered around Trump. But this is not about Trump,” Robinson says. “It’s about platform design, and the incentive structure that benefits from a Trump. It’s like thinking we’ve dealt with racial injustice in policing after George Floyd’s murderer Derek Chauvin was convicted. That doesn’t change the incentive structure. So we’ve ended up with a charitable solution to a structural problem.”
What the Trump ruling means for other politicians on Facebook
Along with its ruling, the Oversight Board issued several non-binding recommendations to Facebook on how it should deal with violations of its rules by political leaders.
In a significant recommendation, the Board called on Facebook to “rapidly escalate content containing political speech from highly influential users to specialized staff who are familiar with the linguistic and political context. These staff should be insulated from political and economic interference, as well as undue influence.”
That recommendation comes after reports in the New York Times that Facebook’s policy staff in Washington, whose job responsibilities included government relations, repeatedly rejected or watered down several proposals aimed at reducing misinformation due to concerns about the measures disproportionately impacting conservatives. (Facebook denied the reports.)
In its offices around the world, including countries with authoritarian leaders, Facebook staff responsible for enforcing the site’s rules are also responsible for keeping a close relationship with governments, and avoiding punitive regulations.
The Board also stressed that the “same rules should apply to all users” of Facebook, but said Facebook should “act quickly” to enforce its rules when posted by influential users, especially when their posts “pose a high probability of imminent harm.”
“The Board noted that heads of state and other high officials of government can have a greater power to cause harm than other people,” the Board said. “If a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm”
“Really, the specifics of how the Board made the decision are more important than the headlines about Trump,” David Kaye, the former United Nations special rapporteur on freedom of expression, told TIME before the ruling.
“This is private oversight, this isn’t public oversight,” Kaye said. “We are living in a moment when there is real pressure for public regulation, and this isn’t it—this is self-regulation.”
How did the Oversight Board make its decision on President Trump?
In making its decision on Trump, the Oversight Board was confined to a narrow question: had the former President broken any of Facebook’s rules, and if so, was his ban justified?
To deliberate that question, the Board followed a process identical to how it tackles all its cases. First, it randomly selects a panel of five members, including at least one from the country where the content originates (in this case, the U.S.). The identities of the panel members are not disclosed to the public.
The Board calls for evidence from the people involved; in the case of Trump, both Facebook and the former President submitted written arguments. (A spokesperson for the Board declined to comment when asked by TIME in February if Trump’s statement would ever be made public.) For each case, the Oversight Board also calls for input from the public. The Trump case garnered more than 9,000 public comments—many times more than all the Board’s previous cases combined.
Next, the panel then looks at how the decision fits with Facebook’s existing policies. When panel members come to an agreement, they submit the decision to the rest of the Board’s 19 members, who vote. If the ruling receives a simple majority, the Board announces its decision.
By putting the hot-button issue of Trump’s access to Facebook back in the company’s hands, the Oversight Board is forcing the social media giant to once again address an issue it had hoped was settled. “We look forward to receiving the Board’s decision,” Clegg said in January. “We hope, given the clear justification for our actions on January 7, that it will uphold the choices we made.”