• Tech
  • Social Media

Facebook and Twitter Finally Locked Donald Trump’s Accounts. Will They Ban Him Permanently?

6 minute read
Updated: | Originally published:

Twitter and Facebook imposed their toughest restrictions so far on President Donald Trump on Wednesday evening, after he incited his supporters to storm the U.S. Capitol in Washington in an attempt to overturn his election loss. Both companies temporarily suspended the President from posting on their platforms and removed several of his posts, but stopped short of permanently banning him.

The account suspensions are the farthest either platform has gone in restricting Trump from broadcasting his message directly to his tens of millions of followers. The moves come after years of calls for social media companies to do more to stop the President from spreading misinformation, conspiracy theories and threats that undermine democracy.

Twitter required Trump to delete three tweets that the company said violated their rules, and said it would suspend his account from posting for 12 hours after their removal. “If the Tweets are not removed, the account will remain locked,” Twitter said in a statement. The tweets are no longer visible on his profile. The company also said that if Trump violated their rules again, his account would be permanently banned.

After Twitter acted, Facebook suspended Trump from posting for 24 hours from Wednesday evening, and deleted two posts it said violated its rules. Instagram, owned by Facebook, did the same. Then, on Thursday, Facebook CEO Mark Zuckerberg said in a statement that Facebook would be extending the block on Trump indefinitely, and for at least two weeks until “the peaceful transition of power is complete.”

Read More: Photographs From Inside the Chaos at the Capitol

Twitter, Facebook and YouTube also removed a video posted by Trump in which he called on rioters to go home, but doubled down on his false claims that the election was stolen and told rioters he loved them. “This is an emergency situation and we are taking appropriate emergency measures, including removing President Trump’s video,” Facebook’s chief of safety and integrity, Guy Rosen, said in a statement. “We removed it because on balance we believe it contributes to rather than diminishes the risk of ongoing violence.”

Trump Supporters Hold "Stop The Steal" Rally In DC Amid Ratification Of Presidential Election
A pro-Trump mob enter the Capitol Building after breaking into it on January 6, 2021 in Washington, DC.Jon Cherry/Getty Images

Facebook and Twitter have long had rules against inciting violence on their platforms, but throughout Trump’s presidency they have refused to suspend or ban Trump in cases where critics say he has fanned the flames of violence.

As Black Lives Matter protests spread around the country in late May after the police killing of George Floyd, Twitter prevented most ways of engaging with a tweet by Trump that said “when the looting starts, the shooting starts”—which the company said violated its rules on incitement to violence. But it allowed the tweet to remain accessible behind a warning message, with Twitter saying in a statement it “determined that it may be in the public’s interest for the Tweet to remain accessible.” Facebook, meanwhile, refused to take any action against Trump’s post, prompting some employees to stage a walkout.

Read More: Trump’s Twitter Has Defined His Presidency. Here’s Why That Won’t Change

Chief among the platforms’ reasons for not banning Trump at that point were that as President, his words were inherently worthy of public attention, scrutiny and discussion. That appeared to change after Wednesday’s assault on the Capitol. “There have been good arguments for private companies to not silence elected officials, but all those arguments are predicated on the protection of constitutional governance,” said Alex Stamos, Facebook’s former chief of security, in a tweet on Wednesday shortly before Facebook and Twitter temporarily suspended Trump’s accounts. “The last reason to keep Trump’s account up was the possibility that he would try to put the genie back in the bottle but as many expected, that is impossible for him.”

The events of January 6, 2021, were—to the tech platforms at least—a glaring sign that the risk of violence, and to democracy, was now greater than the necessity to continue giving a sitting President a platform to speak. And for platforms steeped in the very American notion that freedom of speech is core to democracy, it was a belated acknowledgement of a key lesson from history: that sometimes, a democratically-elected leader can intentionally undermine democracy with inflammatory speech and a large platform. Many democracies have caveats to free speech rules to prevent that from happening. Facebook and Twitter have similar rules for most ordinary users, who can be banned for inciting violence. Until now, Trump has got away with little more than a slap on the wrist for using the platforms to do the same.

Zuckerberg justified waiting until only 13 days were left of Trump’s Presidency to suspend him, saying Wednesday’s events changed things. “Over the last several years, we have allowed President Trump to use our platform consistent with our own rules, at times removing content or labeling his posts when they violate our policies,” he said Thursday. “We did this because we believe that the public has a right to the broadest possible access to political speech, even controversial speech. But the current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government.”

Activists said the bans should just be the start of a tougher approach to big tech regulation by the Biden Administration. “The riots in D.C. yesterday demonstrate very clearly the consequences of disinformation amplified in social media with such incessant frequency that it becomes an alternative reality for those targeted with lies and conspiracy,” said Ben Scott, the executive director of Reset, a group lobbying for stricter regulation of tech platforms, in a statement to TIME. “Regulating these tech platforms should be a major priority for the Biden administration.”

Even after suspending Trump’s accounts, Facebook and Twitter remain platforms where misinformation circulates almost freely. On Thursday morning, a Facebook group called “Stop the Steal” with more than 14,000 members continued to be accessible on the platform. And while Facebook has designed algorithms that since 2019 (when they work properly) have pointed users in the direction of fact-checks when verified false information is posted by individual users, those protections are still easily circumventable so long as users post screenshots of a piece of misinformation instead of linking directly to it.

And the relatively short lengths of the suspensions (12 hours from Twitter and 24 from Facebook) show just how reticent the platforms still are to fully ban a sitting President. Still, neither platform has ruled out taking further action against Trump—and the possibility that he might be banned entirely after stepping down as President remains open.

“This temporary ban doesn’t go far enough,” said Rashad Robinson, president of Color of Change, a group that has long called on social media platforms to ban President Trump. “Ban him permanently. He’s done enough damage. Do not allow him to return in a day to continue to spread dangerous misinfo.”

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com