Online violence leads to real world violence. The January 6th storming of the U.S. Capitol and the role of social media in helping coordinate the riots was not a surprise and the emerging details show this could have been mitigated, if not prevented, if Facebook had taken years of warnings seriously.
Buzzfeed reported that for weeks “Women for America First” were calling for violence on Facebook, Twitter and other platforms. The Washington Post reported that dozens of organizations used Facebook to organize bus rides and transportation to the riots, contradicting Sheryl Sandberg’s claim that these events were “largely organized on platforms that don’t have our abilities to stop hate, don’t have our standards and don’t have our transparency.”
Facebook also continues to allow Steve Bannon to broadcast despite calling for the beheading of a government official and continuing to make claims the 2020 election was fraudulent. “Stop the Steal” groups perpetuated for months. Evidently, we should not take Facebook’s commitment to stop hate at face value.
This is the context for Facebook’s latest PR offensive, touting the new Facebook Oversight Board, specifically the news that at the Oversight Board will rule on Facebook’s January 7th indefinite ban of Donald Trump.
We applauded the decision to ban Trump. Last September, we joined a statement calling on Mark Zuckerberg to ban President Trump and asked asked if blood needed to be spilled before he took action to stop hate and disinformation on his platforms. Sadly, America got its answer: five people died at the insurrection on January 6, some of whom stewed for month in hate speech on Facebook. Activists and dissidents have died in Myanmar, Sri Lanka, the Philippines, India and Pakistan in part because of Facebook’s inaction. An additional 400,000 plus have died from COVID-19 in the U.S., many of them influenced by disinformation amplified by social media.
It’s clear that self-regulation has failed, and governments can no longer shirk responsibility for protecting citizens from harmful technology platforms. We recommend regulation along three dimensions: safety, privacy, and competition. To change incentives and make tech safe, there should be personal liability for tech engineers and executives who cause harm. To restore privacy and self-determination, there need to much greater limits on the ownership and exploitation of personal data by businesses. And it is past time to break up tech monopolies and encourage business models that empower, rather than exploit users.
The reality is that Facebook failed for years to take action over Donald Trump’s repeated use of its platform to incite violence, spread disinformation, and ultimately try to subvert the election. The organization we represent, the Real Facebook Oversight Board, was formed because our members—leading civil rights leaders, experts and academics— feared Facebook was being used to organize a coup in real time. Our fears became reality.
Shouldn’t we celebrate the ban, then, and tout Facebook’s referral of Trump’s “case” to their Oversight Board for deliberation? Sadly, no. The ban may be temporary, and the Oversight Board is an effort to cloak harmful decisions in a veil of legitimacy.
There are four reasons why.
First, mandate. The Oversight Board has a quasi-legal structure and will make judgments based on arguments of legality with a bias towards Facebook’s extreme interpretation of free speech. The Trump ban is a matter of public safety, a perfect example of the limits of free speech. If the First Amendment is the only consideration, as is likely to be the case, the Oversight Board will recommend reinstating Trump. This is likely Facebook’s preferred outcome, as Trump is good for their business. The Oversight Board will provide cover for Facebook to do what it wants to do.
Second, lack of independence. Only one entity can refer cases to the Oversight Board and guarantee their review—Facebook. Cases are heard in private, by a hand-picked, paid board which reports findings back to Facebook for action. It gives the appearance of oversight, but the process is ultimately controlled by Facebook itself.
Third, structure. There are millions of problematic posts on Facebook every day, the vast majority of which leave the victims without recourse. The Oversight Board can take on only a handful of cases, always with a significant delay. The board is powerless by design to step in and address the constant churn of disinformation, hate speech or questionable content that’s live on the site. The sad truth is that harmful content is highly engaging and serves as a lubricant for Facebook’s core business.
Fourth, lack of legitimacy. Governments have failed to protect their citizens from harmful technology. Some authoritarian leaders, like Trump in the U.S. and Duterte in the Philippines, have harnessed Facebook to gain and consolidate their power. Despite terms of service to the contrary, Facebook permits harmful content to pervade its sites. Enforcement is inconsistent, when it happens at all. The Oversight Board has been structured to deal with individual posts, rather than the systemic flaws of Facebook. It is an illusion disguised as a remedy.
Facebook’s strategy appears to be alignment with winners, which may explain why the leaders of India and Brazil are not held to the same benchmark as Donald Trump and, even now, are using the Facebook group of companies to incite violence and spread disinformation.
Trump and his enablers remain unrepentant, which means they are still a threat to public safety. For that reason, we believe Facebook should permanently ban Trump and his enablers from all of its platforms.
Facebook’s business model harms public health, democracy, privacy, and competition. Its management refuses to acknowledge responsibility for these harms, much less a willingness to make necessary changes. We should not pretend that the Facebook Oversight Board is more than a McGuffin designed to distract us from serious issues. As citizens, we must demand that our leaders take action.
Roger McNamee is an early Facebook investor and author of the New York Times bestseller Zucked: Waking Up to the Facebook Catastrophe; Maria Ressa is a 2018 Time Person of the Year and co-founder of Rappler in the Philippines. Both are founding members of the Real Facebook Oversight Board.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com