If democracy is a river, or a forest, or a pristine meadow, then social media platforms are a factory spewing toxic pollutants into it. Even if you block the new effluent, the pollution that has already escaped won’t just go away. It needs to be cleaned up.
That’s the analogy used by Whitney Phillips, one of the world’s leading experts on the rise of the far right online.
Twitter and Facebook’s ban of President Trump last week, and the deplatforming of the rightwing social network Parler by Apple, Google and Amazon on Monday, are crucial first steps in stemming the flow of pollution, says Phillips, who is an assistant professor at the Syracuse University department of communication and rhetorical studies. But more is still spilling out, and that’s before you even get to the question of how to clean up what’s already escaped.
Read More: Why Facebook and Twitter Suspended Trump’s Accounts After Capitol Riots
“The real thing that we have to deal with long term is how these platforms didn’t just allow, but actively incentivize the spreading of this pollution. For years and years and years and years, it was allowed to build up in the environment for so long, such that you now have this enormous percentage of the population that has really internalized so much of this pollution,” she says. “You can take away Parler. But that’s not going to take away the belief in tens of millions of people that the election was stolen.”
Phillips and others who research extremism on social media say the core algorithms and business models of the biggest social platforms like Facebook, Twitter and Google-owned YouTube are in large part to blame for the series of events which led eventually to a violent insurrectionist mob of President Trump’s supporters storming the seat of American democracy on January 6. Those platforms amplify content that provokes emotional reactions above all else, Phillips says. That means “historically, the platforms have actually done more to benefit and embolden the right” than any other political grouping.
On Monday, Amazon pulled Parler’s web hosting, which was provided through Amazon Web Services (AWS), forcing it offline. Google and Apple each suspended Parler from their app stores over the weekend, citing its lax moderation practices and the danger that violence was being planned on the platform. “We cannot provide services to a customer that is unable to effectively identify and remove content that encourages or incites violence against others,” Amazon said in a letter to Parler. In response, Parler filed a lawsuit against Amazon on Monday.
The cutting off of Parler came shortly after President Trump himself was banned permanently from Twitter two days after the storming of the Capitol. He was also suspended from Facebook until Joe Biden’s inauguration at the earliest. Justifying Trump’s suspensions, Twitter and Facebook said the President’s continued presence would have increased the risk of violence and potentially undermine the peaceful transition of power. Trump’s YouTube channel remains online, though YouTube deleted a video in which he praised rioters who stormed the Capitol.
Parler has risen in popularity over the past year, as mainstream social networks like Facebook, Twitter and YouTube have slowly built up their guardrails against misinformation and conspiracy theories like QAnon and election fraud, and banning users who violate their policies the most egregiously. Despite those efforts, those mainstream platforms remain hotbeds of misinformation. Still, the rise of Parler (and now, the movement of many Parler users to the messaging app Telegram) is a sign that even if Facebook, YouTube and Twitter manage to eradicate the pollution from their platforms entirely, it will still exist, swilling around American democracy in the form of radicalized users and exploited by opportunistic politicians and unscrupulous media.
Read More: White Supremacism Is a Domestic Terror Threat That Will Outlast Trump
Researchers who study disinformation and the far-right online say that deplatforming can be successful. They point to when the main platforms banned Alex Jones, the founder of conspiracy theory site InfoWars and Milo Yiannopolous, a far right former Breitbart editor, and to the shutting down of Reddit forums catering to incels or the most toxic of Trump supporters, as examples of successfully reducing the number of people such messages can reach. However, researchers point out, deplatforming may do nothing to deradicalize the most devoted users—or reduce the risk of violent attacks. (The FBI says that armed protests are being planned at all 50 state capitols and the U.S. Capitol in the days leading up to, and the day of, Biden’s inauguration.)
At this late stage, when so many people are already radicalized, the solutions have to be more complex than simply deplatforming people, says Phillips. “I think that they made the right call,” she tells TIME of the platforms’ decisions to deplatform Trump and Parler. But up until this point, she says, they have “continually made the wrong calls, or opaque calls, or inconsistent calls, or calls that ultimately allowed this to happen for so long.”
The tech companies’ eventual decisions to deplatform Trump have quickly fed into conspiracy theories about Silicon Valley unfairly censoring conservatives, a narrative pushed by Republicans and online conservatives over the past several years. Now, politicians like Trump are galvanizing their supporters with claims they are unfairly having their freedom of speech restricted by a cabal of companies bent on overturning Trump’s supposed election victory.
Experts in the field also remain troubled by the problem of big corporations like Facebook, Google and Amazon having the sole power to decide who can and can’t have an online voice. In Vietnam, Facebook has complied with requests from the authoritarian government to remove accounts of dissidents. In India, it has evaded banning ruling party lawmakers even when they’ve broken its rules. Experts are troubled by the timing of the decision in the U.S.: neither Facebook or Twitter decided to suspend Trump until after the Democrats won control of the Senate on Jan. 6 and Biden was confirmed by lawmakers as the next President. “It is hard to view this decision, and the timing, as anything other than trying to cozy up to power, as opposed to some form of responsible stewardship of our democracy,” said Yael Eisenstat, a former Global Head of Elections Integrity Operations for political advertising at Facebook, in a statement.
Facebook and Twitter did not immediately respond to TIME requests for comment. In his statement announcing Trump’s suspension, Facebook CEO Mark Zuckerberg said: “Over the last several years, we have allowed President Trump to use our platform consistent with our own rules, at times removing content or labeling his posts when they violate our policies.” This is true, but only because Facebook wrote an exemption into its own rules that allowed posts by public figures like Trump to remain on the platform even if they broke some rules.
In Russia, where a rightwing autocrat is in power, dissidents viewed Twitter’s decision to ban President Trump with extreme skepticism. “In my opinion, the decision to ban Trump was based on emotions and personal political preferences,” tweeted Russia’s main opposition leader, Alexey Navalny. “Of course, Twitter is a private company, but we have seen many examples in Russian and China of such private companies becoming the state’s best friends and enablers when it comes to censorship.” German Chancellor Angela Merkel also raised concerns about the move’s implications for free speech.
Read More: How Ashli Babbitt Is Being Turned Into a Far-Right Recruiting Tool
While the Biden Administration is mulling reform of Section 230, the law that allows platforms legal protection from accountability for what is posted on them, tech policy experts say that it is low on the incoming President’s list of priorities. His tech policy pointman, Bruce Reed, has expressed a desire to reform Section 230 in the past. In December, Senator Mark Warner, a leading Democrat critic of Facebook, told TIME that Biden’s approach could include invoking civil rights laws to bring stricter penalties for people spreading hate speech or racist activity online. But the incoming team is also stacked with former employees of big tech companies, which has left many activists prepared for a fight ahead over the shape of Biden’s tech policy. “Quite frankly, if people in the Biden Administration want to spend their time and energy fighting to help Mark Zuckerberg make more money, then that’s a fight I will take up,” says Rashad Robinson, President of Color of Change, one of the first civil rights groups to call for Trump to be deplatformed back in 2017.
On Monday, nine days before President Joe Biden is set to be inaugurated, and after years of ignoring calls from civil society to appoint a senior executive with civil rights expertise, Facebook announced that a former Obama Administration official, Roy Austin Jr., would be its first ever vice president for civil rights, with a responsibility of overseeing the company’s accountability for racial discrimination and hatred. He starts the day before Biden’s inauguration.
It may be inevitable that political pressure will always have some bearing on the way platforms moderate themselves. In this case, the platforms are finally pivoting their enforcement to respond to Democratic pressure—which happens to align somewhat with civil society—after years of largely ignoring those calls under the Trump Administration. But still, experts say, the core problem remains. “The underlying problem here is not the [platform] rules themselves,” writes technology columnist Will Oremus for OneZero, “but the fact that just a few, for-profit entities have such power over global speech and politics in the first place.”
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com