Facing potentially existential crises in the spread of targeted misinformation, hate speech and violent content on Facebook, the company’s executives have in recent days been calling for new and wide-ranging rules to govern social media companies. “I believe we need new regulation,” Facebook CEO Mark Zuckerberg wrote in an op-ed published March 30 in The Washington Post. “People shouldn’t have to rely on individual companies addressing these issues by themselves.”
Zuckerberg’s stance is a reversal from years past, when Facebook’s motto was “move fast and break things” — a time when social media companies used freedom of speech as a shield against calls for them to remove unsavory content, and when “disruption” was considered a mark of Silicon Valley success. Facebook, Twitter and other social media sites built their businesses on the principle that users, not websites, would be legally responsible for illegal postings. Thanks in part to such protections, for example, Facebook faced no immediate legal ramifications after a mass shooter used the site to live-stream a massacre in Christchurch, New Zealand in March.
Now, that is starting to change. The Facebook CEO’s op-ed was widely interpreted as an effort to get ahead of a regulatory process that some consider inevitable in the wake of the Christchurch shooting — alongside Facebook’s other recent high-profile calamities, like Russian interference in the 2016 presidential election. But what Zuckerberg and others did not expect is the speed at which lawmakers around the world are turning against social media firms. Australian officials have already moved to hold executives like Zuckerberg responsible for violent content on their sites; some in New Zealand wish to follow their lead. On Monday, the United Kingdom’s government went even further, releasing a vast and detailed proposal for new internet laws that would dramatically reshape the ways in which social media companies like Facebook operate. While the proposal remains preliminary and could be derailed by Brexit, the plan is the most drastic reimagining of internet laws to be suggested by a Western government, and could provide a template for other countries looking to police social media companies.
The proposal, which was anticipated even before the Christchurch shooting, aims to tackle “online harms,” and suggests giving the U.K. government sweeping powers to fine tech companies for hosting content like violent videos, misinformation, child exploitation and more. As with Australia’s rules, social media executives like Zuckerberg could even be held personally responsible if their platforms fail to fall into line.
“The era of self-regulation for online companies is over,” said U.K. Culture Secretary Jeremy Wright while announcing the plans. Indeed, the sense that such firms are failing to get a grasp on their societal impact has led to a growing consensus across the world that they must be better regulated. “Disinformation is clearly harmful to democracy and society as a whole,” said Damian Collins, a U.K. lawmaker who has called on the government to regulate social media. “The social media companies must have a responsibility to act against accounts and groups that are consistently and maliciously sharing known sources of disinformation.”
But some activists are criticizing the plans, saying they smack of Orwellian authoritarianism.
“This is an unprecedented attack on freedom of speech that will see internet giants monitoring the communications of billions and censoring lawful speech,” said Silkie Carlo, the director of civil liberties non-profit Big Brother Watch, in a statement to TIME.
Under the plans, the U.K. would set up an independent social media regulator — funded by a tax on social media companies — which would have the jurisdiction to decide the kinds of content considered permissible. Currently, social media companies like Facebook, Twitter and YouTube have their own self-designed codes of conduct. Facebook, for example, has a detailed set of guidelines for policing hate speech on its platform, which has led to controversies — a leaked presentation for training moderators describes the difference between the statements “Keep the horny migrant teenagers away from our daughters” (allowed on Facebook) and “Muslim migrants ought to be killed” (not allowed). Social media companies have been criticized for their varied interpretations of their own rules, with users frequently complaining that such companies often fail to remove content that clearly appears to violate their terms of service.
Facebook, Twitter and YouTube employ tens of thousands of content moderators, whose job is in part to enforce those codes of conduct by removing statements, videos and images that don’t comply (often at great psychological cost to the workers, who are frequently exposed to disturbing material). Under the proposed new rules, social media companies would still have to employ enough people to take down offending content, but those moderators would follow a set of rules set by a state regulator, rather than their employer — potentially allowing social media companies to skirt accusations of violating freedoms of speech.
Such a regime falls in line with what Zuckerberg, at least, has claimed to want.
“He is putting it up to governments and parliaments to set the rules, not him,” says James Lawless, one of three Irish lawmakers who met with Zuckerberg in Dublin on April 2, just days after the Facebook boss urged countries to take a more proactive role in regulating social media. “He said he is uncomfortable with Facebook making decisions on content and issues of this type.”
One big question is whether a government setting rules around free speech is more palatable than a private company doing so. And even if such a system flies in the U.K., it might be difficult to export to countries with different cultural norms around speech.
“This nebulous concept of ‘harmful’ speech will likely be used to simply silence opinions people don’t like,” says Carlo. “These plans position the likes of Facebook and Twitter as policemen of free speech online, overseen by a regulator funded by themselves, and set an abysmal example for internet regulation globally.”
The U.K. proposal does not go so far as to change the fundamental legal basis on which companies like Facebook, YouTube (owned by Google) and Twitter grew into the behemoths they are today: because they’re considered “platforms” and not “publishers,” they have been able to build vast empires by pegging advertising to user-generated content, while passing legal responsibility for that content onto the people who post it. When challenged in the past over material hosted on their sites, Facebook, Twitter and YouTube have long argued they are open “platforms,” like public squares, as opposed to “publishers,” like magazines or television channels, which are legally responsible for what they print or broadcast. Under the U.K. proposals, social media companies would still be seen as platforms, but would have a new “duty of care” to their users. “Applying ‘publisher’ levels of liability to companies would not be proportionate,” the report notes. “Such an approach would force companies to check every piece of content before upload to ensure it was legal,” it says, suggesting that approach is incompatible with the giant scale of social media sites. Still, enforcing that duty of care could prove expensive.
Even if companies like Facebook, Twitter and YouTube welcome the law publicly, it would be only after years of reckoning during which many executives have been forced to accept that an unregulated Internet will not last forever. “I think [Zuckerberg] knows that legislation and regulation is coming,” says Hildegarde Naughton, another Irish lawmaker who was in the meeting with Zuckerberg. Lawless agrees. “Facebook is under pressure and on the back foot,” he tells TIME. “They are on a charm offensive of regulators and politicians right now.”
More Must-Reads from TIME
- How Donald Trump Won
- The Best Inventions of 2024
- Why Sleep Is the Key to Living Longer
- How to Break 8 Toxic Communication Habits
- Nicola Coughlan Bet on Herself—And Won
- What It’s Like to Have Long COVID As a Kid
- 22 Essential Works of Indigenous Cinema
- Meet TIME's Newest Class of Next Generation Leaders
Write to Billy Perrigo at billy.perrigo@time.com