How Fixing Facebook’s Algorithm Could Help Teens—and Democracy

6 minute read

What does teen anorexia have to do with the crumbling of 21st century democracy? It’s the algorithm, stupid.

On its surface, helping young girls feel better about their bodies doesn’t seem to have much to do with the deep polarization and disinformation threatening civic society around the world. But Tuesday’s testimony by Facebook whistleblower Frances Haugen suggests that they’re both symptoms of the social media platform’s flawed algorithm and corrupt business model, and adjusting Facebook’s algorithm to tackle one problem could go a long way towards addressing the other.

Until Haugen’s whistleblower revelations, which have been published in the Wall Street Journal and on 60 Minutes, most of the conversation about regulating Facebook has focused on hate speech, disinformation, and the platform’s role in enabling the January 6 riot at the Capitol—a conversation that inflames tensions on both sides of the aisle and has led to a political impasse over how to handle the social media giant. But a bipartisan panel of lawmakers seemed uniformly appalled by Haugen’s testimony on Tuesday about Facebook’s potential to hurt kids, which could give Congress a way to move forward in regulating Facebook without getting caught in the controversial political bog of censorship and free speech.

“There is political unanimity about protecting kids,” says Tom Wheeler, who served as chairman of the Federal Communications Commission from 2013 to 2017. “You can say, ‘I want to protect kids and should do that and my algorithms should be focused on that,’ and it’s the same process that also could make sure an algorithm shouldn’t spread lies or hate.”

In her blockbuster testimony to a Senate Commerce subcommittee, Haugen said that Facebook’s “amplification algorithms” and “engagement-based ranking” (the part of the algorithm that rewards posts that get the most likes, shares and follows) were driving children and teenagers to destructive online content, which was leading to body image issues, mental health crises, and bullying. Facebook, she said, was “buying its profits with our safety.” Haugen alleges that Facebook knew its algorithm was funneling teens towards harmful content but refused to take steps to stop it, and that it disbanded the Civic Integrity team shortly after the election, which allowed extremism to flourish on the platform ahead of January 6.

Read More: 4 Big Takeaways From the Facebook Whistleblower Congressional Hearing

But the hearing was much more focused on the former than the latter, a shift in focus that marked a significant evolution in the narrative around what’s wrong with Facebook and what’s required to fix it. It effectively depoliticized the conversation, making it less about right-wing extremism and more about kids’ mental health. And it sidestepped the major partisan disagreement about the dangers of Facebook: Republicans are most concerned by Facebook’s purported attacks on free speech, while Democrats are more outraged by Facebook’s role in spreading hate speech and disinformation online.

Focusing on the potential harm to children puts the blame back on the platform itself, rather than on bad actors using it. “We’ve found a lever on the problem that allows actually both sides to engage on the level without it becoming a ‘what about anti-conservative bias’ or ‘what about the Russian trolls,'” says Jason Goldman, former chief digital officer in Barack Obama’s White House and part of the founding team at Twitter.

The algorithm rewards posts that provoke the most extreme reactions— often anger, rage or fear— because it was designed to keep users looking at the platform for as long as possible, no matter how it makes them feel or what it makes them think. Haugen referenced research that found that kids who started out looking for healthy recipes ended up down a rabbit hole of pro-anorexia content. Reformers like Haugen believe that any true solution would require adjusting the incentives at the crux of Facebook’s platform and business model. Focusing on reforming the algorithm, rather than policing the behavior of its users, is “definitely more dangerous to Facebook,” says Josh Miller, former director of product in the Obama White House, and former product manager and product lead at Facebook, “because it hits at the core of their entire existence.”

To do this, Haugen recommended reforming Section 230, which protects tech companies from liability for third-party content posted on their platform, “to exempt decisions about algorithms,” she said. “Modifying 230 around content is very complicated, because user-generated is something that companies have less control over,” she said. “They have 100% control over their algorithms. And Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety.” Haugen also recommended building a regulatory structure specifically for social media companies, where tech veterans who understand the algorithms can make sure they’re working for the public good (Wheeler, the former FCC Chair, has proposed a similar solution.)

Haugen’s revelations about Facebook’s algorithmic corruption changed the subject on the old back-and-forth over social media censorship, even as some conservatives still attempted to frame it as a free speech issue. Senator Ted Cruz, a Republican from Texas, started his questioning by asserting that Facebook’s targeting of children online was a “a discrete issue” from what he called “political censorship.” But Haugen said taking steps to help kids would help the platform overall, and wouldn’t mean wading into the thorny world of litigating political content. “A lot of the things that I advocate for are around changing the mechanism of amplification, not around picking winners and losers in the marketplace of ideas,” she said, mentioning Twitter’s new requirement that users click on a link before sharing it. “Small actions like that friction don’t require picking good idea or bad ideas, they just make the platform less twitchy, less reactive.”

But some experts say they’re concerned that government oversight of tech algorithms isn’t the right solution and could create more problems than it would solve. “I think the worst case scenario is the U.S. government regulates the algorithms of technology companies,” says Miller, adding that federal regulation of Facebook’s algorithm would be “scary,” because it would mean government influence over journalism and free speech in ways that are far bigger than shutting down a single newspaper or website. Instead, Miller argues, huge tech companies like Facebook need to be broken up so they aren’t as powerful in the first place. “This is an impossible thing to regulate, without precedent, and the only option is to break it up so that no single company wields this much influence,” he says.

Still, Haugen’s whistleblower testimony compelled Senators on both sides of the aisle to say they plan to act. “Here’s my message to Mark Zuckerberg,” said Senator Ed Markey, a Democrat from Massachusetts. “Your time of invading our privacy, promoting toxic content and preying on children and teens is over. Congress will be taking action.”

More Must-Reads From TIME

Write to Charlotte Alter at charlotte.alter@time.com