• Business
  • facebook

How Facebook Forced a Reckoning by Shutting Down the Team That Put People Ahead of Profits

19 minute read

Facebook’s civic-integrity team was always different from all the other teams that the social media company employed to combat misinformation and hate speech. For starters, every team member subscribed to an informal oath, vowing to “serve the people’s interest first, not Facebook’s.”

The “civic oath,” according to five former employees, charged team members to understand Facebook’s impact on the world, keep people safe and defuse angry polarization. Samidh Chakrabarti, the team’s leader, regularly referred to this oath—which has not been previously reported—as a set of guiding principles behind the team’s work, according to the sources.

Chakrabarti’s team was effective in fixing some of the problems endemic to the platform, former employees and Facebook itself have said.

But, just a month after the 2020 U.S. election, Facebook dissolved the civic-integrity team, and Chakrabarti took a leave of absence. Facebook said employees were assigned to other teams to help share the group’s experience across the company. But for many of the Facebook employees who had worked on the team, including a veteran product manager from Iowa named Frances Haugen, the message was clear: Facebook no longer wanted to concentrate power in a team whose priority was to put people ahead of profits.

Facebook Mark Zuckerberg Time Magazine Cover
Illustration by TIME (Source photo: Getty Images)

Five weeks later, supporters of Donald Trump stormed the U.S. Capitol—after some of them organized on Facebook and used the platform to spread the lie that the election had been stolen. The civic-integrity team’s dissolution made it harder for the platform to respond effectively to Jan. 6, one former team member, who left Facebook this year, told TIME. “A lot of people left the company. The teams that did remain had significantly less power to implement change, and that loss of focus was a pretty big deal,” said the person. “Facebook did take its eye off the ball in dissolving the team, in terms of being able to actually respond to what happened on Jan. 6.” The former employee, along with several others TIME interviewed, spoke on the condition of anonymity, for fear that being named would ruin their career.

Tour Of Facebook's Elections Integrity War Room
Samidh Chakrabarti, head of Facebook's civic-integrity team, stands beside Katie Harbath, a Facebook director of public policy, in Facebook's headquarters in Menlo Park, California, on Oct. 17, 2018.Paul Morris—Bloomberg/Getty Images

 

Enter Frances Haugen

Haugen revealed her identity on Oct. 3 as the whistle-blower behind the most significant leak of internal research in the company’s 17-year history. In a bombshell testimony to the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security two days later, Haugen said the civic-integrity team’s dissolution was the final event in a long series that convinced her of the need to blow the whistle. “I think the moment which I realized we needed to get help from the outside—that the only way these problems would be solved is by solving them together, not solving them alone—was when civic-integrity was dissolved following the 2020 election,” she said. “It really felt like a betrayal of the promises Facebook had made to people who had sacrificed a great deal to keep the election safe, by basically dissolving our community.”

Read more: Facebook Will Not Fix Itself

In a statement provided to TIME, Facebook’s vice president for integrity Guy Rosen denied the civic-integrity team had been disbanded. “We did not disband Civic Integrity,” Rosen said. “We integrated it into a larger Central Integrity team so that the incredible work pioneered for elections could be applied even further, for example, across health-related issues. Their work continues to this day.” (Facebook did not make Rosen available for an interview for this story.)

The defining values of the civic-integrity team, as described in a 2016 presentation given by Samidh Chakrabarti and Winter Mason. Civic-integrity team members were expected to adhere to this list of values, which was referred to internally as the "civic oath".Impacts of Civic Technology Conference 2016

Haugen left the company in May. Before she departed, she trawled Facebook’s internal employee forum for documents posted by integrity researchers about their work. Much of the research was not related to her job, but was accessible to all Facebook employees. What she found surprised her.

Some of the documents detailed an internal study that found that Instagram, its photo-sharing app, made 32% of teen girls feel worse about their bodies. Others showed how a change to Facebook’s algorithm in 2018, touted as a way to increase “meaningful social interactions” on the platform, actually incentivized divisive posts and misinformation. They also revealed that Facebook spends almost all of its budget for keeping the platform safe only on English-language content. In September, the Wall Street Journal published a damning series of articles based on some of the documents that Haugen had leaked to the paper. Haugen also gave copies of the documents to Congress and the Securities and Exchange Commission (SEC).

Read more: The Facebook Whistleblower Revealed Herself on 60 Minutes. Here’s What You Need to Know

The documents, Haugen testified Oct. 5, “prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages.” She told Senators that the failings revealed by the documents were all linked by one deep, underlying truth about how the company operates. “This is not simply a matter of certain social media users being angry or unstable, or about one side being radicalized against the other; it is about Facebook choosing to grow at all costs, becoming an almost trillion-dollar company by buying its profits with our safety,” she said.

Facebook’s focus on increasing user engagement, which ultimately drives ad revenue and staves off competition, she argued, may keep users coming back to the site day after day—but also systematically boosts content that is polarizing, misinformative and angry, and which can send users down dark rabbit holes of political extremism or, in the case of teen girls, body dysmorphia and eating disorders. “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people,” Haugen said. (In 2020, the company reported $29 billion in net income—up 58% from a year earlier. This year, it briefly surpassed $1 trillion in total market value, though Haugen’s leaks have since knocked the company down to around $940 billion.)

Asked if executives adhered to the same set of values as the civic-integrity team, including putting the public’s interests before Facebook’s, a company spokesperson told TIME it was “safe to say everyone at Facebook is committed to understanding our impact, keeping people safe and reducing polarization.”

In the same week that an unrelated systems outage took Facebook’s services offline for hours and revealed just how much the world relies on the company’s suite of products—including WhatsApp and Instagram—the revelations sparked a new round of national soul-searching. It led some to question how one company can have such a profound impact on both democracy and the mental health of hundreds of millions of people. Haugen’s documents are the basis for at least eight new SEC investigations into the company for potentially misleading its investors. And they have prompted senior lawmakers from both parties to call for stringent new regulations.

Read more: Here’s How to Fix Facebook, According to Former Employees and Leading Critics

Haugen urged Congress to pass laws that would make Facebook and other social media platforms legally liable for decisions about how they choose to rank content in users’ feeds, and force companies to make their internal data available to independent researchers. She also urged lawmakers to find ways to loosen CEO Mark Zuckerberg’s iron grip on Facebook; he controls more than half of voting shares on its board, meaning he can veto any proposals for change from within. “I came forward at great personal risk because I believe we still have time to act,” Haugen told lawmakers. “But we must act now.”

Potentially even more worryingly for Facebook, other experts it hired to keep the platform safe, now alienated by the company’s actions, are growing increasingly critical of their former employer. They experienced first hand Facebook’s unwillingness to change, and they know where the bodies are buried. Now, on the outside, some of them are still honoring their pledge to put the public’s interests ahead of Facebook’s.

Inside Facebook’s civic-integrity team

Chakrabarti, the head of the civic-integrity team, was hired by Facebook in 2015 from Google, where he had worked on improving how the search engine communicated information about lawmakers and elections to its users. A polymath described by one person who worked under him as a “Renaissance man,” Chakrabarti holds master’s degrees from MIT, Oxford and Cambridge, in artificial intelligence engineering, modern history and public policy, respectively, according to his LinkedIn profile.

Although he was not in charge of Facebook’s company-wide “integrity” efforts (led by Rosen), Chakrabarti, who did not respond to requests to comment for this article, was widely seen by employees as the spiritual leader of the push to make sure the platform had a positive influence on democracy and user safety, according to multiple former employees. “He was a very inspirational figure to us, and he really embodied those values [enshrined in the civic oath] and took them quite seriously,” a former member of the team told TIME. “The team prioritized societal good over Facebook good. It was a team that really cared about the ways to address societal problems first and foremost. It was not a team that was dedicated to contributing to Facebook’s bottom line.”

Chakrabarti began work on the team by questioning how Facebook could encourage people to be more engaged with their elected representatives on the platform, several of his former team members said. An early move was to suggest tweaks to Facebook’s “more pages you may like” feature that the team hoped might make users feel more like they could have an impact on politics.

After the chaos of the 2016 election, which prompted Zuckerberg himself to admit that Facebook didn’t do enough to stop misinformation, the team evolved. It moved into Facebook’s wider “integrity” product group, which employs thousands of researchers and engineers to focus on fixing Facebook’s problems of misinformation, hate speech, foreign interference and harassment. It changed its name from “civic engagement” to “civic integrity,” and began tackling the platform’s most difficult problems head-on.

Shortly before the midterm elections in 2018, Chakrabarti gave a talk at a conference in which he said he had “never been told to sacrifice people’s safety in order to chase a profit.” His team was hard at work making sure the midterm elections did not suffer the same failures as in 2016, in an effort that was generally seen as a success, both inside the company and externally. “To see the way that the company has mobilized to make this happen has made me feel very good about what we’re doing here,” Chakrabarti told reporters at the time. But behind closed doors, integrity employees on Chakrabarti’s team and others were increasingly getting into disagreements with Facebook leadership, former employees said. It was the beginning of the process that would eventually motivate Haugen to blow the whistle.

Facebook Whistle Blower Frances Haugen Testifies To Senate Committee
Former Facebook employee Frances Haugen testifies during a Senate hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' in Washington, D.C., Oct. 5, 2021.Drew Angerer—Getty Images

In 2019, the year Haugen joined the company, researchers on the civic-integrity team proposed ending the use of an approved list of thousands of political accounts that were exempt from Facebook’s fact-checking program, according to tech news site The Information. Their research had found that the exemptions worsened the site’s misinformation problem because users were more likely to believe false information if it were shared by a politician. But Facebook executives rejected the proposal.

The pattern repeated time and time again, as proposals to tweak the platform to down-rank misinformation or abuse were rejected or watered down by executives concerned with engagement or worried that changes might disproportionately impact one political party more than another, according to multiple reports in the press and several former employees. One cynical joke among members of the civic-integrity team was that they spent 10% of their time coding and the other 90% arguing that the code they wrote should be allowed to run, one former employee told TIME. “You write code that does exactly what it’s supposed to do, and then you had to argue with execs who didn’t want to think about integrity, had no training in it and were mad that you were hurting their product, so they shut you down,” the person said.

Sometimes the civic-integrity team would also come into conflict with Facebook’s policy teams, which share the dual role of setting the rules of the platform while also lobbying politicians on Facebook’s behalf. “I found many times that there were tensions [in meetings] because the civic-integrity team was like, ‘We’re operating off this oath; this is our mission and our goal,’” says Katie Harbath, a long-serving public-policy director at the company’s Washington, D.C., office who quit in March 2021. “And then you get into decisionmaking meetings, and all of a sudden things are going another way, because the rest of the company and leadership are not basing their decisions off those principles.”

Harbath admitted not always seeing eye to eye with Chakrabarti on matters of company policy, but praised his character. “Samidh is a man of integrity, to use the word,” she told TIME. “I personally saw times when he was like, ‘How can I run an integrity team if I’m not upholding integrity as a person?’”

Do you work at Facebook or another social media platform? TIME would love to hear from you. You can reach out to billy.perrigo@time.com

Years before the 2020 election, research by integrity teams had shown Facebook’s group recommendations feature was radicalizing users by driving them toward polarizing political groups, according to the Journal. The company declined integrity teams’ requests to turn off the feature, BuzzFeed News reported. Then, just weeks before the vote, Facebook executives changed their minds and agreed to freeze political group recommendations. The company also tweaked its News Feed to make it less likely that users would see content that algorithms flagged as potential misinformation, part of temporary emergency “break glass” measures designed by integrity teams in the run-up to the vote. “Facebook changed those safety defaults in the run-up to the election because they knew they were dangerous,” Haugen testified to Senators on Tuesday. But they didn’t keep those safety measures in place long, she added. “Because they wanted that growth back, they wanted the acceleration on the platform back after the election, they returned to their original defaults. And the fact that they had to break the glass on Jan. 6, and turn them back on, I think that’s deeply problematic.”

In a statement, Facebook spokesperson Tom Reynolds rejected the idea that the company’s actions contributed to the events of Jan. 6. “In phasing in and then adjusting additional measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement,” he said. “When those signals changed, so did the measures. It is wrong to claim that these steps were the reason for Jan. 6—the measures we did need remained in place through February, and some like not recommending new, civic or political groups remain in place to this day. These were all part of a much longer and larger strategy to protect the election on our platform—and we are proud of that work.”

Read more: 4 Big Takeaways From the Facebook Whistleblower Congressional Hearing

Soon after the civic-integrity team was dissolved in December 2020, Chakrabarti took a leave of absence from Facebook. In August, he announced he was leaving for good. Other employees who had spent years working on platform-safety issues had begun leaving, too. In her testimony, Haugen said that several of her colleagues from civic integrity left Facebook in the same six-week period as her, after losing faith in the company’s pledge to spread their influence around the company. “Six months after the reorganization, we had clearly lost faith that those changes were coming,” she said.

After Haugen’s Senate testimony, Facebook’s director of policy communications Lena Pietsch suggested that Haugen’s criticisms were invalid because she “worked at the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives—and testified more than six times to not working on the subject matter in question.” On Twitter, Chakrabarti said he was not supportive of company leaks but spoke out in support of the points Haugen raised at the hearing. “I was there for over 6 years, had numerous direct reports, and led many decision meetings with C-level execs, and I find the perspectives shared on the need for algorithmic regulation, research transparency, and independent oversight to be entirely valid for debate,” he wrote. “The public deserves better.”

Can Facebook’s latest moves protect the company?

Two months after disbanding the civic-integrity team, Facebook announced a sharp directional shift: it would begin testing ways to reduce the amount of political content in users’ News Feeds altogether. In August, the company said early testing of such a change among a small percentage of U.S. users was successful, and that it would expand the tests to several other countries. Facebook declined to provide TIME with further information about how its proposed down-ranking system for political content would work.

Many former employees who worked on integrity issues at the company are skeptical of the idea. “You’re saying that you’re going to define for people what political content is, and what it isn’t,” James Barnes, a former product manager on the civic-integrity team, said in an interview. “I cannot even begin to imagine all of the downstream consequences that nobody understands from doing that.”

Another former civic-integrity team member said that the amount of work required to design algorithms that could detect any political content in all the languages and countries in the world—and keeping those algorithms updated to accurately map the shifting tides of political debate—would be a task that even Facebook does not have the resources to achieve fairly and equitably. Attempting to do so would almost certainly result in some content deemed political being demoted while other posts thrived, the former employee cautioned. It could also incentivize certain groups to try to game those algorithms by talking about politics in nonpolitical language, creating an arms race for engagement that would privilege the actors with enough resources to work out how to win, the same person added.

Tech CEOs Testify Before House Judiciary Subcommittee
Mark Zuckerberg, chief executive officer and founder of Facebook, speaks via video conference during a House Judiciary Subcommittee hearing in Washington, D.C., on, July 29, 2020.Graeme Jennings—Bloomberg/Getty Images

When Zuckerberg was hauled to testify in front of lawmakers after the Cambridge Analytica data scandal in 2018, Senators were roundly mocked on social media for asking basic questions such as how Facebook makes money if its services are free to users. (“Senator, we run ads” was Zuckerberg’s reply.) In 2021, that dynamic has changed. “The questions asked are a lot more informed,” says Sophie Zhang, a former Facebook employee who was fired in 2020 after she criticized Facebook for turning a blind eye to platform manipulation by political actors around the world.

“The sentiment is increasingly bipartisan” in Congress, Zhang adds. In the past, Facebook hearings have been used by lawmakers to grandstand on polarizing subjects like whether social media platforms are censoring conservatives, but this week they were united in their condemnation of the company. “Facebook has to stop covering up what it knows, and must change its practices, but there has to be government accountability because Facebook can no longer be trusted,” Senator Richard Blumenthal of Connecticut, chair of the Subcommittee on Consumer Protection, told TIME ahead of the hearing. His Republican counterpart Marsha Blackburn agreed, saying during the hearing that regulation was coming “sooner rather than later” and that lawmakers were “close to bipartisan agreement.”

As Facebook reels from the revelations of the past few days, it already appears to be reassessing product decisions. It has begun conducting reputational reviews of new products to assess whether the company could be criticized or its features could negatively affect children, the Journal reported Wednesday. It last week paused its Instagram Kids product amid the furor.

Whatever the future direction of Facebook, it is clear that discontent has been brewing internally. Haugen’s document leak and testimony have already sparked calls for stricter regulation and improved the quality of public debate about social media’s influence. In a post addressing Facebook staff on Wednesday, Zuckerberg put the onus on lawmakers to update Internet regulations, particularly relating to “elections, harmful content, privacy and competition.” But the real drivers of change may be current and former employees, who have a better understanding of the inner workings of the company than anyone—and the most potential to damage the business. —With reporting by Eloise Barry/London and Chad de Guzman/Hong Kong

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com