Another Facebook Whistleblower Just Testified in British Parliament. Here’s What to Know About Her Allegations

5 minute read

While appearing before a committee of lawmakers in the British parliament on Monday, Facebook whistleblower Sophie Zhang, a former data scientist for the company, testified that the social media site is allowing authoritarian governments to manipulate political discourse.

Less than two weeks after Frances Haugen’s appearance before a U.S. Senate subcommittee, Zhang’s first public testimony further implicated Facebook in turning a blind eye to disinformation campaigns in order to prioritize profits.

“It’s a company whose official goal is to make money, [it’s] more focused on protecting itself,” Zhang said.

During the evidence session, which formed part of the British government’s drafting of a new bill aimed at tackling harmful content online, Zhang was questioned by lawmakers about her work as a data scientist for the Facebook Site Integrity fake engagement team, dealing with bot accounts, often operated by government-backed agencies.


More from TIME


She told members of Parliament that even though removing fake accounts is part of Facebook policy, “there was a perverse effect in that, if I found fake accounts that were not directly tied to any political figure, they were often easier to take down than if I found fake accounts that were.” This effect, she said, “creates an incentive for major political figures to essentially commit a crime openly.”

Zhang was fired from Facebook in Aug. 2020 for poor performance—which she claims was due to her prioritizing eradicating civic fake engagement over management orders.

Before she left, she published an internal memo accusing the company of allowing its self-interest and desire to make profits to interfere with its responsibility to protect democracy.

In April this year, feeling responsible for what she saw as Facebook’s willingness to let fake engagement run rife through developing countries, Zhang went public in an interview with the Guardian. Her revelations included dangerous loopholes in the social media company’s policies to tackle political manipulation, and indifference to and deflection of the issue by senior management.

“At Facebook, the people charged with making important decisions about what the rules are and how the rules are getting enforced are the same as those charged with keeping good relationships with local politicians and governmental members,” she told lawmakers. This “creates a natural conflict of interest.”

During her time as an employee focusing on fake engagement—likes, comments, shares and reactions made by inauthentic accounts—she discovered a dangerous loophole in company rules. Whereas Facebook ensures authenticity of personal accounts, it has no policy to regulate Pages set up for businesses, brands or individuals.

When fake accounts are set up on Facebook by political actors, it allows them to manipulate platform engagement—which Facebook calls “coordinated inauthentic behavior (CIB).” The most famous examples of CIB occurred during the 2016 U.S. election when Russia’s Internet Research Agency set up Facebook accounts identifying as Americans and used them to influence political debates on Facebook.

While the company knew about these practices by fake personal accounts, Zhang unearthed large CIB networks, in countries from Azerbaijan to Iraq, instigated by fake Pages.

In Honduras, she found thousands of fake Pages boosting the posts of the country’s rightwing nationalist president, Juan Orlando Hernández, whose reelection in 2017 is widely viewed as fraudulent. “It took 11 and a half months [for Facebook] to start the investigation” into the Honduras CIB campaign, Zhang told the Committee.

When Zhang raised her findings with senior management, she was told by Guy Rosen, Facebook’s vice-President of Integrity, that threat intelligence would only prioritize campaigns in “the US/western Europe and foreign adversaries such as Russia/Iran/etc.” The company did not have “unlimited resources,” he said.

“Facebook pays more attention to countries like the United States and Britain but also to India because of the importance of these countries to Facebook,” she told the Committee.

Facebook’s “resources [to tackle misinformation or CIB] differ considerably between nations,” Zhang said. For instance, “if you wanted an AI to determine if a content is hate, you need AI that can speak [the] language” of the country in question, which, she said, Facebook does not currently have.

Many of the issues could be solved by increased resources, Zhang argued. Teams that “work on integrity and investigations…are chronically under-resourced, which I think it’s a statement of the company’s priorities,” she said. “You don’t hear about the ads marketing team at Facebook being chronically under-resourced, for instance.”

Zhang explained that the impact of Facebook’s inaction was particularly significant in countries with authoritarian leaders who are “creating activity [on the platform] to manipulate their own citizenry.”

Zhang’s public testimony is yet another headache for Facebook. In September, former employee Frances Haugen leaked tens of thousands of pages of company documents to the Wall Street Journal revealing the social media company failed to act on issues such as damage to teen mental health.

During Haugen’s subsequent appearance at the Senate Commerce subcommittee on Oct. 5, she said “Facebook is prioritizing profit over people.”

The former employee detailed how Facebook’s algorithm’s heavy weighting of “meaningful social interactions,” i.e. content that generates strong reactions, has amplified divisive content on the platform, fostered hate speech and misinformation, and incited violence, including recent ethnic violence in Ethiopia.

More Must-Reads From TIME

Contact us at letters@time.com