Facebook founder and chief executive officer Mark Zuckerberg will strike an apologetic tone when he appears before Congress this week, expressing contrition for the website’s oversight of user data and its role in the 2016 election.
In testimony at a joint hearing of the Senate Judiciary and Commerce committees on Tuesday and before a House committee on Wednesday, the famously aloof social media executive will take responsibility for failing to handle its civic duties with more care.
“It’s clear now that we didn’t do enough to prevent these tools from being used for harm … That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy,” he will say, according to advance testimony to the House Committee on Energy and Commerce released Monday. “We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”
The hearings come several weeks after it was reported that Cambridge Analytica, a data-mining firm linked to President Donald Trump’s 2016 presidential campaign, had improperly stored the private information of tens of millions of users.
The Cambridge Analytica scandal is only the latest to besiege the social media giant in the wake of the 2016 presidential election. Last fall, a senior Facebook executive (alongside representatives from Google and Twitter) admitted before a congressional hearing that Russian actors had exploited advertising and networking social media tools to “sow division and discord” in the American political conversation both before and after the 2016 election.
Zuckerberg has since faced loud and widespread public criticism, while Facebook’s stock has plunged.
In his remarks before Congress, Zuckerberg is expected to explain the conditions that allowed Cambridge Analytica to exploit private data and detail how Facebook plans to prevent similar breaches in the future.
In the written testimony submitted ahead of Wednesday’s House hearing, Zuckerberg explained that in 2013, a Cambridge University researcher launched a personality quiz on Facebook that drew on the personal information of the roughly 300,000 users who installed it and millions more of their friends. Zuckerberg stressed that it was only a year later, in 2014, that the company revised its privacy policy to prevent Facebook applications from accessing this data, but by that point, the damage was done: the researcher, Aleksandr Kogan, had already shared the lode of information with Cambridge Analytica.
“We have a responsibility to make sure what happened with Kogan and Cambridge Analytica doesn’t happen again,” Zuckerberg writes in the testimony. “We need to make sure that developers like Kogan who got access to a lot of information in the past can’t get access to as much information going forward.”
In the testimony, Zuckerberg articulates what those remedial strategies might look like. “We’re removing developers’ access to your data if you haven’t used their app in three months,” he writes. “We’re reducing the data you give an app when you approve it to only your name, profile photo, and email address. That’s a lot less than apps can get on any other major app platform.”
He will also address the matter of Russian interference in the 2016 election — the threat of which, he writes, his team had “been aware of … for years.” However, he notes that the tactics employed in 2016, namely the proliferation of fake accounts and misinformation, were unprecedented and unexpected. Today, he says, the company is vigilant in searching for, identifying, and removing these accounts. Facebook is also raising the standards of authentication for those attempting to purchase political ads on the social media platform.
“I started Facebook when I was in college. We’ve come a long way since then,” Zuckerberg writes in his prepared testimony. “We now serve more than 2 billion people around the world, and every day, people use our services to stay connected with the people that matter to them most. I believe deeply in what we’re doing. And when we address these challenges, I know we’ll look back and view helping people connect and giving more people a voice as a positive force in the world.”
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com