Mark Zuckerberg, chief executive officer and founder of Facebook Inc., speaks during the F8 Developers Conference in San Jose, California, U.S., on Tuesday, April 18, 2017.
Bloomberg—Bloomberg via Getty Images
By Lisa Eadicicco
January 5, 2018

Each January, Facebook CEO Mark Zuckerberg rings in the New Year with a personal challenge, like learning Mandarin, running 365 miles, or building an artificially intelligent assistant for his home. His pledge for 2018, however, is far less personal, far more vital, and a goal that he should have prioritized sooner than the turn of this new year.

“The world feels anxious and divided, and Facebook has a lot of work to do — whether it’s protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent,” Zuckerberg wrote on his Facebook page on Jan. 4. “My personal challenge for 2018 is to focus on fixing these important issues. We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools. If we’re successful this year then we’ll end 2018 on a much better trajectory.”

Every year I take on a personal challenge to learn something new. I've visited every US state, run 365 miles, built an…

Posted by Mark Zuckerberg on Thursday, January 4, 2018

Read more: The Real Problem With Apple Slowing Down Older iPhones

Indeed, 2017 turned out to be a landmark year for Facebook, but not in a way that the company might have hoped. Despite Zuckerberg’s well-publicized intentions to use Facebook to make the world a better place, his company has found itself embroiled in one controversy after another in recent months and years.

Take last fall, when Facebook revealed it had found a connection between $100,000 in ad spending on its platform and phony accounts likely operated out of Russia. That spending, which occurred between June 2015 and May 2017, was linked to roughly 3,000 ads, and served as evidence of Russia’s desire to influence the outcome of the 2016 U.S. presidential election. “How did Facebook, which prides itself on being able to process billions of data points and instantly transform them in the personal connections with its user, somehow not make the connection that electoral ads, paid for in rubles, were coming from Russia?” former Minnesota senator Al Franken, who recently resigned amid sexual misconduct allegations, asked when grilling company representatives during an October hearing on Russian election meddling.

The year before, an investigation by Gizmodo suggested that Facebook suppressed conservative news outlets in the website’s Trending sidebar, raising concerns over how the site determines which content gets shown to the one billion people who visit daily. “This is a problem because it demonstrates the ability to manipulate the marketplace of ideas from behind the scenes,” Kelly McBride, media ethicist and vice president of The Poynter Institute, wrote at the time. “That’s nothing new, but it undercuts social media proponents who say Facebook is a more democratic news distribution forum.

Meanwhile, Facebook’s live-streaming video tool, known as Facebook Live, emerged as a vehicle for broadcasting horrific acts, a problem that plagues many livestreaming platforms. In one particularly gruesome instance, a man in Thailand broadcast himself killing his 11-month-old daughter on Facebook before committing suicide. BuzzFeed News reported in June that at least 45 instances of violence, including shootings and other brutal attacks, have been streamed on the platform since December of 2015.

To be sure, Facebook has already taken some steps to address these and other issues. In May, it said it would hire thousands of workers to help prevent violent videos from appearing on its social network. The company has also published blog posts and white papers sharing details about the information it’s gathered regarding inauthentic activity and steps it’s taking to prevent and mitigate that problem. Facebook has also introduced several tools to combat fake news, including a feature that provides additional context about media outlets and stories appearing on the site. “We know we have a responsibility to prevent everything we can from this happening on our platforms,” Facebook Chief Operating Officer Sheryl Sandberg said in an October interview with Axios when asked about Russian activity on the site. “We’re determined. These are threats, these are challenges, but we will do everything we can to defeat them.”

During that same interview, Sandberg took a step towards reversing Facebook executives’ longstanding argument that Facebook is not a media company, long seen as a way for the company to duck accepting responsibility for what’s posted on the platform.

“We don’t write any news articles so certainly we’re different than a media company,” she said. “But that doesn’t mean we don’t have responsibility.”

That viewpoint aligns far better with reality (two-thirds of Facebook’s users get news on the platform, according to the Pew Research Center). Zuckerberg, too, now appears to be taking his company’s role in the world more seriously — his New Year’s resolution comes just little more than a year after he dismissed the notion that Facebook played a role in President Donald Trump’s victory as a “pretty crazy idea.” Ultimately, Zuckerberg’s new goals are a sign that he’s finally coming to terms with one simple fact: He runs what may be the most influential company in history, and it’s no longer okay to simply “move fast and break things,” as Facebook’s motto once held.

Contact us at editors@time.com.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST