Facebook is discontinuing its “fake news” alert feature, after discovering that the initiative, which was intended to identify and stop the spread of misinformation on the platform, was sometimes backfiring.
Instead of a red warning icon, the company will offer users additional related and fact-checked content in the hope of drawing readers toward more reliable news sources, according to Facebook project manager Tessa Lyons.
Facebook’s introduced a “Disputed” article flagging feature to help users quickly identify articles from third-party websites that failed to pass a fact-checking standard. But the indicator, which was rolled out in December 2016, wasn’t effective at curbing misinformation, Facebook said, and sometimes even spurred readers to share dubious links more often.
Research showed that “putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended,” Lyons wrote in a blog post.
Now, before reading an article shared on Facebook, users will be offered a menu of fact-checked “Related Articles” from reliable sources in order to “give more context, which our research has shown is a more effective way to help people get to the facts,” Lyons wrote.
So far, it’s been effective — redirecting users to “Related Articles” has led to “fewer shares of the hoax article than the disputed flag treatment,” the designers said.
Facebook also hinted at a new initiative to “better understand how people decide whether information is accurate or not,” focusing on which news sources users trust. The world’s largest social media network has come under fire over accusations that it has failed to properly monitor content shared on the platform, particularly during the 2016 presidential election. Last month, Facebook joined tech giants Google and Twitter to testify before Congress over efforts to stop the spread of misinformation and “extremist content” on their services.