Facebook is working to limit the reach of posts making “sensational health claims” in an effort to tamp down on misinformation, according to a new announcement from the social media network.
Social media sites are notorious breeding grounds for dubious health information, from anti-vaccine arguments to too-good-to-be-true wellness claims. A Wall Street Journal investigation published Tuesday, the same day Facebook made its announcement, found that Facebook and YouTube are rife with posts promoting questionable alternative cancer therapies, for example. Many of these therapies are unproven, and can be dangerous if patients rely on them instead of doctor-prescribed care.
Facebook’s new policy is meant to prevent the spread of this type of content, since, as the announcement notes, “people don’t like posts that are sensational or spammy, and misleading health content is particularly bad for our community.” The site is also cracking down on posts trying to sell products or services based on unsound health claims, the announcement says.
But rather than removing sensational health posts outright, Facebook will change the way they’re ranked in users’ News Feeds, so that fewer people see them. Posts that include certain commonly used phrases — ones that suggest the post either “exaggerates or misleads,” or that it’s using health claims to hawk products like weight-loss pills or medications — will show up lower in users’ News Feed. The strategy is similar to the one Facebook uses to limit the influence of clickbait posted by publishers.
The announcement does not, however, mention Groups, where health-related misinformation often spreads and takes hold within established communities.
This isn’t Facebook’s first effort to improve the quality of health information on its site. Along with platforms like YouTube and Pinterest, Facebook has already implemented updates meant to limit the spread of false anti-vaccine content, which often proliferates online and can contribute the kind of vaccine skepticism that has allowed measles to resurface in the U.S. In March, it announced that it’s working with groups like the World Health Organization and the Centers for Disease Control and Prevention to identify incorrect vaccine claims so that it can “take action” against the posters and limit the influence of their content.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Jamie Ducharme at jamie.ducharme@time.com