What Would a Climate-Conscious Facebook Look Like?

4 minute read

A version of this story first appeared in the Climate is Everything newsletter. If you’d like sign up to receive this free once-a-week email, click here.


After a summer of devastating hurricanes, heat waves and wildfires, Facebook’s new measures to address climate misinformation leave something to be desired. In fact, you might be forgiven for thinking they were a joke.

In a blog post headlined “Tackling Climate Change Together,” Facebook said it would be adding quizzes to its climate information center and donating $1 million to organizations that fight climate misinformation, among other measures. Those pledges, activists and disinformation experts say, are piddling compared to the amount of climate misinformation and paid pro-fossil fuel advertising on the site. In 2020 alone, U.S. oil companies spent nearly $10 million on Facebook ads promoting the continued use of fossil fuels, according to an August report from InfluenceMap, a nonprofit watchdog group. Another report, from nonprofit Friends of the Earth, showed how faulty narratives around renewables spread far and wide on social media sites like Facebook following the Texas blackouts in February. “The initiatives that [Facebook] took are far too little, far too late,” says Michael Khoo, Friends of the Earth’s disinformation spokesperson. “It’s missing the big picture problem.”

There are smart, reasonable people working at Facebook—people well aware of the dire climate situation, and the shrinking window of time we have to avoid the worst effects down the road. But why haven’t they taken more serious action? Is addressing climate misinformation on Facebook a matter of a few simple fixes—something a programming team could accomplish in a week? Or would it require a rethinking of everything Facebook does and is? For that matter, what does a climate conscious Facebook even look like? Would we know it if we saw it?

I put those questions to an array of climate activists and social media scholars. They uniformly agreed that making such changes is well within Facebook’s power. “You’ve got the smartest kids in the room and the most money on the planet. How on Earth can you not figure out a solution?” says Khoo. “They literally created this problem, so there’s no one better positioned to deconstruct the problem.”

Indeed, some of the fixes researchers and activists are talking about sound relatively straightforward. Faye Holder, a program manager at InfluenceMap, says part of the issue is a matter of Facebook implementing its own advertising policies by clamping down harder on obvious falsehoods in paid posts—like claims that climate change is a hoax. Facebook could also expand that policy to include advertisements from fossil fuel companies portraying oil and gas as clean energy sources. “Facebook isn’t including all of these claims as misleading or misinformation, but there’s definitely scope [for them] to,” says Holder. “That’s something they need to address and figure out where they fall on this.”

There’s also a broader problem: Facebook’s advertising-based business model is powered by engagement—its algorithm promotes whatever content keeps people hooked. The system makes Facebook the perfect breeding ground for conspiracies and disinformation of all sorts, including climate denialism, because that kind of content is some of the most engaging, says Danny Rogers, co-founder of The nonprofit Global Disinformation Index and an adjunct professor at New York University. Changing the algorithm would cost the company—which reported $10 billion in profits in the most recent quarter—financially. “Facebook makes money by luring users to the platform and keeping them on the platform,” says Rogers. “If people are spending less time on the platform, [Facebook] makes less money.”

What, then, might a responsible Facebook look like? It might be a company that decides it’s in its own best interest—if not the rest of society’s—not to exploit engagement to the maximum extent, which would stop conspiracies and climate disinformation from spreading like wildfire. It could come to such a decision due to government regulation, or a restructuring that ends the almost complete control of its founder, Mark Zuckerberg. It might even choose to become part of the solution, giving an extra boost to posts that demonstrate positive climate action.

There could be a trade off though: With a decreased emphasis on engagement above all else, this new version of Facebook might seem a little less exciting from a user’s perspective, with fewer high emotions and comment thread screaming matches—more of a school field trip vibe than an MMA cage match. “At first glance that sounds disappointing, like a healthier diet sounds more boring compared to a gluttonous feast,” says Rogers. “But ultimately it’s not a bad thing.”

More Must-Reads from TIME

Write to Alejandro de la Garza at alejandro.delagarza@time.com