Conspiracy theories, both powerful and enduring, can wreak havoc on society. In recent years, fringe ideas prompted a gunman to storm a Washington, D.C. pizzeria and may have motivated another to fatally shoot 11 worshippers inside a Pittsburgh synagogue. They are also largely to blame for a worldwide surge in measles cases that has sickened more people in the U.S. in the first half of 2019 than in any full year since 1994.
Now, the FBI says conspiracy theories “very likely” inspire domestic terrorists to commit criminal and sometimes violent acts and “very likely will emerge, spread and evolve” on internet platforms, according to an intelligence bulletin obtained by Yahoo News. The May 30 document from the FBI’s Phoenix field office—the first of its kind to examine the threat of conspiracy-driven extremists—also says the 2020 presidential election will likely fuel conspiracy theories, potentially motivating domestic extremists who subscribe to them.
“It’s increasingly becoming clear that lots and lots of people believe in them, and they have negative outcomes,” says Viren Swami, a social psychology professor at Anglia Ruskin University in the U.K., who has published several studies on conspiracy theories.
Millions of people all over the world—including, by one estimate, half of the U.S. population—believe in conspiracy theories. Today, that figure may be even higher, according to political scientists and psychologists who study the phenomenon. Since researchers have not tracked these trends over time, it’s difficult to determine whether the number of people who believe in conspiracy theories has risen over the years. But experts, and now the FBI, argue an average person’s exposure to them has certainly increased, in large part because conspiracy theories are now more easily disseminated on social media.
Among the most prominent peddlers of misinformation on social media, experts say, is President Donald Trump. Trump has repeatedly promoted falsehoods, using his personal Twitter account more than 100 times to voice doubts about the negative effects of climate change, contradicting an overwhelming consensus among scientists. Trump, who has more than 63 million Twitter followers, also spent years pushing the false narrative that former President Barack Obama was not born in America.
More recently, following Jeffrey Epstein’s apparent suicide in federal jail, Trump retweeted an uncorroborated theory that suggested the death of the well-connected financier, who was charged with sex trafficking of minors and conspiracy, was suspicious and somehow linked to former President Bill Clinton. When asked by reporters Tuesday whether he thinks the conspiracy theory he promoted is true, Trump said he has “no idea” but added that Clinton was a “very good friend” of Epstein who has been on Epstein’s private plane and perhaps to Epstein’s private Caribbean island, which locals reportedly dubbed “Pedophile Island.” Clinton spokesman Angel Urena called the claim “ridiculous and of course not true.”
“The chief conspiratorialist of the last 10 years is now the President of the United States,” says Harvard University researcher Joseph Vitriol, who studies political psychology. “Because of that, what we might be seeing is increased influence and pervasiveness of these beliefs.”
In 2016, when a gunman barged into the Washington, D.C. pizzeria, he had falsely believed children were trapped in a sex-trafficking ring led by Hillary Clinton—a fringe idea propagated by an anonymous user known as “Q,” who favors Trump. Thousands of people, including the actor Roseanne Barr, believe or acknowledge Q’s uncorroborated musings, which started on the controversial message board 4chan.
Platforms like YouTube and Facebook have also given life to conspiracy theories and allowed many to go viral. The children’s arcade chain Chuck E. Cheese was pressured to address and debunk an allegation that it resold leftover pizza slices in February after a YouTube star made the claim in a video, which now has more than 35 million views. The theory was mostly harmless but highlights how YouTube, which boasts more than 1 billion users, is part of the problem.
Critics argue the company’s obscure recommendation algorithm system—which chooses and automatically plays the user’s next video—often leads viewers down rabbit holes, pushing them toward questionable content they might not discover on their own. “It used to be a lot harder for things to go viral,” says Micah Schaffer, a technology policy consultant who crafted YouTube’s first policies when he worked for the company between 2006 and 2009. “Now, without any human intervention, you could have a machine that says a lot of people are watching this and put it on blast to a mass audience.”
The recommendation algorithms have been successful in keeping viewers on the site and luring them to watch more videos. Over 70% of the time people spend on YouTube is driven by recommendations, YouTube’s chief product officer Neal Mohan wrote in a 2018 Variety op-ed. And about 80% of YouTube users in the U.S. at least occasionally watch the videos suggested by the platform’s recommendation algorithm, a November 2018 Pew Research Center survey found.
In January, YouTube announced it would gradually start “reducing recommendations of borderline content and content that could misinform users in harmful ways.” Conspiracy theories were part of the targeted content, including “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.” In June, YouTube said the number of views the harmful content gets from recommendations has dropped by over 50% in the U.S.
Facebook faced its own pressures to act in March after an Ohio teenager, who got vaccinated against his mother’s wishes, testified about the dangers of misinformation during a widely viewed Senate hearing. Ethan Lindenberger, 18, told lawmakers that his mother, an anti-vaccine advocate, mostly relied on Facebook for her information. Measles was declared eliminated from the U.S. in 2000, and public health officials have debunked the claim that vaccinations lead to autism. Yet since January, more than 1,180 measles cases have been confirmed in 30 states, the greatest number of cases reported in the U.S. in 25 years, according to the U.S. Centers for Disease Control and Prevention.
“Conspiracy theories are effective at doing the things that they do,” says Mike Wood, a lecturer at England’s University of Winchester, who specializes in the psychology of conspiracy theories. “They motivate people to take actions—to vote or to not vote, to vaccinate their kids or not to vaccinate their kids, to do all of these things that are important.”
Two days after the hearing, Facebook said it would do more to mute anti-vaxxers and scrub related ads. The company has also banned high-profile propagators of conspiracy theories, including Alex Jones, saying they promote or engage in violence or hate.
Vitriol says conspiracy theories are “tremendously problematic” because they undermine trust in institutions and change perceptions of what is real. “The further we deviate from an evidence-based understanding of reality, the less likely we’re able to deal with it,” he says.
So what is the best way to deal with conspiracy theorists, especially those who are not easily dissuaded? Researchers call it the million-dollar question. The first step is to avoid belittling them, Swami says. Diminishing deeply rooted beliefs may backfire, fueling propagators and their followers to shun mainstream explanations even more. “The problem with condemning conspiracy theories is that it plays into the conspiracy theorist’s mind,” he says. “It would entrench their beliefs.”
Instead, experts say it’s important to understand the science behind their mentality and the environment that fuels it. Conspiracy theories thrive in polarizing political climates, researchers say. According to Swami, they spring up when people who feel politically disenfranchised seek ways to explain what’s happening in the world. “Conspiracy theories don’t just emerge in a vacuum,” he says. “It simplifies events and gives you a sense of control of your life again.”
It may not always make sense for truth seekers to confront all conspiratorialists with factual evidence when trying to change their minds. (A 2017 study found people who believe in conspiracy theories may just simply want to believe them.) But it’s still a worthwhile strategy, experts say. At least two experimental studies have shown that it works. According to the authors of a 2016 study published in the journal Frontiers in Psychology, pointing out the logistical inconsistencies of conspiracy theories helped discredit them, especially if the person presenting the rational counter argument was perceived to be intelligent and competent.
The bottom line, experts say, is that ignoring conspiracy theories, over the longstanding fear of inflaming or promoting the ideas, is no longer an option. “The risk of doing nothing is that people who know nothing about the issue may adopt the conspiratorial account because there’s no alternative account,” Vitriol says. “If this was just a small portion of the general public in the darkest depths of the internet, perhaps you get could get away with not doing anything. But it’s too mainstream now.”
“It’s too consequential for us not to deal with it,” Vitriol adds.
More Must-Reads from TIME
- Caitlin Clark Is TIME's 2024 Athlete of the Year
- Where Trump 2.0 Will Differ From 1.0
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com