Anxieties over the growing coronavirus pandemic are making people increasingly susceptible to misinformation, with conspiracy theories linking 5G wireless technology to COVID-19 gaining traction in recent weeks. The conspiracy has been spread by celebrities, including Woody Harrelson and John Cusack, as well as lesser known influencers and online trolls. And in the last 10 days, they’ve had real-world consequences, with at least 20 phone masts vandalized across the United Kingdom.
The unfounded conspiracy theories reportedly began when a Belgian doctor speculated to a national newspaper about 5G masts in Wuhan, China, where the new coronavirus originated. Despite the article being removed after a few hours because the comments were baseless, the narrative was picked up by conspiratorial Internet personalities and has spread across the Internet as the coronavirus fans anxieties around the world. (Contrary to the conspiracy theories, which say 5G’s high-frequency waves are harmful, the scientific community agrees they are beneath the wavelength that can damage human cells.)
Conspiracy experts say that it’s understandable that misinformation like the baseless 5G theories would spread in such an uncertain time. “When there are periods of great anxiety about unknown events and unknown threats, people become more conspiracy minded,” says Quassim Cassam, a professor of philosophy at the University of Warwick and author of the book Conspiracy Theories. The original 5G conspiracies, which existed in fringe areas of the Internet before the COVID-19 outbreak, focused on the idea that the new phone masts required by the technology are somehow causing health problems that are being concealed by governments.
When the coronavirus began to spread around the world in January, Cassam says, conspiracy theorists seized on the uncertainties surrounding the virus to spread the baseless theory to wider audiences searching for information about COVID-19. “The classic form of a conspiracy theory is that there’s a small group of a small group of powerful individuals who are doing stuff behind the backs of the people and endangering our futures,” he says. “If you take that as a template, you see this 5G theory is basically the same theory being recycled. People were saying the same about 4G, just without coronavirus.”
But 5G conspiracy theories have spread rapidly, making them far more influential than those surrounding 4G — largely thanks to their piggybacking on coronavirus, which has driven unprecedented online traffic for both reputable and disreputable news organizations. From their genesis in an obscure corner of the internet, the theories spread onto mainstream social networks like Facebook and YouTube, where they were picked up by prominent celebrities including Harrelson, Cusack and British television personality Amanda Holden, increasing their reach. They were then amplified by social media algorithms “that were smart enough to spot a viral trend but dumb enough not to notice the idiocy of its content,” according to Wired, a technology magazine.
Some researchers have also said the theories were boosted by state-backed inauthentic activity — in other words, fake accounts masquerading as real people — though no perpetrator has yet been identified. However, the state-funded Russian television station RT has been spreading 5G conspiracy theories since at least 2019, perhaps to slow the spread of the technology in the West and allow Russia to catch up, according to the New York Times.
Tech companies are now moving to crack down on 5G conspiracy theories, following pressure from governments — especially the U.K., which has borne the brunt of attacks on phone masts. The theories are “dangerous nonsense,” said U.K. cabinet minister Michael Gove on Saturday, as the British government prepared to meet with social networks to call for more stringent monitoring.
The companies are already taking matters into their own hands, with many issuing new policies related to COVID-19 misinformation. “Any content that disputes the existence or transmission of COVID-19 … is in violation of YouTube policies,” a YouTube spokesperson said in a statement to TIME on Thursday, adding that this meant such content would be removed. “This includes conspiracy theories which claim that the symptoms are caused by 5G,” they said. Facebook has made a similar commitment. “Under our existing policies against harmful misinformation, we are starting to remove false claims which link COVID-19 to 5G technology and could lead to physical harm,” said a Facebook spokesperson.
Twitter is taking a different approach, saying they “will not take enforcement action on every Tweet that contains incomplete or disputed information about COVID-19.” Instead, a spokesperson told TIME, Twitter is “prioritizing the removal of content when it has a call to action that could potentially cause harm.” Twitter has removed more than 1,100 tweets for sharing misinformation about COVID-19 — though many thousands more are still online.
Some critics say the tech companies’ quick response to countering COVID-19 misinformation — as opposed to their often-slow pace in tackling posts targeting minorities or spreading politically motivated fake news — reveals biases in the companies. “Companies’ COVID-19 response is showing us that they could have made different choices about allocation of resources and takedowns of content that leads to offline violence and harm a long time ago,” said Dia Kayyali, Tech and Advocacy coordinator at Witness, a human rights group.
Cassam, who has studied conspiracy theories for years, says governments leaning on tech companies to remove 5G misinformation could feed into conspiracy narratives. “Governments should be doing what they can to tell people that these theories are not true,” he says. “But once you start telling tech companies to take this stuff down, the conspiracy theorists would begin to think of that as further evidence in support of their theory. People are suspicious that censorship is an indication that there’s something to hide.”
Nevertheless, the alternative is leaving misinformation to spread online to potentially millions of people. Deplatforming, while occasionally controversial, tends to work. After YouTube, Facebook and Twitter banned well-known conspiracy theorist Alex Jones, for example, he moved to less popular sites where he was accepted, but where his reach was limited, largely, only to the people who actively sought him out.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com