A passenger wears latex gloves as he travels on a MTR underground metro train during a Lunar New Year of the Rat public holiday in Hong Kong on January 27, 2020, as a preventative measure following a coronavirus outbreak which began in the Chinese city of Wuhan.
Photo by Anthony Wallace / AFP
January 30, 2020 7:04 PM EST

As a new form of coronavirus continues to infect a growing number of people around the world, medical professionals, scientists and big tech giants are fighting the spread of another contagion — misinformation. Just like a virus, it can be difficult to contain and many working in medical and scientific fields are using the very tools used to spread misinformation to counter it.

Though so much misinformation is spread on platforms like Twitter and Facebook, Johns Hopkins Center for Health Security, and hundreds of other scientists and medical professionals who are studying the 2019 Novel Coronavirus, have been utilizing social media to disseminate accurate information in real time, countering conspiracy theories and collaborating for research.

“Today, in this outbreak, we are sharing information almost to the second of its release,” says Crystal Watson, senior researcher and assistant professor at the Johns Hopkins Center for Health Security. “That allows a lot more collective thinking and decision making.”

Watson says that social media had made it possible for scientific information to be shared much more quickly. “In prior outbreaks before social media, often we had to wait for a publication in a journal, for example, to learn about some of what was going on,” she says.

Sign up for our daily coronavirus newsletter by clicking on this link, and please send any tips, leads, and stories to virus@time.com.

Many working in scientific and medical fields started to notice the spread of harmful misinformation at the beginning of the outbreak and experts started using their expertise to help counter it. “I think there’s some [misinformation] that is intentionally harmful, either disseminating information about a false cure, for example, or spreading information that stigmatizes specific groups of people,” Watson says. “So it’s really important that we get on top of that and provide correct information and push it out as best we can.”

Misinformation on social media platforms like Facebook, Twitter and YouTube range from racially driven scapegoating to supposed cures for the virus. One inaccurate Facebook post shared more than 500 times claimed that a vaccine exists for the new form coronavirus, which is false. In fact, there are no vaccines for any of the seven types of coronavirus that humans are susceptible to according to PolitiFact, quoting Amesh Adalja, senior scholar at Johns Hopkins Center for Health Security.

Other false claims have involved inaccurate information about how to protect from the virus, including claims that a Chinese respiratory expert found that saline solution kills the virus, and that people should rinse their mouths out with it.

“That’s the risk that we run here when we deal with misinformation,” says Tara Kirk Sell, senior scholar and assistant professor at Johns Hopkins Center for Health Security. “It’s not just ‘oh, who cares what people are saying?’ If it undermines trust, then that’s a big problem.”

Sell has studied misinformation that spread after the 2014 outbreak of Ebola in West Africa. She says there are similarities in the misinformation spread during that outbreak and the outbreak of the new form of coronavirus known as 2019 Novel Coronavirus, which started in the city of Wuhan in central China. There are now 8,236 total confirmed cases as of Thursday evening, most of which are in China, and the Centers for Disease Control and Prevention (CDC) have confirmed five cases in the U.S. On Thursday, the World Health Organization (WHO) declared a public health emergency of international concern.

“There’s always overlays of politics,” Sell tells TIME. “Even though you think of [outbreaks] as health events, they’re an opportunity for some people to create discord and to cause people to become fearful and also to criticize different government actions.”

A spokesperson for Facebook tells TIME in an emailed statement that the company has partnered with third-party fact-checkers around the world to add warning labels to posts that contain false information and promote articles that include fact checked information. The company is also sending notifications to those who have already shared false content.

“This situation is fast-evolving and we will continue our outreach to global and regional health organizations to provide support and assistance,” the spokesperson said.

Representatives for Twitter and Google did not immediately respond to TIME’s request for comment, but a spokesperson for Twitter told The Washington Post that users searching for coronavirus on its platform were met with information from the CDC. Similarly, Google, which owns YouTube, is promoting content that contains accurate and verified information, according to The Post.

“It’s challenging because this information is being churned out very, very quickly,” says Antonia Ho, an infectious diseases physician and clinical senior lecturer at the University of Glasgow. “No one is an expert right now… Obviously, this Novel Coronavirus is so new that with all this information coming out, it takes a lot to control, and certainly misinformation may not be noticed until later on just because it takes time to verify.”

Still, Ho tells TIME, social media — Twitter in particular — has been a significant tool for scientists who can counter misinformation with accuracies and research. The sharing of information and updates on Novel Coronavirus by members of the science and medical communities has grown organically, and many scientists, doctors and other experts have accumulated thousands of followers. For example, Laurie Garrett, a Pulitzer Prize-winning author and expert on infectious diseases, tweets daily coronavirus updates and has a following of 44,000.

This sharing of information has also led to increased online interaction among those working on the virus. “There’s often months of delays when people do research… but now this is all coming out on Twitter, and in a way there is a self peer review,” Ho says.

“Scientists who work on this around the world are able to form collaborations and are having really interesting conversations. And there are sensible voices that are emerging. People that you would follow because you know that they’re the expert in so many things,” she adds.

WHO has also launched an initiative to counter misinformation known as the WHO Information Network for Epidemics (EPI-WIN). The initiative shares accurate tailored information with targeted sectors impacted by the coronavirus, including healthcare, travel and tourism, business and food and agriculture.

“The spread of misinformation has been challenging but WHO is prepared for this. While the organization is known for fighting epidemics, it’s also fighting ‘infodemics,'” a WHO spokesperson said in an emailed statement to TIME. “[EPI-WIN] allows the organization to cut through the ‘noise’ by rapidly sending information through existing and trusted sources to the public. It’s like an injection of information.”

Sell says that tech companies have some responsibility to combat misinformation, but that alone is not enough to stop the spread of falsities.

“Being able to talk freely and post freely — those things are important,” Sell says. “An appropriate tech response to dealing with misinformation is critical, but I don’t think it’s sufficient… We would rather ourselves be able to determine what’s true or not true.”

Write to Jasmine Aguilera at jasmine.aguilera@time.com.

Read More From TIME

Related Stories

EDIT POST