• Health
  • COVID-19

A New Lab-Made COVID-19 Virus Puts Gain-of-Function Research Under the Microscope

11 minute read

On October 14, a team of scientists at Boston University released a pre-print study reporting that they had created a version of SARS-CoV-2 combining two features of different, existing strains that boosted its virulence and transmissibility. Scientists and the public raised questions about the work, which refocused attention on such experiments, and prompted the U.S. government to investigate whether the research followed protocols for these kinds of studies.

The concerns surround what is known as gain-of-function studies, in which viruses, bacteria, or other pathogens are created in the lab—either intentionally or unintentionally—that possess more virulent and disease-causing features than is found in nature. The controversy is especially fraught in the context of COVID-19, as questions about where the virus originated—whether it jumped from animals to people or whether it was created in the Wuhan Virology Institute by scientists studying earlier coronaviruses—remain unresolved.

Those questions continue to plague experiments involving SARS-CoV-2, and heighten scrutiny on such experiments, especially by government regulators, and might have been unremarkable had they involved other viruses, says a scientist who requested not to be on the record. In fact, lab studies pushing the virus toward becoming resistant to known drugs are requested by the U.S. Food and Drug Administration—such work helps doctors and patients have a clear idea of the likelihood that a virus would become resistant to new therapies.


More from TIME


The B.U. scientists were trying to answer a different, but related question of what made Omicron better able to escape the protection provided by the immune system and vaccines. To do so, they created chimeric viruses that contained some genetic material from the original SARS-CoV-2 virus, and some from the Omicron BA.1 strain, focusing on the virus’ key feature, the spike protein, which alerts the immune system into action. By comparing the altered viruses to the original version of SARS-CoV-2, they could determine whether mutations in Omicron’s spike region were responsible for making the virus resistant to vaccines, or if different sections of the viral genome contributed to this escape.

In the process, however, the team created a version of the virus that they found was 80% lethal in lab mice. That finding was reported in the pre-print study that had not been peer-reviewed by other scientists. The Daily Mail cited the result, raising alarms about a lab-created, highly lethal version of SARS-CoV-2. The work exposed unresolved questions about what gain-of-function research entails, how it should be regulated, and who bears responsibility for such studies.

What Boston University researchers actually did

These questions aren’t new, nor is the B.U. study the first to focus attention on them. Most experts support the need to conduct such studies, arguing that they are essential for understanding new pathogens, from SARS-CoV-2 to HIV. Others, however, feel such work is an unnecessarily dangerous way of getting those answers, and adamantly believe alternative strategies should be used.

In the U.S., the Department of Health and Human Services (HHS), which oversees the National Institutes of Health (NIH), the largest funder of biomedical research, calls such entities enhanced potential pandemic pathogens, and has guidelines for reviewing such studies before they are approved—but only if the work is funded using public monies from that specific federal department. If not, then oversight responsibilities are unclear. “The layer of HHS oversight is over HHS grants,” says Marc Lipsitch, professor of epidemiology at the Harvard T.H. Chan School of Public Health. Lipsitch is among a number of experts who have advocated for stronger review of such studies since concerns were raised by similar experiments with the influenza H5N1 strain in the 2010s that generated more virulent versions of the virus in the lab. “If the grant is from another federal department, there is no required oversight. If you use private funding, there is no oversight.”

In a statement provided to TIME, the agency said that the National Institute of Allergy and Infectious Diseases (NIAID), which is part of NIH, “did not review nor issue awards” for the experiment described in the B.U. pre-print study that has triggered the current discussion. The NIH is investigating whether indirect federal dollars were used in conducting the experiment, and if so, whether B.U. scientists failed to follow federal policies governing research into potentially dangerous pathogens.

For its part, Boston University officials said the experiment was performed using university funding, and that NIAID was acknowledged in the manuscript because of “tools and platforms that were used in this research; they did not fund this research directly. We believe that funding streams for tools do not require an obligation to report.” B.U. also said in a statement that the research did not involve gain of function.

It’s a gray area, says Dr. David Ho, professor of microbiology and immunology and director of the Aaron Diamond AIDS Research Center at Columbia University, and that’s part of the problem when it comes to deciding if anyone should be overseeing such work, and if so, who. “I think this [work] is borderline gain-of-function,” he says. “It does provide a valuable scientific contribution in that they showed that the virulence factor is outside the spike chain. That science is important.”

What does ‘gain-of-function’ mean?

The back and forth over whether the experiment involved gain of function work, and what role, if any, government health officials have in overseeing it, reflects the ambiguous state of this precarious research that has remained unresolved for decades. Even with government-funded studies, scientists don’t have clearcut instructions for exactly what constitutes gain-of-function research that would require additional scrutiny.

Would modifying viruses to understand which mutations made them more virulent, and more able to evade drugs and vaccines, fall into this category? Virus experts do such work routinely, says Ho, and he himself has conducted such experiments for years with HIV, as well as with SARS-CoV-2. What’s more, these new versions of viruses and bacteria are constantly being created by nature as well, in response to natural selection pressures. That’s why scientists mutate viruses like SARS-CoV-2 to understand which changes the virus might develop next out in the real world, and what they would mean for existing vaccines and treatments. “These mutations are going to occur naturally,” says Ho. “We are trying to get ahead of it—that’s just routine for many virus studies. The problem right now is there is a lack of clear guidelines, both from the government and from the [scientific] journals.”

That lack of clarity is both confusing and hindering understanding of SARS-CoV-2, says Ho. His lab has scaled back some of its experiments exploring how the virus becomes resistant to existing vaccines and therapies out of concern it might fall into the category of gain-of-function research. “Science is being slowed down a little bit because of these concerns,” he says. “In the lab, we are selecting for viruses with drug resistance and antibody resistance, and from my HIV days, these studies are all routine and in fact required by the FDA. How will you generate the next generation of drugs or antibody therapies if you don’t know which mutations contribute to resistance? To me, a lot of these studies are not gain-of-function, they are relevant studies to advance our knowledge of the virus to guide us to the next generation of therapeutics.”

FDA’s requirements that scientists demonstrate what it might take for viruses to become resistant to drugs conflict with what the HHS considers enhanced potential pandemic pathogens. HHS deems these to be “bacteria, viruses, and other microorganisms that are likely highly transmissible and capable of wide, uncontrollable spread in human populations and highly virulent, making them likely to cause significant morbidity and/or mortality in humans,” according to a fact sheet on the agency’s website. These include certain versions of the influenza virus capable of causing widespread disease, such as H5N1 and H7N9, as well as the original SARS and SARS-CoV-2 coronaviruses.

Boston University maintains that the version of the virus its scientists created at the university’s National Emerging Infectious Disease Laboratories is actually less lethal, at 80%, than the original virus, which was 100% deadly in the mice when they were exposed to the virus at certain concentrations. The university also said its researchers were given permission to conduct the research by the university’s internal review board.

But Lipsitch says that such boards often don’t have the expertise to evaluate whether a study has the likelihood of producing a potentially dangerous public health threat. “We’re all familiar with research that puts participants at risk, like vaccine trials,” he says. “But the idea of research that puts people who have no idea the risk is even happening, like people across the country who could get a virus if it spreads [from a lab] globally, that’s a relatively new phenomenon. And that’s why it’s so badly regulated, because we never really had to think about it before.”

Under the current system, the burden of alerting authorities—either at a researcher’s own institution or at the HHS—lies with the individual scientist. Ho says if any of his research ended up creating something in the lab that was more virulent and potentially a threat to public health, he would inform both his institution as well as NIH and Centers for Disease Control, “regardless of whether the funding was coming from there. I think that’s what any responsible, diligent scientist would do.”

The problem is that the incentives for sounding the alarm aren’t necessarily aligned to do so, since alerting authorities almost certainly would halt the research and potentially even trigger a wider review of the laboratory’s activities.

What experts say needs to change

In February 2022, the NIH and the White House Office of Science and Technology Policy asked the country’s biosafety board, the National Science Advisory Board for Biosecurity (NSABB) to review current policies regarding gain-of-function research—and consider whether more oversight is needed even on studies that are not funded with government dollars— and issue recommendations by the end of the year

The debate over how best to manage research with dangerous pathogens moved from government and academic circles into the public eye during the last major infectious disease epidemic, involving influenza. In 2014, the White House Office of Science and Technology Policy issued a temporary ban on funding gain-of-function research involving flu, and the MERS and SARS coronaviruses, which halted 18 studies underway at the time. The moratorium stemmed from concerns over certain studies funded by NIH on H5N1 that could potentially create more virulent and even lethal versions of the virus that could be devastating if they escaped and spread among the world’s population.

The ban was lifted in December 2017 by the NIH, after the HHS issued new guidelines for reviewing such research, including the creation of an independent panel of experts to review any proposals for these types of studies submitted to the HHS. Those reviewers were tasked with considering whether such research was absolutely necessary, and whether there was no alternative way to gain the same knowledge that proved less risky to both the scientists and society. Three research proposals have been awarded under these guidelines, two involving influenza in which the reviewers decided there were no alternative ways to answer the scientific questions posed, and another that was initially approved and required additional security measures but was ultimately replaced by alternative studies not requiring the more stringent review.

Still, some experts feel more could be done to justify such studies, including being more transparent with the public about who is reviewing the experiments, their comments, and the risks and benefits of the work. The NSABB’s recommendations are likely to reflect the recent worries over SARS-CoV-2’s origins, and attempt to provide clearer guidance for researchers who are interested in undertaking gain of function research. And based on how the scientific community has responded to previous biosecurity concerns, Ho says it is likely that the government will lean toward requiring some type of review of all research that might lead to creating enhanced potential pandemic pathogens, even if the work is not paid for by public funds.

Better oversight is necessary, as many in the field argue that these types of studies are essential in a world more easily threatened by virulent diseases. “I would not like to see a blanket ban on this kind of experiment, because we are learning things from it,” says Lipsitch. “I would like to see much more careful review of this type of experiment so we are doing them with the understanding of what the risks and benefits are.”

More Must-Reads from TIME

Contact us at letters@time.com