Gain-of-Function Research: Balancing Science and Security
Gain-of-function research can help science get ahead of potential threats—but is it too much of a risk?
At the height of the 2014–15 flu season, Andy Pekosz had to shut down his influenza virus research. The professor of Molecular Microbiology and Immunology was experimenting with the seasonal flu virus to understand how it might evolve to outsmart the vaccine. The premise was simple: Let a flu virus evolve in the lab until it acquires mutations to evade the vaccine—and then use that information to design better vaccines.
But in late 2014, the NIH froze funding for research that involved tinkering with influenza, MERS, and SARS after a research group created a version of a bat coronavirus that could infect mice and humans. The study sparked public outcry. Scientists claimed the research would lead to better vaccines and treatments for coronavirus, but concern that the virus could escape the lab and cause a pandemic trumped the potential benefits. (The study was ultimately allowed to continue and was published in 2015.)
This kind of experiment is known as “gain-of-function” research—a term coined in 2011 after two groups of scientists showed they could modify the bird flu to infect ferrets. The 2014 bat coronavirus studies reignited the debate and spurred the U.S. government to impose a moratorium on the research. Around 20 projects, including Pekosz’s, were paused as the NIH spent the next three years debating how to make the research safer.
The moratorium was lifted in 2017 with the implementation of new guidelines specifying that research that increases the transmissibility of pathogens with high potential to cause a pandemic—such as coronaviruses, Ebola, and influenza—would be carefully evaluated for biosecurity risks. For Pekosz’s research, it was too late. The flu mutated before he could study it, rendering his work obsolete. Nature beat him to the punch.
Now, gain-of-function research is back in the news. Prompted by theories that a lab leak led to the COVID-19 pandemic, regulatory agencies are once again reevaluating gain-of-function research guidelines. In January, the National Science Advisory Board for Biosecurity released the first draft of the guidelines—which recommend greater oversight of more types of research—inspiring a debate between those whose research could be stymied and those who favor tighter restrictions.
Defining the Threat
A central issue in this debate is that scientists don’t always agree on what “gain of function” means. “The whole term is super vague,” says Gigi Kwik Gronvall, PhD, a senior scholar at the Johns Hopkins Center for Health Security and an associate professor in Environmental Health and Engineering. When the term was coined, “gain of function” meant making a virus more transmissible and dangerous.
But over time the term has expanded to include many different types of work, not all of which is worrisome. In October 2022, for example, Boston University researchers caused a stir with their research on the Omicron variant, which had acquired a couple dozen mutations and become less deadly but more transmissible, even among vaccinated people. They wanted to answer a basic biology question: Were these changes related to mutations in the spike protein?
To better understand the molecular changes behind Omicron’s virulence and transmissibility, the researchers inserted its spike protein into the original SARS-CoV-2 virus. The new lab-made virus—dubbed Omicron-S—was more virulent than the naturally occurring variant: No mice infected with Omicron died, but 80% of those infected with Omicron-S died. Since both versions had the same spike protein, this told researchers that something else must be responsible for the virulence.
Meanwhile, the original SARS-CoV-2 virus killed 100% of infected mice but did not evade immunity. Omicron-S did evade immunity—suggesting that the spike mutations played a role in Omicron’s transmissibility.
Public outcry ensued with critics claiming the researchers were doing dangerous gain-of-function research. But Gronvall says the research was misinterpreted. The researchers didn’t make the virus more deadly than it was in nature; they combined two naturally occurring viruses and got one with qualities of both—and made important discoveries about the spike protein.
‘Reasonable Anticipation’
Of top concern in the proposed guidelines is a recommendation for additional review of research that is “reasonably anticipated to enhance the transmissibility and/or virulence of any pathogen.” The problem is that scientists are still learning the molecular and genetic factors that make viruses more deadly, and more research is needed to understand how mutations can affect their severity. As we saw with the pandemic, each time a new variant arose, public health officials had to wait and see how it would impact the population.
“If you put me and five other researchers in a room and threw us a couple of case studies, we would probably come to different opinions,” says Michael Imperiale, PhD, professor of Molecular Biology and Immunology at University of Michigan Medical School. Additionally, scientists don’t always know what to expect when they start an experiment. They have a hypothesis, test it, and get the results.
“If I was able to ‘reasonably anticipate’ what happened in the lab, I would have gotten my PhD in six months instead of four years,” says Gronvall.
The current guidelines only apply to a short list of pathogens with pandemic potential, but the new guidelines could apply to many more species. Imperiale says they could have far-reaching consequences, impacting distantly related fields like microbiome research or even synthetic biology, in which microbes—such as E. coli, which is used today to produce synthetic insulin—are engineered to have special functions.
The new guidelines also remove an exemption for vaccine development and surveillance. Many scientists worry that putting hurdles around this research could impede progress that keeps us safe. “Those are the critical elements of defense,” says MMI chair and Bloomberg Distinguished Professor Arturo Casadevall, MD, PhD. “If you add more bureaucratic regulation to the steps that are being done for societal defense, I don’t think that leaves us more secure.”
Casadevall also worries that putting the onus on researchers to “reasonably anticipate” whether a virus could become more virulent may stifle important new discoveries. He points to the Johnson & Johnson coronavirus vaccine, which was created by inserting genes for a spike protein into an adenovirus that had been made less infectious. The spike protein genes gave the adenovirus a new function, which is to generate immunity against SARS-CoV-2. “This is the technical definition of gain of function,” says Casadevall. Some scientists may have predicted that this research would make the adenovirus more deadly, and then we might not have the vaccine.
In February, he published a paper highlighting many advances that can be credited to gain-of-function research—including the treatment of antibiotic-resistant bacteria, bacterial bioremediation tools, and drought- and pest-resistant crops. At least two FDA approvals stem from imbuing viruses with new functions. One is a cancer therapy called T-VEC that consists of a modified herpes simplex virus that infects and destroys tumor cells. The other was the Johnson & Johnson coronavirus vaccine.
Reconcilable Differences
Even at the School, opinions are divided on the draft guidelines. CHS director Tom Inglesby, MD, thinks the new guidelines are warranted. “The work isn’t forbidden,” he says. “But if you are going to make a more transmissible strain of Ebola, then you need to have the work reviewed by the U.S. government.” Inglesby thinks government oversight will be reasonable and will consider the benefit of research, such as on vaccines, meant to protect the public. Along with colleagues at Stanford and Harvard universities, Center deputy director Anita Cicero, JD, and fellow Jaspreet Pannu, MD, Inglesby signed a letter commending the new guidelines and recommending additional changes. Similarly, American Society for Microbiology, an organization that represents the interests of scientists, wrote a statement of support for the new regulations.
On the other side of the debate are scientists like Pekosz who believe existing regulations suffice. He notes that the CDC sets biosafety levels for labs and requires extensive training and regulation for those with designations BSL 3 (such as those working with pathogens that cause anthrax and tuberculosis) and BSL 4 (e.g., Marburg virus). In addition, universities have institutional review entities, or IREs, that decide whether potentially dangerous research is warranted. A member of the Johns Hopkins IRE, Pekosz has made decisions to not approve research he found too risky. He thinks the public would be reassured if researchers and review boards did more to inform them about the processes in place to protect them.
In the weeks and months ahead, scientists and regulatory agencies will be considering the pros and cons and looking for reasonable ways to balance them. “On one hand, you want to protect society against an accidental outbreak or a deliberate act that causes great harm. But, on the other hand, that research enterprise is also society’s first line of defense,” says Casadevall.