Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Nov 1.
Published in final edited form as: J Med Ethics. 2015 Aug 28;41(11):901–908. doi: 10.1136/medethics-2014-102619

The Ethics of Biosafety Considerations in Gain-of-Function Research Resulting in the Creation of Potential Pandemic Pathogens

Nicholas Grieg Evans 1,, Marc Lipsitch 2, Meira Levinson 3
PMCID: PMC4623968  NIHMSID: NIHMS724467  PMID: 26320212

Abstract

This paper proposes an ethical framework for evaluating biosafety risks of gain-of-function (GOF) experiments that create novel strains of influenza expected to be virulent and transmissible in humans, so-called potential pandemic pathogens (PPP). Such research raises ethical concerns because of the risk that accidental release from a laboratory could lead to extensive or even global spread of a virulent pathogen. Biomedical research ethics has focused largely on human subjects research, while biosafety concerns about accidental infections, seen largely as a problem of occupational health, have been ignored. GOF/PPP research is an example of a small but important class of research where biosafety risks threaten public health, well beyond the small number of persons conducting the research.

We argue that bioethical principles that ordinarily apply only to human subjects research should also apply to research that threatens public health, even if, as in GOF/PPP studies, the research involves no human subjects. Specifically we highlight the Nuremberg Code’s requirements of “fruitful results for the good of society, unprocurable by other methods,” and proportionality of risk and humanitarian benefit, as broad ethical principles that recur in later documents on research ethics and should also apply to certain types of research not involving human subjects. We address several potential objections to this view, and conclude with recommendations for bringing these ethical considerations into policy development.

Introduction

The United States Government announced in October 2014 a formal “deliberative process” on the risks and benefits of certain “gain-of-function” (GOF) experiments that enhance the mammalian virulence or transmissibility of influenza, SARS and MERS viruses. During this process, government funding for such experiments was suspended. This pause and deliberative process are the latest major developments in a growing debate over what we and others have called the creation of potential pandemic pathogens (PPPs) [1]: pathogenic microbes that combine genetic novelty with likely or known high virulence and efficient transmissibility in humans [2]. GOF experiments, in which new traits are conferred on existing viruses, are a common and generally uncontroversial method of scientific inquiry; the creation of PPPs, however, is a particularly controversial class of GOF experiment.

The roots of this discussion go back more than a decade, with the controversial attempt to create reassortant influenza viruses that combined genetic material from highly virulent H5N1 avian influenza with material from seasonal human influenza [3], and the 2005 synthesis of the highly virulent 1918 “Spanish flu” pandemic influenza strain in the laboratory [4]. A few studies involving creation of novel, potentially transmissible influenza viruses were published with little fanfare in the years that followed [5,6]. The debate was rekindled [2,710], however, by two publications in 2012 describing the creation of ferret-transmissible strains of influenza [11,12].

Initially, the debate over the experiments creating ferret-transmissible viruses focused on whether the results of these experiments should be published, and if so whether detailed results should be included in the publication. The main concern was whether the publication of these results might enable malevolent individuals, groups or states to create dangerous viruses as weapons [10]. This was the latest chapter in a decade-long discussion in which governments around the world have attempted to wrestle with the dual-use dilemma, in which one and the same piece of scientific research may help or harm humanity [13]. GOF research in fact initiated the discussion of dual-use in the life sciences; the first biological research identified as dual-use was a gain-of-function study in which Australian virologists created a strain of the mousepox virus that killed many or all mice exposed to it, including mice genetically resistant to natural mousepox strains and vaccinated mice [14]. This mousepox research had applications in efforts to eradicate rabbit and mouse populations, significant agricultural pests in Australia, but could also be used to augment viruses that infect humans (such as the smallpox virus) as a tool in acts of bioterrorism.

The eventual decision to publish the 2012 ferret-transmission H5N1 papers in full represented a decision by US governmental authorities (which had jurisdiction both because of having funded the work and because one of the papers was published in Science, a US-based journal) that the benefits of publishing outweighed the biosecurity risks, although not all participants in the decision were satisfied that such concerns had been resolved adequately [1517].

Simultaneous with the publications, however, a further concern was raised: that of biosafety [9]. Biosafety pertains to the unintentional exposure to, or accidental release of laboratory pathogens, a foreseeable potential consequence of conducting an experiment with dangerous pathogens [18]. Biosafety is appropriately seen for most kinds of research as a branch of occupational health, because lab accidents generally pose a threat only or primarily to those working in the lab where they occur. In experiments involving PPP creation, however, accidental release could lead to widespread transmission, affecting many more people than those exposed in a particular laboratory. Biosafety for PPP research, therefore, constitutes a problem not only of occupational health, but also of public health (Table 1).

Table 1.

Matrix of Biosafety and Biosecurity Concerns

Biosafety concerns about harm from accidents in
the conduct of research
No Yes-
occupational
health
Yes – public
health
Biosecurity concerns about harm from malevolent use of the results of research No Most scientific research Laboratory studies of dangerous pathogens not considered suitable for malevolent use (eg Streptococcus spp.) ?
Yes IL-4 mousepox Studies of anthrax and other “weaponizable” but nontransmissible pathogens PPP experiments

Biosafety concerns in laboratories studying dangerous pathogens gained salience in the summer of 2014 following three biosafety incidents in US Government laboratories: the accidental human exposure to anthrax [2,7,19] and inadvertent contamination of viral samples sent to another laboratory with H5N1 highly pathogenic avian influenza [2,7,20] at the Centers for Disease Control, and the discovery of unsecured samples of the smallpox virus in a freezer on the National Institutes of Health campus [21]. Although none of these involved PPPs, individuals and groups concerned about biosafety in PPP research pointed to these incidents as evidence of the potential for mishaps that could lead to accidental infections in even the most respected laboratories. Congressional hearings on these incidents, as well as pressure from scientists and others concerned about biosafety in PPP experiments, contributed to the US Government’s decision to undertake the funding pause and deliberative process on such experiments. Recent incidents involving Ebola virus at CDC and Burkholderia pseudomallei at Tulane University [22,23] have helped to sustain this concern.

Biosecurity versus Biosafety: Distinct Ethical Concerns

Conceptually, it is valuable to distinguish the biosecurity and biosafety aspects of dual-use. The biosecurity dimension of dual-use is concerned with malevolent, deliberate use of the results or products of particular research to harm humanity [18,24]. The biosafety dimension, by contrast, is concerned with accidental infections, caused by unintentional exposure to or accidental release of a pathogen during the conduct of such research (including, potentially, the future replication of experiments) [18]. In the case of PPP studies, the chief biosafety concern is the risk of an accidental epidemic.

As Table 1 shows, these aspects are related but not coextensive. The Australian mousepox study was considered a biosecurity risk because of the potential for malevolent actors to create a similar strain of smallpox that could be a bioweapon, but the experiments themselves, using a mouse virus that does not readily infect humans, posed no special biosafety concerns. On the other hand, it is difficult to think of an area of research that raises public-health biosafety concerns without some biosecurity dimension. PPP experiments, as we have seen, combine both concerns.

The intertwining of the two concerns is most acute in areas of science where expertise is relatively common and costs relatively low, both properties that increasingly apply to pathogen research and synthetic biology in general [25]. This is because in such areas, the results of an experiment – typically the concern of biosecurity – may facilitate the conduct of similar experiments in laboratories with less stringent biosafety controls than the original laboratory, leading to risk of accident in those other laboratories. Access control, physical security, and inventory maintenance are regulatory approaches that can contribute to both safety and security.

Despite this overlap, there are strong reasons to keep the evaluation of biosecurity concerns about the results of experiments conceptually distinct from the framework for evaluating biosafety concerns about the conduct of experiments. Evaluating biosecurity concerns involves a number of speculative empirical claims about the capabilities and intentions of malevolent actors that are difficult to evaluate without expertise and (often classified) specialist knowledge. On the other hand, biosafety risks pertain to the predictable actions (including inadvertent ones) of presumably well-intentioned, benevolent researchers. They are assessments of the risk posed by the scientists’ own actions, rather than of the risk that others will use the research findings in malevolent ways. Biosafety risks can also, at least in part, be estimated and evaluated within the context of everyday civilian biomedical research. While estimating the likelihood and magnitude of the risks of such research requires some specialist knowledge and is subject to debate, the discussion can be held using publicly available data and information; indeed, it already is [26].

Yet biosafety concerns in life sciences research, to date, have not been the subject of sustained bioethical analysis, precisely because few biological experiments have been thought to carry public health risks. In the US, biosafety is generally assumed to be covered by the CDC’s manual Biosafety in Microbiological and Biomedical Laboratories [27]. The principles developed in that manual treat biosafety largely as an issue of occupational health and individual risk to the workers in the laboratory. The most serious cases—category 4 agents—are those that are considered high risk to “the community” [27 p.xxxii] but these are, consistent with the manual’s remit, approached as an issue of laboratory design and standard operating procedures for experiments, rather than an ethical issue.

What is not discussed is the conduct of experiments as an ethical issue relating to the safety of distant others. For example, the principles for which agents require the highest level of biosafety read in part as follows:

Biosafety Level 4 is required for work with dangerous and exotic agents that pose a high individual risk of aerosol-transmitted laboratory infections and life-threatening disease that is frequently fatal, for which there are no vaccines or treatments, or a related agent with unknown risk of transmission [27].

This standard only obliquely addresses (“related agent with unknown risk of transmission”) the risk that an accident could pose to persons far removed from the laboratory by sparking long chains of transmission. In their guidance on the manipulation of highly pathogenic avian influenza (HPAI) viruses, the CDC recommends biosafety level 3 (BSL-3) or enhanced agricultural BSL-3 laboratory standards [27]. This category of containment, and indeed the guidance on HPAI viruses, however, focuses on occupational health and agricultural impact, while saying nothing on the public health implications of such research.

While perhaps appropriate for regulating the safety of research that entails risks to the scientists performing it, such principles are inadequate to address the public health risks from highly transmissible agents, including GOF/PPP research. The recent incidents at the CDC and NIH campus emphasize that the rate at which accidental exposures occur in highly sophisticated, high-containment laboratories is far from negligible [2,28]. This fact was already clear, though perhaps less widely appreciated, from publicly available data on the frequency of laboratory-acquired infections [29,30]. Such rates of accidents may be acceptable when the consequences are limited to the originally exposed individuals, yet the same rates may become unacceptable when the potential consequences include the extensive or even global spread of PPPs.

When biosafety risks from research in the life sciences affect public health, the debate over whether and how to perform such research is one of bioethics. Ethical reasoning about the balance of benefits and risks of scientific research has a history that is relevant to the biosafety aspects of dual-use research. The debate ultimately turns on the weight of the rights of scientists compared to risk of harm to the public; deciding what values are important to promote in pursuing risky research; and evaluating substantive claims about what societal goals are worth achieving with science’s help [31].

Nuremberg’s Legacy

Discussions of research ethics have focused on human subjects research as the dominant kind of activity requiring oversight and review. This focus reflects the historic origins of the effort to codify research ethics. Contemporary research ethics has its normative and historical foundations in the Nuremberg Code (hereafter, “the Code”), written in the aftermath of the atrocities committed by Nazi doctors on political prisoners during the Second World War, and their revelation during the Doctors’ trial that followed. The Code, adopted in 1947, is a cornerstone of bioethics globally. It has been augmented, but not replaced, by the 1954 statutes of the World Medical Assembly, later revised into the 1964 Declaration of Helsinki and its subsequent updates. In the United States, the Belmont Report defines the bioethical commitments governing scientific research involving human subjects; it draws from both the Code and the Helsinki Declaration.

The Code and its direct descendants were developed specifically in response to medical research atrocities with human subjects, and so they focus explicitly on the ethics of scientific experiments involving direct human participants—what has now become known as human subjects research. This is historically justified. From a contemporary perspective, too, research involving human subjects—particularly in medicine—typically carries more serious and immediate risks to humans than other forms of scientific research. Hence the first article in the Code beings with “The voluntary consent of the human subject is absolutely essential,” situating it as a document primarily about research involving direct human participants.

However, some of the ethical principles embodied in the Code and its successor documents are clearly generalizable to scientific research that poses risk to human beings who are not direct participants in the research itself. Two principles of particular relevance are stated most succinctly as the second and sixth articles of the Code:

2. The experiment should be such as to yield fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature.

6. The degree of risk to be taken should never exceed that determined by the humanitarian importance of the problem to be solved by the experiment [32].

These principles recur in other statements of human subjects research ethics; indeed, the Belmont Report’s explication of the principle of beneficence notes the importance of maximizing benefits, minimizing risks, and achieving humanitarian benefits for the research enterprise as a whole:

The obligations of beneficence affect both individual investigators and society at large, because they extend both to particular research projects and to the entire enterprise of research. In the case of particular projects, investigators and members of their institutions are obliged to give forethought to the maximization of benefits and the reduction of risk that might occur from the research investigation. In the case of scientific research in general, members of the larger society are obliged to recognize the longer term benefits and risks that may result from the improvement of knowledge and from the development of novel medical, psychotherapeutic, and social procedures [33].

Furthermore, one of the strongest recent statements of the requirement for beneficence and risk analysis in research comes from a document that is about work on dangerous pathogens, rather than human subjects research: the Interacademy Panel on International Issues’s Statement on Biosecurity. It reads in part:

Scientists have an obligation to do no harm. They should always take into consideration the reasonably foreseeable consequences of their activities [34].

This statement was approved by 74 national academies of science, presumably demonstrating widespread agreement to the extension of beneficence and risk assessment to non-human subjects research. Its implications seem to have received little attention in research ethics, however.

GOF/PPP research is an unusual example of research not involving human participants, yet entailing substantial risk to human health and life. It seems reasonable to subject such research to these same ethical criteria as apply to human subjects research more conventionally understood: that it be done only if it benefits society, if the same benefits could not be procured through less-risky means, and if the anticipated benefits exceed the anticipated risk. In this respect, we argue that the above norms of the Code ought to apply to GOF/PPP research, and justify a level of oversight and review above the status quo (i.e. biosafety review as an occupational health concern).

This kind of oversight is justified for two reasons.

  1. Risk of harm to large numbers of persons is incurred through the conduct of scientific research. One necessary outcome of GOF/PPP research is pathogens that combine genetic novelty with likely or known high virulence and efficient transmissibility in humans. According to current estimates, there is a 0.01%–0.1% risk per laboratory-year, or a 0.05%–0.06% risk per full-time worker-year (2,000 hours), that a novel, transmissible form of influenza virus will infect a laboratory worker, escape control, and cause a disease outbreak. Given historic estimates of influenza’s spread across the globe (~24–38% of the world’s population), it is estimated that a pandemic of a highly virulent influenza strain, such as those created in GOF/PPP experiments, could cause between 20 million and 1.6 billion fatalities [26,35].

  2. It is questionable whether those at risk will receive benefits from the research. In a world of mass transit, influenza can travel great distances rapidly. During the 2009 H1N1 influenza outbreak, for example, laboratory-confirmed cases of the virus were found in 73 countries in the first six months of 2009. The 2002–2003 SARS outbreak travelled to 26 countries. The benefits of scientific research—including vaccines, therapeutics, and disease surveillance infrastructure—travel relatively slowly, by contrast, and are confined by considerations such as a lack of global health governance, commercial priorities of pharmaceutical manufacturers, political unrest, and a lack of enabling resources (such as consistent power, or well-maintained roads) [36]. Given the already uncertain possible benefits of GOF/PPP studies, their tenuous actual benefits, and the unequal distribution of those possible or actual benefits, many of the persons placed at risk by the pursuit of GOF/PPP studies have little hope of seeing benefits from those studies.

We believe that the sixth precept of the Code has particular force in light of this marked inequality in the provision of risk and benefit through the conduct of GOF/PPP research. The risks of the research are global—a pandemic following the accidental release of a novel influenza strain might infect almost 40% of the global population [26], yet the major purported benefits of these studies – better influenza virology [37] and more effective vaccine strain selection [38,39] benefit a much smaller subset of the population, mainly for reasons of affordability. This is prima facie evidence that the humanitarian benefits of the study are much smaller than the risks. Thus, burden of proof lies on those who support such research to demonstrate that GOF/PPP research provides sufficiently broad benefits, or limited risks, to meet the norms that require the risks of research to be outweighed by its benefits. Such reasoning is also supported by the Belmont Report, which explicitly cautions that “research should not unduly involve persons from groups unlikely to be among the beneficiaries of subsequent applications of the research” [33].

Human Subjects Research as a Continuum

It is uncontroversial to claim that research involving human subjects must be justified by humanitarian benefits that cannot be achieved by alternative means, and that these benefits must outweigh the risks that arise in the conduct of experimentation. But the force of these claims does not rely on human subjects research as a special class of research. Rather, the creation of risks attendant to human beings is what drives the necessity of considering humanitarian benefits of this research. Any scientific experiment that poses considerable risks to humans ought to be subject to an assessment of its humanitarian importance relative to the risks it poses, and of the necessity of the experiment in achieving its stated aims.

Ethical concern for the risks to nonparticipants arising from the conduct of scientific research already has some precedent. Exclusion criteria are a standard way that all clinical trials reduce the risk to participants. But in a small subset of clinical trials, an individual’s participation may create risks to others, such as those for whom the participant cares at home or professionally. Such studies include challenge studies with live pathogens, and immunization with live attenuated vaccines. In both cases there may be a risk of transmission to someone who is vulnerable due to immunodeficiency. It has been argued that the risk of harm provides an ethical obligation to exclude persons from such studies if their participation would produce risk to others, or—if the study design relies on a certain kind of participant—monitoring and mitigating the chance of an accidental exposure (including, potentially, forced isolation) [40].

These exclusion and isolation criteria appear in the literature on live attenuated vaccine studies and live-pathogen challenge studies. In a 2013 study of a smallpox vaccine, researchers based their exclusion criteria on CDC guidance that included the exclusion of individuals who live with someone who has eczema [41], because the household member, who is not a trial participant, might be placed at risk from vaccine virus shed by the participant. Another smallpox vaccine study excluded anyone who had household contact, sexual contact, or occupational exposure to pregnant women, immunosuppressed persons, persons with eczema, or infants less than 12 months of age [42]. An ongoing vaccine trial of live attenuated influenza A/H7N9 vaccine requires participants to be in an isolation unit for the duration of the study [43]. A clinical study that deliberately infected consenting participants with Listeria monocytogenes to study the natural history of infection listed among its exclusion criteria subjects that lived with immunosuppressed individuals, children under four years of age, or people with chronic skin disorders [44].

Given examples such as these, we argue that it is a conceptual and ethical mistake to maintain a hard, binary distinction between research involving direct human participants and research without. Rather, human subjects research, and the ethical mandates that attend it, should be understood as lying on a continuum. Some human subjects research presents risks only to participants; other research poses risks to participants and nonparticipants. GOF/PPP research presents considerable risks to distant others, but not human participants (in virtue of there being no human participants). GOF/PPP studies that risk significant harm to large numbers of people lie on this continuum, and ought, as a result, to be subject to additional scrutiny and oversight.

As a point on the continuum of research requiring ethical review, we argue, GOF/PPP research should be subject to rigorous scrutiny of its risks and benefits. This scrutiny, moreover, should incorporate the two normative principles of the Nuremberg code and subsequent research ethics documents: GOF/PPP experiments may be justified when they yield fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature; and when the degree of risk does not exceed that determined by the humanitarian importance of the problem to be solved.

Objections

There are some foreseeable objections to our account, which we now address.

First, it could be argued that the historical context of bioethics in the last half-century, particularly the Nuremberg Code, respond to a large degree to egregious abuses of by doctors of vulnerable and unwilling participants, and that this confines the ethical force of these principles to research that takes place in the context of relationships involving identified human participants, rather than broad risk to unknown persons. On this account, it would be inappropriate to apply the principles of the Code or subsequent elaborations thereof to risks to persons other than human participants in a study, where the specific relationship between the researcher and the participants is liable to abuse.

We believe this is an inappropriately narrow reading of the principles’ ethical scope. While the Code itself originated historically in response to particular abuses by physician-experimenters, its normative foundations and those of successor documents are now widely regarded to encompass human subjects research writ large. While there are differences between research within and outside of clinical care [45], human subjects review now extends to nonclinical human subjects research, including many categories of research in the social sciences involving no preexisting relationship between the researcher and the participants.

Further, subsequent documents acknowledge the broader relevance of benefits and harms to society. The Belmont Report, for example, explicitly deduces its human subjects prescriptions from principles it states are broadly applicable:

The expression "basic ethical principles" refers to those “general judgments that serve as a basic justification for the many particular ethical prescriptions and evaluations of human actions. Three basic principles, among those generally accepted in our cultural tradition, are particularly relevant to the ethics of research involving human subjects: the principles of respect of persons, beneficence and justice” [33].

As noted above, the Belmont Report also claims that research conducted using public funds should not unduly risk persons from groups that are unlikely to be among the beneficiaries of subsequent applications of the research [33]. This form of reasoning exemplifies our claim that accounts of the normative force of human subjects research ethics must draw on deeper principles that apply beyond human subjects research.

More specifically, as we have noted, ethical considerations are cited to restrict inclusion or require participant isolation in human subjects studies to prevent a risk to distant persons (such as the patients cared for by a participant who is also a health care provider). For all these reasons, we doubt that widely accepted principles of human subjects research ethics could be justified without reference to ethical concerns for individuals that precedes any relationship they may have with the researcher.

Second, it could be objected that the historical and normative foundations of human subjects research are less pertinent to assessing the ethics of PPP/GOF research than are, say, the context and outcomes of the Asilomar Conferences around recombinant DNA, or the debate on the ethics of the Human Genome Project. Both Asilomar and the Human Genome Project have been recognized by the National Research Council as examples of scientific self-governance that predate the issue of dual-use, and demonstrate the competence the scientific community possesses in self-governance [13,46]. It is therefore appropriate to question our choice of cases in mounting our argument.

Scientific self-governance along the lines modeled at Asilomar is an attractive approach—particularly to the scientific community—to ethical judgment about novel scientific pursuits. For example, a statement coauthored with eighteen other scholars, the Cambridge Working Group Consensus Statement on the Creation of Potential Pandemic Pathogens (PPPs) referenced the Asilomar process as a valuable starting point for a deliberative process on the risks and benefits of a class of scientific activity [7]. However, Asilomar cannot provide a complete model for the current situation, for three reasons. The first is that Asilomar had a different scope: it dealt only with the risk of accidental creation of virulent pathogens [13]., and not the deliberate creation of novel, pathogenic organisms like PPPs. The situation in which we find ourselves now is thus different because we are not considering the possibility that there could be augmented PPPs that result from research that has aims other than the creation of such diseases, but rather research that deliberately seeks to create these strains of disease. Where Asilomar was concerned primarily with the unintended consequences of otherwise worthwhile research, the Nuremberg Code and its successors are concerned with outlining what kinds of research are worth pursuing in the first place. In this way, the mindset and method of Asilomar is not sufficient for our current task.

Second, the Asilomar process was responding to a collective concern about the risks of rDNA research, but was decidedly narrow in its approach to scientific governance. Asilomar was publicly discussed, but the deliberation that ensued prior to public announcement was largely confined to the scientific establishment, some journalists, and those lawyers present at the meeting in California. On our account, this is not sufficient when the potential harms apply to large groups of people. Scientists are necessary participants within a deliberative process, but are far from sufficient in adequately examining the values at stake when comparing the risks and benefits of GOF/PPP research.

Third, the debate about GOF/PPP is not about the possibility of a new technology to help or harm humanity, which was the case at Asilomar. It has been established that GOF is a valuable methodology for scientific inquiry that can answer certain types of biological questions definitively. Unlike recombinant DNA research in 1974, GOF is not a methodology in need of validation. Rather, the current debate is one of determining whether certain instances of the use of this methodology are justified (specifically, GOF research that creates PPPs), based on the risk of harm to humans and the purported benefits relative to alternative approaches.

A final potential objection to our argument is that extending human subjects research ethics to GOF/PPP research risks excessively broadening the purview of research ethics oversight to otherwise benign life sciences research. Criticisms of this kind would note that there are, in principle, many kinds of life sciences research that could pose some risk to others. Our conclusion might entail additional scrutiny for research in which the conduct of the research risks harm to others. All scientific progress entails some risk, but this ought not to entail burdensome and unnecessary oversight for all.

In response, we point out that the majority of life sciences research only poses occupational risks to researchers. These risks are limited and affect consenting researchers, may be mitigated by adequate biosafety, and are likely to be outweighed by the merits of doing the research in the first place. It is very unlikely that these kinds of risks are severe enough that they would be flagged for review under our model. Most scientific research in the life sciences would therefore not be burdened by increased regulation or oversight.

Research that poses population-level risks, however, is of a different magnitude from this occupational risk, and targets a much larger group with potential harms. We agree that the number of cases that could be captured is larger than GOF/PPP research, and hence that we are proposing expanded scrutiny of some scientific research. We also argue, however, that such scrutiny is justified given the potential risks to public health. Recent examples include the use of RNA gene drives for the alteration of wild populations [47], and the proposed moratorium on human germ line editing using CRISPR/Cas9 technologies [48]. Given that genetic engineering could pose serious risks to distant others, including future generations, we think that the recommendations we present below, based on our preceding analysis, would be a useful form of review. We view this as a strength of our model: the normative principles set out in the Code and its successors, which we suggest are applicable to GOF/PPP research, may be applied—with suitable refinements—to future issues as they arise.

Recommendations

We urge national and international regulatory bodies to bear the principles normally applied to human subjects research in mind when evaluating the risks and benefits of GOF/PPP experimentation. To this effect, we have six recommendations in designing a risk-benefit assessment of the safety of PPPs:

  1. Consider the broadest possible set of risks and benefits. The Code, and the family of statements in which it resides, places an emphasis on the “good of society.” This good, however, should accommodate a nuanced range of risks and benefits posed by GOF studies involved with the creation of PPPs. This includes, for example, how gains made in disease surveillance from PPP research but unavailable through other means might benefit—or fail to benefit—different jurisdictions in the contexts of disease outbreak. The possible uneven distribution of risks and benefits to different populations – in particular risks experienced by populations with little hope of benefitting from the research – may be of ethical concern in themselves, a point we flag but do not further explore here.

  2. Consider different classes of experiments differently. It should be obvious that applying a single set of ethical principles to the risks and benefits of various categories of experiments does not necessarily lead to the same judgments of their ethical status. In particular, some categories of experiments included in the US Government’s funding pause may be both less risky and more beneficial than the enhancement of transmissibility of virulent, novel avian influenza strains that triggered the recent controversy. There will be some common elements in any risk-benefit assessment of PPP research. Nonetheless, such assessment should be individualized to account for differences between work on different viruses, for example, and for differences between enhancing transmissibility and enhancing pathogenicity [49]. Recognition of such distinctions is exemplified by the removal, following the first NRC meeting on this topic, of coronavirus gain-of-virulence studies from the funding pause [50].

  3. Consider alternatives to experiments within a portfolio. The Code notes that research should be done only when necessary to achieve some good aim for society’s benefit, which is not otherwise procurable. If research is deemed sufficiently risky, it should be evaluated for the aims it would achieve in comparison to a series of alternative approaches that achieve, individually, or in concert, the same types of scientific or practical benefits. There are some scientific questions that can be answered only by PPP/GOF experiments [37]. There is a longer discussion whether answering these specific questions is of great importance to the broad understanding of influenza biology, or whether, as one of us has argued, similar and comparably important scientific questions can be answered using different techniques or similar techniques with less dangerous viral strains (e.g. strains for which there is significant population-wide immunity and/or strains with lower virulence) [2]. Whatever one’s views on the scientific merits of GOF/PPP work, the humanitarian goal of preventing and mitigating influenza may be approached by a large number of alternative paths, some of which may have considerable practical advantages over GOF/PPP studies [2]. These alternatives ought to be considered, and GOF/PPP research pursued only if there is no reasonable alternative.

  4. Commit to prudent stewardship of the life sciences by funding research on laboratory biosafety. We acknowledge that there is a lack of data surrounding the overall efficacy of current laboratory biosafety standards. Indeed, the level of underreporting has been emphasized in some of the academic literature [51], but the full accounting of biosafety incidents has been left to investigative journalists armed with freedom-of-information requests [52,53]. In light of this, beyond the yearlong timeline of the current deliberative process there should be a concerted effort to develop better standards for assessing and reporting biosafety in laboratories around the country and beyond. With better evidence, we can make more informed decisions about which studies to pursue—something that should satisfy all sides of the current debate. The life sciences demand exceptional rigor when studying the behavior of viruses in nature; we should demand a concomitant rigor when studying the life sciences itself.

  5. Develop normative approaches to biosafety as a public health issue to inform policies and procedures for evaluating research that may raise concerns This paper focuses on PPP/GOF research as a particularly clear case of research not involving human participants with a public health biosafety dimension. The ethical basis of our claims raises a number of philosophical questions about the scope of responsibility of researchers and their ethical reviewers to protect humans who are not participants in the research. This issue has been little treated in the literature. Also important is to better define the boundaries of such concern to avoid overly burdensome regulatory processes while reliably flagging research that raises real concern. The growth of synthetic biology may present many more challenging cases for assessing whether a public health risk exist, and what is the nature and magnitude of that risk. This involves both normative and empirical questions.

  6. Generate international partnerships. GOF experiments are international, even if the locus of the current controversy is the United States. For example, in 2013 Chinese researchers constructed 127 reassortant viruses of H5N1 and H1N1, and tested them for sustained transmission in guinea pigs. In addition to the US [11,12] and China [54], funders of GOF research include the European Union [11], private and public funders in the UK [55,56], and the governments of Japan [56] and the Netherlands [11]. The impact of creating new viruses in the lab should thus be assessed in terms of the proliferation of these techniques and novel strains internationally. This, we believe, will be best accomplished by partnering with foreign and international groups to develop common, global standards and information sharing mechanism. Countries such as Germany and Australia have already begun the process of incorporating ethical considerations for dual-use life sciences research into their national medical research frameworks [57,58]; engaging with these processes early will ensure more comprehensive international harmonization. (This would also enable better confidence building mechanisms in line with the commitments the US has to the Biological Weapons and Toxins Convention.)

Building a Better Life Sciences

The current ethical prescriptions set out by Nuremberg and its successors have not imperiled the biomedical sciences: during the last sixty-seven years, we have seen incredible growth in medical research and improvements in medical care and longevity. Whether this occurred because of, or despite, the Nuremberg Code’s legacy is not a question that can be easily answered. What can be stated is that the development of medicine has surpassed society’s expectations.

Likewise, we have little reason to believe that subjecting GOF/PPP experiments to heightened scrutiny, based on the principles currently applied to human subjects research, will harm the life sciences. We do have reason, however, to weigh the merits of this small class of experiments against the risks they pose to human health. Incorporating principles of human subjects research ethics that are of particular relevance to GOF/PPP studies will mean considering the benefits of this research in the context of its capacity to benefit or harm humanity, and those potential benefits to outweigh the risk of harm when considered in a portfolio of research. The benefits of science are often pursued in the national interest [59], but the costs of a laboratory release of a novel PPP could be global. Ensuring that science’s means match its legitimate, ethical ends may require trading off a method presently attractive to a subset of the life sciences community, for those that present safer options for achieving our justified goals.

The legacy of Nuremberg teaches us that the epistemic norms of science and medicine ought to be moderated by ethical concerns. The life sciences revolution should change our world for the better; its means should be as desirable and ethical as its ends.

Acknowledgements

We thank Annette Rid and Nir Eyal for helpful discussions that informed this paper. This work was supported by Award Number U54GM088558 from the National Institute Of General Medical Sciences. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute Of General Medical Sciences or the National Institutes of Health.

Contributor Information

Nicholas Grieg Evans, Department of Medical Ethics and Health Policy, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104 USA. evann@mail.med.upenn.edu.

Marc Lipsitch, Center for Communicable Disease Dynamics Harvard T.H. Chan School of Public Health, Boston, MA, USA.

Meira Levinson, Levinson, Harvard Graduate School of Education, Cambridge, MA, USA.

References

  • 1.Klotz LC, Sylvester EJ. The unacceptable risks of a man-made pandemic. [accessed 23 Apr2015];Bulletin of the Atomic Scientists. 2012 http://thebulletin.org/unacceptable-risks-man-made-pandemic. [Google Scholar]
  • 2.Lipsitch M, Galvani AP. Ethical Alternatives to Experiments with Novel Potential Pandemic Pathogens. PLoS Medicine. 2014;11:e1001646. doi: 10.1371/journal.pmed.1001646. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Enserink M. Tiptoeing Around Pandora's Box. Science. 2004;305:594–595. doi: 10.1126/science.305.5684.594. [DOI] [PubMed] [Google Scholar]
  • 4.Tumpey TM. Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus. Science. 2005;310:77–80. doi: 10.1126/science.1119392. [DOI] [PubMed] [Google Scholar]
  • 5.Maines TR, Chen L-M, Van Hoeven N, et al. Effect of receptor binding domain mutations on receptor binding and transmissibility of avian influenza H5N1 viruses. Virology. 2011;413:139–147. doi: 10.1016/j.virol.2011.02.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Sorrell EM, Wan H, Araya Y, et al. Minimal molecular constraints for respiratory droplet transmission of an avian-human H9N2 influenza A virus. Proceedings of the National Academy of Sciences. 2009;106:7565–7570. doi: 10.1073/pnas.0900877106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Attaran A, Bloom B, Casadevall A, et al. Cambridge Working Group Consensus Statement on the Creation of Potential Pandemic Pathogens (PPPs) Cambridge Working Group; 2014. [accessed 5 Nov2014]. http://www.cambridgeworkinggroup.org/ [Google Scholar]
  • 8.Evans NG. Great expectations--ethics, avian flu and the value of progress. Journal of Medical Ethics. 2013;39:209–213. doi: 10.1136/medethics-2012-100712. [DOI] [PubMed] [Google Scholar]
  • 9.Lipsitch M, Plotkin JB, Simonsen L, et al. Evolution, Safety, and Highly Pathogenic Influenza Viruses. Science. 2012;336:1529–1531. doi: 10.1126/science.1223204. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Berns KI, Casadevall A, Cohen ML, et al. Adaptations of avian flu virus are a cause for concern. Science. 2012;335:660–661. doi: 10.1126/science.1217994. [DOI] [PubMed] [Google Scholar]
  • 11.Herfst S, Schrauwen EJA, Linster M, et al. Airborne Transmission of Influenza A/H5N1 Virus Between Ferrets. Science. 2012;336:1534–1541. doi: 10.1126/science.1213362. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Imai M, Watanabe T, Hatta M, et al. Experimental adaptation of an influenza H5 HA confers respiratory droplet transmission to a reassortant H5 HA/H1N1 virus in ferrets. Nature. 2012;486:420–428. doi: 10.1038/nature10831. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.National Research Council. Biotechnology Research in an Age of Terrorism. National Academies Press; 2004. [PubMed] [Google Scholar]
  • 14.Selgelid MJ, Weir L. The mousepox experience. EMBO reports. 2010;11:18–24. doi: 10.1038/embor.2009.270. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Greenfieldboyce N. First Of Controversial Bird Flu Studies Is Published. [accessed 8 Feb2015];National Public Radio. 2012 http://www.npr.org/blogs/health/2012/05/09/151870671/first-of-controversial-bird-flu-studies-is-published. [Google Scholar]
  • 16.Osterholm MT. Letter to Dr. Amy P. Patterson. 2012:1–7. [Google Scholar]
  • 17.Relman DA. The Increasingly Compelling Moral Responsibilities of Life Scientists. Hastings Center Report. 2013;43:34–35. doi: 10.1002/hast.156. [DOI] [PubMed] [Google Scholar]
  • 18.World Health Organization. Responsible life sciences research for global health security: a guidance document. Geneva: World Health Organization; 2010. http://whqlibdoc.who.int/hq/2010/WHO_HSE_GAR_BDP_2010.2_eng.pdf?ua=1. [PubMed] [Google Scholar]
  • 19.Centers for Disease Control and Prevention (CDC) Report on the Potential Exposure to Anthrax. Centers for Disease Control and Prevention; 2014. [Google Scholar]
  • 20.Centers for Disease Control and Prevention (CDC) Report on the Inadvertent Cross- Contamination and Shipment of a Laboratory Specimen with Influenza Virus H5N1. 2014 http://www.cdc.gov/about/pdf/lab-safety/investigationcdch5n1contaminationeventaugust15.pdf.
  • 21.Kaiser J. The catalyst. Science. 2014;345:1112–1115. doi: 10.1126/science.345.6201.1112. [DOI] [PubMed] [Google Scholar]
  • 22.Centers for Disease Control and Prevention (CDC) Report on the Potential Exposure to Ebola Virus. CDC; 2015. [Google Scholar]
  • 23.Young A. Senators seek answers in bioterror laboratory accident. [accessed 2 May2015];USA Today. 2015 http://www.usatoday.com/story/news/2015/03/25/senators-seek-answers-tulane-lab-accident/70453068/ [Google Scholar]
  • 24.National Science Advisory Board for Biosecurity. Proposed Framework for the Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research Information. Washington, DC: Department of Health and Human Services; 2007. [Google Scholar]
  • 25.Tucker JB. Could terrorists exploit synthetic biology? The New Atlantis. 2011:69–81. [Google Scholar]
  • 26.Lipsitch M, Inglesby TV. Moratorium on Research Intended To Create Novel Potential Pandemic Pathogens. mBio. 2014;5:e02366–e02314. doi: 10.1128/mBio.02366-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Centers for Disease Control and Prevention (CDC) Biosafety in Microbiological and Biomedical Laboratories. 5 ed. Bethesda, MD: Centers for Disease Control and Prevention; 2009. http://www.cdc.gov/biosafety/publications/bmbl5/BMBL.pdf. [Google Scholar]
  • 28.Evans NG, Selgelid MJ. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies. Sci Eng Ethics. 2014 Sep 24; doi: 10.1007/s11948-014-9591-3. Published Online First. [DOI] [PubMed] [Google Scholar]
  • 29.Henkel RD, Miller T, Weyant RS. Monitoring select agent theft, loss and release reports in the United States–2004–2010. App Biosafety. 2012;17:171–180. [Google Scholar]
  • 30.Rhodes K. Office USGA, Investigations USCHCOEACSOOA. High-containment Biosafety Laboratories. 2007 [Google Scholar]
  • 31.Miller S, Selgelid MJ. Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences. Dordrecht: Springer; 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Trials of war criminals before the Nuernberg Military Tribunals under Control Council law no. 10, Nuremberg, October 1946-April, 1949. US Government Printing Office; 1949. [Google Scholar]
  • 33.Ryan KJ, Brady JC, Cooke RE, et al. The Belmont Report: Ethical Principles & Guidelines for Research Involving Human Subjects. [accessed 24 Apr2015];Department of Health and Human Services. 1979 http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html#xbenefit.
  • 34.IAP Statement on Biosecurity. IAP; 2005. [accessed 25 Nov2014]. The Interacademy Panel on International Issues. http://www.interacademies.net/File.aspx?id=5401. [Google Scholar]
  • 35.Lipsitch M, Inglesby TV. Erratum for Lipsitch and Inglesby, Moratorium on Research Intended To Create Novel Potential Pandemic Pathogens. mBio. 2015;6:e02534–e025314. doi: 10.1128/mBio.02366-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Evans NG. Dual-use decision making: relational and positional issues. Monash Bioeth Rev. 2014;32:268–283. doi: 10.1007/s40592-015-0026-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Casadevall A, Howard D, Imperiale MJ. An Epistemological Perspective on the Value of Gain-of-Function Experiments Involving Pathogens with Pandemic Potential. mBio. 2014;5:e01875–14–e01875–14. doi: 10.1128/mBio.01875-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Davis CT, Chen L-M, Pappas C, et al. Use of Highly Pathogenic Avian Influenza A(H5N1) Gain-Of-Function Studies for Molecular-Based Surveillance and Pandemic Preparedness. mBio. 2014;5:e02431–e024314. doi: 10.1128/mBio.02431-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Schultz-Cherry S, Webby RJ, Webster RG, et al. Influenza Gain-of-Function Experiments: Their Role in Vaccine Virus Recommendation and Pandemic Preparedness. mBio. 2014;5:e02430–e024314. doi: 10.1128/mBio.02430-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Miller FG, Grady C. The ethical challenge of infection-inducing challenge experiments. Clinical Infectious Diseases. 2001;33:1028–1033. doi: 10.1086/322664. [DOI] [PubMed] [Google Scholar]
  • 41.Rotz LD, Damon IK, Becher JA, et al. Vaccinia (smallpox) vaccine: recommendations of the Advisory Committee on Immunization Practices (ACIP), 2001. Morb Mortal Wkly …. 2001;22:1–25. [PubMed] [Google Scholar]
  • 42.Frey SE, Couch RB, Tacket CO, et al. Clinical Responses to Undiluted and Diluted Smallpox Vaccine. New England Journal of Medicine. 2002;346:1265–1274. doi: 10.1056/NEJMoa020534. [DOI] [PubMed] [Google Scholar]
  • 43.University of Rochester. Safety and Immunogenicity of a Live Attenuated H7N9 Influenza Virus Vaccine in Healthy Adults. [accessed 2 May2015];2014 ClinicalTrials.gov. https://clinicaltrials.gov/ct2/show/NCT01995695?term=NCT01995695&rank=1.
  • 44.Hohmann EL. Transcutaneous Immunization With an Attenuated Listeria Monocytogenes Vector Vaccine. [accessed 21 Apr2015];2012 doi: 10.1016/j.vaccine.2013.05.028. ClinicalTrials.gov. https://clinicaltrials.gov/show/NCT01311817. [DOI] [PMC free article] [PubMed]
  • 45.Joffe S, Miller FG. Bench to Bedside: Mapping the Moral Terrain of Clinical Research. Hastings Center Report. 2008;38:30–42. doi: 10.1353/hcr.2008.0019. [DOI] [PubMed] [Google Scholar]
  • 46.National Research Council. Globalization, Biosecurity, and the Future of the Life Sciences. National Academies Press; 2006. [Google Scholar]
  • 47.Esvelt KM, Smidler AL, Catteruccia F, et al. Concerning RNA-guided gene drives for the alteration of wild populations. eLife. 2014;3:e03401. doi: 10.7554/eLife.03401. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Baltimore D, Berg P, Botchan M, et al. A prudent path forward for genomic engineering and germline gene modification. Science. 2015;348:36–38. doi: 10.1126/science.aab1028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Lipsitch M, Inglesby TV. Moratorium on Research Intended to Create Novel Potential Pandemic Pathogens. mBio. doi: 10.1128/mBio.02366-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.National Research Council, Institute of Medicine. Potential Risks and Benefits of Gain-of-Function Research: Summary of a Workshop. Washington, DC: National Academies Press; 2015. [PubMed] [Google Scholar]
  • 51.Kimman TG, Smit E, Klein MR. Evidence-Based Biosafety: a Review of the Principles and Effectiveness of Microbiological Containment Measures. Clinical Microbiology Reviews. 2008;21:403–425. doi: 10.1128/CMR.00014-08. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Young A. New lab incidents fuel fear, safety concerns in Congress. [accessed 22 Oct2014];USA Today. 2014 http://www.usatoday.com/story/news/nation/2014/09/22/biolab-safety-incidents-lassa-fever-h7n9-burkholderia/15908753/ [Google Scholar]
  • 53.Sample I. Scientists condemn ‘crazy, dangerous’ creation of deadly airborne flu virus. [accessed 14 Jun2014];The Guardian. 2014 http://www.theguardian.com/science/2014/jun/11/crazy-dangerous-creation-deadly-airborne-flu-virus. [Google Scholar]
  • 54.Zhang Y, Zhang Q, Kong H, et al. H5N1 Hybrid Viruses Bearing 2009/H1N1 Virus Genes Transmit in Guinea Pigs by Respiratory Droplet. Science. 2013;340:1459–1463. doi: 10.1126/science.1229455. [DOI] [PubMed] [Google Scholar]
  • 55.Shelton H, Roberts KL, Molesti E, et al. Mutations in haemagglutinin that affect receptor binding and pH stability increase replication of a PR8 influenza virus with H5 HA in the upper respiratory tract of ferrets and may contribute to transmissibility. Journal of General Virology. 2013;94:1220–1229. doi: 10.1099/vir.0.050526-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Watanabe Tokiko, Zhong Gongxun, Russell Colin A, et al. Circulating Avian Influenza Viruses Closely Related to the 1918 Virus Have Pandemic Potential. Cell Host & Microbe. 2014;15 doi: 10.1016/j.chom.2014.05.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.German Ethics Council. Biosecurity — Freedom and Responsibility of Research. Berlin: German Ethics Council; 2014. [Google Scholar]
  • 58.Anderson W. Research Newsletter. National Health and Medical Research Council of Australia; 2013. [Google Scholar]
  • 59.Bush V. Science, the endless frontier. Washington, DC: United States Office of Scientific Research and Development; 1945. [Google Scholar]

RESOURCES