Skip to main content
EMBO Reports logoLink to EMBO Reports
. 2015 Dec 18;17(2):127–130. doi: 10.15252/embr.201541674

How likely is it that biological agents will be used deliberately to cause widespread harm?

Policymakers and scientists need to take seriously the possibility that potential pandemic pathogens will be misused

Thomas V Inglesby 1,2, David A Relman 3
PMCID: PMC5290809  PMID: 26682799

Abstract

The fact that biological weapons have never been used—at least in recent history—is not sufficient reason to dismiss concerns that terrorists or nations could acquire and use dangerous pathogens as weapons. The ongoing discussion about gain‐of‐function experiments should take this very real prospect more seriously.

graphic file with name EMBR-17-127-g001.jpg

Subject Categories: Microbiology, Virology & Host Pathogen Interaction; S&S: Health & Disease; S&S: Politics, Policy & Law


During the past few years, there has been substantial debate concerning the risks and benefits of certain experiments with pathogens—initially motivated by two publications in 2012 that described laboratory efforts to enhance the mammalian transmissibility of the avian H5N1 influenza virus. One of these two reports was particularly noteworthy because the experiments were designed to yield new viruses with a set of properties that together might confer pandemic potential, such as high transmissibility, high pathogenicity, and resistance to commonly available countermeasures. Not all research on pathogens generates such concerns; in fact, it is only a rare experiment that might lead to the creation of a novel pathogen with pandemic potential (PPP). The term “gain‐of‐function” has also been used to describe this realm of research, but it refers to a much broader range of widely accepted non‐controversial research techniques and goals. For that reason, we think it should not be used in this discussion and refer to this work with the more precise term of PPP.

To say that no one would now or ever use PPP for deliberate misuse is grossly irresponsible guesswork

Proponents of such research argue that it is necessary to understand the evolution of pathogens and mechanisms of pathogenesis and transmission and that this knowledge can help public health authorities, vaccine manufacturers, and governments prepare for potential epidemics. Those concerned about PPP argue that this work is not critical for vaccine development or disease surveillance and that the accidental release of PPP—owing to insufficient biosafety or biosecurity or to laboratory accidents—could cause major outbreaks or even a pandemic 1, 2. Much less has been said or written, however, about the danger that such pathogens or their genome sequences could be deliberately misused to cause harm.

While the reporting of accidents and the collection and sharing of this safety information could (and should) be improved, it is possible to calculate a baseline probability of accidental releases from laboratories that perform PPP‐related research with data based on existing records and statistics about biosafety and laboratory accidents in the USA and elsewhere 1. Such calculations for example suggest at least a 0.2% chance of a laboratory‐acquired infection per BSL3 laboratory year. A similarly quantitative risk assessment of the intentional misuse of PPP, however, is not possible. Such a calculation would require reliable, quantitative data on a variety of probability assessments: the probability that a person, group, or country intends to release PPP; that a person, group, or country has the means to obtain the pathogen or has the capacity to generate one from published data; and, that a person, group, or country has the means of distributing a PPP in a way that would start an epidemic. Those kinds of data are not presently available, nor will they be in the foreseeable future. However, other kinds of assessments could and should be made, including the human and political motivations that might lead to the misuse of PPPs, the weaknesses of security systems, the global distribution and quality of research capacity, and the availability of published research information. All of these could provide insight related to the risk posed by the deliberate misuse of PPP.

The fact that a technology has not been misused is an unreliable predictor of its potential future misuse…

How can we therefore assess the risk that individuals, groups, or countries will start a pandemic with a PPP either now or in the future? Some involved in this debate have argued that since there have been no known attempts to use pathogens to start pandemics in recent times, there is little risk of it occurring in the future. Throughout history, however, there are examples of periods in which the potential of a new technology to be used for harm was not seen, or was denied up until the moment it was used as a weapon. Such moments often occur during periods of political, economic, or social upheaval, especially as the technology proliferates and disseminates. Tanks, for example, were initially seen as having very specific limited uses in battle, until the purpose and technologies related to tank warfare changed substantially in light of the trench warfare in World War I. Chlorine and its derivatives were first used to bleach textiles and anesthetize patients—until combatants introduced the large‐scale use of chemical weapons on the battlefield during World War I, beginning with the deadly use of chlorine gas. Commercial airplanes were a boon for international travel and commerce, and hijackings were uncommon—until the late 1960s when scores of hijackings occurred. Modern terrorists, it was broadly stated, only wanted to frighten people, not kill large numbers of them—until terrorists hijacked airplanes and flew them into buildings, or blew them up in mid‐air. Today, some extremist groups seek to kill as many of their enemy as possible—the attacks of 9/11 and many since have shown that clearly. The fact that a technology has not been misused is an unreliable predictor of its potential future misuse. Similarly, past actions of a particular terrorist group do not dictate what it will do in the future.

If a terrorist group or country were to place a high enough value on obtaining a PPP strain, there would be a successful theft

Some might say that because experts cannot agree on the likelihood of a PPP being used in a terrorist attack, but that it might be small, the reasonable path to follow would be to assume that it will not happen. The counter‐argument, which is the one we would support, is that the lack of agreement is a sign of the great complexity and uncertainty surrounding these issues. Given the potential consequences, we should err on the side of caution. There is a frequent presumption that other people, institutions, and countries will act as we do. This idea is called the Rational Actor Model for behavior, and it had serious potential consequences when it affected decisions made during the Second World War and the Cuban Missile Crisis 3. We must avoid this pitfall when assessing the risks posed by the deliberate use of PPP.

There are other possibilities as to why people might deliberately use PPP to cause harm. Scientists could conceivably be co‐opted to do things against their will because of extraordinary pressure or threat brought to bear. Alternatively, scientists could be convinced or seduced unwittingly to do things that aid and abet someone else whose ultimate purpose they did not appreciate or support.

If the potential consequences of PPP were not so serious, then speculation about the motivations of various actors around the world would be less important, as the penalty for being wrong would not be so great. But given the potential consequences of the misuse of PPP, it is critical to admit how much we do not and cannot know. The world is a huge, heterogeneous, complicated mix of cultures, motivations, drivers, and decisions. To say that no one would now or ever use PPP for deliberate misuse is grossly irresponsible guesswork.

Under what conditions might a person or group choose to start a pandemic with a PPP? The Islamic State and its affiliates use apocalyptic rhetoric and have seemingly few limits to their brutality, as the recent, horrific attacks in Paris demonstrated. The Islamic State has also sought to recruit scientists to help meet its ends. Paris is only the latest in a long list of recent examples of mass killings. Suicide bombers working for religious extremist groups have targeted places of worship, markets, and schools throughout the world. A lone, suicidal airplane pilot killed hundreds of people as collateral to his own suicide. As a thought exercise, would you give The Islamic State, or suicidal or homicidal people access to guns? Would you give them access to a virus that was both lethal and transmissible? You probably answered no to both because you think it at least conceivable that people in these situations could make terrible decisions that seem inconceivable to most of the world.

Are there any conditions under which a country might choose to start a pandemic with a PPP? It would seem improbable given that the consequences could devastate that country itself as the pandemic spreads. However, there are reasons why a country might consider it. Countries that wish to have an insurance policy against invasion might threaten use of a PPP in retaliation, as some do now with nuclear weapons. Countries could also use the prospect of PPP to compel other countries to act in certain ways, or to levy demands or extort concessions, as some countries that possess nuclear weapons now do. If a country were to develop a vaccine that was effective against a particular PPP and so could protect its own population, then it might have a lower threshold for using PPP for harm. Countries could even plan to use PPP in the case of defeat, as the former Soviet Union planned to do during the Cold War 4. As a thought experiment, do you think it would be prudent to disseminate PPP laboratory strains to every nation in the world for their own national research programs? You probably do not; perhaps not even those nations you might be inclined to trust. Part of your reasoning might be that you have concerns about laboratory safety. But you also probably have other concerns and uncertainties about the possible fate of those PPP strains. And yet because of reverse genetics, publishing PPP genome sequences in the public domain is in some ways the same as distribution of the virus itself.

If the number of laboratories doing this work grows, the opportunity to divert and obtain PPP strains will grow too in addition to the risks associated with potential laboratory accidents

Another consideration is that the line between countries and terrorist groups is not always distinct. It is clear that some terrorist groups are supported by nation‐states and vice versa. And it is evident that some terrorist groups act as proxies for nation‐states. In addition, leading scientists working within a country might not be under the control of national authorities, as was the case in the history of nuclear weapons proliferation (www.fas.org/sgp/crs/nuke/RL34248.pdf). What might seem implausible when considering the intentions of one specific entity might become plausible when considering the connections and relationships between individuals, terrorist groups, and countries.

Would a person, group, or country intending to start a pandemic be able to obtain a PPP? Some laboratories working on pandemic strains have sophisticated security measures to prevent theft, which present a considerable challenge for anyone intent on stealing them. But if an entity with the means were committed to obtaining those strains, would those security plans be insurmountable? Could anyone working in those laboratories be convinced through some means—bribery, extortion, or disgruntlement, for example—to steal strains from the laboratory? Humans design and operate security systems, which means these systems have vulnerabilities. Items with high value—money, art, weapons designs, new technologies, financial information—are stolen all the time. If a terrorist group or country were to place a high enough value on obtaining a PPP strain, there would be a successful theft. In the past, when countries wanted access to new weapons or business technologies to give them an edge, they used insiders at appropriate locations to obtain that information. If PPP strains were seen as similarly valuable for their potential to do harm, then similar efforts would be likely. These efforts would be all the more successful if they targeted laboratories with lesser degrees of physical and operational security.

These risks will increase if PPP research continues and expands. For now, research on PPP strains is conducted in only a few laboratories. But given the attention and high impact publications that have followed this work, other laboratories will want to initiate similar research programs. Some will have capabilities in viral reverse genetics to re‐create viral PPP starting with only the genomic sequence data. Currently, there are no global standards to say who will or will not be allowed to do this kind of work. Calls to try to limit PPP experiments have already been rejected as a misguided effort to control new technologies. If the number of laboratories doing this work grows, the opportunity to divert and obtain PPP strains will grow too in addition to the risks associated with potential laboratory accidents.

Clearly, the vast majority of life scientists are dedicated to the search for new knowledge that might benefit the planet, or for cures and vaccines, as examples. But this is not universally true. Some scientists may have a morbid curiosity to learn whether an alleged finding or a claim holds water. Some have infected others with pathogens from their own laboratories 5. Some have cheated and misled their colleagues with false data 6. Various countries have employed scientists to create weapons from pathogens, in some cases at large scales 7. Scientists have joined terrorist groups, as was the case of Yazid Sufaat working for Al Qaeda (http://www.weeklystandard.com/al-qaedas-anthrax-scientist/article/16989). Our planning to cope with the risks of PPP—as well as for other potential future challenges in the life sciences—needs to acknowledge this.

Even if interested parties were not able to obtain PPP directly, there is still a risk that a person, group, or country with the intention of starting a pandemic could create such agents based on publicly available information. One of the fundamental building blocks of scientific research is its reproducibility: if an experiment cannot be reproduced, the results will be called into question. Scientists publishing their work in peer‐reviewed scientific journals are therefore required to describe their methods and experiments in sufficient detail so as to enable their colleagues to repeat it. Unless this requirement were changed in the special case of PPP research—and there are no indications that this will occur at this point—any existing and future publications on PPP will contain sufficient information for recreating novel strains of pathogens that are potentially lethal and transmissible in humans.

Countries around the world have a right to know where this work is being done given the risks it poses to their populations

Only a small number of laboratories can perform such experimental work under appropriate biosafety conditions; but if safety were no longer a major concern, then the work could be carried out in a broader variety of laboratory settings. There are thousands of academic, government, and private science laboratories around the world. The 100 leading universities in the world for microbiology, based on their research record and reputations, are located in 20 different countries on five continents (http://www.usnews.com/education/best-global-universities/microbiology?page=10). Participants and winners in the International Genetically Engineered Machines competition (IGEM) come from all over the world. More than three‐dozen BL4 laboratories existed or were being constructed as of 2011, located in 18 countries (http://fas.org/programs/bio/biosafetylevels.html). More than 1,300 registered BL3 laboratories existed in the USA alone as of 2007 (http://www.gao.gov/new.items/d08108t.pdf). There is not just an abundance of laboratories that have the necessary equipment and setup to conduct PPP research, but there is also no shortage of expertise and workforce. Employment in the US life sciences industry alone totaled 1.62 million in more than 73,000 companies in 2012 (http://www.nature.com/nbt/journal/v33/n1/full/nbt.3116.html). Another report noted that there were at least 500,000 life scientists in the EU (https://ec.europa.eu/research/infrastructures/pdf/enabling-science.pdf).

Some have commented that any unsanctioned work to create PPP will not go unnoticed and will eventually draw the attention of laboratory members or superiors or outsiders. But it is not always straightforward to know what kind of work is going on in a given laboratory, even from within the same institution. From a distance, it will be all the more challenging. The former Soviet Union had a massive bioweapons program for decades that was a complete mystery to the rest of the world.

Another critical factor here is that unlike the pathogens themselves, which may be limited to a single location or even destroyed at some point, their genome sequences and the information on how to genetically manipulate them will be publicly available from the moment it is published in perpetuity. We not only have to consider risks for the present, but possible risks for the future. It is important not only to acknowledge the limits of our own ability to make predictions, but also to acknowledge that we are often wrong about these predictions.

Given these considerations, the only reasonable and safe approach for continuing PPP research is to have two planning assumptions: There may be people, groups, and/or countries that are motivated to obtain PPP and either threaten, or in fact use them to start a pandemic, and there will be the means available to obtain PPP strains if they are created or to re‐create them based on published information.

What should be done about this? Given these risks—and the risks involved in potential laboratory accidents discussed elsewhere—deliberate efforts to create new PPP should not be pursued unless a compelling case can be made that the benefits of a particular experiment outweigh the risks including the risks of deliberate misuse. We do not see that compelling argument for PPP, but it is possible that such a case could emerge.

If a decision is made nonetheless to proceed with PPP research, then a range of steps should be taken to reduce the risk of misuse. First, the risk of deliberate misuse should be taken more seriously. There has been little debate on the risks of deliberate misuse since the discussions about GOF, DURC, and PPP started a few years ago. This risk has often been dismissed with facile mischaracterizations such as “people in caves can't do this work”. The discussion clearly requires a far more insightful analysis than that, and any entity funding or authorizing PPP work should have the best possible expert assessment on these issues before proceeding. An expert assessment—far beyond the considerations raised in this commentary—would have to include a determination of the level of scientific training that would be necessary to re‐create these strains based on published information. This kind of assessment would necessarily include scientists who understand how this work was conducted in the original setting, as well as whether and how it could be conducted in a variety of other distinct settings.

A full assessment of the risk would also include a serious analysis of the conditions under which people, groups, and countries might consider the deliberate use of PPP. That kind of assessment would logically include social scientists, political scientists, and historians who have studied how technologies to do harm have evolved, dispersed, and been used. It would also draw on the talents of those who study the psychological elements of modern terrorism.

There are very few pathogens in the world with the potential to generate large‐scale human‐to‐human transmissible epidemics. Laboratories working with those pathogens, particularly with PPP strains, should undergo exceptional external evaluations of safety and security. Countries around the world have a right to know where this work is being done given the risks it poses to their populations. They should not just learn about the work when it is published in a scientific journal. They should know about the work before it is started. International norms guiding research that could result in a pandemic do not now exist, but should be pursued (http://www.upmchealthsecurity.org/our-work/publications/synopsis-of-biological-safety-and-security-arrangements).

We are hopeful that the US government review process that is now underway—and others that follow it elsewhere in the world—will include a thorough assessment of the prospect of the deliberate misuse of PPP. We are also hopeful that US policymakers and policymakers around the world will start to take the possibility of deliberate misuse more seriously as part of an overall calculation of the risks and benefits of this narrow but highly consequential area of work. As such, we hope they will consider whether, given the risks, it is defensible to continue supporting such research.

Conflict of interest

The authors declare that they have no conflict of interest.

References

  • 1. Lipsitch M, Inglesby TV (2014) Moratorium on research intended to create novel potential pandemic pathogens. MBio 5: e02366–14 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Duprex WP, Fouchier RA, Imperiale MJ, Lipsitch M, Relman DA (2015) Gain‐of‐function experiments: time for a real debate. Nat Rev Microbiol 13: 58–64 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Allison G, Zelikow P (1999) Essence of Decision: Explaining the Cuban Missile Crisis, 2nd edn New York: Pearson; [Google Scholar]
  • 4. Hoffman D (2009) The Dead Hand: The Untold Story of the Cold War Arms Race and Its Dangerous Legacy. New York, NY: Doubleday; [Google Scholar]
  • 5. Carus S (2001) Bioterrorism and Biocrimes: The Illicit Use of Biological Agents Since 1900. Washington, DC: Center for Counterproliferation Research, National Defense University; [Google Scholar]
  • 6. Fang FC, Steen RG, Casadevall A (2012) Misconduct accounts for the majority of retracted scientific publications. Proc Natl Acad Sci USA 109: 17028–17033 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Seth CW (2015) The history of biological weapons use: what we know and what we don't. Health Secur 13: 219–255 [DOI] [PubMed] [Google Scholar]

Articles from EMBO Reports are provided here courtesy of Nature Publishing Group

RESOURCES