Our firm conducted a risk/benefit assessment of “gain-of-function” research, as part of the deliberative process following a U.S. moratorium on the research (U.S. Department of Health and Human Services, U.S. Government Gain-of-Function Deliberative Process and Research Funding Pause on Selected Gain-of-Function Research Involving Influenza, MERS, and SARS Viruses, 2014).
KEYWORDS: biosafety, conduct of research, gain of function, science policy
ABSTRACT
Our firm conducted a risk/benefit assessment of “gain-of-function” research, as part of the deliberative process following a U.S. moratorium on the research (U.S. Department of Health and Human Services, U.S. Government Gain-of-Function Deliberative Process and Research Funding Pause on Selected Gain-of-Function Research Involving Influenza, MERS, and SARS Viruses, 2014). Due to significant missing but theoretically acquirable data, our biosafety assessment faced limitations, and we were forced to provide a relative, instead of absolute, measure of risk (Gryphon Scientific, LLC, Risk and Benefit Analysis of Gain of Function Research, 2016). Here, we argue that many of these types of missing data represent large and stunning gaps in our knowledge of biosafety and argue that these missing data, once acquired via primary research efforts, would improve biosafety risk assessments and could be incorporated into biosafety practices to reduce risk of accidents. Governments invest billions in biological research; at least a small fraction of this support is warranted to prevent biological accidents.
PERSPECTIVE
In 2014, the White House issued a moratorium on so-called gain-of-function research involving influenza virus and severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS) coronaviruses and initiated a deliberative process to be managed by the National Science Advisory Board for Biosecurity (NSABB), a process which recently concluded with the release of the White House Office of Science and Technology Policy’s (OSTP) Recommended Policy Guidance document (1, 2). In an earlier stage of the process, our firm was contracted by the National Institutes of Health to perform and deliver a risk/benefit assessment of this research to independently investigate its biosafety and biosecurity risks, as well as the scientific and public health benefits it may bring. As part of our biosafety risk assessment, we were asked to provide a measurement of the absolute risk of a biosafety incident. However, due to significant missing, but theoretically acquirable data, our work faced limitations, and we were forced to provide a relative, instead of absolute, measure of risk (3). Many of these types of missing data represent large and stunning gaps in our knowledge of biosafety. These missing data, once acquired via relatively simple primary research efforts, would not only improve biosafety risk assessments but also could be immediately incorporated into biosafety practices to reduce the risk of accidents. Governments invest billions of dollars to support biological research with the purpose of improving the human condition; clearly, at least a small fraction of this support should be used to prevent the biological accidents that imperil that basic goal.
LACK OF HUMAN RELIABILITY DATA
To the best of our knowledge, Gryphon Scientific’s biosafety risk assessment was the first to comprehensively consider the probability and types of human error that play a role in laboratory accidents, and we identified it as the dominant component of laboratory biosafety risk. This dominance of human errors is based on multiple factors. First, most plausible pathways to laboratory-acquired infection or accidental release require a human error to precipitate. For example, spills and needlesticks begin with a person making a motor control mistake. Second, humans fail more frequently than equipment. For example, in our aerosol release scenarios, the median chance of an exhaust fan failing was estimated to be 1.5 orders of magnitude less likely than the chance of someone improperly responding to an airflow alarm should the fan fail (4). These relative probabilities align with our common sense—most laboratory scientists have likely experienced a simple human error, such as a dropped plate or tube, many times more often than we have experienced random mechanical failures. The last, but perhaps most important contribution of human error to laboratory incidents comes after an incident has already begun: human errors after an exposure, whether due to ignorance, panic, or expediency, can drastically exacerbate the consequences of that event. A laboratory worker who immediately notifies appropriate personnel and follows proper health surveillance and isolation protocols after an exposure is significantly less likely to cause secondary infections, limiting the consequences of the incident.
Despite the dominance of human error in determining risk, we could find limited data to support a quantitative estimate of human failure rates. Publicly available quantitative biosafety risk assessments done by others focused primarily on detailed measurements of equipment and mechanical failure rates, and either omitted a quantitative treatment of human error entirely (5, 6) or used a flat failure rate for all types of errors in all incidents (7). The available primary data were similarly limited, and we identified only a single source of human error measurements taken directly from biological laboratories (8). Due to these data limitations, we instead analogized the rate of human failures in laboratories to that of similar errors in other industries, primarily aerospace and nuclear power (9).
Indeed, in contrast to the life sciences, these other industries invest heavily in the study of human error. For decades, many technical industries have turned to formalized assessments of human performance and mistake rates, termed human reliability assessments (HRAs) (10, 11). These HRAs provide analytical frameworks to rigorously identify the type and frequency of the mistakes humans make, as well as the circumstances in which they are most likely to make them. Armed with the knowledge of how and when things may go wrong, risk and safety managers can work to redesign systems and update practices to best prevent common and/or serious mistakes before they occur. These investments are made even though the scale of potential consequences of an accident in the power and transportation industries is dwarfed by the consequences of a global infectious disease outbreak.
LACK OF HISTORICAL BIOSAFETY INCIDENT DATA
Not only is scholarship lacking on how mistakes may be made, data are not collected on how mistakes have been made either. Despite a clear need for keeping these records, the United States has no standardized or comprehensive system for tracking laboratory incidents or near misses in high-containment laboratories. Astoundingly, we appear to lack even the most basic knowledge of how many high-containment laboratories are currently in operation (12). Although some partial reporting systems exist (13, 14), none of these systems are sufficiently standardized, complete, or of high-enough quality and detail to provide usable statistics about the type, magnitude, and kind of incidents that occur in biological laboratories. This lack of centralization and standardization hinders the spread of lessons learned and best practices between laboratories, which likely results in us suffering the same mistakes in multiple laboratories.
Other industries have discovered that centralized tracking of incidents can significantly reduce risk. In 1974, TWA flight 514 crashed outside Washington’s Dulles airport, killing 92 people (15). During the investigation, it was discovered that a United Airlines flight had a near miss due to the same cause just weeks earlier, and although United had alerted its pilots of the danger, the warning did not spread industry-wide. In the wake of the accident, the Federal Aviation Administration (FAA) created—and has maintained ever since—a no-fault system of reporting aviation incidents and mistakes, no matter how minor. This reporting correlates with a decrease in risk: airline industry sources show a substantial and continuing decrease in accident rates from 1976 to today (today’s rate is about a third of the rate in the early 1970s) (16, 17). Similar measures have been undertaken in the nuclear power and chemical manufacturing sectors, which greatly improve the ability to predict potential accidents and prevent them (18, 19). Although we note that these industries have had decades of additional time and experience beyond that of the relatively new life science industry to advance their state of risk assessment and prevention, we believe that the life sciences industry should strive, over time, to meet the same standards those industries have pioneered. Also, although these industries are dominated by a few giant concerns, the fact that the government and a handful of companies are responsible for the large majority of research on dangerous pathogens enables a similarly few number of influential players to effect significant change.
BENEFITS OF GATHERING THESE DATA
The combination of prospective human reliability primary data gathering to assess what might go wrong and historical incident record keeping to assess what did go wrong provides a powerful path to reducing biosafety risk. Using these data, laboratory safety practices can be improved, lowering the risk to the researchers and to their surrounding communities. In addition, training, equipment, and safety systems can be redesigned to prevent common mistakes before they happen.
Beyond these benefits in risk reduction, gathering data on human reliability and biosafety incidents will also support the development of absolute biosafety risk assessments, which is desirable for several reasons. First, biosafety levels can begin to be defined by the maximum risk of an experiment allowable under that safety level, instead of being defined by organism phenotype, ensuring that the riskiest experiments remain tightly protected without unnecessarily burdening others. Second, absolute numbers enable comparisons across industries and activities, which contextualizes the level of biosafety risk we are assuming. By making risk less subjective, policy debates can be more focused and targeted. In our experience with the gain-of-function debate, our inability to provide absolute numbers left unresolved questions about the baseline acceptable level of risk, distracting from the policy questions at hand.
RECOMMENDATIONS
In sum, gathering these data is achievable and would provide immense benefits. As such, we believe significantly more funding is urgently and immediately needed to support three basic thrusts: (i) development of a national incident reporting system, (ii) primary research programs focused on HRAs, equipment failures, and decontamination efficiencies, and (iii) sharing of best practices. We believe the reporting system should be structured like the FAA’s database and should include not only consequential incidents but also reports of near misses, defined as situations that required a response but where no infections or consequences resulted. Consequential incidents are often rare outcomes of a series of common mistakes or failures. By reporting near misses that may contain these same common mistakes, the mistakes can be identified and remedies can be put into place prior to the mistakes precipitating high-consequence events. The NSABB’s recommendations for overseeing gain-of-function research included—due to prompting by us and others—a recommendation to create such a database of incidents and near misses, and we strongly agree with their recommendation (20). OSTP’s guidance document in response to NSABB’s recommendations (2) does not address this data gap, and we fear momentum for the creation of the database is at risk of being lost.
Creation of such a database is not without challenges. Currently, many institutions have expressed reluctance to share information about biosafety incidents, due to fears of how that information may be shared by organizations beyond their control, for example, in the series of articles USA Today has published (21, 22). In addition, some institutions have hesitated to share data on incidents due to a lack of standard definitions of what qualifies as reportable, and no institution wants to be penalized as unsafe for the good deed of overreporting. Although daunting, these challenges can—and should—be overcome. One first step would be to identify the largest current disincentives to reporting and potential approaches to removing them. A second step would be building consensus for a complete definition of what should be reportable. We believe efforts to accomplish these steps should be led by the biosafety and research communities, similar to how these same communities led the efforts at Asilomar in 1975 to establish biosafety guidelines for work with recombinant DNA (23).
To address biosafety knowledge gaps, federal grant funds should be made available for primary research studies. Although the federal government does fund biosafety education and training programs, there appears to be little spending on quantitative studies of laboratory biosafety practices and accidents, despite a National Institute of Allergy and Infectious Disease (NIAID) operating budget of $4.7 billion in fiscal year 2016 (24). We note that a new annual allocation of a mere 1/10th of 1% of that number—$5 million per year—set aside for biosafety research and tracking could make significant strides to cover these gaps and greatly increase our ability to predict and prevent accidents. The funding for investigations of accidents and research into their causes by agencies whose primary mission is safety in the transportation (nearly $1 billion, via the National Transportation Safety Board [NTSB] and National Highway Traffic Safety Administration [NHTSA]) and nuclear and chemical industries (more than $1 billion via the Nuclear Regulatory Commission [NRC] and Chemical Safety Board [CSB]) demonstrates that the United States has already recognized the importance of these types of studies. In addition to studies investigating human reliability, applied, laboratory-based studies should also be conducted to gather additional data about failures due to mechanical systems or decontamination protocols. Given the recent widespread attention on failures of inactivation protocols (25, 26), learning how these protocols fail will also enhance safety and our ability to ascertain risk. Finally, although failure rates for many types of personal protective equipment (PPE) are known, many kinds of data are still lacking. Primary research in this area would be straightforward and would additionally improve biosafety risk assessments.
Although incident recordkeeping and human reliability are the two largest and most urgent areas, other areas of biosafety research have also suffered due to a lack of funding. In our experience visiting laboratories undertaking gain-of-function research, we noted some institutions maintained a strong safety culture that likely played a significant role in reducing the risk of accident in these labs. Yet, how to create this culture and the effect it has on overall safety remain understudied. In addition, we noted that some labs put into place some unique best practices, such as maintaining a separate entrance at the designated local clinic for potentially exposed laboratory personnel, limiting public exposure, yet these practices were not widely known or shared within the biosafety community. Funding for collection and dissemination of these best practices is sorely needed, would be relatively modest, and would have a nearly immediate return on investment.
Federal, state, and local governments, educational institutions, and industry participants do commit significant resources to biosafety and biorisk management, through biosafety officers, institutional biosafety committees, the Biosafety in Microbiological and Biomedical Laboratories manual (27), and a host of risk assessment practices, polices, and reporting requirements. These efforts play a critical role in ensuring that biological research is conducted safely. However, despite these efforts, large gaps in our knowledge on how accidents did and could occur in laboratories still exist, and this fact is surprising and inexcusable, given that an accident in a biological laboratory, while already extremely unlikely under today’s safety precautions, could lead to a global infectious disease outbreak that kills more people than all aviation and industrial accidents combined. Given the vast gaps in knowledge that exist, a significant return on investment could be expected in terms of reduced biosafety risk in the near term, making this one of the safest research investments the federal government could make. Transforming biosafety into a quantitative practice would ensure that our research enterprise can produce the cures and treatments to improve our quality of life without a single accident vitiating millions of hours of toil at the bench. It is time to make that transition a reality.
REFERENCES
- 1.US Department of Health and Human Services 17 October 2014. U.S. Government gain-of-function deliberative process and research funding pause on selected gain-of-function research involving influenza, MERS, and SARS viruses. US Department of Health and Human Services, Washington, DC: https://www.phe.gov/s3/dualuse/documents/gain-of-function.pdf. [Google Scholar]
- 2.White House Office of Science and Technology Policy 9 January 2017. Recommended policy guidance for potential pandemic pathogen care and oversight (P3CO). White House Office of Science and Technology Policy, Washington, DC: https://www.whitehouse.gov/blog/2017/01/09/recommended-policy-guidance-potential-pandemic-pathogen-care-and-oversight. [Google Scholar]
- 3.Ritterson R, Kazmierczak M, Isbell M, Hulme-Lowe C, Lauer E, Finnegan M, Krem H, Cerles A, Venugopalan G, Handoko R, Chu J, Fields D, Casagrande R. 2016. Supporting an estimate of absolute risk, p 161–164. In Risk and benefit analysis of gain of function research. Final report—April 2016 Gryphon Scientific, LLC, Takoma Park, MD: http://www.gryphonscientific.com/wp-content/uploads/2016/04/Risk-and-Benefit-Analysis-of-Gain-of-Function-Research-Final-Report.pdf. [Google Scholar]
- 4.Ritterson R, Kazmierczak M, Isbell M, Hulme-Lowe C, Lauer E, Finnegan M, Krem H, Cerles A, Venugopalan G, Handoko R, Chu J, Fields D, Casagrande R. 2016. Risk and benefit analysis of gain of function research, supplemental information for Chapter 6. Gryphon Scientific, LLC, Takoma Park, MD: http://www.gryphonscientific.com/wp-content/uploads/2015/12/Supporting-Information-Event-Tree-Details-and-Probabilities.pdf. [Google Scholar]
- 5.Fort Point Associates, Inc 2013. BioSquare phase II supplemental final environmental impact report. Fort Point Associates, Inc., Boston, MA. [Google Scholar]
- 6.US Department of Energy 2008. Final revised environmental assessment for the proposed construction and operation of a biosafety level 3 facility at Lawrence Livermore National Laboratory, Livermore, California. US Department of Energy publication no. DOE/EA-1442R National Nuclear Security Administration, US Department of Energy, Washington, DC. [Google Scholar]
- 7.US Department of Homeland Security 2012. Updated site-specific biosafety and biosecurity mitigation risk assessment, vol II National Bio and Agro-Defense Facility, Science and Technology Directorate, US Department of Homeland Security, Washington, DC. [Google Scholar]
- 8.Nordgren LD, Gerberich SG, Alexander BH, Church TR, Bender JB, Ryan AD. 2014. Evaluation of risk and protective factors for work-related bite injuries to veterinary technicians certified in Minnesota. J Am Vet Med Assoc 245:434–440. doi: 10.2460/javma.245.4.434. [DOI] [PubMed] [Google Scholar]
- 9.Mauger P, Ritterson R, Casagrande R. 2016. Human reliability assessment in biological laboratories, p 505–511. In Risk and benefit analysis of gain of function research. Final report—April 2016 Gryphon Scientific, LLC, Takoma Park, MD: http://www.gryphonscientific.com/wp-content/uploads/2016/04/Risk-and-Benefit-Analysis-of-Gain-of-Function-Research-Final-Report.pdf. [Google Scholar]
- 10.Kirwan B. 1994. A guide to practical human reliability assessment. Taylor & Francis, London, United Kingdom. [Google Scholar]
- 11.Spurgin A. 2010. Human reliability assessment: theory and practice. CRC Press, Boca Raton, FL. [Google Scholar]
- 12.US Government Accountability Office 2009. High-containment laboratories: national strategy for oversight is needed. US Government Accountability Office publication no. GAO-09-574 US Government Accountability Office, Washington, DC. [Google Scholar]
- 13.National Institutes of Health 2016. NIH guidelines for research involving recombinant or synthetic nucleic acid molecules. Office of Science Policy, National Institutes of Health, Bethesda, MD. [Google Scholar]
- 14.Code of Federal Regulations 2005. Title 7. Agriculture. Chapter III. Animal and Plant Health Inspection Service, Department of Agriculture. Part 331. Possession, use, and transfer of select agents and toxins. Part 331.19. Notification of theft, loss, or release. 7 CFR Part 331.19.
- 15.Reynard WD, Billings CE, Cheaney ES, Hardy R. 1986. The development of the NASA Aviation Safety Reporting System. NASA reference publication 1114. Scientific and Technical Information Branch, National Aeronautics and Space Administration, Washington, DC. [Google Scholar]
- 16.Airbus S.A.S 2015. Commercial aviation accidents 1958–2014. A statistical analysis. Airbus S.A.S., Blagnac, France: http://asndata.aviation-safety.net/industry-reports/Airbus-Commercial-Aviation-Accidents-1958-2014.pdf. [Google Scholar]
- 17.Boeing Commercial Airplanes 2016. Statistical summary of commercial jet airplane accidents: worldwide operations 1959–2015. Aviation Safety, Boeing Commercial Airplanes, Seattle, WA: http://www.boeing.com/resources/boeingdotcom/company/about_bca/pdf/statsum.pdf. [Google Scholar]
- 18.Barach P, Small SD. 2000. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ 320:759–763. doi: 10.1136/bmj.320.7237.759. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Phimister JR, Oktem U, Kleindorfer PR, Kunreuther H. 2003. Near-miss incident management in the chemical process industry. Risk Anal 23:445–459. doi: 10.1111/1539-6924.00326. [DOI] [PubMed] [Google Scholar]
- 20.National Science Advisory Board for Biosecurity 2016. Recommendations for the evaluation and oversight of proposed gain-of-function research. National Science Advisory Board for Biosecurity, Office of Biotechnology Activities, Office of Science Policy, National Institutes of Health, Bethesda, MD. [Google Scholar]
- 21.Young A. 30 June 2016. Hundreds of safety incidents with bioterror germs reported by secretive labs. USA Today, McLean, VA; http://www.usatoday.com/story/news/2016/06/30/lab-safety-transparency-report/86577070/. [Google Scholar]
- 22.Young A. 23 June 2016. CDC failed to disclose lab incidents with bioterror pathogens to Congress. USA Today, McLean, VA; http://www.usatoday.com/story/news/2016/06/23/undisclosed-cdc-lab-incidents/86305700/. [Google Scholar]
- 23.Berg P, Baltimore D, Brenner S, Roblin RO, Singer MF. 1975. Summary statement of the Asilomar Conference on Recombinant DNA Molecules. Proc Natl Acad Sci U S A 72:1981–1984. doi: 10.1073/pnas.72.6.1981. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.National Institute of Allergy and Infectious Disease 1 October 2016. NIAID budget data comparisons. National Institute of Allergy and Infectious Disease, National Institutes of Health, Bethesda, MD; https://www.niaid.nih.gov/grants-contracts/niaid-budget-data-comparisons. [Google Scholar]
- 25.McCarthy M. 2014. Biosafety lapses prompt US CDC to shut labs and launch review. BMJ 349:g4615. doi: 10.1136/bmj.g4615. [DOI] [PubMed] [Google Scholar]
- 26.US Department of Defense 2015. Review committee report: inadvertent shipment of live Bacillus anthracis spores by DoD. Committee for Comprehensive Review of DoD Laboratory Procedures, Processes, and Protocols Associated with Inactivating Bacillus anthracis Spores, US Department of Defense, The Pentagon, Arlington, VA: https://www.defense.gov/Portals/1/features/2015/0615_lab-stats/Review-Committee-Report-Final.pdf. [Google Scholar]
- 27.US Department of Health and Human Services 2009. Biosafety in microbiological and biomedical laboratories, 5th ed. Public Health Service, Centers for Disease Control and Prevention, and National Institutes of Health, US Department of Health and Human Services, Washington, DC. [Google Scholar]