Abstract
Biosafety laboratory accidents are a normal part of laboratory science, but the frequency of such accidents is unclear due to current reporting standards and processes. To better understand accident reporting, a survey was created, with input from ABSA International, which included a series of questions about standards, requirements, and likely motivations for reporting or nonreporting. A total of 60 biosafety officers completed the survey. Respondents reported working with more than 5,000 people in laboratories, including more than 40 biosafety level 3 or animal biosafety level 3 laboratories, which work with higher-risk pathogens. Most of the respondents were located in the United States, Canada, or New Zealand, or did not identify their location. Notable results included that 97% of surveyed biosafety officers oversee laboratories that require reporting exposure to at least some pathogens. However, 63% relayed that the reports are not usually sent outside of the institution where they occurred. A slight majority (55%) stated that paper reports were used, with the rest reporting they used a variety of computer systems. Even in laboratories that used paper-based reporting systems, 67% relayed that these reports were used alongside, or entered into, a digital system. While 82% of these biosafety officers agreed that workers understood the importance of reporting for their own safety, 82% also agreed that a variety of disincentives prevent laboratory workers from reporting incidents, including concerns about job loss and loss of funding.
Keywords: Laboratory safety, Laboratory acquired infections, Reporting, Legal requirements, Risk analysis
Introduction
Biosafety is a critical issue in biocontainment laboratories, and thankfully the standards and adherence to best practices have improved over the past several decades.1,2 Despite improvements, there have been concerns raised recently by some activists, such as Klotz,3 that “undetected or unreported laboratory-acquired infections” pose “clearly far too high a risk,” in part because of fears of unreported accidents. Existing analyses of such risks, including that of Klotz, include only reported events and publicly available incidents that are widely available.4 Unreported accidents, however, would imply a greater frequency and pose a greater risk of undetected infections and dangerous consequences. It has been unclear whether unreported accidents pose a significant concern.
There are 3 ways in which an undercount of accidents is likely: (1) laboratory accidents may be fully undetected, which unfortunately cannot be captured in a survey; (2) accidents may go unreported by workers, despite being detected; and (3) laboratory accidents may be detected and reported but may be kept private and not shared publicly. The third possibility (private knowledge) is quantifiable, whereas the second (lack of reporting) is somewhat harder to quantify but can at least be qualitatively understood. Despite this, extant literature has not yet attempted to understand the scope of unreported events.
Methods
Survey Goals and Population
To address the gaps in current understanding of unreported laboratory accidents, a survey about incident recording and dissemination methods and possible nonreporting concerns was sent to the ABSA International biosafety email distribution list during the week of January 13, 2020.
To understand the lack of public records, the survey included questions about the process of reporting and the requirements for recordkeeping and reporting related to accidents and incidents in biological research laboratories. This included how accidents and incidents are recorded; what the recording requirements are; whether they are required to be reported externally, internally, or not at all; and how the reports are shared more broadly.
To better understand nonreporting, the survey asked about incentives that might affect what is and is not reported. This more qualitative portion asked whether there would be reason to suspect that accidents are not reported despite requirements to do so.
The population for the survey was the ABSA Biosafety Discussion mailing list, which is used primarily for discussions about biosafety issues. An earlier in-depth survey in 2016 included members of the mailing list and a variety of other people involved with ABSA (for instance, attendees at conferences), resulting in a total of 825 individuals, 712 of whom had current jobs involving responsibility for biosafety.5 Members of the mailing list and others surveyed in 2016 were largely located in the United States, likely reflecting the history of ABSA International, formerly known as the American Biological Safety Association.
Survey Design and Structure
The survey was designed in consultation with members of the ABSA board who provided feedback to ensure the question categories and response types were reasonable. A pdf of the full survey and a comma-separated values (CSV) file of the responses are available at www.liebertpub.com/doi/suppl/10.1089/hs.2021.0083.
The author sent the survey via an email invitation to the ABSA mailing list, and respondents completed it using Google Forms. The survey included an initial consent question, a final reconsent question, and 19 substantive survey questions. The substantive questions were divided into 4 sections, each with varying numbers of questions, as follows: (1) background information – location, laboratory types and sizes, and other comments; (2) reporting infrastructure – record system type, incident types, report recipients, comments; (3) incident recording and reporting practices and requirements – record requirements, potential exposure reporting, laboratory-acquired infection reporting, further dissemination, recipients, reasons for not sharing, comments; and (4) nonreported incidents – awareness, incentives, disincentives, comments on nonreporting, reconsent, and final comments.
Other than 2 respondents who elected not to have their responses recorded, all other responses were included. Response timestamps were removed as potentially sensitive and not relevant for analysis, but no other potentially identifiable information was included.
Ethical Considerations
The survey and the associated research were approved by the University of Haifa Ethics Committee Institutional Review Board (2209) under the project title “Eliciting Biosafety Experts on Lab Safety and Disclosures.” All respondents were asked to give their consent both at the beginning and at the end of the survey. Reporting methods suggested in the consent form filled out by respondents and those outlined in the ethics committee application and approval were followed.
Results
In total, 60 people completed the survey, of which 59 were responsible for at least 1 biosafety level (BSL)-2 laboratory and 41 were responsible for at least 1 BSL-3 laboratory. No respondents listed BSL-4 laboratories, although as noted later in the Discussion section, this may be due in part to deliberate nonresponse.
Laboratory Sizes
The numbers of employees who have access to or use each type of laboratory were recorded in ranges, listed in the survey as 1 to 10, 10 to 25, 26 to 50, 50 to 100, and 100 or more. As shown in Figure 1, the majority of BSL-2 laboratories were large, whereas most BSL-3 laboratories were smaller.
Figure 1.
Number and size ranges of laboratories. Abbreviations: BSL-2, biosafety level 2; BSL-3, biosafety level 3.
Note that BSL-1 laboratories, such as those used in high school or undergraduate courses, often have no biosafety officer because human pathogens are not handled in such laboratories. BSL-1 laboratories were therefore not included in the survey. Despite this, 13 respondents listed BSL-1 laboratories at their facilities under “other,” in some cases along with other laboratory types. Specifically, in addition to the laboratory types included in the survey, 2 respondents listed plant BSL-3 laboratories, and several listed animal BSL-1 and BSL-2 laboratories.
Reporting Infrastructure
Over half (n = 33, 55%) of the 60 respondents listed multiple systems used for reporting exposures or accidents, which may overlap or may relate to different laboratories overseen by a single respondent. Even if they overlap, it is unclear if all details of each report type are included in other systems. Of the 33 respondents using multiple systems, several (n = 3, 9%) noted 3 reporting system types, while the remainder (n = 30, 91%) listed 2 types. Note that more systems for reporting may exist, but if they are of the same type, such as 2 different intranet reporting systems, they would have been captured as only 1 system type. Over half (n = 18, 55%) respondents reported using paper incident report forms, handwritten reports, or similar, which were filled out for reporting, making it the single most common system type, but most of those using paper reports (n = 22, 67%) also used computerized or online reporting systems. Of the 11 laboratories using only paper forms, 1 laboratory mentioned requiring notification via phone.
The digital reporting methods noted by all respondents included “spreadsheets, written computer-based reports, or similar documentation” (n = 21, 35%), which likely also includes email; “intranet or other online reporting system (eg, web-based portal or forms)” (n = 26, 43%); and “specialized software (eg, commercial software designed for reporting)” (n = 14, 23%).
The requirements did not differ significantly by laboratory size. Using the number of people with access to the BSL-2 laboratories as a proxy for overall size, Figure 2 illustrates the distributions and the similarity between the groups using each of the different reporting methods mentioned, and additionally, those using only paper forms.
Figure 2.
Laboratory reporting methods by biosafety level 2 lab size. Abbreviation: BSL-2, biosafety level 2.
Reporting Requirements
Almost all (n = 58, 97%) of the respondents' laboratories require exposure to certain pathogens to be reported. Due to the wording of the question, however, the detailed data for this question may be misleading. For instance, of the 41 respondents with BSL-3 laboratories, most (n = 36, 88%) required reporting risk group (RG)-3 pathogens, but few responded that laboratories required reports of RG-4 pathogens. Of the 19 respondents without BSL-3 laboratories, about a third (n = 6, 32%) said they required reporting exposures to RG-3 pathogens. The negative responses are presumably because the respondents have no such pathogens in their facility, which otherwise would be reported – rather than saying that they required reporting of only lower-risk pathogens.
A few (n = 6, 10%) respondents said that laboratory-acquired infections were not reported, but only 1 respondent said “other human injuries” were not reported. This implies that even most of the laboratories that do not report infections would report clinically relevant infections as injuries—although the single exception is worrying.
Respondents also volunteered a number of other event types that required reporting. Eight respondents entered text noting that they required reporting of near misses, although this seems to be an undercount given answers to later questions that asked about near miss reporting explicitly, discussed below. One respondent mentioned requiring reporting of “incidents involving deviation from federal requirements,” and 2 others mentioned reporting “any potential loss of containment.” Additional responses included events such as spills, chemical spills, and “incidents involving rDNA.” Finally, 3 respondents mentioned requiring reporting of “any incident,” but without specifying what qualifies as an incident.
Dissemination of Reports
Reporting of exposure often occurs, but what happens with the reports is a critical issue. Internal oversight is nearly universal (n = 58, 97%), with reports sent to some combination of a biosafety officer (n = 51, 85%), the “Institutional Biosafety Board or similar dedicated committee” (n = 35, 58%), or upper management (n = 35, 58%). One respondent did not report using any of these but explained that they reported internally using a different structure—both that respondent and a respondent who said they did not have internal oversight noted that they report what is legally required but did so only minimally. The first noted that “only those incidents which are mandatory to report are reported externally,” and the other said, “We report to whichever entity we MUST (sic) report to.”
External reporting was generally less common. Of the 19 respondents with no BSL-3 laboratories, only a few (n = 3, 16%) said they sent any reports to a national agency or occupational health agency, and none said they sent reports to a local or state health agency or authority. The 41 respondents with BSL-3 laboratories did so more often but sending reports externally to either national (n = 14, 34%) or state or local agencies (n = 12, 29%) was still uncommon. While no question was asked on the topic, 2 respondents noted that they also report any injuries to their worker compensation carrier.
Even when exposure reports are not sent externally, there may still be aggregate data on incidents. Over half (n = 32, 53%) of respondents said that reports were shared more broadly, but few noted whether this was dependent on the nature of the incident—which may indicate an undercount. Subsequent questions about incident reporting for high-risk events is consistent with a higher level of reports. About two-thirds (n = 40, 67%) of the respondents said that either public or nonpublic aggregate data were compiled, although again, the unexpectedly low number may be a function of not realizing that the question included severe incidents reported in other ways.
Recordkeeping and Reporting Requirements
One obvious reason for recordkeeping and reporting is a legal or policy requirement to do so. The survey therefore included questions about what requirements exist for a variety of event types. Figures 3 and 4 summarize the respondents' requirements for recording and reporting various incidents.
Figure 3.
What record-keeping requirements exist for each of the following types of incidents?
Figure 4.
What type of incident reporting is required for each of the following types of incidents?
Required reporting is laudable, but for incident types other than high-consequence exposures and laboratory-acquired infections, which are considered “select agents” (with the potential to pose a severe threat) in the United States, the survey shows that required records are generated but often not reported externally. Worse, even in cases where there are legal requirements, it is unfortunately clear from at least 1 known previous event that in at least some cases reporting is deferred or does not occur, as is corroborated by the results in Figure 5.6
Figure 5.
Are you aware of nonreporting of exposures involving high-risk pathogens? (N = 60)
Additionally, the survey results reveal a disparity in legal requirements across laboratories. The reasons are complex, and likely due to a variety of causes. First, there are differences in local, state, and national government requirements. Second, there is uncertainty on the part of biosafety officers about what must be reported. And finally, there is a question of compliance, which is not captured in these questions. Laboratories with the most relaxed policies are presumably the least likely to have highly engaged biosafety officers who respond to a survey like this one.
Near-Miss Reporting
Additional reporting of nonaccident events is another feature that would provide useful evidence about how reporting and risk management is done. Near-miss reporting is a critical part of hazard analysis and safety engineering.7 Such reports allow for better safety practices in the future and are certainly a public benefit even if they are not mandated. This benefit is greatly increased if near misses are reported and shared publicly so that laboratories and standards organizations can learn from them. Despite the benefits, 49 (82%) of the respondents noted that near-miss reporting was either an internal policy only (n = 33, 55%,) or a best practice (n = 16, 27%), as shown in Figure 3. Only 2 respondents said it was a legal requirement, and 4 said they specifically did not keep such records.
Policies that encourage this type of reporting are a good start, but near misses still may not be reported. One respondent addressed this directly in a comment: “At our institution, we encourage reporting of all near misses and incidents, although anecdotally I might estimate that [much less than] 5% of near misses are reported.” However, they noted that the reporting rate was likely far higher, “perhaps above 95%” for more dangerous pathogens such as ones that are designated as select agents in the United States.
Reporting Motives
A key issue with understanding incident frequency based on reporting is that nonreported accidents are (by definition) not included. To better understand this potential issue, the survey included questions about why workers would or would not be motivated to report accidents.
Obviously, a key reason for reporting is the workers' own safety, and a great majority (n = 49, 82%) of biosafety officers agreed that workers understood the importance of reporting for their own safety. A majority of respondents who did not select workers' own safety as a reason for reporting selected only 1 checkbox in each question, perhaps misunderstanding the question format, which allowed multiple selections. This issue is discussed in more detail in the next section. One-quarter of respondents (n = 15, 25%) noted legal penalties for nonreporting as a reason workers would report accidents, and some (n = 9, 15%) noted that monitoring and other processes made it infeasible not to report accidents.
At the same time, of the 49 respondents who agreed that there were significant disincentives for workers to report, they identified key concerns as job loss (n = 25, 51%), loss of funding (n = 29, 59%), and other factors, including the complexity and difficulty of reporting (n = 25, 51%) and social pressure (n = 24, 49%).
It is worth considering how these motivations interact. For instance, 9 respondents said both that workers would report due to legal penalties for nonreporting, and that nonreporting might be caused by fear of job loss. The interaction is not contradictory but does point to some real tensions between the incentives. It is therefore unsurprising that a survey of laboratory workers found that workers are nervous about violating regulations.8
One key issue for reporting is the benefit of nonpunitive reporting. Such reporting is very helpful but can be difficult. For example, 1 respondent noted that changing their “safety culture from punitive to understanding” encouraged reporting, but fear of job loss was still a disincentive. The concerns are reasonable on the part of both employers and workers, since employers do not want accident-prone or careless workers. Unfortunately, the reticence to remove punitive measures will certainly disincentivize reporting. On the other hand, penalties for nonreporting are certainly compatible with not punishing those who report, and a respondent noted that the reasons workers would report included legal penalties for nonreporting and that “there must be a nonpunitive system for it to work.”
Survey Sample and Limitations
Nonresponse bias is potentially a significant issue, given that only 62 responses were received, and 2 of the respondents chose not to have their responses included. It is unclear how many subscribers of the mailing list read the high volume of posts, and not all mailing list members would be valid respondents. Another survey sent out the same week had only 32 responses,9 so it seems clear that the low response rate is a reflection of this context, not an issue specific to the survey. In the 2016 survey,5 which included both mailing list participants and others, 712 members were responsible for laboratories, so a conservative estimate of an effective response rate was 8.7%. While the response rate is low, it is unclear how many people subscribe to the mailing list, and email surveys often have low response rates.10 The low rate in this case is likely compounded by the relatively high volume of mail on the mailing list.
The sample was also unfortunately geographically concentrated, reflecting the distribution of the population found in the 2016 survey, of which 90% were located within the United States.5 In the current survey, the United States accounted for 91% of responses where a country was selected. However, a significant minority (n = 14, 23%) of respondents in the current survey did not select any location, an issue likely in part due to the long drop-down list of countries and alphabetical ordering placing the United States near the end. In addition, the survey was available only in English.
In the 2016 survey,5 334 of 712 (47%) respondents indicated that they “provide biosafety support to activities involving highly pathogenic agents.” It is unclear how well that survey corresponds to the current survey; if it includes all BSL-3 laboratories, these respondents were overrepresented in the current survey. More likely, only a portion of BSL-3 laboratories would claim to deal with “highly pathogenic agents.” For this reason, it is difficult to determine whether they are over- or underrepresented. It is also unclear whether the respondents tended to be more willing to admit to issues in their laboratories, or who chose to respond because they thought their laboratory was better than most others.
In the responses to the current survey, several anomalous reports were likely due to survey design. Specifically, respondents that had no laboratories equipped to deal with RG-3 or RG-4 pathogens often said they would not report such exposures, despite reporting exposures to RG-2 pathogens. For this reason, graphs of results and reported percentages indicate that all reporting requirements reported for lower-level pathogens would be applied to higher-level pathogens. The raw data for the survey has, of course, not been altered.
Another possible issue is that for the 2 multiple-choice questions involving motivations for reporting, a surprising number of respondents (n = 16, 27%) checked only 1 checkbox. Although none of the same respondents checked only 1 box for earlier multiple-choice questions, it is possible that they misunderstood the prompt to choose more than 1 answer. Alternatively, the low response rate could have been due to survey fatigue, as these questions were located toward the end of the survey.
Noninclusion of Biosafety Level 4 Laboratories
While the small sample size might account for the lack of response about any BSL-4 laboratories in the data since there are very few such laboratories overall, this lack of reports seems unlikely because the biosafety officers of BSL-4 facilities are overrepresented in many discussions on the ABSA list. Two possible causes seem plausible: that such individuals chose not to respond to the survey, or that respondents chose deliberately not to respond to those questions. All but 7 respondents (n = 53, 88%) selected the option for having no BSL-4 laboratories, while the remaining 7 left the answer blank.
Notably, the only 2 respondents reporting more than 100 people working in BSL-3 laboratories left the question about having a BSL-4 laboratory blank. While other respondents left additional questions blank, these 2 respondents did not do so. Deliberate nonresponse seems plausible because most sites with BSL-4 laboratories also have research at lower biosafety levels, and there is increased sensitivity about reporting standards at these high-security laboratories, as well as implicitly far less anonymity due to the small number of such sites, and greater consequences of public data showing that these laboratories have faulty or nonpublic records.
Discussion
This appears to be the first survey that looked at how laboratory accidents are reported from biological research laboratories, rather than considering injuries, exposure, or infections. This is important because the vast majority of laboratory accidents in biocontainment facilities do not involve any human injury but are still useful to inform our understanding of what goes wrong.
In previous literature, it is clear that serious cases are not uncommon, including accidental needlesticks, spills, or splashes; exposure without sufficient protective gear; or scratches or bite injuries from laboratory animals.11,12 None of these cases are likely to occur without being detected, but if they are unreported, they could lead to not only an infection of the exposed individual, but also a lack of treatment and monitoring, potentially resulting in further exposure to the wider community. Furthermore, previous work has shown that such underreporting of occupational accidents and injuries in general is common.13 In cases of exposure to a dangerous pathogen, the individuals exposed have good reason to report the incident, if only to receive treatment themselves. However, they may nonetheless fail to do so for a variety of reasons, some of which have been explored in this article.
Reporting Standards
The issue of different reporting standards has been highlighted previously by other researchers such as Gronvall and Rozo,14 but harmonizing standards is challenging in general,15 and attempted panaceas are likely to fail, as Munroe humorously notes in their webcomic.16 Although most respondents to our survey were located in the United States, the substantial variance shows that standards are elusive even within a single nation. The fact that biosafety is an international issue makes this even more challenging.
Because reporting is not standardized, any imposed categorization will be somewhat ambiguous. To ameliorate the issue, the survey design phase included soliciting and incorporating feedback from ABSA board members about how to improve the questions and wording. For the full text of the survey questions and the responses, see the supplemental materials at www.liebertpub.com/doi/suppl/10.1089/hs.2021.0083. Despite our attempt to use clear wording, perceived ambiguities in the questions seemed noteworthy to a few respondents, as noted in the next section.
Finally, some terms used in the survey could be ambiguous. For example, “incident,” “accident,” and “release” can all refer to the same event or to different event types. Clearly defining these may be useful, and while there was discussion of the problem, most of the ABSA board members agreed it would be better not to do so on a survey like this, instead relying on expert understanding of the terms. This lack of definition was not highlighted as a key issue even by those who noted other problems.
Survey Design Comments and Concerns
Despite working with ABSA to ensure that survey questions were phrased appropriately for the audience, 2 respondents (3%) commented that the questions did not reflect how things were managed at their institution. One noted that “the selection of potential answers did not accurately describe the situation at my institution,” and the other said “the choices didn't reflect how things are done at my institution and it was difficult to provide an answer with a fair representation.” In addition, respondents noted issues about incident reports, because their requirements and processes did not align with the questions. Finally, 1 respondent thought that asking about whether those completing the survey were “aware of nonreporting of exposures involving high-risk pathogens” was not a good question and noted that the “incentive questions don't make sense as incentives.”
While the survey seems to have captured the majority of respondents' institutions, other research methods, including qualitative interviews and other more individualized elicitations would be useful for better addressing the heterogeneity. Such methods could also greatly enhance the understanding of incentives.
Considerations for Future Risk Analysis and Mitigation
The survey results are important to inform our understanding of the current practice of laboratories, providing an interesting insight into future risk analyses for laboratories. For instance, it is not unreasonable that most reports do not leave the institutions where they occur, but the lack of understanding about the frequency of accidents potentially contributes to the concerns by biosafety officers and workers about accident reporting. Counteracting many of the arguments about the public relations risks of reporting is the fact that some locations, such as Canada, already require disclosure.17
When incidents that may have led to exposure are properly reported, standard precautions including testing, monitoring, and treatment are usually pursued. However, these are local and one-time responses. More systematic responses require understanding the broader regulatory, reporting, and oversight ecosystem. A greater degree of transparency about accidents across laboratories would allow more confidence in safety and better oversight of risks. For that reason, several types of data seem useful.
First, more available data on the types and causes of accidents would help identify unsafe practices and the areas of most pressing concern. Second, because many types of accidents are rare, as mentioned earlier,18 data on near misses are potentially critical. In fact, safety engineers note that “near misses, or almost-accidents, can be even more revealing than an actual accident.”7 Lastly, data on the relative frequency of accidents at different laboratories would help identify whether a given laboratory's safety practices and culture are performing reasonably, are exemplary and should be a source of best practices, or whether the laboratory should seek ways to improve them.
On the other hand, not publicly discussing the reports is also strongly incentivized, weighing heavily against the previous arguments for collecting and publishing data to enable broader transparency. As a respondent noted, “oftentimes [...] those not familiar with the field like to judge the incident, but never get the rest of the story.[...] Incident reporting is not one step, it's a process.” There are also important issues with legal liability and public relations that can result from disclosure. Laboratories are concerned with the nonmedical risks to their operations and research that might emerge from disclosure.
The concern about liability and public relations illustrates a more general tension between transparency and self-interest. Accidents are obviously bad, but no system will ever be foolproof, and there will always be room for improvement. The question of how to weigh transparency and public benefit against regulatory burden, cost, and misinformed public speculation is critical, and finding a balance requires careful investigation. Still, our preliminary review indicates that in following Canada's model,17 more transparency would offer significant benefits to the community as a whole, and over the longer term it would help institutions build trust that is based on facts instead of secrecy. However, even putting that aside, the costs to individual institutions for admitting imperfection in their safety record are far smaller than the benefits to the public.
Burdensome Requirements
As noted earlier, several respondents said they were required to report everything (all incidents or similarly broad categories). This is potentially concerning because respondents also agreed that burdensome reporting might be a disincentive from reporting at all, and extensive and burdensome reporting is likely to be seen as a waste of time by scientists and workers, while adding administrative overhead costs, and such reporting might actually be unnecessary. Having multiple reporting systems, as many respondents reported, could impose similar additional burdens.
The desire for minimal burden certainly contrasts with the desire to have data, and this tension is likely not fully resolvable. At the same time, if reporting is made easier by using computers to automate parts of the forms or eliminate duplicates, data collection could plausibly be less burdensome and more complete than it is at present. To understand whether this is possible, or advisable, a better understanding of specific computer system requirements is needed.
Next Steps
Many issues highlighted by the results of our survey have yet to be resolved, as noted in the discussion. Given the relatively low response rate to the survey, and the relatively limited understanding as a result of response types and the small sample, follow-up work should include qualitative interviews. Such work is currently being planned.
Conclusion
This survey is a step toward better understanding the current state of reporting and disclosure in biosafety laboratories, which is a critical concern for understanding laboratory safety more generally. There are 2 key conclusions: (1) reporting standards and methods vary greatly between labs, as do both requirements and dissemination standards, and (2) there are clear areas for improvement.
The survey results show that many important questions remain unanswered, including how and why laboratories report differently, how best practices and internal policy standards are developed or applied, and how some of the identified causes of potential underreporting can be addressed, and hopefully minimized. The results also make it clear that intrinsic ambiguity in definitions and variation between laboratories make some of the quantifiable results less than ideal. Qualitative interviews would clearly be useful for further understanding of the complex issues involved. Results from the survey represent only a snapshot in time, and they mostly represent the situation in the United States. At the same time, it seems clear that reporting, and the lack thereof, are critical in understanding laboratory accidents. Further discussion and understanding of what is and is not publicly reported and why, and how the deficiencies can be remedied, are critical to the ongoing process of enhancing the safety of biological research laboratories.
Acknowledgments
Thank you to David Gillum and Barbara Johnson for early suggestions about how to write and field the survey, and to the ABSA council for approving the survey and allowing it to be sent to the mailing list. Thanks are also due to David Gillum, Ed Stygar, Karen Byers, Domenica Zimmerman, and Julie Savage, all at ABSA, who were very helpful in providing iterative feedback on the survey design. Thank you to all of the respondents. Thank you also to Gregory Lewis, Daniel Greene, and Robert Dean Smith, all of whom graciously provided feedback on a draft of the paper, and to Angela Baggio, for assistance with proofreading. Finally, tremendous thanks are due to the journal editors for assisting with the process of turning a survey into a paper, and to Nancy Connell, who provided invaluable feedback on the discussion and the writing.
References
- 1. National Research Council. Biosafety in the Laboratory: Prudent Practices for the Handling and Disposal of Infectious Materials. Washington, DC: The National Academies Press; 1989. Accessed November 10, 2021. 10.17226/1197 [DOI] [PubMed]
- 2. US Department of Health and Human Services (HHS), Centers for Disease Control and Prevention, National Institutes of Health. Biosafety in Microbiological and Biomedical Laboratories. 5th ed. Washington, DC: HHS; 2009. Accessed September 8, 2019 . https://www.cdc.gov/labs/pdf/CDC-BiosafetyMicrobiologicalBiomedicalLaboratories-2009-P.PDF
- 3. Klotz L. Human error in high-biocontainment labs: a likely pandemic threat. Bull Atomic Sci. February 25, 2019. Accessed September 23, 2021. https://thebulletin.org/2019/02/human-error-in-high-biocontainment-labs-a-likely-pandemic-threat/
- 4. Gillum D, Krishnan P, Byers K. A searchable laboratory-acquired infection database. Appl Biosaf. 2016;21(4):203-207. [Google Scholar]
- 5. Gillum D, Mendoza IA, Mancini C, Fletcher J. Are biosafety credentials beneficial? Appl Biosaf. 2016;21(4):193-197.
- 6. Smith S. BU delayed reporting possibly lethal exposure. Boston Globe. January 20, 2005. Accessed September 23, 2021. http://archive.boston.com/news/education/higher/articles/2005/01/20/bu_delayed_reporting_possibly_lethal_exposure/
- 7. Bahr NJ. System Safety Engineering and Risk Assessment: A Practical Approach. Boca Raton, FL: CRC Press; 2015. [Google Scholar]
- 8. Sutton V. Survey finds biodefense researcher anxiety—over inadvertently violating regulations. Biosecur Bioterror. 2009;7(2):225-226. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Gardiner L, Altmann S. Biosecurity survey. Published January 16, 2020. http://archive.mail-list.com/absabiosafety/message/20200116.171530.0fb32107.en.html
- 10. Shih TH, Fan X. Comparing response rates in e-mail and paper surveys: a meta-analysis. Educ Res Rev. 2009;4(1):26-40. [Google Scholar]
- 11. Wurtz N, Papa A, Hukic M, et al. Survey of laboratory-acquired infections around the world in biosafety level 3 and 4 laboratories. Eur J Clin Microbiol Infect Dis. 2016;35(8):1247-1258. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Trim JC, Elliott TS. A review of sharps injuries and preventative strategies. J Hosp Infect. 2003;53(4):237-242. [DOI] [PubMed] [Google Scholar]
- 13. Leigh JP, Marcin J, Miller TR. An estimate of the U.S. government's undercount of nonfatal occupational injuries and illnesses. J Occup Environ Med. 2004;46(1):10-18. [DOI] [PubMed] [Google Scholar]
- 14. Gronvall GK, Rozo M. Addressing the gap in international norms for biosafety. Trends Microbiol. 2015;23(12):743-744. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Pelkmans J. The new approach to technical harmonization and standardization. J Common Mark Stud. 1987;25(3):249-269. [Google Scholar]
- 16. xkcd.com. Standards. Accessed September 23, 2021. https://xkcd.com/927/
- 17. Lien A, Abalos C, Atchessi N, Edjoc R, Heisz M. Surveillance of laboratory exposures to human pathogens and toxins, 2019. Can Commun Dis Rep. 2020;46(9):292-298. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Barach P, Small SD. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ. 2000;320(7237):759-763. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.





