Since the events of September 11, 2001 (9/11), health-care institutions have been encouraged to enhance their readiness for disasters. The Joint Commission (previously the Joint Commission on Accreditation of Healthcare Organizations) has, since 2001, required member hospitals to complete an annual hazard vulnerability analysis (HVA), which is expected to provide a foundation for emergency planning efforts. A literature search revealed that little has been written and published on HVA since that requirement came into effect, and no known investigation of current HVA procedures has been completed.
To begin to address this gap, researchers from the Harvard School of Public Health and the Southern Maine Regional Resource Center for Public Health Emergency Preparedness (SMRRC) interviewed staff members at eight hospitals in Maine to document current HVA processes and develop recommendations for improvement. SMRRC is one of three regional nonprofit hospital-based centers in Maine guiding health systems and public health preparedness activities.
BACKGROUND AND OBJECTIVES
Hospitals and other health-care organizations have always had to prepare for and respond to a wide array of routine emergency and catastrophic disaster events. Since the terrorist attacks of 9/11 and subsequent attention and funding from the U.S. Department of Health and Human Services and Department of Homeland Security, hospitals have been urged to substantially expand their response plans and overall readiness for disasters. Hospitals are now expected to develop, implement, train, and exercise comprehensive all-hazards emergency management and operations plans. These planning efforts need to be inclusive of all four phases of emergency management: mitigation, preparedness, response, and recovery.
Emergency management programs and their associated emergency operations plans are only as good as the assumptions upon which they are based, which is especially true at the local level where planning must take into account specific risks unique to the immediate environment. Local priorities need to be considered, in addition to those required by federal and state authorities, and detailed in the goals, objectives, and deliverables tied to all funding streams. However, local priorities based on opinion alone, and not on objective data, can provide a weak foundation for planning. Expert clinical or administrative staff opinions can result in waste, duplication, missed opportunities, siloing, and confusion over what the true priorities are in terms of threat, vulnerability, and risk.
In the 2001 edition of its Comprehensive Accreditation Manual for Hospitals, the Joint Commission significantly revised the existing standard for emergency management.1 For the first time, the Joint Commission was guiding hospital emergency preparedness efforts “into the same arena as emergency management in the community as a whole.”2 Hospitals were now expected to function as an “integrated entity within the scope of the broader community.”
The 2001 standard urged that hospital response plans now must be “based on a hazard vulnerability analysis (HVA) performed by the hospital.” Although HVA was a relatively new term for hospital staff, the concept itself was not.2 The Joint Commission defined HVA as “the identification of hazards and the direct and indirect effects these hazards may have on the hospital.” The actual or anticipated hazards are analyzed in the context of the population at risk to determine the vulnerability to each specific hazard.
Hospital emergency managers have long performed HVAs in their heads, as “much of the process is highly intuitive.” For example, hospitals in the Midwest do not need to plan for hurricanes, while those along the Atlantic Coast must. Even the way risk has been defined both qualitatively and quantitatively for hospitals is wide-ranging in its scope and use. As a result, “risk may be one of the most elusive concepts in health emergency management.”3
While mandating that hospitals perform HVA, the 2001 Joint Commission standard did not formalize the process for doing so. Additionally, the Joint Commission did not offer a specific tool to normalize the process in hospitals. While the American Society for Healthcare Engineering (ASHE) of the American Hospital Association offered the first standard methodology in 2001 for performing a hospital HVA,2 a wide array of other tools and methods also became available for hospitals to utilize for risk and vulnerability assessment.3
Later in 2001, Kaiser Permanente developed a modified Hazard Vulnerability and Assessment Tool for Medical Center Hazard and Vulnerability Analysis.4 This tool expanded both the guidance and scope of hazard “events” that hospitals should consider. Specifically, it expanded the risk measures to include human impact, property impact, and business impact. Each measure was rated separately for each event and weighted in the final vulnerability score. Likewise, the mitigation measure was expanded from the ASHE tool, which simply rated preparedness as “poor,” “fair,” or “good.” The new tool broke mitigation down into preparedness (preplanning), internal response (time, effectiveness, and resources), and external response (community/mutual aid staff and supplies). This final measure reflected the intended outcome of the new Joint Commission standard by assessing hospitals as community organizations rather than stand-alone institutions.
The following year, HCPro, Inc., a private health-care regulation and compliance product and service provider, published its own HVA Toolkit for hospitals.5 Similar to the Kaiser tool, this toolkit is meant to facilitate the evaluation of every potential event in each of the three categories: probability, risk, and preparedness. Like the others, the kit allows the user to add events as necessary. To determine probability, users are encouraged to consider known risk, historical data, and manufacturer/vendor statistics. The Joint Commission does not provide this level of detail or guidance; rather, it is individual private publishers that offer HVA tools with this level of specificity. While helpful, these modifications make it difficult to draw comparisons among hospitals, or across jurisdictions or states.
While the Joint Commission continues to refine and expand emergency management standards, it has yet to provide a standardized method or tool for conducting HVAs. What none of these tools or the Joint Commission standard offers, however, is a standardized method for collecting or using HVA data at the hospital or community level. Hospitals are left on their own to determine how they will collect information on probability and severity, how they will process that information within the institution, and what to do with the results.
The primary objective of this study was to investigate how institutions at the local level, in particular hospitals in Maine, currently implement HVA, in an effort to encourage future research on this topic to ultimately improve HVA efficacy.
METHODS
During 2005 and 2007, the SMRRC invited eight hospitals in the Southern Maine region to participate in a regional HVA process. The Southern Maine region includes acute care and mental health hospitals within York, Cumberland, Sagadahoc, and Lincoln counties, most of which are Joint Commission accredited. An electronic copy of the Medical Center HVA template and instructions were provided to each hospital's emergency preparedness contact. These individuals participate regularly in SMRRC activities and preparedness efforts. They represent a variety of departments from their institutions, including hospital administration, planning, safety, infection control, and facilities management.
Administration of the HVA tool was customized to best meet the needs and available resources of each facility. If a facility had recently completed an HVA, its staff members were encouraged to use those data to aid in the completion of the SMRRC version. Other facilities distributed the HVA forms to individual members of their internal Environment of Care or Emergency Preparedness Committees and then convened as a group to reach consensus for the organization. The HVA tool used in this study was based on the model developed by Kaiser Permanente and modified for use by the SMRRC.
During April 2008, we conducted a series of face-to-face, semi-structured, in-depth interviews with staff from each of the participating hospitals who were identified to have a key role in the HVA process at their facility. Two interviewers attended each discussion and subsequently compared notes to assure objectivity. The questions were largely drawn from a paper entitled, “Risk and Risk Assessment in Health Emergency Management.”3 Beyond the issues suggested by this paper, the interviewers discussed the HVA results produced in each hospital and changes in results from year to year.
RESULTS
The lack of standardization in the HVA process from hospital to hospital became apparent as the survey progressed. Specifically, the researchers found the following:
The scope of risk varied a great deal across the institutions. Some hospital staff considered the scope to be limited to the institution's campus, while others had an expanded view and considered risks to the hospital's entire service area.
The planning time frame was rarely clarified and often varied from institution to institution. In some hospitals, staff believed that they were planning for one year, while in other hospitals they believed that they were planning for a longer time frame (e.g., three to five years).
The individuals facilitating the process had a large impact on the results. For example, regarding scope of risk, staff members with hospital engineering backgrounds focused on the institution, while others with public health exposure and training tended to focus on the community. An individual's personal experience with disasters had a substantial impact on the results. Changes in HVA results from period to period tended to be those hospitals with substantial changes in the staff responsible for HVA.
The level of resources committed to HVA differed greatly. None of the institutions prepared a budget specifically targeting this activity. The number of hospital staff substantially involved in the deliberations varied from one person to 20 people, and the difference was not consistently related to the size of the institution. In addition, while some hospitals invited community experts (e.g., fire, emergency medical services, police, and emergency management personnel) into the process, most limited participation to their employees. Only one hospital staff member used information available at the county emergency management agency office, despite the availability of that staff and knowledge base to all participants.
The decision-making process was usually informal. The process of arriving at decisions was rarely made explicit. No minutes were kept in any of the institutions to record, for example, differences of opinion regarding risk, although many of the individuals interviewed could recall differences, including animated debates.
Changes in results were apparently highly associated with whether the process was framed and managed as incremental or not. In some hospitals, the results from prior years were present for discussion of the current year's risks. In others, the issue was considered without reference to previous results.
The results of the HVA process were not widely shared. Hospital staff rarely communicated results outside the institution beyond the Regional Resource Center that requested them. Within the institution, the results were nearly all communicated to established (e.g., safety) committees, but only a few hospitals channeled results to the Chief Executive Officer (CEO) and Board of Trustees for discussion.
HVA results affected preparedness activities very differently from institution to institution. In one hospital, the results were only communicated to the external Regional Resource Center, and never passed on internally. That hospital's staff members believed that the Regional Resource Center needed the information for regional planning purposes and did not understand that the HVA was completed primarily for internal planning and accreditation purposes. In contrast, at another hospital, staff members completed an annual action plan detailing how they were going to respond to each of the risks identified.
The commitment of individual hospital senior leaders, including the CEO, had a substantial impact on the HVA process, influencing both the level of resources committed and the management of results.
CONCLUSIONS AND RECOMMENDATIONS
We believe the efforts presented in this article are among the first exploratory investigations into this important issue. We encourage other public health professionals to pursue investigations covering more health-care institutions and employing more rigorous research methods. In addition, we offer the following recommendations:
The HVA process should be developed to achieve a greater degree of standardization. For example, the scope of risk and planning time frames should be clarified and applied consistently across hospitals. Guidelines should also encourage greater use of other community experts and available information.
The level and types of expertise required should be addressed. The HVA was added to the Joint Commission requirements because the importance of emergency planning has been enhanced. Enhanced quality of planning also requires input from diverse areas, including facility management, public health, emergency management, administration, nursing, and medical care.
The Joint Commission should address the issue of periodicity. Currently, hospitals are expected to complete an HVA on an annual basis. We believe that the process should be changed from annual to every other or every third year unless a serious alteration in conditions occurs (e.g., construction of a nuclear power plant nearby). Too-frequent assessments tend to dull the process and reduce it to an insubstantial incremental procedure with little impact.
- Each hospital should be encouraged to pursue the following steps when completing the HVA:
- Research into vulnerability through public safety, emergency management agencies, and other sources of information;
- Organizational meeting of individuals to be involved in the deliberative process that would clarify the decision-making process as well as its importance within and outside the institution;
- Individual completion of the assessment instrument in private to encourage differing opinions;
- Group discussion and consensus;
- Documentation of discussion, including minority opinions and overall results;
- Documentation of action planning to address identified gaps; and
- Wide distribution of the results both outside and within the institution, including to the most senior decision makers.
Acknowledgments
The contents of this article are solely those of the authors and do not necessarily represent the views of CDC, the U.S. Department of Health and Human Services, or any partner organizations, nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. government.
Footnotes
This article was supported by funding awarded to the Harvard School of Public Health (HSPH) Center for Public Health Preparedness under Grant/Cooperative Agreement #3U90TP124242-05 from the Centers for Disease Control and Prevention (CDC).
REFERENCES
- 1.Joint Commission on Accreditation of Healthcare Organizations. Comprehensive accreditation manual for hospitals: the official handbook. Oakbrook Terrace (IL): Joint Commission Resources, Inc.; 2008. [Google Scholar]
- 2.American Society for Healthcare Engineering of the American Hospital Association. Hazard vulnerability analysis [Healthcare Facilities Management Number: 055920] Chicago: ASHE; 2001. [Google Scholar]
- 3.Arnold JL. Risk and risk assessment in health emergency management. Prehosp Disaster Med. 2005;20:143–54. doi: 10.1017/s1049023x00002363. [DOI] [PubMed] [Google Scholar]
- 4.Kaiser Permanente. Medical center hazard and vulnerability analysis. Kaiser Foundation Health Plan, Inc. [cited 2010 Jun 16]. Available from: URL: http://www.calhospitalprepare.org/sites/epbackup.org/files/resources/Hazard%20&%20Vulnerability%20Analysis_kaiser_model.xls.
- 5.HCPro Inc. Hazard vulnerability analysis toolkit: assessing risk to patients and preparing for all disasters. Marblehead (MA): Opus Communications, Inc.; 2002. [Google Scholar]