Abstract
Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1), disaster management mechanisms (F2), hospital infrastructural safety (F3), and disaster resources (F4). These factors displayed good internal consistency. The overall level of hospital disaster resilience (F) was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.
Keywords: disaster, evaluation framework, factor analysis, hospital resilience, preparedness
1. Introduction
Disasters including extreme weather events, natural disasters, bioterrorism, and pandemics are having an increased global impact [1]. Hospitals play an important role during disasters, as they provide “lifeline” services to reduce disaster associated mortality and mobility, and thus minimize the impact of disasters on the community [2,3,4,5]. Efficient hospital disaster management is considered an essential way for hospitals to supply continuous health services during disasters, even if the hospitals are directly affected by the disaster [6,7]. Hospital resilience is an emerging concept that has been added into the hospital disaster management context [2,8,9,10,11,12]. It can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services (e.g., on-site rescue, pre-hospital care, emergency health treatment, critical care, decontamination and isolation), and then to recover to its original state or adapt to a new one.” [10]. It implies a comprehensive perspective of a hospital’s ability to cope with disasters, including inherent strength (ability to resist and absorb disasters) and adaptive flexibility (strategies to maintaining and surging essential health services and adaptation for future disasters) [13,14,15]. There are four criteria of disaster resilience which can be adapted into hospitals, namely, robustness, redundancy, resourcefulness, and rapidity [5,8,9,12]. To be more specific, hospital resilience aims to improve hospital pre-event strength, thus promoting the rapidity of response and recovery, through redundant resources and resourceful strategies [10,16,17,18,19,20].
Defining and measuring hospital resilience has been the subject of recent research [2,5,8,9,10,12,21]. Most of these measures were derived from the engineering perspective, and used mathematical calculations; there is little empirical evidence of a hospital resilience evaluation instrument that can be used easily by hospital managers and health administrators. Recently, the primary focus of studies in the area of health disaster management has been on the evaluation of disaster preparedness [22,23,24,25,26,27]. However, such disaster preparedness studies tend to adopt a particular perspective of the pre-disaster stage, instead of examining the institution’s overall ability to confront disasters. For example, previous research emphasizes the structural components (e.g., human, equipment, resources), but devalues the importance of functional components, such as safety infrastructure, and medical response and surge capacity [10,28,29,30]. Another weakness of most of the existing frameworks is that they are not validated by empirical data [10].
Most studies view hospital disaster ability separately, using concepts such as hospital safety, preparedness, response, recovery, business continuity, surge capacity, and other specific abilities [26,27,28,31,32]. These concepts do not achieve consensus around a comprehensive framework for evaluating a hospital’s ability through all phases of the disaster management [28,29]. Thus, there is value in developing a comprehensive instrument, which can be used as an evaluation tool by hospital practitioners, managers and government administrators to understand and assess the overall ability of hospitals to cope with disasters. The concept of disaster resilience (which is a measure of return of functions) is a new but important concept that is useful to help devise this instrument [12].
This article aims to develop and validate a framework for hospital resilience, and assess its validity and reliability using empirical data from tertiary hospitals. It has four objectives: (1) to develop a preliminary evaluation framework of hospital resilience; (2) to extract key component factors from the primary domains of the framework for further conceptualization and measurement of hospital resilience; (3) to examine the construct validity and internal consistency of this framework; and (4) to establish scoring models to measure the level of hospital disaster resilience.
2. Methods
2.1. Study Design
This study was a cross-sectional study of tertiary hospitals in Shandong Province in China. The survey was conducted between January 2013 and June 2013. Ethical approval to conduct the research was obtained from the Queensland University of Technology (approval No. 1200000170) and written informed consent was obtained from each participating hospital.
2.2. Study Setting and Sample
In China a tertiary hospital is defined as a health facility covering a large administrative area and capable of providing comprehensive and specialized medical care. For tertiary hospitals, the total number of beds should be more than 500, with physician and nurse numbers being more than 1.03 and 0.4 per bed, respectively. Tertiary hospitals can be further classified into three subgroups: Grade A, Grade B, and Grade C. Such classification was derived from the national hospital evaluation according to each hospital’s comprehensive scores (the total score is 1000) aggregated from respective perspectives, such as service levels, size, medical missions, medical technology, medical equipment, management, and medical quality [33,34]. Tertiary A hospitals have achieved higher comprehensive scores than tertiary B and tertiary C hospitals. Thus, tertiary A hospitals are supposed to be the most advanced hospitals in an area [34]. A total of 50 tertiary hospitals in the Shandong Province were selected using stratified random sampling, according to their subgroups (i.e., Grade A, B, or C). The hospital selection used the contact list obtained from the Provincial health department. A total of 41 fully completed questionnaires were received, representing an 82.0% response rate. The response quantity accounted for approximately half (53.9%) of the total quantity of tertiary hospitals in this province.
2.3. Study Protocol
A preliminary framework for assessing hospital resilience was proposed through a systematic literature review [10]. This review used four criteria of disaster resilience (i.e., robustness, resourcefulness, redundancy and rapidness) as inclusion criteria for selecting key domains and indicators from the included studies [10]. More details regarding such selection can be found in this review [10]. The potential indicators in this framework were then developed and validated by a Modified-Delphi study undertaken in China. This study was used to collect experts’ opinions regarding key indicators of hospital resilience [35]. From the previous systematic approach, a preliminary evaluation framework was established. Within this framework, eight key domains were identified that can be used to reflect the level of hospital resilience. Then, based on the preliminary framework, a hospital survey questionnaire was designed to collect the data. The resultant questionnaire consisted of nine sections, with more than 100 items in total.
As the study was conducted in China, a Chinese language questionnaire was used to capture the responses; the instrument was subsequently translated back into English for the final reporting. Brislin’s model of translation was used to ensure the accuracy and appropriateness of the translation of the questionnaires [36]. The survey questionnaire was forward-translated and then back-translated between the source language (English) and the target language (Chinese), by two independent translators (Shuang Zhong, Yuli Zang). The English version of the questionnaire can be found in the Supplementary Information.
The feasibility and suitability of the questionnaire were evaluated and modified by a pilot study at three hospitals. The questionnaire, accompanied by an official letter from the Provincial Health Department outlining the importance of the survey, was sent to each of the sample hospitals by both email and post. Each hospital was asked to designate a director in the management department to be responsible for distributing and coordinating the completion of the questionnaire. The questionnaires were completed and signed-off by suitably knowledgeable staff from relevant departments within the hospital. These departments included the emergency department, human resource management, pharmacy, resource logistics, and facility maintenance department. Each returned questionnaire was reviewed for its completeness and consistency. For those questionnaires which were incomplete and/or contained inconsistent responses follow-up telephone calls were made to ensure completeness and consistency.
2.4. Measurements
The first section of the questionnaire includes demographic information, while sections 2–9 represent the eight key domains that were identified in the preliminary evaluation framework for hospital disaster resilience, namely: hospital safety standard and procedures, emergency command, communication and cooperation system, disaster plan, disaster resource stockpile and logistics management, emergency staff capability, emergency services and surge capability, training and drills, and recovery and adaptation strategies. Most questions are in the format of binary variables, and can be answered by “yes” or “no” (e.g., is there any disaster plan?). Some questions are in the format of numeric variables, and can be answered by quantity numbers (e.g., how many staff are there in the rescue teams?).
The numeric indicators were mainly used for a qualitative description of hospital status [37], while the primary binary variables were included into the calculation of the resilience index. For the binary variables, the options of “yes” or “no” were assigned the score of “1” or “0”, respectively. Then, the scores of each domain were calculated by adding together the score of all the relevant questions. A total score was calculated by summing the scores across all eight domains, which is a proxy for measuring overall disaster resilience in an institution. The higher the total score, the better the hospital’s disaster resilience.
2.5. Data Analysis
The data from the returned questionnaires were transferred into a Microsoft Office Access 2007 database. The data were checked, cleaned, and analyzed using SPSS Statistics [38]. The study used several steps and methodologies during the data analysis. Firstly, to test the construct validity, factor analysis was chosen to analyze the empirical data. Factor analysis can be used to identify the underlying component factors between the measured variables and the latent constructs, thereby allowing the formation and refinement of the theory. It also can provide evidence of construct validity evidence of self-reporting questionnaires [39]. The steps of the factor analytical process included assessing the suitability of using factor analysis, correlation matrices, factor extraction, choosing the number of factors to retain, factor rotation, component score coefficient matrix, and factor interpretation [40]. Secondly, the discriminate validity of the construct was tested. It assumed that the level of the hospital’s ability to cope with a disaster is associated with the level of the hospital’s categorization [25,41]. For instance, it was expected that tertiary A hospitals would report a generally higher level of resilience than tertiary B hospitals, as the missions of regional disaster centers are primarily assigned to tertiary A hospitals. The scores of each factor, according to hospital categorization, were compared using independent sample t-tests. Thirdly, a self-assessment scale was developed based on the items in the questionnaire to categorize hospitals into different resilience levels. Also a model for calculating the overall level of disaster resilience among hospitals was established through each component factor being weighted. Finally, to test the internal consistency of key domains, the Cronbach’s coefficient alpha analysis was performed [42]. In addition, the current status of the sample was investigated using descriptive analysis, and was reported in a published article [37].
3. Results
3.1. Inspection
The value of the Kaiser-Meyer-Olkin Measure of Sampling Adequacy (KMO) in the current study is 0.792, which is over the threshold of 0.7. The result of Bartlett’s Test (p < 0.01) revealed that the eight primary domains are not independent, and have a high level of correlation. Thus, factor analysis is suitable to be used to extract component factors in the study [42].
3.2. Extraction of Component Factors
The factor solution was determined by the number of factors generated with eigenvalues greater than 1, as well as by theoretical considerations. The domain scores were used as the independent variable for component extraction. Four factors had an eigenvalue greater than one that could be extracted to be representative of all the domains. The 4-factor solution was compared to solutions with larger and smaller numbers of factors. The 4-factors solution appeared to be more meaningful theoretically than the smaller or larger factor solutions, which were difficult to interpret. As illustrated in Table 1, the 4-factor solution accounts for about 86.9% of the cumulative variance, and all the domains loaded highly (at least 77.0%).
Table 1.
Domains | Extraction | Eigenvalues | % of Variance | Cumulative % |
---|---|---|---|---|
1. Emergency command, communication and cooperation system | 0.770 | 2.201 | 52.653 | 52.653 |
2. Disaster plans | 0.874 | 2.121 | 17.685 | 70.338 |
3. Disaster stockpiles and management | 0.924 | 1.199 | 9.050 | 79.388 |
4. Emergency staff | 0.895 | 1.035 | 7.552 | 86.940 |
5. Emergency training and drills | 0.789 | 0.604 | 4.906 | 91.846 |
6. Emergency services and surge capacity | 0.911 | 0.311 | 3.883 | 95.729 |
7. Hospital safety standard and procedures | 0.973 | 0.181 | 2.259 | 97.989 |
8. Recovery and adaptation strategies | 0.821 | 0.161 | 2.011 | 100.000 |
Note: Extraction method: principal component analysis.
3.3. Rotation
The relationship between the initial domains and the extracted components was not obvious. Thus the varimax rotation was used to make the extracted components more decentralized, and their relationship clearer and easier to explain [42]. These principal components can be explained with their significance; therefore, in this study these principal components are referred to as component factors.
According to the Rotated Component Matrix, each of the four factors extracted the majority of information from different domains (the percentage of the extracted information from domains is expressed in the parentheses). The first factor contains information mainly from three domains: essential service maintenance and surge capacity (0.901), emergency staff capability (0.891), and training and drills (0.625). It was assumed that these three domains reflect or impact upon a hospital’s medical response performance during disasters, and thus is named “emergency medical response capability”. The second factor was closely related to the domains of disaster plans (0.611), crisis leadership and cooperation (0.762), and recovery and adaptation strategies (0.878), and reflects “hospital disaster management mechanisms”. The third factor was mainly representative of the domain of disaster stockpiles (0.896), which focuses on a hospital’s “disaster resources”. The fourth factor includes hospital infrastructural strength and safety strategies to cope with disasters, and was representative of “hospital safety and vulnerability” (0.951). Hence, the four factors were identified and named: emergency medical response capability (F1), disaster management mechanisms (F2), infrastructure safety (F3), and disaster resources (F4). They explain most of the overall level of the hospital’s disaster resilience.
3.4. The Construction of the Framework
The results of the factor analysis are largely consistent with the original, theoretically developed framework, established from the literature review and the Modified-Delphi Method. In the validated framework the 43 measurable indicators were divided into four factors for conceptual understanding of hospital resilience. A summary of these results are presented in Table 2.
Table 2.
Construct of Measurable Items |
---|
Factor 1: Emergency Medicine Response Capability |
Domain 1: Emergency Services and Surge Capacity (Include On-site Rescue, Hospital Treatment, and Surge Capacity) |
|
Domain 2: Emergency Staff Capability (Refer to Emergency Staff Specialties, Qualifications and Supporting Strategies) |
|
Domain 3: Emergency Training and Drills (Disaster Training and Drills Involved in Hospital Daily Work) |
|
Factor 2: Disaster Management Mechanisms |
Domain 4: Emergency Command, Communication and Cooperation System |
|
Domain 5: Disaster Plans (Plans to Prepare All-hazards Disasters in Advance) |
|
Domain 6: Recovery and Adaptation Strategies (for Recovery and Improvement after Disasters) |
|
Factor 3: Disaster Resources |
Domain 7: Disaster Stockpiles and Logistics Management (Stockpile and Management of Emergency Supplies and Medications) |
|
Factor 4: Hospital Infrastructral Safety |
Domain 8: Hospital Safety Standard and Procedures (Refer to Hospital Infrastructural Safety, Surveillance System, and Network Backup) |
|
3.5. Framework Validity Discrimination
A comparison of the scores (total score and mean score of each factor) among different categories of hospitals is shown in Table 3. As illustrated, the mean score of each factor of tertiary grade A hospitals is higher than that of tertiary grade B hospitals; the statistical difference is confirmed among all the factors. The mean scores of the four factors for the general hospitals are higher than that of specialized hospitals, except for the factor of hospital safety and vulnerability. However, statistical difference among the factors of emergency medical response (F1) and disaster resources (F3) was tested. The mean score of each factor of hospitals with the mission of “regional disaster rescue” is higher compared to hospitals without this mission. Statistical differences among most factors were tested. The overall scores for the tertiary A hospitals, general hospitals, and hospitals with the mission, are higher than that of tertiary B hospitals, specialized hospitals, and hospitals without the mission, respectively.
Table 3.
Variables | No. | Component Factors | |||||
---|---|---|---|---|---|---|---|
Emergency Response (F1) | Management Mechanisms (F2) | Safety (F3) | Resources (F4) | Overall Score (F) | |||
MS | MS | MS | MS | MS | |||
CI | CI | CI | CI | CI | |||
Level | Tertiary A | 27 | 29.26 | 12.89 | 8.07 | 2.26 | 52.48 |
Tertiary B | 14 | 24.29 | 7.86 | 6.93 | 1.29 | 40.36 | |
Type | General | 27 | 31.60 | 11.30 | 7.63 | 2.00 | 52.52 |
Specialized | 14 | 19.79 | 10.93 | 7.79 | 1.79 | 40.29 | |
Disaster Mission | Assigned | 13 | 37.31 | 13.92 | 8.23 | 2.38 | 61.85 |
No mission | 28 | 23.04 | 9.89 | 7.43 | 1.71 | 42.07 | |
No. Category | Total | 41 | 27.56 | 11.17 | 7.68 | 1.93 | 48.34 |
Notes: Emergency medical response capability, (highest score = 51); Emergency management mechanisms, (highest score = 17); hospital safety and vulnerability, (highest score = 9); disaster resources (highest score = 4); MS: Mean score CI: 95% confidence interval of means. * p < 0.05; Tested by non-parameter test (Mann-Whitney Test).
3.6. Scoring and Modelling
3.6.1. Hospital Self-evaluation Scale Card (without Weighted)
When the questionnaire is used by hospitals for their self-assessment, the scale card as set out in Table 4, can be used to readily assess hospital vulnerabilities. There are 81 binary questions in the questionnaire that were included to reflect the key indicators in the evaluation framework. When all parts of the questionnaire were finished, the scores of these questions, “0” or “1”, can be summed. The highest score of each factor is illustrated at the bottom of the Table 3. The overall resilience score was summed up by adding the scores of the four factors (the highest score was 81). The quartile (25%, 75%) was used to categorize the levels of overall resilience score, and each dimension score. Such classification criteria were derived from the research approach of the Torrens Resilience Institute, in 2012, regarding the development of a scorecard toolkit to measure community disaster resilience [47]. As a result, if the overall score is over 60, the hospital was likely to be extremely resilient to disasters. If the overall score is less than 20, the hospital was likely to be extremely impacted upon in a disaster and/or have greater difficulty in recovering. Particular attention needed to be paid to the scores in the four components of resilience. If the individual score in one component tended to be in the low zone, this aspect of resilience likely indicated the highest priority for action. Thus, all scores can be very useful in highlighting those aspects of resilience that most need attention from hospital managers and relevant decision-makers. For instance, adapting the data in this study sample (Table 3) into the scale card (Table 4), the hospitals’ average overall resilience was found to be located in the “moderate zone” (score = 48.34). Among the four factors, only the score of hospital safety (F3) was located in the “high zone” of resilience (score = 7.68), which indicates a good level of resilience for that factor. The scores of the other three factors (F1, F2, and F4) were located in the “moderate zone”.
Table 4.
Factors | Low Zone | Moderate Zone | High Zone |
---|---|---|---|
Overall | 25% (0–20) | 26%–75% (21–60) | 76%–100% (61–81) |
Emergency medical response | 25% (0–12) | 26%–75% (13–38) | 76%–100% (39–51) |
Management mechanisms | 25% (0–4) | 26.2%–75% (5–12) | 76%–100% (13–17) |
Safety and vulnerability | 25% (0–2) | 26%–75% (3–6) | 76%–100% (7–9) |
Resources | 25% (0–1) | 26%–75% (1–3) | 76%–100% (3–4) |
Notes: Low zone (poor level of resilience); Moderate zone (moderate level of resilience); High zone (Resilient). The classification criteria of 25% and 75% were derived from the research approach of the Torrens Resilience Institute, in 2012 regarding the development of a scorecard toolkit to measure community disaster resilience [47].
3.6.2. Scoring Model for Comparison and Rank of Hospitals in an Area (Weighted)
To compare or rank disaster resilience among hospitals, the weight of each factor was taken into consideration, as different factors make different contributions to the overall resilience [37].
The score for each factor for each of the sample hospitals was obtained by the use of regression analysis based on the Component Score Coefficient Matrix. Each factor score for the study hospital sample can be expressed using the following model [11]. Data is from Zhong, et al. 2014 [37]:
F1 = 0.524 X1 + 0.570 X2 + 0.191 X3 − 0.080 X4 − 0.220 X5 − 0.083 X6 − 0.147 X7 + 0.080 X8 F2 = 0.030 X1 − 0.315 X2 + 0.138 X3 + 0.443 X4 + 0.182 X5 + 0.659 X6 − 0.214 X7 − 0.211 X8 F3 = −0.253 X1 − 0.116 X2 + 0.093 X3 + 0.007 X4 + 0.364 X5 − 0.318 X6 + 0.936 X7 − 0.199 X8 F4 = −0.144 X1 + 0.288 X2 − 0.109 X3 − 0.119 X4 + 0.148 X5 − 0.167 X6 − 0.148 X7 + 0.972 X8 |
(1) |
The weight of each factor was assigned by the proportion of the variance contribution of each principal to the cumulative variance contribution of the four primary factors (as in Table 1) [42]. Then the evaluation model of hospital disaster resilience was established and the overall level of hospital disaster resilience (F) can be calculated using the following model:
F = 0.615 F1 + 0.202 F2 + 0.103 F3 + 0.080 F4 | (2) |
The weight for emergency medical response capability was 0.615, for disaster management mechanisms 0.202, for hospital infrastructural safety 0.103 and for disaster resources 0.080. The disaster resilience score for each hospital in the study was calculated accordingly; their relative rank is listed in Table 5. Due to ethical issues and respect for confidentiality, the hospital’s name was replaced with a hospital ID.
Table 5.
ID | F | Rank | ID | F | Rank | ID | F | Rank | ID | F | Rank |
---|---|---|---|---|---|---|---|---|---|---|---|
1 | −0.554 | 32 | 12 | 0.482 | 11 | 23 | 1.120 | 1 | 34 | 0.733 | 7 |
2 | −0.997 | 37 | 13 | −0.195 | 26 | 24 | −0.093 | 23 | 35 | 0.331 | 16 |
3 | 0.384 | 13 | 14 | −0.970 | 36 | 25 | 0.095 | 19 | 36 | 0.733 | 8 |
4 | −0.206 | 27 | 15 | −1.189 | 41 | 26 | −0.262 | 30 | 37 | 1.088 | 2 |
5 | −0.750 | 35 | 16 | −1.126 | 40 | 27 | −0.085 | 22 | 38 | 0.932 | 3 |
6 | −0.147 | 24 | 17 | −1.005 | 38 | 28 | 0.774 | 6 | 39 | −0.176 | 25 |
7 | 0.049 | 20 | 18 | −1.116 | 39 | 29 | 0.623 | 9 | 40 | 0.360 | 14 |
8 | −0.239 | 29 | 19 | −0.04 | 21 | 30 | −0.572 | 33 | 41 | −0.729 | 34 |
9 | 0.262 | 17 | 20 | 0.8624 | 4 | 31 | −0.235 | 28 | |||
10 | 0.550 | 10 | 21 | 0.7885 | 5 | 32 | 0.412 | 12 | |||
11 | −0.33 | 31 | 22 | 0.3393 | 15 | 33 | 0.0990 | 18 |
During the analysis, regression factor scores were standardized by construction. Thus, the illustrated scores were relative scores. A negative score means that the ratings of the level of resilience was lower than average. If a score was closer to the value of 0, it meant that the ratings of the level of overall resilience were closer to the average of the sample [48]. According to Table 5, there were 20 hospitals whose average comprehensive scores (F) were positive, which accounts for approximately 50% of the sample. The results indicate that hospital disaster resilience in the province was insufficient with considerable variation in the likely resilience of those hospitals which were sampled. Similarly, the score of each four factors (F1, F2, F3, and F4) can be calculated and ranked respectively, and thus can assist in identifying areas for highest priority in relation to strengthening resilience.
3.7. Internal Consistency
The results indicate that the four extracted component factors had a generally acceptable level of internal consistency (Cronbach’s Alpha Based on Standardized Items is 0.744). The eight domains also had a good overall internal consistency (Cronbach alpha = 0.780). The internal consistency (expressed by the Cronbach alpha) of most domains was over 0.6. Only one domain, that of emergency training and drills, had an internal consistency which fell under 0.6 (0.406).
4. Discussion
4.1. Significance
The concept of hospital resilience provides a comprehensive approach to improving a hospital’s ability to withstand the impact of a disaster, and to reduce the mortality and mobility associated with disasters through efficient health responses. In this study, the principal factor analysis was conducted to provide a four-factor solution for hospital resilience.
The significance of this study is three-fold. Firstly, the proposed framework launches an important discussion regarding the conceptual understanding of what constitutes hospital disaster resilience. Departing from the majority of existing research, that focuses on only one or two aspects of a hospital’s disaster ability (such as preparedness), the framework captures the principal and comprehensive components to depict a hospital’s overall ability to cope with disasters. It is proposed that a multidimensional measurement is more reliable and comprehensive for measuring a hospital’s overall resilience than examining its disaster capability through only one aspect or a limited number of aspects. Thus, the proposed framework measures preparedness before disasters, and also describes the underlying elements of disaster functions and health outcomes, i.e., hospital emergency medical response, recovery and adaptation strategies.
Secondly, the results of the factor analysis are encouraging. It yielded factors largely consistent with the conceptualization of hospital resilience and the construction of its primary domains. It generated a four-factor structure of hospital resilience, namely: emergency medical response capability (F1), disaster management mechanisms (F2), hospital infrastructural safety (F3), and disaster resources (F4). It seems that the four factors move from the micro level to the macro level. The F1 was at the ground level and was about immediate “doing”, which encompasses emergency medicine capability, surge capacity, and staff capability; F2 was relevant to management, plans and cooperation. F2 was not immediate “doing”, but rather, it was about having policies and procedures in place that can be enacted at the right time. F3 and F4 were concerned with infrastructures and resources which involve macro-level planning for safety, such as standards, procedure, and disaster resources.
Finally, the four-factor structure offers a way of modelling the overall level of hospital resilience and the level of four factors independently. There are two ways of modelling: one way is to use the self-assessment scales for categorizing the level of hospital resilience. It uses the questionnaire as a simple checklist for hospital self-evaluation. It can provide an overall score, as well as provide scores on particular areas of resilience. Thus, it can enable hospitals to drill down to identify areas for improvement and can be used as a basis for measuring improvement over time. Further, the scales can be adjusted and reconstructed with more items being added into the questionnaire for future studies. Even if the highest scores (i.e., total number of questions) are different, the same classification criteria (25%, 75%) can still be used for the classification of the levels [47]. Alternatively, the regression modelling of different weights for each factor can be used. The modelling also has regional application as it enables regions or provinces to drill down into the resilience of their hospitals. The scores can provide a comparison of hospital resilience for hospitals of similar size and function within and across regions. Thus, the regions and provinces can build capacity in a targeted and coordinated way.
The strength of this study is that the preliminary evaluation framework and the derived questionnaire were developed from a systematic approach (using an international literature review, the Modified- Delphi Method in China, and the empirical data). The internal consistency of the framework and questionnaire was relatively high; its construct validity was validated to some extent. This preliminary framework can provide a foundation for further similar studies or evaluations.
Furthermore, according to the experts’ opinions in the Delphi study, except for the hospital routine treatment, tertiary hospitals are also expected to have outreach functions, and to cope with major disasters (e.g., earthquakes, bioterrorism, and epidemics) that can cause mass casualties on-site. Hospital resilience was an essential element of resilient communities, therefore, hospitals need to provide immediate healthcare services to the affected communities [49]. However, critical healthcare services include both patient transport into hospital and the bringing of care to the patient on site [50]. Thus, during major disasters, tertiary hospitals need to be capable of providing expert teams, equipment and medical care services into the field, for on-site medical rescue, as well as to support the treatment provided by the local hospitals for a large number of patients [51,52]. At present, such functions are commissioned by the local government from the local tertiary hospitals, in order to build a district coordinated medical response network for major disasters [52].
4.2. Model Justification
The weights of these four factors were calculated using factor analysis. However, the calculated weights needed to be justified from the resilience concept and the actual resilience status of the hospital. Hospital resilience is a comprehensive concept; it includes structural components (e.g., facility and infrastructural safety), non-structural components (e.g., network, staff, equipment, medication), emergency healthcare service components (e.g., medical response and treatment, continuity of medical service, surge capacity, training and drills), and disaster management (e.g., plan and procedure, command, cooperation, crisis communication, recovery and adaptation strategies) [10,12,53]. Thus, the eight primary domains and the extracted four factors complied with the concept, and made intuitive sense.
Among the four factors, hospital medical response capability (F1) was identified as the main component factor, as it was relevant to the emergency healthcare performance and was health outcome-oriented in times of disasters. F1 was merged with three other domains (i.e., emergency service capability and surge capacity, emergency staff capability, disaster training and drills); the combination made intuitive sense. The domain of emergency service capability (e.g., medical equipment, on-site rescue, triage of patients, and priority of functions) and surge capacity (i.e., surge beds, supplies and staff) were essential for the hospital medical performance during disasters. The presence of key staff, and ensuring that they were suitably trained and ready, also impacted upon the hospital’s disaster performance [27,44]. Hence, F1 was about the “doing” capacity at the ground level, and linked directly to the hospital medical care functions during disasters.
Hospital disaster management mechanisms (F2) were the second most highly weighted factor, and it consisted of disaster plans, command and control system, cooperation, communication, recovery and adaptation strategies. The F2 reflects the basic disaster management system, which refers to the resourceful strategies (adaptive flexibility) that can be used to improve hospital operational functions and, thus, to achieve hospital disaster resilience. F2 was used to efficiently recover affected hospital essential functions, using a wide variety of disaster management approaches including “various disaster plans”, “command and control system”, “communication protocols”, “cooperation with community facilities”, “strategies to prioritize and maintain essential health functions”, “strategies to share and manage emergency supplies and staff”, and “recovery and adaptation strategies after disasters” (these illustrated indicators are presented in Table 2). Thus, F1 and F2 were the key components influencing the speed with which the recovery of a hospital’s essential functions were achieved (rapidity); consequently, they should be the core component of hospital disaster resilience [21].
The factors of hospital infrastructural safety (F3) and hospital disaster resources (F4) weighted less highly. Physical infrastructure safety was found to have a significant effect on maintaining hospital functionality during various disasters (e.g., earthquakes, floods, fires and epidemics), and the redundancies of network can increase hospital resilience [54,55]. F3 in this study refers to two issues: the infrastructure (or procedures) of a hospital that were compliant with the construction standards to guarantee hospital internal safety; and using the redundancies of utility networks (e.g., equipment, communication, power, and water pumping) to continue hospital functions during disasters. It may be controversial that hospital infrastructure did not weight more prominently. After all, hospital infrastructure is essential to resilience as surge capacity and other response solutions could not occur in the face of infrastructure collapse [54,55,56]. However, the evaluation framework of hospital resilience in this context was meant to be used by hospital managers and governmental administrators. Thus the engineering perspective, especially of infrastructure nonstructural components (e.g., transportation systems, elevators, ceilings and utility system) [12,54,55,56] was likely to have been understated, as more items focused on the perspective of disaster management, emergency medicine capability and capacity.
F4 was used to reflect the critical resources, including medications and supplies that were used to achieve health function continuity. The stockpile of these resources should comply with the hospital emergency medicine catalog and emergency resources reservation plan that were issued by the local health department. The critical supplies included food, water, hand hygiene, stretchers, wheelchairs, ventilators, IV pumps, tourniquets, personal protective equipment, etc., while the critical medications included antimicrobial agents, cardiac medications, insulin, anti-hypertensive agents, IV fluids, etc. [57]. Although different levels of hospitals may have a wide variation in disaster resources and stockpiles, hospitals should be prepared with self-sustained supplies and medications for a disaster for 2–3 days, regardless of the level of structural damage [57,58]. Broadly based strategies for resources logistics management (e.g., share with other facilities, sign with pharmacy companies, priority distribution) should also be applied to maintain and surge these critical resources. In summary, F3 and F4 together can be used to reflect a hospital’s robustness (the extent to which function is maintained) to withstand disasters, initially without external assistance.
4.3. Limitations
There are several limitations to this study. Firstly, the framework was tested on a relatively small sample (n = 41) of tertiary hospitals in one province in China, which restricts the generalizability of the findings. Thus, the results may not be directly transferable to hospitals in other areas or countries, or to smaller facilities. While the nature of this work can be considered as preliminary or exploration, it is appropriate to test the framework in a region or province, prior to a larger roll-out, such as a national study. Hence, the proposed framework needs to be adapted into other contexts (e.g., other areas or countries) for further validation. Secondly, due to ethical considerations, the surveyed hospitals were guaranteed anonymity. It was, therefore, not possible to compare and validate their resilience score (see Table 5) with their real status of emergency medicine and disaster preparedness. Thirdly, the questionnaire tends to measure hospital resilience from different dimensions, with each dimension contributing different weights to the overall resilience. The weight was calculated through the empirical data, so it would be beneficial to be validated further with experts’ opinions and additional empirical studies. Finally, the framework was derived from a systematic review and Delphi study in China. Thus, the indicators in the framework may not capture all possible indicators, and its generalizability is limited. With the addition of further expert perspectives (e.g., engineering, emergency medicine experts from other countries), as well as newer information that became available since the conduct of the survey, more items may be added into the construct and retested for validation through ongoing research. Additionally, the self-evaluation scales and the score model could be adjusted accordingly. Despite the limitations, the framework can be regarded as a checklist to evaluate key indicators of hospital vulnerability and to identify priority practices that could better prepare the facilities for future disasters. The methodological framework and some of the agreed indicators may inform the development of indicators of hospital resilience in other countries or areas.
5. Conclusions
The comprehensive framework provides a way to conceptualize hospital resilience, and a foundation for developing a user-friendly instrument for measuring it. A four-factor structure of hospital resilience was identified. The reliability and construct validity of this framework was tested. Although additional work is still needed, the findings provide a basic framework and foundation for future research.
Acknowledgments
Shuang Zhong is supported by a QUT-CSC joint Ph.D. scholarship. Funds for covering the costs to publish in open access are supported by School of Public Health and Social Work & Institute of Health and Biomedical Innovation, Queensland University of Technology, Australia.
Supplementary Files
Author Contributions
Shuang Zhong and Gerard FitzGerald designed the study and developed the questionnaire. Shuang Zhong and Yuli Zang supervised the data collection and data entry process. Shuang Zhong performed data checkup, data analysis and drafted the manuscript. All authors participated in writing, revision and approval of the final manuscript.
Conflicts of Interest
The authors declare no conflict of interest.
References
- 1.Guha-Sapir D., Vos F., Below R., Ponserre S. Annual Disaster Statistical Review 2011: The Numbers and Trends. The Centre for Research on the Epidemiology of Disasters (CRED); Brussels, Belgium: 2012. [Google Scholar]
- 2.Albanese J., Birnbaum M., Cannon C., Cappiello J., Chapman E., Paturas J., Smith S. Fostering disaster resilient communities across the globe through the incorporation of safe and resilient hospitals for community-integrated disaster responses. Prehosp. Disaster Med. 2008;23:385–390. doi: 10.1017/s1049023x00006105. [DOI] [PubMed] [Google Scholar]
- 3.Braun B.I., Wineman N.V., Finn N.L., Barbera J.A., Schmaltz S.P., Loeb J.M. Integrating hospitals into community emergency preparedness planning. Ann. Intern. Med. 2006;144:799–811. doi: 10.7326/0003-4819-144-11-200606060-00006. [DOI] [PubMed] [Google Scholar]
- 4.Paturas J.L., Smith D., Smith S., Albanese J. Collective response to public health emergencies and large-scale disasters: Putting hospitals at the core of community resilience. J. Bus. Contin. Emer. Plan. 2010;4:286–295. [PubMed] [Google Scholar]
- 5.Cimellaro G.P., Reinhorn A.M., Bruneau M. Framework for analytical quantification of disaster resilience. Eng. Struct. 2010;32:3639–3649. doi: 10.1016/j.engstruct.2010.08.008. [DOI] [Google Scholar]
- 6.Sauer L.M., McCarthy M.L., Knebel A., Brewster P. Major influences on hospital emergency management and disaster preparedness. Disaster Med. Public Health Prep. 2009;3:S68–S73. doi: 10.1097/DMP.0b013e31819ef060. [DOI] [PubMed] [Google Scholar]
- 7.Barbera J.A., Yeatts D.J., Macintyre A.G. Challenge of hospital emergency preparedness: Analysis and recommendations. Disaster Med. Public Health Prep. 2009;3:S74–S82. doi: 10.1097/DMP.0b013e31819f754c. [DOI] [PubMed] [Google Scholar]
- 8.Bruneau M., Chang S.E., Eguchi R.T., Lee G.C., O’Rourke T.D., Reinhorn A.M., Shinozuka M., Tierney K., Wallace W.A., von Winterfeldt D. A framework to quantitatively assess and enhance the seismic resilience of communities. Earthq. Spectra. 2003;19:733–752. doi: 10.1193/1.1623497. [DOI] [Google Scholar]
- 9.Cimellaro G.P., Reinhorn A.M., Bruneau M. Seismic resilience of a hospital system. Struct. Infrastr. Eng. 2010;6:127–144. doi: 10.1080/15732470802663847. [DOI] [Google Scholar]
- 10.Zhong S., Clark M., Hou X.-Y., Zang Y.-L., Fitzgerald G. Development of hospital disaster resilience: Conceptual framework and potential measurement. Emerg. Med. J. 2013 doi: 10.1136/emermed-2012-202282. [DOI] [PubMed] [Google Scholar]
- 11.Zhong S., Clark M., Hou X.-Y., Zang Y., FitzGerald G. Establishing healthcare resilience to cope with disasters: Theory, definition and conceptual framework. Emergencias. 2014;26:69–77. [Google Scholar]
- 12.Bruneau M., Reinhorn A. Exploring the concept of seismic resilience for acute care facilities. Earthq. Spectra. 2007;23:41–62. doi: 10.1193/1.2431396. [DOI] [Google Scholar]
- 13.Carthey J., de Leval M.R., Reason J.T. Institutional resilience in healthcare systems. Qual. Saf. Health Care. 2001;10:29–32. doi: 10.1136/qhc.10.1.29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Jeffcott S.A., Ibrahim J.E., Cameron P.A. Resilience in healthcare and clinical handover. 2009;18:256–260. doi: 10.1136/qshc.2008.030163. [DOI] [PubMed] [Google Scholar]
- 15.McDaniels T., Chang S., Cole D., Mikawoz J., Longstaff H. Fostering resilience to extreme events within infrastructure systems: Characterizing decision contexts for mitigation and adaptation. Global Environ. Change. 2008;18:310–318. doi: 10.1016/j.gloenvcha.2008.03.001. [DOI] [Google Scholar]
- 16.McAslan A. Organisation Resilience: Understanding the Concept and Its Application. [(accessed on 18 March, 2014)]. Available online: http://torrensresilience.org/images/pdfs/organisational%20resilience.pdf.
- 17.Rogers P. Development of resilient Australia: Enhancing the PPRR approach with anticipation, assessment and registration of risks. Aust. J. Emerg. Manag. 2011;26:54–59. [Google Scholar]
- 18.Kahan J.H., Allen A.C., George J.K. An operational framework for resilience. J. Homel. Secur. Emerg. Manag. 2010;6:1–48. [Google Scholar]
- 19.Sternberg E. Planning for resilience in hospital internal disaster. Prehosp. Disaster Med. 2003;18:291–300. doi: 10.1017/s1049023x00001230. [DOI] [PubMed] [Google Scholar]
- 20.Bruneau M., Filiatrault A., Lee G., O’Rourke T., Reinhorn A., Shinozuka M., Tierney K. White Paper on the SDR Grand Challenges for Disaster Reduction. MCEER; Buffalo, NY, USA: 2007. [Google Scholar]
- 21.Cimellaro G.P., Reinhorn A.M., Bruneau M. Performance based metamodel for healthcare facilities. Earthq. Eng. Struct. 2011;40:1197–1217. [Google Scholar]
- 22.Higgins W., Wainright C., Lu N., Carrico R. Assessing hospital preparedness using an instrument based on the Mass Casualty Disaster Plan Checklist: Results of a statewide survey. Amer. J. Infect. Control. 2004;32:327–332. doi: 10.1016/j.ajic.2004.03.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Kaji A.H., Lewis R.J. Hospital disaster preparedness in Los Angeles county. Acad. Emerg. Med. 2006;13:1198–1203. doi: 10.1111/j.1553-2712.2006.tb01648.x. [DOI] [PubMed] [Google Scholar]
- 24.Kollek D., Cwinn A.A. Hospital emergency readiness overview study. Prehosp. Disaster Med. 2011;1:1–7. doi: 10.1017/S1049023X11006212. [DOI] [PubMed] [Google Scholar]
- 25.Li X., Huang J., Zhang H. An analysis of hospital preparedness capacity for public health emergency in four regions of China: Beijing, Shandong, Guangxi, and Hainan. BMC Public Health. 2008;8 doi: 10.1186/1471-2458-8-319. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.World Health Organization . World Health Organization (WHO) Regional Office for the Western Pacific; Geneva, Switzerland: 2010. Safe Hospitals in Emergencies and Disasters: Structural, Non-Structural and Functional Indicators. [Google Scholar]
- 27.World Health Organization . World Health Organization (WHO) Regional Office for Europe; Copenhagen, Denmark: 2011. Hospital Emergency Response Checklist—An All-hazards Tool for Hospital Administrators and Emergency Managemers. [Google Scholar]
- 28.Nelson C., Lurie N., Wasserman J., Zakowski S. Conceptualizing and defining public health emergency preparedness. Amer. J. Public Health. 2007;97:S9–S11. doi: 10.2105/AJPH.2007.114496. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Asch S.M., Stoto M., Mendes M., Valdez R.B., Gallagher M.E., Halverson P., Lurie N. A review of instruments assessing public health preparedness. Public Health Rep. 2005;120:532–542. doi: 10.1177/003335490512000508. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Devlen A. How to build a comprehensive business continuity programme for a healthcare organisation. J. Bus. Contin. Emer. Plan. 2009;4:47–61. [PubMed] [Google Scholar]
- 31.Krogstad U., Hofoss D., Hjortdahl P. Continuity of hospital care: Beyond the question of personal contact. BMJ. 2002;324:36–38. doi: 10.1136/bmj.324.7328.36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Kaji A., Koenig K.L., Bey T. Surge capacity for healthcare systems: A conceptual framework. Acad. Emerg. Med. 2006;13:1157–1159. doi: 10.1111/j.1553-2712.2006.tb01641.x. [DOI] [PubMed] [Google Scholar]
- 33.Guo Z. Hospital Management. People’s Health Publishing House; Beijing, China: 1990. [Google Scholar]
- 34.Chinese Ministry of Health . General Hospital Hierarchical Management Standard. Medical Administration Depertment, Ministry of Health; Beijing, China: 1989. (in Chinese) [Google Scholar]
- 35.Zhong S., Clark M., Hou X.-Y., Zang Y., FitzGerald G. Development of key indicators of hospital resilience to cope with disasters: A consensus study in China. J. Health Serv. Res. Policy. 2014 doi: 10.1177/1355819614561537. under review. [DOI] [PubMed] [Google Scholar]
- 36.Brislin R.W. The Wording and Translation of Research Instruments. Sage Publications, Inc.; Thousand Oaks, CA, USA: 1986. [Google Scholar]
- 37.Zhong S., Hou X.-Y., Clark M., Zang Y.-L., Wang L., Xu L.-Z., FitzGerald G. Disaster resilience in tertiary hospitals: A cross-sectional survey in Shandong Province, China. BMC Health Serv. Res. 2014;14 doi: 10.1186/1472-6963-14-135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Norusis M.J. SPSS 15.0 Guide to Data Analysis. Prentice Hall; Englewod Cliffs, NJ, USA: 2007. [Google Scholar]
- 39.Pett M.A., Lackey N.R., Sullivan J.J. Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research. Sage; Thousand Oaks, CA, USA: 2003. [Google Scholar]
- 40.O’Brien K. Factor analysis: An overview in the field of measurement. Physiother. Can. 2007;59:142–155. doi: 10.3138/ptc.59.2.142. [DOI] [Google Scholar]
- 41.Hui Z., Jian-Shi H., Xiong H., Peng L., Da-Ling Q. An analysis of the current status of hospital emergency preparedness for infectious disease outbreaks in Beijing, China. Amer. J. Infect. Control. 2007;35:62–67. doi: 10.1016/j.ajic.2006.03.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Norusis M. SPSS 16.0 Guide to Data Analysis. Prentice Hall Press; Englewod Cliffs, NJ, USA: 2008. [Google Scholar]
- 43.Braun B.I., Darcy L., Divi C., Robertson J., Fishbeck J. Hospital bioterrorism preparedness linkages with the community: Improvements over time. Amer. J. Infect. Control. 2004;32:317–326. doi: 10.1016/j.ajic.2004.01.003. [DOI] [PubMed] [Google Scholar]
- 44.Niska R.W., Shimizu I.M. Hospital preparedness for emergency response: United States, 2008. Natl. Health Stat. Report. 2011;24:1–14. [PubMed] [Google Scholar]
- 45.Thome C.D., Levitin H., Oliver M., Losch-Skidmore S., Neiley B.A. A pilot assessment of hospital preparedness for bioterrorism events. Hospital. 2006;21:414–422. doi: 10.1017/s1049023x0000412x. [DOI] [PubMed] [Google Scholar]
- 46.Hospital Safety Index: Evaluation Forms for Safe Hospitals. PAHO, WHO; Washington, DC, USA: 2008. [Google Scholar]
- 47.Torrens Resilience Institute Developing a Model and Tool to Measure Community Disaster Resilience: Community Disaster Resilience Scorecard Toolkit. [(accessed on 7 May 2014)]. Available online: http://torrensresilience.org/images/pdfs/toolkit/tritoolkit.pdf.
- 48.DiStefano C., Zhu M., Mindrila D. Understanding and using factor scores: Considerations for the applied researcher. Pract. Assess. Res. Eval. 2009;14:1–11. [Google Scholar]
- 49.Berkes F. Understanding uncertainty and reducing vulnerability: Lessons from resilience thinking. Nat. Hazards. 2007;41:283–295. doi: 10.1007/s11069-006-9036-7. [DOI] [Google Scholar]
- 50.Farmer J.C., Carlton P.K., Jr. Providing critical care during a disaster: The interface between disaster response agencies and hospitals. Crit. Care Med. 2006;34:S56–S59. doi: 10.1097/01.CCM.0000199989.44467.2E. [DOI] [PubMed] [Google Scholar]
- 51.Shi Y., Zheng S. The strategic supporting role of a regional state-level hospital during medical rescue after Wenchuan earthquake. Chin. J. Evid. Based Med. 2008;8:380–382. doi: 10.1111/j.1756-5391.2008.00002.x. (in Chinese) [DOI] [PubMed] [Google Scholar]
- 52.Zhang L., Liu X., Li Y., Liu Y., Liu Z., Lin J., Shen J., Tang X., Zhang Y., Liang W. Emergency medical rescue efforts after a major earthquake: Lessons from the 2008 Wenchuan earthquake. Lancet. 2012;379:853–861. doi: 10.1016/S0140-6736(11)61876-X. [DOI] [PubMed] [Google Scholar]
- 53.Veterans Health Administration . VHA Comprehensive Emergency Management Program Analysis Capabilities Description. Veterans Health Administration; Washington, DC, USA: 2008. [Google Scholar]
- 54.Jacques C.C., McIntosh J., Giovinazzi S., Kirsch T.D., Wilson T., Mitrani-Reiser J. Resilience of the Canterbury Hospital System to the 2011 Christchurch Earthquake. [(accessed on 13 June 2014)]. Available online: http://www.scoop.co.nz/stories/AK1211/S00008/resilience-of-christchurch-hospitals.htm.
- 55.Myrtle R.C., Masri S.F., Nigbor R.L., Caffrey J.P. Classification and prioritization of essential systems in hospitals under extreme events. Earthq. Spectra. 2005;21:779–802. doi: 10.1193/1.1988338. [DOI] [Google Scholar]
- 56.Mitrani-Reiser J., Mahoney M., Holmes W.T., de la Llera J.C., Bissell R., Kirsch T. A functional loss assessment of a hospital system in the Bio-Bio province. Earthq. Spectra. 2012;28:S473–S502. doi: 10.1193/1.4000044. [DOI] [Google Scholar]
- 57.Kollek D. Disaster Preparedness for Health Care Facilities. McGraw-Hill Europe; New York, NY, USA: 2013. [Google Scholar]
- 58.Kirsch T.D., Mitrani-Reiser J., Bissell R., Sauer L.M., Mahoney M., Holmes W.T., Santa Cruz N., de la Maza F. Impact on hospital functions following the 2010 Chilean earthquake. Disaster Med. Public Health Prep. 2010;4:122–128. doi: 10.1001/dmphp.4.2.122. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.