Skip to main content
Health Services Research logoLink to Health Services Research
. 2006 Aug;41(4 Pt 2):1654–1676. doi: 10.1111/j.1475-6773.2006.00570.x

Improving Patient Safety in Hospitals: Contributions of High-Reliability Theory and Normal Accident Theory

Michal Tamuz, Michael I Harrison
PMCID: PMC1955347  PMID: 16898984

Abstract

Objective

To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices.

Data Sources/Study Setting

We reviewed and drew examples from studies of organization theory and health services research.

Study Design

After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA).

Principal Findings

HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors.

Conclusions

Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures.

Keywords: Double-check, crew resource management, computerized physician order entry, incident reporting, root cause analysis


The Institute of Medicine (IOM) report To Err Is Human introduced many patient safety advocates to the idea of developing hospitals into high-reliability organizations (HROs) (Kohn, Corrigan, and Donaldson 2000). The HRO model is appealing, in part, because it helps health care organizations incorporate lessons learned from high hazard industries, such as aviation and nuclear power. In contrast, normal accident theory (NAT), another research perspective that examines similar industries, did not receive such widespread attention from the health care sector. Although high reliability theory (HRT) and NAT were first cast as competing perspectives, they are now considered complementary (Perrow 1999a; Weick 2004).

The two sets of HRT and NAT assumptions, concepts, and empirical predictions are best viewed as providing distinctive frames for understanding patient safety (Weick 2004).1 HRT and NAT are bodies of theory, research, and recommendations for practice and policy that evolved essentially in parallel. Hence, there are instances where these approaches diverge in their assumptions and in the organizational features they treat as critical, rather than offering competing hypotheses.

Each frame poses significant questions and offers valuable insights into the pursuit of patient safety. Previous studies compared the two perspectives by applying them to disasters (e.g., Roberts 1990) or near disasters (e.g., Sagan 1993), but we apply them to five popular patient safety practices. We aim to identify distinctive contributions that HRT and NAT make to understanding the organizational conditions affecting patient safety in hospitals and the prospects for transforming hospitals into HROs. To accomplish this, like Snook (2000) we expand NAT beyond its original system-level focus to include processes and interactions among units and individuals. Moreover, we apply NAT to understanding incidents and component failure accidents in hospitals, not just to system accidents.

COMPARING HIGH-RELIABILITY AND NORMAL ACCIDENT THEORIES

Building on Sagan (1993), Table 1 compares and contrasts the two frames and their applications to hospitals.2 As the first two rows indicate, HRT argues that the features of HROs can be identified and adopted by other organizations seeking to attain high reliability (Roberts 1990). In contrast, as NAT scholars uncover enduring and inherent risks in high-hazard industries, they raise doubts whether the prototypical HROs in some high-hazard industries deserve imitation by others. One way to view this debate would be to see NAT authors as critics of HRT, as they raise concerns about features—such as redundancy, training, and an integrated safety culture—in which HRO analysts put considerable trust (e.g., Weick, Sutcliffe, and Obstfeld 1999). Another view would credit HRT for drawing attention to the realm of shared cognition and culture (e.g., Weick 1987; Roberts 1993), whereas NAT adds awareness of the effects on safety of system features including coupling, interactive complexity, and politics (Sagan 1994). Coupling refers to the degree of dependence among system components (e.g., procedures, equipment, and the people who operate them). Interactive complexity is the extent to which interactions among such components are unexpected, unplanned, or not visible.

Table 1.

Comparing HRT and NAT Theories to Hospital Organizations

HRT NAT Hospital Organizations
Assumptions
Main concern Improve reliability in high-hazard settings (e.g., airlines, nuclear power) Raise awareness of unavoidable risk of major system failures in industries using tightly coupled, interactively complex technologies (e.g., nuclear power)
Orientation Optimistic and melioristic; focus on internal organizational practices and culture Pessimistic; focus on industries andencourages political elites to abandon or radically restructure systems based on high-risk technologies
Applications
Objectives Reliability is first priority Safety competes with other objectives Administrators confront competing objectives
Providers guided by divergent safety goals
Redundancy Technical and social redundancies enhance reliability Redundancy can contribute to accidents when it:
  • Lacks independence

  • Increases complexity

  • Obscures operating processes

  • Diffuses personal responsibility

There are many social redundancies and some technical ones
Some redundancies enhance reliability; others reduce it
Structure and processes Reliability enhanced by:
  • Rules and SOPs

  • Training in rule applications

Limited impact of rule enforcement and training Professional controls are applied more frequently than rule enforcement; clinicians train through apprenticeship
Decision making migrates toward expertise Decision making migrates toward powerful Decisions sometimes migrate toward powerful
Flexible structure enables rapid response Key structural concepts include:
  • Interactive complexity

  • Tight and loose coupling

Decision making tends to be decentralized
HRO lacks discussion of complexity and interdependence Interactive complexity and tight coupling create potential for catastrophic (major system) failure Hospitals tend to be:
  • Complex

  • Differentiated

  • Loosely coupled

  • Low risk of catastrophes

Culture Cultural norms enhance reliability and safety Safety culture is necessary, but not sufficient for safety Hospital cultures characterized by:
  • Multiple subcultures

  • Conflicting beliefs and norms

    Assumptions about risk Managers assume that risk exists and that they can devise strategies to cope with risk Politics and personal interests influence risk interpretation Sources of risk are ambiguous
Developing risky new procedures and applications enhances hospitals' and providers' reputations
    Rewards Rewards should be consistent with desired behavior Reward system influences and is influenced by politics External organizations influence internal allocation of rewards
    Cognition Emphasizes cognition and developing a culture of mindfulness Limited treatment of cognition Organizational conditions can distort or undermine mindfulness Few empirical studies of cognition
Top managers see the big picture Barriers to top managers gathering information from front lines
Individuals engage in valid and reliable sense making History of success undermines current vigilance

HRT, high reliability theory; NAT, normal accident theory.

As noted in the third column of Table 1, hospital organization and practice diverge substantially from the elements of HROs (Gaba 2000). Hospital managers typically pursue multiple and conflicting goals. Clinicians' objectives and practices may diverge from management's espoused goals for safety and quality. Many technical and social features of hospitals exhibit redundancy, but not all of these contribute to safety and reliability (e.g., Lingard et al. 2004). Much of the gap between hospital realities and the HRO model reflects the fact that hospitals are professional bureaucracies (Mintzberg 1979), where norms and routines are learned through professional socialization and authority flows through professional hierarchies. In addition, whereas clinicians readily shift decision making responsibility in response to changing conditions (e.g., emergency codes), hospitals usually do not (e.g., Meyer 1982).

Hospitals tend to be loosely coupled. Loose coupling of routine activities enables providers to notice problems and intervene before they cause harm. Similarly, changes in one unit do not necessarily affect others. Except for emergencies, hospitals tolerate time delays (e.g., in a patient being sent for imaging tests), and the sequencing of procedures is often flexible (e.g., scheduling imaging tests and medication administration).

Hospitals do not ordinarily provide fertile grounds for the development of well-integrated and cohesive cultures of reliability. Hospitals and health care as a whole are very complex (Gaba, Maxwell, and De Anda 1987; Gaba 2000) and may be growing more so (Lake et al. 2003). Hospitals often encompass a myriad of subcultures that mirror the structural complexity of the hospital system and its occupational differentiation (Edmondson 1996; Degeling, Kennedy, and Hill 2001; Sexton et al. 2001; Singer et al. 2003; Ferlie et al. 2005). Furthermore, some professional beliefs and norms clash with HRO norms (Thomas, Helmreich 2002; Leape and Berwick 2005).

APPLYING HRT AND NAT PERSPECTIVES TO PATIENT SAFETY PRACTICES

How can the HRT and NAT frames contribute to a fresh look at five popular and promising patient safety practices? We will examine each of these practices in turn, as summarized in Table 2.

Table 2.

HRT and NAT Analyses of Patient Safety Practices

Patient Safety Practice Analysis within HRT Frame Analysis within NAT Frame
Double-checking medications Incorporates redundancy Constrained by limits of social redundancy
Exemplifies a cultural norm Can hinder and delay problem detection
Creates formal procedures to assure reliability Can lower vigilance
Crew resource management (CRM) Enables people with critical expertise and information to make decisions Seeks to make risky technologies safer, not reduce their catastrophic potential
Facilitates flexible responses to unexpected situations Better suited for loosely coupled technologies
Incorporates reward systems and cultural norms that support speaking up to authority Relies on interpersonal communication skills; these are necessary but not sufficient to identify safety threats
Computerized physician order entry (CPOE) Provides a method for gathering error data for top managers May reduce interactive complexity, but will increase tight coupling
May hinder open communication among different professionals Reduces errors from simple component failures
Adds to risk of infrequent, high consequence errors affecting many patients
Illustrates limitations of redundancy “added on” to original design
Incident reporting Requires end to “culture of blame” “Politics of blame” hinders reporting
Relies on individual capacity to engage in valid sensemaking Incentives lacking for reporting incidents
Provides a method to integrate individual heedfulness with organizational-level assessment Promotes interorganizational exchange of safety-related reports
Enables top management to assess the big picture Pressures from the external environment may influence internal reward system and enhance (or inhibit) reporting
Root cause analysis (RCA) Fits HRO emphasis on learning from adverse events Constrained by difficulties of learning from adverse events
Supports sharing expertise from front lines Interpreting adverse events, their causes and solutions, can be shaped by political and personal interests
Works better in organizations with a culture of reliability Fosters overlooking problems that lack available solutions or ones preferred by management
Requires a reward system that does not blame or punish those involved in adverse events Identifying problems can be hindered by complexityand multiple layers of redundancy
Provides big picture to top management May lead participants to choose solutions based on ease of implementation

HRT, high reliability theory; NAT, normal accident theory

Double-Checking Medications

Conducting double checks, in which one provider reviews and signs off on another's task, is a form of social redundancy that is pervasive in nursing (e.g., Cohen, Kilo 1999; Griffin 2003) and pharmacy (e.g., Cohen et al. 1996) and is required in particular situations by the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). Despite the widespread practice of nurses double-checking medications, it has not been widely studied in hospitals (ISMP 2004). Double-checking requires that one fallible person monitor the work of another imperfect person. Because people tend to hear what they expect to hear and see what they expect to see, effectiveness is reduced (e.g., Reason 1990; ISMP 2003). Applying the HRT and NAT frames calls attention to the social and organizational implications of double-checking.

From a HRT perspective, when two professionals double-check a hazardous medication, they embody three key attributes of a HRO: (1) redundancy, (2) adherence to patient safety norms embedded in a culture of reliability, and (3) use of formal procedures that reinforce culturally expected behavior. Despite the Institute for Safe Medication Practices (ISMP) norms for double-checking high-hazard medications, one survey reports that such norms were routinely followed in only 45 percent of the hospitals (Smetzer et al. 2003). Furthermore, HRO proponents are aware of the limits of relying solely on prevention as a means of averting harm (Weick and Sutcliffe 2001). Over-reliance on double-checking can actually reduce mindfulness of safety risks. For example, if a hospital is not selective in its medication double-checking requirements (ISMP 2004), providers may consider the pervasive requirement to be a “superficial routine task” and not check independently (ISMP 2003).

The NAT frame also underscores the limits of redundancy, as embodied in double-checking medications. Even if nurses double-check medications independently, as instructed by ISMP alerts, they both can make the same mistake; both providers may be stymied by equipment or other environmental design flaws, such as a confusing drug label (ISMP 2004). Furthermore, double-checking, like other backup procedures, can compensate for underlying problems, and thus, delay their discovery and correction (Reason 1997). For example, if a mistake is detected and corrected during a routine double-check procedure in the hospital pharmacy, it is not classified as an error, and thus, the underlying cause may go unnoticed by pharmacy management (Tamuz, Thomas, and Franchois 2004).

NAT researchers argue that social redundancy, such as double-checking, may inadvertently undermine safety protections because of the inherent difficulties of expecting people to act as backups. When people are aware that others are duplicating their efforts, redundancy can diffuse responsibility and lead individuals to overlook safety checks (Snook 2000; Sagan 2004b). Instead of conducting an independent double-check, pharmacy “staff learn to rely upon the checker to catch problems” (ISMP 2004). Alternatively, a pharmacist who trusts the quality of a colleague's work may fail to conduct a thorough, independent double-check because of overconfidence (Smetzer 2005). Effective duplication can also be subverted by differences in status and responsibility, such as when the nurse who double-checks defers to the nurse with the primary drug administration responsibility.

Crew Resource Management (CRM)

CRM is a form of interpersonal communication training developed for and by commercial airline pilots (e.g., Weiner, Kanki, and Helmreich 1993; Hamman 2004), based on group dynamics research (Hackman 1990, 1993), and adapted as teamwork training in simulated operating room settings (Gaba, Maxwell, and DeAnda 1987; Gaba 1989; Helmreich and Schaefer 1994). CRM practices include briefings—in which the person in charge reviews the tasks facing the team and highlights potential threats—and interpersonal communication methods. CRM instructs subordinates on how to raise safety concerns and question the actions of authority figures without challenging their authority. CRM is one of the “proven methods” of teamwork training for health care providers (Kohn, Corrigan, and Donaldson 2000, p. 149). Hospitals have implemented both briefings and instruction in interpersonal communication (Leonard, Graham, and Bonacum 2004; McFerran et al. 2005), however, these vary in thoroughness and depth.

Although teamwork is not considered a key element of HRT, CRM training fits well with the HRO model (e.g., Weick and Sutcliffe 2001). CRM techniques support “migrating decision making,” in which decision makers defer to the person with the relevant expertise, rather than the one with the highest-ranking authority. Furthermore, CRM can make it easier to identify anomalies and, thus, respond flexibly to an unexpected situation. For instance, a team member is expected to speak up when confronted with a potential threat to patient safety (Sutcliffe, Lewton, and Rosenthal 2004).

HRT highlights how reward systems and organizational culture influence the effectiveness of CRM implementation (Musson and Helmreich 2004). Successful CRM implementation depends on removing disincentives for speaking up. If CRM training consists of “one-shot, day-long classroom lectures” (Musson and Helmreich 2004, p. 29), it is unlikely to be sufficient to produce cultural change and overcome prevailing norms against speaking up to authority. Effective CRM would have to grow out of or be customized to fit a hospital's cultures.

Perrow (1999a), the originator of NAT, underscores the broad, societal implications of supporting CRM and other methods designed to improve safety through modifying work group culture. He raises concerns that methods for improving teamwork “ask how we can make risky systems with catastrophic potential more safe” (Perrow 1999a, p. 379), but fail to raise more fundamental questions about the implications of pursuing efficiency goals in industries with catastrophic potential. Perrow's concerns may be less relevant to hospitals than to other high-hazard organizations because loose coupling among hospital equipment, procedures, and units reduces the potential for catastrophic system accidents.

NAT's emphasis on coupling and interactive complexity draws attention to important structural conditions that may affect CRM effectiveness in hospitals. CRM techniques are likely to prove more effective when systems and procedures are loosely coupled, because team members have time to identify hazards and to intervene before negative consequences occur. In contrast, tightly coupled, time-dependent technologies (e.g., chemical plants, heart-lung machines) provide fewer opportunities for intervention (Perrow 1984, 1999b). Therefore, we hypothesize that CRM methods will vary in their effectiveness depending on the degree of coupling that characterizes specific medical, surgical, and emergency response procedures.

Furthermore, under conditions of interactive complexity even if CRM enhanced communication among authority figures and their subordinates, they might still fail to recognize an unsafe situation or identify emerging threats. When practitioners cannot predict all the conditions under which potentially hazardous interactions might occur, they lack access to information that could be critical to collective decision making. Practitioner assessments of a situation may also be obscured by multiple layers of redundant safety measures. For example, in one hospital, the pharmacy computer system frequently generated a flurry of redundant but irrelevant warnings, making it difficult for the pharmacists to notice critical warning messages (Tamuz and Thomas 2006). Thus, in loosely coupled systems, NAT would view efforts to improve interpersonal communication through CRM as a necessary, but not sufficient condition for improving patient safety.

Computerized Physician Order Entry (CPOE)

Researchers and health policy analysts recommend CPOE implementation as a means of reducing medication errors that lead to adverse drug events (e.g., Aspden et al. 2004). CPOE eliminates handoffs between physicians, nurses, clerks, and pharmacists and reduces errors because of illegible handwriting, similar sounding drugs, and predictable drug interactions. CPOE can also accurately and efficiently collect data on particular error frequencies and disseminate such drug-related information. (See Kaushal, Shojania, and Bates 2003 for a review.)

HRT and NAT do not directly address CPOE. However, both frames suggest constructive insights. The HRT perspective would highlight CPOE's impact on information flow among decision makers. CPOE can contribute to HRO development by providing clinicians and higher-level managers with accurate data on error frequencies and adverse drug events. A disadvantage of CPOE is that current software and commercial products may not solve and can even complicate data entry, retrieval, and exchange as well as communication among providers (Ash, Berg, and Coiera 2004; Miller and Sim 2004; Health Data Management and Schuerenberg 2005). Thus, implementing CPOE may enhance data access for top managers while hindering communication among clinicians with expertise and first-hand experience.

NAT draws attention to the implications of CPOE for system design. CPOE has the potential to produce fundamental design changes in the medication process; these changes would reduce interactive complexity and tighten coupling between medication ordering and dispensing. CPOE would reduce the potential for unexpected interactions in the medication process by eliminating involvement of some personnel (e.g., clerks who copy and fax the doctors' orders) and equipment (e.g., fax machines). Coupling would tighten because an order change would more directly and rapidly affect drug dispensing; proceeding from the physician's keyboard to the pharmacy computer, with fewer possibilities for people to alter or stop the order.

In practice, however, CPOE systems do not yet conform to designers' expectations. The difficulties that have emerged can be readily understood within the context of NAT. First, according to NAT, system breakdowns can result from an array of different conditions, ranging from simple, recognized component failures to a multiplicity of unanticipated and unpredictable interactions. Second, redundancy in technologies like CPOE not only can enhance safety precautions, but also may undermine them.

To illustrate these two issues, we draw examples from research on an early CPOE system (Koppel et al. 2005). The researchers found that CPOE eliminated some medication error risks but gave rise to an array of other unanticipated risks. In NAT, some of these errors would be classified as simple component failures, such as the failure of the computer program to cancel a test-related drug order after the physician cancelled the order for the test. Other risks documented in this CPOE system illustrate how instances of interactive complexity can occur even in a relatively linear CPOE system (Perrow 1984). For example, residents diligently followed the dosing guidelines on the computer screen, but the minimal doses appearing on the screen reflected purchasing procedures for the pharmacy (e.g., purchase as 10 mg tablets) rather than clinical standards (e.g., for effective minimum dosage). Thus, a process designed for use by one component (i.e., pharmacy) interacted in an unexpected way with another (i.e., house staff). From a NAT viewpoint, simple component failures are less troublesome; once identified, they can be corrected. But unanticipated interactions among system components cannot be completely predicted, averted, or designed away.

This CPOE study also illustrates the difficulties of adding redundancy onto an existing organizational system, a recurrent theme in NAT research (e.g., Sagan 2004b). Some of the problems reported by Koppel and colleagues emerged from the design of the new CPOE technology and its unanticipated interactions with components (e.g., equipment, operators) in the existing medication ordering process (see also Han et al. 2005). CPOE was added to an existing system in which nurses continued to use handwritten reminders (e.g., to renew antibiotic orders) and attached them to patients' charts. However, because the physicians were entering orders in electronic patient records, they did not notice the nurses' written reminders. This illustrates how adding CPOE to an existing system resulted in unexpected interactions among system components.

One of the advantages of CPOE is that it replaces social redundancy (e.g., nurses checking doctors' orders) with technical redundancy (e.g., computerized error detection). However, for allergy monitoring, “House staff claimed post hoc [allergy] alerts unintentionally encourage house staff to rely on pharmacists for drug-allergy checks, implicitly shifting responsibility to pharmacists” (Koppel et al. 2005, p.1200). This illustrates how technical redundancy can generate social redundancy and thereby increase the potential for error.

NAT points to a third potential problem with CPOE that has not been widely discussed: the safety trade-offs associated with making technologies more tightly coupled. To reduce routine errors, CPOE tightens the coupling of the medication ordering process. An unanticipated consequence of tighter coupling may be greater risk of infrequent, but potentially widespread and harmful errors. For example, a mistake in a decision rule programmed into the computer has the potential to harm many patients simultaneously.

We support implementation of CPOE but emphasize that NAT sounds an important cautionary note about the trade-offs in implementing tightly coupled systems. We need to better understand the conditions under which hospitals should tighten the coupling between departments and procedures, for example, as a means of reducing multiple, error-prone hand-offs. We also need to specify conditions under which hospitals can allow loose coupling and thereby provide more time to diagnose, respond to, and reverse potentially hazardous situations. Slowing down and decoupling operations can provide time for safe recovery, but at the cost of efficiency.

In hospitals, tight coupling is likely to occur in four types of procedures, which are shown in Table 3. Emergency procedures tend to be tightly coupled because they are time-dependent. In technology-driven procedures, such as anesthesiology (Gaba 1989), tasks are time-dependent and the sequence of tasks cannot be easily changed. Chemical processes are tightly coupled because they are time-dependent, invariant in their sequences, and follow a standard path. Furthermore, tight coupling of these procedures reduces the feasibility of using slack—buffers or redundancies that may mitigate negative outcomes. Finally, automation often further tightens the coupling in technology-based and chemical processes, reducing the availability of alternative paths to implementation and the slack resources necessary for recovery. The risk in automation is that a low-probability error, such as introduction of an incorrect component in a chemical process, can rapidly spread a wave of high consequence errors.

Table 3.

Tight Coupling in Hospital Procedures

Conditions Conducive to Tight Coupling

Emergency Procedures Technology-based Procedures Chemical Processes Automation
Examples Responding to life-threatening “code” Anesthesia in surgery Batch lab tests CPOE
Treating heart attack patient Heart-lung machine in open-heart surgery Drugs in which the effects cannot be mitigated or reversed after administration Robotic drug dispensing
Dispensing urgent “stat” medications Dialysis Batch blood tests
Tight coupling elements*
Delays are detrimental X X X
Invariant sequences X X X
One path to goal X X X
Little slack allowed X X
*

See Perrow (1984, pp. 93–4)

Incident Reporting

The 2000 IOM report (Kohn, Corrigan, and Donaldson 2000) identified underreporting as a patient safety issue and recommended that hospitals develop nonpunitive environments to promote incident reporting. Patient safety advocates (e.g., Kaplan and Barach 2002) called for intensifying the reporting and analysis of near-miss data, and some hospitals implemented a variety of near-miss reporting systems modeled, in part, on the aviation experience (Battles et al. 1998; Etchegaray et al. 2005; see Wald and Shojania 2001 for an overview of incident reporting systems).

Both HRT and NAT stress the importance of learning from errors and near misses. However, the proponents of the two perspectives differ in their assessment of the feasibility of gathering information about these safety-related events and learning from them.

Although HRT does not explicitly promote incident reporting systems as a safety measure, incident reporting systems are consistent with elements of HRT. Incident reporting provides a method for clinicians to relay first-hand data about potential patient safety threats to key decision makers, provided that the clinicians can engage in valid “sensemaking” (i.e., accurately interpret what they observed). In addition, top-level HRO managers could gather and analyze incident data to assess emerging patient safety problems and evaluate existing ones; they could use the incident data to maintain a “big picture” of potential threats to patient safety. HRT advocates (e.g., LaPorte and Consolini 1991) are optimistic that organizations can create reward systems that support meaningful incident reporting and promote the capacity to learn from errors.

HRT researchers also recognize that when organizations do not fully enact these HRO precepts, they can hinder the gathering and use of incident reporting data. Making sense of errors can be problematic in HROs (Weick and Sutcliffe 2001); this might be reflected in health care providers' expressions of confusion over what constitutes a medical error (e.g., Wakefield, Wakefield, and Uden-Holman 2000; Taylor et al. 2004). Moreover, there are also concerns about the reliability of incident report data, because of the tendency toward underreporting (e.g., Aspden et al. 2004). Furthermore, hospitals undermine the incentives for incident reporting when they “blame and shame” those who make mistakes (Roberts, Yu, and van Stralen 2004).

NAT researchers recognize that incident reporting systems can provide the feedback organizations need to learn from their experience, but they tend to be pessimistic that organizations will succeed in modifying their internal reward systems to promote blame-free incident reporting and learning (Sagan 1994, 2004a; Perrow 1999b). “The social costs of accidents make learning very important; the politics of blame, however, make learning very difficult” (Sagan 1994, p. 238). High-hazard organizations usually do not create incentives for individuals to report their errors or for departments to share incident data with one another. Despite these difficulties, airlines have developed innovative methods of reducing disincentives for incident reporting. Pilots who self-report incidents are sheltered from company disciplinary measures and full regulatory prosecution (Tamuz 2000).

Whereas HRT focuses on the “culture of blame” and NAT on “the politics of blame,” both sets of researchers concur that misguided reward systems discourage incident reporting. Surveys of health care providers suggest that fear, particularly of implicating others (e.g., Taylor et al. 2004) or of litigation (e.g., Vincent, Stanhope, and Crowley-Murphy 1999), contributes to underreporting. Similarly, nurses are less likely to disclose their errors if they perceive their unit leader is not open to discussing mistakes (Edmondson 1996).

Given the formidable barriers to gathering data within the organization, NAT directs attention beyond the organization's boundaries. The organizational environment provides alternative methods for incident reporting, as well as a source of pressure for internal change. Perrow (1999b, p. 152) recommends “constant feedback about errors and a system-wide sharing of near misses.” He focuses on gathering data and disseminating information among organizations, not within them. Such industry-level, nonregulatory, interorganizational reporting systems are exemplified by the Aviation Safety Reporting System (ASRS) (Tamuz 2001) and an ASRS-based transfusion medicine reporting system (Battles et al. 1998).

NAT researchers also suggest that agencies in the external environment can exert influence on intractable internal organizational interests. These agencies can create pressures and incentives to adopt safety practices (Perrow 1999b). For example, JCAHO has created incentives for hospitals to promote incident reporting and adopt patient safety practices (Devers, Pham, and Liu 2004). Unfortunately, external forces, such as the tort system and professional licensing boards, can also block organizational learning in high-hazard industries when external agents assume that incompetent individuals cause most errors and adverse events (Tasca 1990). Therefore, NAT highlights the roles of agencies in the external environment in shaping internal incident reporting and patient safety practices.

Root Cause Analysis (RCA)

A RCA is a formal investigation of an adverse event or a potential adverse event (i.e., one in which the patient was not injured but could have suffered harm). RCA programs rely on rational decision-making processes to provide impartial, analytical tools for adverse event analysis. The nuclear power industry developed methods for investigating the root causes of hazardous events (e.g., Perin 2004). Similar RCA techniques have been adapted to the Veterans Administration hospitals (Bagian et al. 2002) and disseminated as a model for U.S. hospitals. Specific RCA methods have been devised for U.K. health care settings (Donaldson 2000) and tailored to analyzing transfusion medicine mishaps (Kaplan et al. 1998).

The HRT perspective highlights the potential contributions of RCAs. RCAs can be seen as a forum for “migrating decision making” by providing an opportunity for people with first-hand knowledge of an event to share their expertise with upper-level managers. Developing a culture of reliability and mindfulness would be a necessary condition for holding effective RCAs and also would be consistent with expanding the RCA focus to include potential adverse events, not just patient injuries.

HRT also focuses on communication with top management and within management teams. HRT would lead us to ask what top managers know about the RCA events and about any plans to avert their recurrence. Ideally, in a HRO, information resulting from a RCA would contribute to development of management's “big picture” of the hospital's operations.

NAT highlights how applications of rational problem solving techniques, such as RCA, are affected by decision making under conditions of ambiguity and politics. Political considerations can affect critical choices about: (1) the events to be considered in a RCA, (2) investigation and interpretation of what went wrong, and (3) corrective actions. When decision makers choose events for RCAs, they often do so under conditions of ambiguity (Marcus and Nichols 1999). In hospitals, it is often unclear whether an adverse event could have been prevented, whether it is a rare aberration or likely to recur, or in the case of a near miss, whether it could have resulted in harm (March, Sproull, and Tamuz 1991).

Ambiguity gives managers room for interpretation. They may choose to investigate events that advance their personal or professional interests, whether to engineer a certain success or distract attention from a failure. Alternatively, they may decide not to analyze a threatening event. Furthermore, decision makers may choose to analyze an event because they can devise a solution for it (Carroll 1998) or because it matches a solution they want to implement (Kingdon 1995).

Interpreting the causes of and solutions for an accident can be a highly political process (Tasca 1990). When alternative solutions conflict, NAT would predict that the choice will migrate to the most influential participants, not necessarily to those with the most expertise. In addition, when a patient is harmed the stakes are high, and clinicians seek to protect “their own.” In one hospital (Franchois 2003), provider groups protected their own professional interests by choosing solutions in response to a patient injury, and in some cases, implementing their preferred solutions, before the first RCA meeting was held. RCA participants may also join in producing a “success” that identifies a proximate, simple, and visible cause and thereby avoids in-depth treatment of the issues, like other forms of limited learning in hospitals (Tucker and Edmundson 2003). Despite its name, an RCA can allow the participants to choose simple fixes, rather than searching for more complicated underlying causes. Thus, the HRT frame highlights potential contributions of an RCA, while the NAT frame illuminates the limitations of implementing an RCA in practice.

CONCLUSION

HRT and NAT raise fundamental issues surrounding the introduction of safety practices in hospitals, particularly those adopted from other industries. Each distinctive frame focuses attention on some organizational conditions affecting safety while overlooking others. Each frame has strengths and can make a valuable contribution to improving patient safety. We sought to highlight the most productive applications of the frames, underscore their pitfalls, and call attention to their blind spots. Our approach may help policy makers, managers, and clinicians avoid putting confidence in solutions that might not produce the expected results and could actually divert attention from safety threats and needed changes.

Health care researchers and administrators might find it useful to apply HRT and NAT frames to help assess the trade-offs associated with patient safety practices and to identify contexts in which certain patient safety interventions are more likely to be effective. In particular, administrators might find it useful to apply these frames when deciding whether to adopt safety practices from other industries. For example, NAT directs attention to the organizational conditions under which the practice originated, as well as those conditions in the hospital to which it will be adapted. Applying these frames can assist administrators and practitioners to learn not only from medical mishaps, but also from the hospital's choice and implementation of safety measures. By examining how organizations adopt and adapt new patient safety practices, administrators, as well as researchers, can also gain insight into organizational conditions affecting patient safety.

Despite the value of NAT and HRT, practitioners and researchers should treat both as frames and not as blueprints; they are sensitizing devices and not roadmaps (see Schon 1983). In the final analysis the theories, practices, and hypotheses that flow from HRT and NAT need to be tested empirically—both through research and through action—by formulating ideas, trying them out in practice, gathering data on the effects of these practices, and reformulating the ideas in keeping with the findings.

Acknowledgments

We are grateful to Eleanor T. Lewis, Eric J. Thomas, Ross Koppel, and the staff of AHRQ's Information Resource Center for their contributions to this paper. The first author acknowledges the funding support of AHRQ grant #1PO1HS1154401.

Disclaimers: The views in this paper are those of the authors and do not reflect the views of the institutions with which they are affiliated.

NOTES

2

We constructed the four sets of concepts by regrouping concepts presented by Roberts and her colleagues (Roberts, Yu,). We focus mainly on the systematic HRT framework presented by Roberts and her colleagues in the Patient Safety Handbook (Roberts, Yu, and van Stralen 2004), because it is widely disseminated in the health care community. See Schulman (2004) for a discussion of variations on HRT, and Weick and Sutcliffe (2001) for their application of HRT to business organizations. For NAT, we draw primarily on Perrow (1984, 1999a) and Sagan (1993, 1994). For comparisons of HRT and NAT and their applicability to health care see Gaba (2000); Gaba, Maxwell and DeAnda (1987); and Hoff, Pohl, and Bartfield (2004).

REFERENCES

  1. Ash J S, Berg M, Coiera E. Some Unintended Consequences of Information Technology in Health Care The Nature of Patient Care Information System-Related Errors. Journal of American Medical Informatics Association. 2004;11:104–12. doi: 10.1197/jamia.M1471. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aspden P, Corrigan J M, Wolcott J, Erickson S M. Patient Safety: Achieving a New Standard for Care. Washington, DC: National Academies Press; 2004. [PubMed] [Google Scholar]
  3. Bagian J P, Gosbee J, Lee C Z, Williams L, McKnight S D, Mannos D M. The Veterans Affairs Root Cause Analysis System in Action. Joint Commission Journal on Quality Improvement. 2002;28(10):531–45. doi: 10.1016/s1070-3241(02)28057-8. [DOI] [PubMed] [Google Scholar]
  4. Battles J B, Kaplan H S, Van der Schaaf T W, Shea C E. The Attributes of Medical Event-Reporting: Systems Experience with a Prototype Medical Event-Reporting System for Transfusion Medicine. Archives of Pathology and Laboratory Medicine. 1998;122(3):231–8. [PubMed] [Google Scholar]
  5. Bolman L G, Deal T E. Reframing Organizations: Artistry, Choice, and Leadership. 3d Edition. New York: John Wiley; 2003. [Google Scholar]
  6. Carroll J S. Organizational Learning Activities in High-Hazard Industries: The Logics Underlying Self-Analysis. Journal of Management Studies. 1998;35(6):699–717. [Google Scholar]
  7. Cohen M R, Anderson R W, Attilio R M, Green L, Muller R J, Pruemer J M. Preventing Medication Errors in Cancer Chemotherapy. American Journal of Health-System Pharmacy. 1996;53(7):737–46. doi: 10.1093/ajhp/53.7.737. [DOI] [PubMed] [Google Scholar]
  8. Cohen M R, Kilo C M. High-Alert Medications: Safeguarding Against Errors. In: Cohen M R, editor. Medication Errors: Causes, Prevention, and Risk Management. Washington, DC: American Pharmaceutical Association; 1999. pp. 5.1–5.4. [Google Scholar]
  9. Degeling P, Kennedy J, Hill M. Mediating the Cultural Boundaries between Medicine, Nursing, and Management—The Central Challenge in Hospital Reform. Health Services Management Research. 2001;14(1):36–48. doi: 10.1177/095148480101400105. [DOI] [PubMed] [Google Scholar]
  10. Devers K J, Pham H H, Liu G. What Is Driving Hospitals' Patient-Safety Efforts? Health Affairs. 2004;23(2):103–15. doi: 10.1377/hlthaff.23.2.103. [DOI] [PubMed] [Google Scholar]
  11. Donaldson L. An Organisation with a Memory: Report of an Expert Group on Learning from Adverse Events in the NHS. London: The Stationery Office; 2000. [Google Scholar]
  12. Edmondson A C. Learning from Mistakes Is Easier Said Than Done: Group and Organizational Influences on the Detection and Correction of Human Error. Journal of Applied Behavioral Science. 1996;32(1):5–28. [Google Scholar]
  13. Etchegaray J M, Thomas E J, Geraci J M, Simmons D, Martin S K. Differentiating Close Calls from Errors: A Multidisciplinary Perspective. Journal of Patient Safety. 2005;1(3):133–7. [Google Scholar]
  14. Ferlie E, Fitzgerald L, Wood M, Hawkins C. The Nonspread of Innovations The Mediating Role of Professionals. Academy of Management Journal. 2005;48(1):117–34. [Google Scholar]
  15. Franchois K E. Can Healthcare Organizations Learn from Medication-Related Safety Events? Unpublished Master of Public Health thesis, The University of Texas Health Science Center at Houston School of Public Health: Houston; 2003. [Google Scholar]
  16. Gaba D M. Human Error in Anesthetic Mishaps. International Anesthesiology Clinics. 1989;27(3):137–47. doi: 10.1097/00004311-198902730-00002. [DOI] [PubMed] [Google Scholar]
  17. Gaba D M. Structural and Organizational Issues in Patient Safety: A Comparison of Health Care to Other High-Hazard Industries. California Management Review. 2000;43(1):83–102. [Google Scholar]
  18. Gaba D M, Maxwell M, DeAnda A. Anesthetic Mishaps: Breaking the Chain of Accident Evolution. Anesthesiology. 1987;66(5):670–6. [PubMed] [Google Scholar]
  19. Griffin E. Safety Considerations and Safe Handling of Oral Chemotherapy Agents. Clinical Journal of Oncology Nursing. 2003;7(suppl 6):25–9. doi: 10.1188/03.CJON.S6.25-29. [DOI] [PubMed] [Google Scholar]
  20. Hackman J R, editor. Groups That Work (and Those That Don't): Creating Conditions for Effective Teamwork. San Francisco: Jossey-Bass; 1990. [Google Scholar]
  21. Hackman J R. Teams, Leaders, and Organizations: New Directions for Crew-Oriented Flight Training. In: Wiener E L, Kanki B G, Helmreich R L, editors. Cockpit Resource Management. San Diego: Academic Press; 1993. pp. 47–69. [Google Scholar]
  22. Hamman W R. The Complexity of Team Training: What We Have Learned from Aviation and Its Applications to Medicine. Quality and Safety in Health Care. 2004;13(suppl 1):72–9. doi: 10.1136/qshc.2004.009910. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Han Y Y, Carcillo J A, Venkataraman S T, Clark RSB, Watson R S, Nguyen T C, Bayir H, Orr R A. Unexpected Increased Mortality after Implementation of a Commercially Sold Computerized Physician Order Entry System. Pediatrics. 2005;116(6):1506–12. doi: 10.1542/peds.2005-1287. [DOI] [PubMed] [Google Scholar]
  24. Harrison M I, Shirom A. Organizational Diagnosis and Assessment: Bridging Theory and Practice. Thousand Oaks, CA: Sage; 1999. [Google Scholar]
  25. Health Data Management. Schuerenberg B K. CPOE Progress: No Guts, No Glory. [July 6, 2005]; Available at http://www.healthdatamanagement.com/html/current/CurrentIssueStory.cfm?PostID=19652.
  26. Helmreich R L, Schaefer H G. Team Performance in the Operating Room. In: Bogner M S, editor. Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates; 1994. pp. 225–53. [Google Scholar]
  27. Hoff T J, Pohl H, Bartfield J. Creating a Learning Environment to Produce Competent Residents: The Roles of Culture and Context. Academic Medicine. 2004;79(6):532–40. doi: 10.1097/00001888-200406000-00007. [DOI] [PubMed] [Google Scholar]
  28. ISMP Medication Safety Alert! The Virtues of Independent Double Checks—They Really Are Worth Your Time! [March 6, 2003];Institute for Safe Medication Practices. 2003 Available at http://www.ismp.org/MSAarticles/TimePrint.htm.
  29. ISMP Medication Safety Alert! Nurse Advise-ERR. (2) 12 Institute for Safe Medication Practices. 2004. [December 2004]. Available at http://www.ismp.org.
  30. Kaplan H, Barach P. Incident Reporting: Science or Protoscience? Ten Years Later. Quality and Safety in Health Care. 2002;11:144–5. doi: 10.1136/qhc.11.2.144. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Kaplan H S, Battles J B, Van der Schaaf T W, Shea C E, Mercer S Q. Identification and Classification of the Causes of Events in Transfusion Medicine. Transfusion. 1998;38(11–12):1071–81. doi: 10.1046/j.1537-2995.1998.38111299056319.x. [DOI] [PubMed] [Google Scholar]
  32. Kaushal R, Shojania K G, Bates D W. Effects of Computerized Physician Order Entry and Clinical Decision Support Systems on Medication Safety: A Systematic Review. Archives of Internal Medicine. 2003;163(12):1409–16. doi: 10.1001/archinte.163.12.1409. [DOI] [PubMed] [Google Scholar]
  33. Kingdon J W. Agendas, Alternatives, and Public Policies. 2nd Edition. New York: HarperCollins College Publishers; 1995. [Google Scholar]
  34. Kohn L T, Corrigan J M, Donaldson M S, editors. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000. [PubMed] [Google Scholar]
  35. Koppel R, Metlay J P, Cohen A, Abaluck B, Localio A R, Kimmel S E, Strom B L. Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors. Journal of the American Medical Association. 2005;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
  36. Lake T, Devers K, Brewster L, Casalino L. Something Old, Something New: Recent Developments in Hospital–Physician Relationships. Health Services Research. 2003;38(1, part 2):471–88. doi: 10.1111/1475-6773.00125. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. LaPorte T R, Consolini P M. Working in Practice but Not in Theory: Theoretical Challenges of High Reliability Organizations. Journal of Public Administration Research and Theory. 1991;1(1):19–48. [Google Scholar]
  38. Leape L L, Berwick D M. Five Years after ‘To Err Is Human’: What Have We Learned? Journal of the American Medical Association. 2005;293(19):2384–90. doi: 10.1001/jama.293.19.2384. [DOI] [PubMed] [Google Scholar]
  39. Leonard M, Graham S, Bonacum D. The Human Factor: The Critical Importance of Effective Teamwork and Communication in Providing Safe Care. Quality and Safety in Health Care. 2004;13(suppl 1):85–90. doi: 10.1136/qshc.2004.010033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Lingard L, Espin S, Whyte S, Regehr G, Baker G R, Reznick R, Bohnen J, Orser B, Doran D, Grober E. Communication Failures in the Operating Room: An Observational Classification of Recurrent Types and Effects. Quality and Safety in Health Care. 2004;13:330–4. doi: 10.1136/qshc.2003.008425. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. March J G, Sproull L S, Tamuz M. Learning from Samples of One or Fewer. Organization Science. 1991;2(1):1–13. [Google Scholar]
  42. Marcus A A, Nichols M L. On the Edge: Heeding the Warnings of Unusual Events. Organization Science. 1999;10(4):482–99. [Google Scholar]
  43. McFerran S, Nunes J, Pucci D, Zuniga A. Perinatal Patient Safety Project: A Multicenter Approach to Improve Performance Reliability at Kaiser Permanente. Journal of Perinatal Neonatal Nursing. 2005;19(1):37–45. doi: 10.1097/00005237-200501000-00010. [DOI] [PubMed] [Google Scholar]
  44. Meyer A D. Adapting to Environmental Jolts. Administrative Science Quarterly. 1982;27(4):515–37. [PubMed] [Google Scholar]
  45. Miller R H, Sim I. Physicians' Use of Electronic Medical Records: Barriers and Solutions. Health Affairs. 2004;23(2):116–26. doi: 10.1377/hlthaff.23.2.116. [DOI] [PubMed] [Google Scholar]
  46. Mintzberg H. The Structuring of Organizations: A Synthesis of the Research. Englewood-Cliffs, NJ: Prentice Hall; 1979. [Google Scholar]
  47. Morgan G. Images of Organization. 2nd Edition. Thousand Oaks, CA: Sage; 1996. [Google Scholar]
  48. Musson D M, Helmreich R L. Team Training and Resource Management in Health Care: Current Issues and Future Directions. Harvard Health Policy Review. 2004;5(1):25–35. [Google Scholar]
  49. Perin C. Shouldering Risks: The Culture of Control in the Nuclear Power Industry. Princeton, NJ: Princeton University Press; 2004. [Google Scholar]
  50. Perrow C. Normal Accidents: Living with High Risk Technologies. New York: Basic Books. Second edition, Princeton University Press; 19841999a. [Google Scholar]
  51. Perrow C. The Limits of Safety: The Enhancement of a Theory of Accidents. Journal of Contingencies and Crisis Management. 1994;2(4):212–20. [Google Scholar]
  52. Perrow C. Organizing to Reduce the Vulnerabilities of Complexity. Journal of Contingencies and Crisis Management. 1999b;7(3):150–5. [Google Scholar]
  53. Reason J. Human Error. Cambridge, U.K.: Cambridge University Press; 1990. [Google Scholar]
  54. Reason J. Managing the Risks of Organizational Accidents. Aldershot, U.K.: Ashgate Publishing; 1997. [Google Scholar]
  55. Roberts K H. Some Characteristics of One Type of High Reliability Organization. Organization Science. 1990;1(2):160–76. [Google Scholar]
  56. Roberts K H. Cultural Characteristics of Reliability Enhancing Organizations. Journal of Managerial Issues. 1993;5(2):165–81. [Google Scholar]
  57. Roberts K H, Madsen P, Desai V, Van Stralen D. A Case of the Birth and Death of a High Reliability Healthcare Organisation. Quality and Safety in Health Care. 2005;14:216–20. doi: 10.1136/qshc.2003.009589. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Roberts K H, Yu K, van Stralen D. Patient Safety as an Organizational Systems Issue: Lessons from a Variety of Industries. In: Youngberg B J, Hatlie M, editors. Patient Safety Handbook. Sudbury, MA: Jones and Bartlett Publishers; 2004. pp. 169–86. [Google Scholar]
  59. Sagan S D. The Limits of Safety: Organizations, Accidents and Nuclear Weapons. Princeton, NJ: Princeton University Press; 1993. [Google Scholar]
  60. Sagan S D. Toward a Political Theory of Organizational Reliability. Journal of Contingencies and Crisis Management. 1994;2(4):228–40. [Google Scholar]
  61. Sagan S D. Learning from Normal Accidents. Organization and Environment. 2004a;17(1):15–9. [Google Scholar]
  62. Sagan S D. The Problem of Redundancy Problem: Why More Nuclear Security Forces May Produce Less Nuclear Security. Risk Analysis. 2004b;24(4):935–46. doi: 10.1111/j.0272-4332.2004.00495.x. [DOI] [PubMed] [Google Scholar]
  63. Schon D A. The Reflective Practitioner. New York: Basic Books Inc; 1983. [Google Scholar]
  64. Schulman P R. General Attributes of Safe Organisations. Quality and Safety in Health Care. 2004;13(suppl 2):39–44. doi: 10.1136/qshc.2003.009613. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Sexton J B, Thomas E J, Helmreich R L, Neilands T B, Rowan K, Vella K, Boyden J, Roberts P R. Frontline Assessments of Healthcare Culture: Safety Attitudes Questionnaire Norms and Psychometric Properties. Technical Report 04-01. The University of Texas at Houston Center of Excellence for Patient Safety Research and Practice. 2001 Available at http://www.utpatientsafety.org.
  66. Singer S J, Gaba D M, Geppert J J, Sinaiko A D, Howard S K, Park K C. The Culture of Safety: Results of an Organization-wide Survey in 15 California Hospitals. Quality and Safety in Health Care. 2003;12(2):112–8. doi: 10.1136/qhc.12.2.112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Smetzer J L. Reducing At-Risk Behaviors. Joint Commission Journal on Quality and Patient Safety. 2005;31(5):294–9. doi: 10.1016/s1553-7250(05)31037-3. [DOI] [PubMed] [Google Scholar]
  68. Smetzer J L, Vaida A J, Cohen M R, Tranum D, Pittman M A, Armstrong C W. Findings from the ISMP Medication Safety Self-Assessment for Hospitals. Joint Commission Journal on Quality and Safety. 2003;29(11):586–97. doi: 10.1016/s1549-3741(03)29069-9. [DOI] [PubMed] [Google Scholar]
  69. Snook S A. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq. Princeton, NJ: Princeton University Press; 2000. [Google Scholar]
  70. Sutcliffe K M, Lewton E, Rosenthal M M. Communication Failures: An Insidious Contributor to Medical Mishaps. Academic Medicine. 2004;79(2):186–94. doi: 10.1097/00001888-200402000-00019. [DOI] [PubMed] [Google Scholar]
  71. Tamuz M. Defining Away Dangers: A Study in the Influences of Managerial Cognition on Information Systems. In: Lant T K, Shapira Z, editors. Organizational Cognition: Computation and Interpretation. Mahwah, NJ: Lawrence Erlbaum Associates; 2000. pp. 157–83. [Google Scholar]
  72. Tamuz M. Learning Disabilities for Regulators: The Perils of Organizational Learning in the Air Transportation Industry. Administration & Society. 2001;33(3):276–302. [Google Scholar]
  73. Tamuz M, Thomas E J, Franchois K. Defining and Classifying Medical Error: Lessons for Reporting Systems. Quality and Safety in Health Care. 2004;13:13–20. doi: 10.1136/qshc.2002.003376. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Tamuz M, Thomas E J. Presentation at the Twelfth Annual Organization Science Winter Conference. Steamboat Springs, CO; 2006. “Monitoring Mishaps: Aviation and Patient Safety Reporting. [Google Scholar]
  75. Tasca L. Stony Brook: Sociology Department, State University of New York; 1990. The Social Construction of Human Error. Unpublished Dissertation. [Google Scholar]
  76. Taylor J A, Brownstein D, Christakis D A, Blackburn S, Strandjord T P, Klein E J, Shafii J. Use of Incident Reports by Physicians and Nurses to Document Medical Errors in Pediatric Patients. Pediatrics. 2004;114(3):729–35. doi: 10.1542/peds.2003-1124-L. [DOI] [PubMed] [Google Scholar]
  77. Thomas E J, Helmreich R L. Will Airline Safety Models Work in Medicine? In: Sutcliffe K M, Rosenthal M M, editors. Medical Error: What Do We Know? What Do We Do? San Francisco: Jossey-Bass; 2002. pp. 217–34. [Google Scholar]
  78. Tucker A L, Edmundson A C. Why Hospitals Don't Learn from Failures: Organizational and Psychological Dynamics that Inhibit System Change. California Management Review. 2003;45(2):55–72. [Google Scholar]
  79. Vincent C, Stanhope N, Crowley-Murphy M. Reasons for Not Reporting Adverse Incidents: An Empirical Study. Journal of Evaluation in Clinical Practice. 1999;5(1):13–21. doi: 10.1046/j.1365-2753.1999.00147.x. [DOI] [PubMed] [Google Scholar]
  80. Wakefield B J, Wakefield D S, Uden-Holman T. Improving Medication Administration Error Reporting Systems. Why Do Errors Occur. Ambulatory Outreach. 2000:16–20. [PubMed] [Google Scholar]
  81. Wald H, Shojania K G. Incident Reporting’ and ‘Root Cause Analysis’. In: Markowitz A J, Wachter R M, editors. Making Health Care Safer: A Critical Analysis of Patient Safety Practices. Rockville, MD: Agency for Healthcare Research and Quality; 2001. pp. 41–56. [Google Scholar]
  82. Weick K E. Organizational Culture as a Source of High Reliability. California Management Review. 1987;29(2):112–27. [Google Scholar]
  83. Weick K E. Normal Accident Theory as Frame, Link, and Provocation. Organization and Environment. 2004;17(1):27–31. [Google Scholar]
  84. Weick K E, Sutcliffe K M. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco: Jossey-Bass; 2001. [Google Scholar]
  85. Weick K E, Sutcliffe K M, Obstfeld D. Organizing for High Reliability: Processes of Collective Mindfulness. Research in Organizational Behavior. 1999;21:81–123. [Google Scholar]
  86. Weiner E L, Kanki B G, Helmreich R L. Cockpit Resource Management. San Diego: Academic Press; 1993. [Google Scholar]

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES