Skip to main content
The Milbank Quarterly logoLink to The Milbank Quarterly
. 2011 Mar;89(1):4–38. doi: 10.1111/j.1468-0009.2011.00623.x

Counterheroism, Common Knowledge, and Ergonomics: Concepts from Aviation That Could Improve Patient Safety

Geraint H Lewis 1, Rhema Vaithianathan 1, Peter M Hockey 1, Guy Hirst 1, James P Bagian 1
PMCID: PMC3160593  PMID: 21418311

Abstract

Context: Many safety initiatives have been transferred successfully from commercial aviation to health care. This article develops a typology of aviation safety initiatives, applies this to health care, and proposes safety measures that might be adopted more widely. It then presents an economic framework for determining the likely costs and benefits of different patient safety initiatives.

Methods: This article describes fifteen examples of error countermeasures that are used in public transport aviation, many of which are not routinely used in health care at present. Examples are the sterile cockpit rule, flight envelope protection, the first-names-only rule, and incentivized no-fault reporting. It develops a conceptual schema that is then used to argue why analogous initiatives might be usefully applied to health care and why physicians may resist them. Each example is measured against a set of economic criteria adopted from the taxation literature.

Findings: The initiatives considered in the article fall into three themes: safety concepts that seek to downplay the role of heroic individuals and instead emphasize the importance of teams and whole organizations; concepts that seek to increase and apply group knowledge of safety information and values; and concepts that promote safety by design. The salient costs to be considered by organizations wishing to adopt these suggestions are the compliance costs to clinicians, the administration costs to the organization, and the costs of behavioral distortions.

Conclusions: This article concludes that there is a range of safety initiatives used in commercial aviation that could have a positive impact on patient safety, and that adopting such initiatives may alter the safety culture of health care teams. The desirability of implementing each initiative, however, depends on the projected costs and benefits, which must be assessed for each situation.

Keywords: Medical error; safety management; health knowledge, attitudes, practice; human engineering; costs and cost analysis


The comparative safety records of commercial aviation and health care have been widely publicized, and proposals to borrow safety concepts from aviation abound. In modern aviation, only one passenger's life is lost per 10 million flights, compared with one iatrogenic death for every one hundred to three hundred hospital admissions (Hall 2006; Levinson 2010). Moreover, when safety concepts have been systematically adopted from aviation, the impact on patient safety has sometimes been substantial. For example, the Surgical Safety Checklist of the World Health Organization (WHO) is based on a concept introduced in aviation seventy years earlier (Godlee 2009). An evaluation found that it was relatively quick and cheap to implement and that in some settings, it reduced deaths and complications for surgical patients by more than a third (Haynes et al. 2009). Likewise, catheter-related bloodstream infections were eliminated from a surgical intensive care unit with a package of interventions used in aviation, including a checklist, a common layout for equipment, and the empowerment of junior staff to call for a procedure to be abandoned if guidelines were violated (Berenholtz et al. 2004).

Given these apparent successes, patient safety organizations such as the U.S. Department of Veterans Affairs (VA) National Center for Patient Safety and Britain's National Patient Safety Agency (NPSA) have been championing a range of initiatives drawn from aviation, including checklists, crew resource management (CRM), team self-review, and close-call reporting (Neily, Dunn, and Mills 2004; NPSA 2004). Might there, however, be other error countermeasures used in aviation that could usefully be borrowed?

As an industry, health care is clearly unique in several respects, so researchers and policymakers should proceed with caution. Nonetheless, Marshall noted that successful quality and safety initiatives in many high-risk industries have the same set of underlying principles (Marshall 2009). He argued that the health care sector should continue borrowing from other industries because clinicians are able both to identify which concepts will transfer well and to make any adaptations that may be necessary.

In this article, we describe fifteen safety practices used routinely in aviation. Each initiative might have applications in health care, but only some of these examples are currently used widely in hospitals. From this list we identify three themes that we use to create a conceptual framework for classifying error countermeasures. As with all policy prescriptions, not all proposals are applicable to all situations, so we turn to the economics literature for a cost-benefit analysis and to taxation policy for a framework for gauging which safety initiatives are most likely to transfer well from aviation to health care. Finally, we return to our classification to consider the different reasons why initiatives borrowed from aviation may cause disinterest or antipathy among some doctors.

Safety Strategies from Aviation

Passengers flying on a modern commercial aircraft are protected by a plethora of advanced safety measures. Some of these practices and devices would appear to have little or no direct relevance to health care; for example, those relating to hijacks and other forms of sabotage. Using our combined aviation and clinical experiences, however, we have drawn up a list of fifteen airline safety measures that we believe may be applicable to health care (see Table 1). Some of these initiatives, such as safety checklists and crew resource management, are already used by many health care providers around the world. But others, such as incentivized no-fault reporting and the obligatory use of first names among clinicians, are not currently in widespread use.

Table 1.

Safety Initiatives Used in Commercial Aviation and Their Applications or Potential Applications to Health Care

Safety Initiative Use in Commercial Aviation Application or Potential Application to Health Care
1. Checklists Pilots use three types of checklists to ensure safety: The anesthetic equipment checklist (Association of Anaesthetists of Great Britain and Ireland 2004) is an example of a “read and do checklist”; the WHO surgical safety checklist (WHO 2008b) is essentially a “challenge and response” checklist; and the preoperative team briefing and postoperative debriefing that form part of the VA's Medical Team Training approach are examples of when an “aide memoire” is the preferred type of checklist (Dunn et al. 2007).
a. Read and do checklists, which typically are used in unusual circumstances in which a rigid sequence of actions is necessary, such as confirming that the appropriate memory items have been completed correctly after an engine fire. Pilots simply tick off a series of items on an electronic checklist, or they verbally respond to the “read out” checklist, either at the time of the incident or immediately afterward.a
b. Challenge and response checklists, which are used during routine events, such as the after-takeoff checklist. These involve both pilots: one pilot asks the questions, and the other provides specific answers to items that have already been completed. Although it was designed to be almost universally applicable, the WHO has encouraged adaptation of the surgical safety checklist. For example, there is now an Australia and New Zealand edition, as well as a specific WHO surgical safety checklist for cataract surgery (Gough 2010; NPSA 2010).
c. Aide memoire checklists, such as the list used for the predeparture briefing, which serve as an agenda to ensure that all relevant topics are covered. However, the WHO cautions against uncritical adaptations to the checklist, advising that any modifications be focused, brief, actionable, verbal, collaborative, tested, and integrated with existing safety processes (WHO 2008c).
2. Crew resource management A seminal study published by NASA in 1979 found that most airline disasters were caused by failures of communication, leadership, and teamwork rather than by human technical inability or mechanical failure (Cooper, White, and Lauber 1979). CRM in health care was first used in anesthetics, but it has since been implemented across a range of health care disciplines. There is evidence that CRM has improved communication in, and attitudes toward, health care and that operating room efficiency may be improved by early identification of issues with equipment or staffing (Croskerry et al. 2008). Only more recently have changes in work behavior (Sax et al. 2009) and reductions in surgical mortality (Neily et al. 2010) been demonstrated.
This led to the development of a training program for pilots called Cockpit Resource Management, later renamed Crew Resource Management (CRM). CRM emphasizes the importance of using all available resources—information, equipment, and people—to ensure the safe operation of an aircraft.
Early generations of CRM were criticized by pilots for being “too psychological” and were perceived as simply encouraging the crew to be “team players” (Helmreich and Merritt 1998). More recent iterations of CRM explicitly recognize that errors and threats to safety are inevitable and accordingly teach specific skills for dealing with these inescapable problems. As with aviation, health care will never eradicate unexpected events, so the latest version of CRM in aviation, called Threat and Error Management (TEM), may be particularly well suited to health care. TEM is predicated on the inevitability of errors and teaches pilots how to avoid, trap, and mitigate hazards before they become serious or catastrophic incidents.
In many countries, pilots must now demonstrate their competency in CRM as part of their reaccreditation each year (EASA 2009).
3. Joint safety briefings Before each departure, the crew of a commercial aircraft gathers together for a joint safety briefing. Detailed arrangements differ from airline to airline, but these sessions often include an educational update on a safety topic, and the crew may be required to read and sign any safety notices that are in force. These may pertain to potential hazards on their route or to potential issues with their particular aircraft type. Analogous joint briefings may be useful in health care before the start of clinics, ward rounds, and handoffs, as well as at the start of the operative day. Staff may be required to gather for a focused educational update on a relevant patient safety topic and be asked to sign any safety notices directly relevant to their practice. In the United Kingdom, these notices might include any applicable “Rapid Response Reports,”“Patient Safety Alerts,” or “Safer Practice Notices” issued by the National Patient Safety Agency (NPSA 2009a).
Alternatives to joint briefings—for example, issuing alerts individually or posting them on notice boards—are inferior because members of the team cannot be sure who has read which alerts and therefore may be less willing to challenge their colleagues over any perceived safety breaches.
4. Minimum safety requirements The pilots of an aircraft are entitled to specific safety installations at all airports to which they fly, including the type of fire cover and the availability of different navigational aids. The exact specifications vary according to the aircraft type, and in exceptional circumstances, an aircraft can land at an airport with a lower safety rating. However, a minimum standard always applies. The availability of many clinical safety features varies substantially not only from hospital to hospital, but also according to the time of day and the day of the week. Indeed, there is evidence that certain patients admitted to hospitals on the weekend are at a higher adjusted risk of dying than if they were admitted on a weekday (Bell and Redelmeier 2001).
Work-hour restrictions and minimum nurse staffing ratios are examples of safety requirements that have been introduced by various jurisdictions. However, there may be advantages to prescribing additional detailed sets of minimum safety requirements across all hospitals. For example, at VA facilities, concern about inconsistencies in the management of patients who are difficult to intubate led to a rule that a minimum number of competent staff must be available twenty-four hours a day, seven days a week.
Patient safety authorities could play a useful role in drawing up additional evidence-based minimum requirements, for example, by specifying a maximum bed occupancy rate (Borg 2003). Such codification would make it clear when a hospital was failing to provide the necessary staffing, equipment, or support. This, in turn, might encourage clinicians to report deficiencies rather than persevering on, regardless of the problem. The analysis and publication of such complaints could have an important role in tackling systemic shortcomings. Complaints of this type should therefore be encouraged by the authorities by advertising what provision is to be expected and what number to call or who to email if any equipment or staff are missing.
5. Sterile cockpit rule During the safety-critical phases of a flight, such as when taxiing on the ground or flying below an altitude of 10,000 feet, the pilots and cabin crew must refrain from all nonessential activities such as reading newspapers or chatting idly. Clinicians currently are exposed to considerable distractions at work. So, an analogous rule prohibiting nonessential activities during the critical phases of medical practice might be expected to improve safety, for example, during operations and handoffs (Healey, Primus, and Koutantji 2007). The most hazardous stages for different types of clinical practice could be determined in advance. For an operation, these might include the times of prebriefing, incision, closure, the suturing of any crucial anastomoses, and postoperative debriefing; and on ward rounds, the rule might apply when the management plan for each patient was being formulated.
This requirement is known as the sterile cockpit rule, and its contravention has been implicated in many aviation disasters (FAA 1981; Levin 2009). The entire crew is informed about when the rule is in force through warnings or alert systems, whose details differ by airline. Of course, many clinical teams already implement this type of rule, and there is evidence that it can reduce error rates (Pape 2003). For example, some hospitals require nurses to wear brightly colored vests when they are administering medications, as a means of signaling to staff and visitors that they should not be interrupted. Such an unambiguous signal is needed both to indicate when the rule is in operation and to ensure that staff members know that their colleagues are aware it is in force.b
Crewmembers are taught how to call, without awkwardness, for the sterile cockpit rule to be implemented at additional times when particular concentration becomes necessary. Likewise, they receive training to ensure that they neither inadvertently violate the rule nor wrongly fail to voice potentially important information when the rule is in force. The strength of the sterile cockpit rule is that it is a public, explicit, and formalized step, which goes beyond the traditional approach of simply asking staff to be more careful.
6. Alternation of roles In modern airlines, captains and first officers typically alternate between flying and nonflying duties for each journey, although there are certain conditions under which the captain will always fly the aircraft. Moreover, the captain holds the legal authority to decide who will fly each segment, bearing in mind the experience of the first officer. In medicine, the current practice often is for the most senior doctor in a team to lead the ward round or to perform the operation, and for more junior medical staff to provide assistance, for example, by taking notes, looking up results, or retracting during an operation.
One of the reasons for this alternation is that it promotes a nondeferential working environment, for example, when the flying pilot asks the nonflying pilot to adjust the sun visor for her or him. This helps the aviation industry flatten the hierarchy on board commercial aircraft, which thereby suppresses the individual valor of pilots and instead promotes team cohesion. Although many hospitals already alternate the leading role between the two most senior doctors, especially in surgery, we believe this practice could be extended to other specialties and systematized further. Such alternation would need to be recorded and audited carefully. It would also require some flexibility, since it may not be possible to alternate for some complex or unusual cases or when new trainees first begin their duties. However, alternation can have substantial educational benefits and might help counter the reduced training opportunities that have resulted from shorter working hours.
7. Standard layout The instruments in the cockpit of an aircraft are set out in a standardized way, based on years of safety and ergonomic research by the aircraft manufacturer. We believe that there may be safety benefits from researching and standardizing a wider range of health care devices and layouts, particularly since many junior doctors frequently change institutions as part of their training.
In the Kegworth air disaster, a Boeing 737–400 crashed into a major highway. This collision was blamed partly on incorrect decisions made by the pilots based on their experience of flying previous Boeing 737 models in which certain instruments and other aircraft systems were laid out and configured differently (Air Accidents Investigation Branch 1990). Authorities in the United States recently have begun promoting ergonomic standards across the health care industry (Pronovost et al. 2009). Likewise, in the United Kingdom, the National Health Service already has taken steps to standardize the design of infusion pumps; it has mandated a common telephone number for calling the cardiac arrest team (2222); and in Wales, a standard drug chart is used in all NHS hospitals (Pownall 2009).
We would suggest exploring how ergonomic research and standardization might be extended to cover the layouts of resuscitation trolleys; sterile packs; recovery, resuscitation, treatment, and operating rooms; monitors; infusion devices; and so on.
Pilots who are accredited for one type of aircraft (e.g., Airbus A320) may fly similar aircraft within the same “family” (e.g., Airbus A319/A321), but if they wish to switch to a completely different plane (e.g., Boeing 767), they must undergo months of retraining, known as a “type conversion” (Lande 1997; Lauda Air 1999). Likewise, there may be important safety benefits from standardizing clinical IT systems across different hospitals. Clinicians working in many health care settings currently use a plethora of idiosyncratic clinical software packages, which may be particularly hazardous when staff move to other organizations. These dangers may increase as more organizations become paperless and as more staff work across hospital and ambulatory care settings. The risks involved might be mitigated by adopting a single, standard IT interface in the same way that the VistA® computerized patient record system is used throughout all VA facilities, spanning primary, secondary, and social care.c
8. Black box The flight recorder, or black box, on an aircraft typically incorporates three components: a “quick access recorder” that tapes hundreds of flight parameters, a “voice recorder,” and a “crash recording,” which is designed to be recovered only after a catastrophic event. We believe there is a need for more medical monitors (such as those that record blood pressure, blood oxygen saturation, and pulse), to incorporate “quick access recorders,” and for these to be routinely analyzed.d
Computers read the quick access recorder after every trip, and if any parameters are exceeded, this will set off a warning. In the United Kingdom, if any of these parameters are either inexplicable or dangerous, a representative from the British Airline Pilots’ Association will contact the crew for an explanation. Pilots therefore know that all their actions are being monitored and that everything they do and say is being recorded—a fact that may also encourage civility between pilots, and between pilots and air traffic controllers. In theory, hospital telephone conversations also could be recorded. This already happens with some nurse-led telephonic services (such as NHS Direct in England), and many primary care practices already tape all conversations made on their telephone systems.
As in the aviation industry, there would need to be a clear audit trail and robust safeguards in place determining who could access this information and under what circumstances. For example, access to recordings might be restricted to nonpunitive safety investigations.
When black boxes were first introduced, pilots fiercely resisted them, worried that they would be used to spy on the crew, but they now are compulsory in civilian passenger aircraft worldwide (Williamson 2010). An additional advantage of recording hospital telephone conversations is that it might discourage the aggressive behavior that health care staff sometimes face when making a referral or requesting an investigation over the telephone, especially late at night. Reducing the likelihood of such hostility could encourage staff to seek investigations and specialist input more readily, with potential benefits for patients’ safety.
In the United States, the Federal Aviation Administration stipulates that at least eighty-eight different parameters be recorded by the flight recorder. However, the latest devices may record more than three hundred parameters, and some can stream data continuously in real time to an analysis center on the ground (Sims 2010).
9. Corporate responsibility for training Airlines arrange, pay for, and ensure the quality of all the safety training they require of their pilots. Shift rosters are designed so that pilots can attend required training sessions, and airlines keep a close record of when pilots attend these courses, together with their renewal dates. In contrast, physicians often have to arrange many elements of their own core training, such as Advanced Life Support courses. They are also often obliged to pay for these courses themselves because of limited or nonexistent study budgets and to rearrange or swap shifts or clinical sessions with their colleagues in order to attend.
If pilots miss or fail any training or proficiency checks, they will face restrictions until the shortfall has been corrected. This may include losing the ability to exercise the privilege of their pilot's license (Civil Aviation Authority 2010). Although it might be administratively burdensome, we believe that there is a strong case for requiring health care organizations to maintain detailed databases of all the ongoing training competencies required of their staff; to provide (or arrange and pay for) all the necessary training; and to arrange shifts or rosters so that staff can attend. Responsibility for these tasks at doctors’ practices might be assumed by an entity such as an independent practice association (U.S.) or a health board or primary care consortium (U.K.).
Making the team and the system accountable for safety training reinforces the message that safety is not an individual responsibility.
10. First-names-only rule Until the 1970s, junior pilots addressed their seniors as “Captain,”“Sir,” or “Ma’am” but in today's modern cockpits, only first names are routinely used. It has been suggested that this change came about partly because of perhaps counterintuitive evidence that aircraft are in fact safer when flown by the first officer rather than by the more senior captain (Gladwell 2008; NTSB 2006). The theory is that when the captain is flying, the more junior first officer may be reluctant to question a dubious decision, in part because speaking up could be perceived as challenging the captain's authority. Using first names can help flatten the social hierarchy in the cockpit and thereby foster a culture in which colleagues feel more comfortable questioning one another, regardless of rank. It could be argued that the current practice in most hospitals, in which junior nurses and doctors typically refer to the senior staff by their title and surname, creates an unhelpful barrier between members of the team, which may inhibit critical feedback.e
Clearly, the issue is complex, not least because many patients express a preference for formality in their relationships with their physicians. Nevertheless, requiring all staff to address one another using only first names might improve patients’ safety if it promoted a culture in which even the most junior members of the team felt more comfortable about questioning their seniors over perceived hazards.
During announcements, pilots often introduce themselves to passengers using their titles and surnames; and they wear uniforms that visibly signify their status and rank. But in the cockpit, formal titles are used only in extreme circumstances to draw attention to the gravity of a situation, such as when a co-pilot follows the PACE protocol to challenge the performance or behavior of a captain (Besco 1999). As part of the WHO surgical safety checklist, all team members are required to introduce themselves by name and role before skin incision occurs (WHO 2008b). However, neither the current edition of the checklist nor its implementation manual stipulates that first names be used (WHO 2008a).
11. Incentivized no-fault reporting In the United States, NASA operates an Aviation Safety Reporting System (ASRS) that offers the incentives of anonymity and immunity to pilots who report an unsafe situation within ten days of its occurrence (NASA 2007). If a report is submitted within this timeframe, the reporting pilot is issued an ASRS reference number. All identifying information in the report is then removed before the incident is investigated and any lessons are publicized. Many health care systems already operate successful error-reporting systems (NPSA 2009b). For example, between 1999 and 2010, more than 700,000 incident reports were submitted to the VA's National Center for Patient Safety (personal communication from the VA's National Center for Patient Safety, December 14, 2010). Moreover, NASA is now using its experience with ASRS to offer a confidential and nonpunitive reporting mechanism for employees of federal and private health care facilities (NASA 2009).
Later, if the Federal Aviation Administration (FAA) chooses to take enforcement action against the pilots involved in the incident, the pilots can present their ASRS reference number as evidence of a “constructive safety attitude,” and the FAA will not impose any penalties as long as the mistakes were inadvertent and did not constitute a criminal offense. We believe that the wider use of incentives such as anonymity and also immunity in health care reporting systems could help move away from the current culture in which filing an adverse incident report is often regarded as “disloyal” to colleagues or as a mechanism for criticizing or punishing individual staff members.
12. Bottle-to-throttle rule Evidence shows that alcohol has effects up to fourteen hours after consumption, with hangovers adversely influencing visuospatial skills, dexterity, managerial skills, and task completion (Wiese, Shlipak, and Browner 2002; Yesavage and Leirer 1986). For this reason, strict arrangements are in place to ensure that pilots do not fly when hung-over. In some countries, for example, pilots must not consume more than five units of alcohol in the twenty-four hours before they report for duty and, in addition, are prohibited from drinking any alcohol at all in the eight hours before they fly (FAA 2006). Although the adverse effects of alcohol and hangovers may be particularly serious for pilots, due to the nature of their work and the potentiating effects of altitude, clearly physicians working while hung-over can also be hazardous (Collins, Mertens, and Higgins 1987).
The practicalities of introducing an analogous rule in health care would need to be carefully considered. For example, unlike long-haul pilots who may work only a few days each month, many senior physicians provide almost continuous cover.
Such bottle-to-throttle rules are widely publicized within the airline industry to generate common knowledge, in the same way that drunk-driving limits have become common knowledge and thereby have helped change the culture so that it is now socially unacceptable in many countries to drive with a blood alcohol level above the legal limit (Söder 1991). We believe that health care organizations should go out of their way to encourage employees with drug or alcohol problems to come forward for counseling, rehabilitation, and detoxification, without the threat of disciplinary action. But there are also strong arguments for considering explicit restrictions on alcohol consumption by health care staff in the hours before starting work.
13. Mistake-proofing Mistake-proofing refers to designing a system so that the user finds it difficult or impossible to make a mistake. It aims to prevent human errors caused by forgetfulness, lack of experience, sloppiness, misunderstanding, or inattention due to fatigue, stress, or work overload. Mistake-proofing is already used widely in health care. A well-known example is the design of medical gas supply connectors, which prevent misconnections such as connecting the oxygen line into the nitrous oxide supply.
An example is the lever controlling a Boeing 747's landing gear. This is automatically locked in the “down” position whenever the airplane is on the ground, based on the readings of sensors located on the nosewheel.
14. Forcing functions Forcing functions try to correct human errors as they occur. An example is the Traffic Alert and Collision Avoidance System. When the system detects that two planes are on course for a midair collision, it calculates a set of avoidance maneuvers, which it orders the pilots to follow. Many electronic prescribing systems incorporate a forcing function that requires a manual override when a potentially harmful drug interaction is detected.
15. Flight envelope protection Flight envelope protection (FEP) is a set of limits on the controls of an aircraft that prevent the pilots from commanding a plane so vigorously that it exceeds its structural and aerodynamic operating limits. Its purpose is to liberate pilots so that they can use maximum control forces in an emergency without fear of endangering the safety of their aircraft through their own actions. An example is the rearward side-stick, which is used to pitch the aircraft nose up. If a pilot attempts to pitch the aircraft beyond the stalling angle, the computers creating the flight envelope protection will cause the aircraft to ignore this command. Such protection allows the pilot to make rapid evasive maneuvers in response to a potential hazard. The principles of FEP might be further developed in health care. Examples of when such protection could be useful for physicians are guiding (a) the concentration of oxygen delivered to patients in type 2 respiratory failure; (b) the rate of fluid resuscitation for patients with heart failure; and (c) the speed of rewarming for hypothermic patients.

Notes:

a

When the situation requires extremely urgent action, pilots are required to act according to memorized protocols and then to tick off the checklist after the emergency is over, to ensure that everything necessary has indeed been done.

b

In the operating room, a foot-operated light switch or buzzer might be used to signal the sterile cockpit rule, and on ward rounds, holding up a colored card could be used to indicate that the rule is in effect.

c

Such standardization would require unprecedented cooperation across the IT industry, but in the United Kingdom, the national program for health IT (Connecting for Health) is developing an open-source common user interface (see http://www.cui.nhs.uk).

d

This would require patients to be registered on all such monitors, which might be achieved by incorporating a scanner to read bar-coded patient wristbands.

e

In Sweden, following the “Du reforms” championed by the director general of Sweden's National Board of Health and Welfare, only first names are used (Oksaar 1998).

Conceptual Framework

While analyzing the fifteen examples listed in Table 1, we identified three pervasive themes for safety initiatives used in public transport aviation: counterheroism, common knowledge, and ergonomics.

Counterheroism

Many of the aviation safety initiatives listed in Table 1 are specifically designed to minimize the responsibility of individual pilots and instead to emphasize the importance of the team and the system as a whole for ensuring safety (Reason 2000).

The field of health care may often be characterized by a culture of individual “heroism.” For example, in their ethnographic study of operating rooms, Waring and colleagues observed that surgeons would respond to system deficiencies by finding ways around each problem they encountered (Waring, Harrison, and McDonald 2007). For instance, when items of surgical equipment were missing, surgeons modified, reshaped, or adjusted equipment designed for other uses. The surgeons described these actions as “adventurous,” “daring,” and necessary for “getting the job done.” In contrast, other members of the operating room team said they felt anxious about these “heroic” modifications of established practice. However, the frequency of system failures had bred a culture in which complaining about system failures was regarded as a criticism of the surgeons’ ability to innovate rather than as a condemnation of the system failure itself.

Alamberti and colleagues found that the frequency of unforeseen events in an operating room limits the degree of safety that can be provided through good systems, since individuals will inevitably be called on to respond to unpredicted and unpredictable events (Alamberti, Berwick, and Barach 2005). In contrast, Woods asserted that an organization's ability to react to surprise events is, in fact, an important characteristic of a dynamically safe system (Woods 2006). Our contention is that the manner of a physician's response to an error or threat to safety is closely tied to medicine's current culture of heroism. We further contend that system failures and a culture of heroism may be self-reinforcing. In aviation, the team and the system are central to the safety culture. Fostering a comparable culture in health care therefore requires curbing individual heroism, which may be achieved through increased codification and other measures that downplay the role of individuals in ensuring safety. These restrictions will inevitably be seen as challenging the current “heroic” culture, and so opposition from some doctors is to be expected.

In modern aviation, rules and protocols are so deeply ingrained in the culture that if a pilot or a mechanic encounters an unusual situation, his or her response is not to try jury-rigging a way around the problem but, rather, to follow set procedures and to report the problem through official channels. Gawande observed that as a result, the “rock star status” of the first daring aviators has been diminished by tight regulations such as preflight checklists, and he noted that a similar transformation is now under way in medicine (Gawande 2007). Some of the mechanisms the aviation industry uses to encourage this nonheroic culture, such as promoting the use of first names only among staff, would be inherently challenging to clinicians. We believe, however, that the increased use of counterheroic measures in health care might foster a less obsequious workplace culture in the long run. In a less deferential, less hierarchical workplace, senior staff may not feel so pressured to make ad hoc adjustments in response to system failures, and junior staff may feel less awkward about raising concerns when they believe their seniors are about to make a mistake.

It is important to note that by proposing curbs on heroism, we are referring not to “heroic” treatments for patients with low survival probabilities, to “heroic” personal sacrifices by clinicians, or indeed to the systematic development of novel practices. Rather, we are proposing constraints only on makeshift adjustments made by individuals in response to hazards and adverse events.

Common Knowledge

In economics, the concept of common knowledge is used to denote values or information that not only are known to members of a group but also are known to be known, and are known to be known to be known, ad infinitum (Lewis 1969). For instance, the joint safety briefing for the pilots and cabin crew before a flight creates common knowledge because all members of the team are reminded not only of what they should do in the event of an emergency but also what is expected of them by their colleagues. After a rule has become common knowledge, if one member of the team is seen to violate that rule, then the other members of the team are “authorized” to bring this to the attention of the violator, whereas previously they might have felt less confident in doing so.

Publicizing a rule creates common knowledge and, in so doing, causes an important cultural change, even for rules that already have been tacitly acknowledged by all. For example, the European Aviation Safety Agency (EASA) mandates that all pilots be assessed on their CRM skills based on a detailed description of CRM methods and terminology (EASA 2009). It is our belief that these assessment criteria—being tightly codified and known to pilots, their colleagues, trainers, and employers—make it easier to insist on acceptable behavior in the cockpit. Other examples of common knowledge generated by the aviation sector are publicizing the “sterile cockpit” rule and the “bottle-to-throttle” rule and publishing detailed minimum safety requirements.

Ergonomics

Ergonomics, also called human factors engineering (HFE), is the science of designing products, processes, systems, and environments that take explicit account of the capabilities and behaviors of the people who will interact with them (Gosbee 2002). The aviation industry uses HFE extensively: examples are mistake-proofing, forcing functions, and flight envelope protection.

HFE is already used in health care as well, a notable example being medical gas connectors that are designed to prevent mistakes. Nonetheless, we believe that ergonomics can be applied much more widely to health care. For example, automatic identification technologies such as bar coding and radio frequency identification (RFID) could be used more extensively to reduce wrong-patient, wrong-drug, and wrong-dose errors. Other ways in which the aviation sector improves safety by design are standardizing instrument layouts and using flight recorders (“black boxes”) to encourage safe behaviors.

Clearly, many of the initiatives listed in Table 1 involve more than one of the three themes we have identified, and Table 2 shows what we believe is the relative importance of each theme for the fifteen initiatives listed in Table 1.

Table 2.

Relative Importance of Counterheroism, Common Knowledge, and Ergonomics to Each Initiative Listed in Table 1

Counterheroism (Overcomes Excessive Individualism) Common Knowledge (Imposes or Enhances Group Knowledge) Ergonomics (Careful Design to Overcome Human Error)
1. Checklists ••• •• ••
2. Crew resource management •••
3. Joint safety briefings •••
4. Minimum safety requirements ••• ••
5. Sterile cockpit rule •• ••
6. Alternation of roles •••
7. Standard layout •••
8. Black box •• ••
9. Corporate responsibility for training •• ••
10. First-names-only rule •••
11. Incentivized no-fault reporting
12. Bottle-to-throttle rule ••
13. Mistake-proofing •••
14. Forcing functions •• •••
15. Flight envelope protection •••

Key: •= low, ••= moderate, and •••= high relative importance of the theme for each initiative, as perceived by the authors.

An Economic Framework for Assessing Safety Initiatives

Although at an individual level, clinicians have an obligation to do no harm to their patients, the initiatives we discuss in this article concern changes at the organizational level. Given the infinite demands placed on health care budgets and the high costs of implementing certain safety initiatives, the opportunity costs may be substantial. So, to be acceptable to health care organizations, the safety initiatives advocated in this article would have to provide better value than other competing medical interventions. In other words, these safety initiatives must be cost-effective. Economists would regard as optimal any safety initiative whose marginal safety benefit outweighed its marginal implementation cost. In this section, we describe a framework for considering the cost-effectiveness of patient safety initiatives.

The benefits of the safety initiatives listed in Table 1 depend on (1) the frequency and severity of the safety problem they are designed to address, (2) their effectiveness in mitigating that safety threat, and (3) the value that society places on safety in that context. One reason why commercial aviation is an atypical industry is that the reputational harm ensuing from an aircraft disaster is so great that the benefits of aviation safety interventions are almost infinite. As a result, the optimal level of safety in public transport aviation approaches complete safety.1 In contrast, other industries, such as ground-based forms of public transport like trains and taxis, do not go to such extreme lengths to eliminate risks. For example, train passengers are generally not required to wear seat belts, and taxi passengers are not issued with crash helmets. This suggests that for ground-based forms of public transport, officials place a greater weight on price, convenience, and quality when determining safety policy.

We believe that in health care, patient safety policy is similarly multidimensional. For this reason, it is unlikely to be optimal to attempt to eliminate all iatrogenic risks because the foregone opportunity costs would be so great. Therefore, the benefits from introducing any patient safety initiatives need to be tested against the benefits of other forms of health care, as well as against their costs.

Calculating benefits

In aviation, officials at the Federal Aviation Administration (FAA) use a system called Security Risk Management to estimate the cost-effectiveness of prospective safety programs (FAA 2002). In health care, one way to calculate the relative benefits of a patient safety initiative is to determine its incremental cost-effectiveness ratio, which is measured in units of cost per quality-adjusted life year (QALY) gained. An alternative method, given the high cost of many iatrogenic complications, is to quantify the benefits as the costs of averted adverse incidents (Semel et al. 2010).

Calculating costs

To calculate the costs of introducing a patient safety initiative, one approach might be to apply the criteria that economists use for assessing the implementation costs of taxes—taxes and patient safety initiatives being analogous in that both involve encumbrances borne by individuals for the greater good. These criteria are the compliance costs, the administration costs, and the costs of any ensuing behavioral distortions (Musgrave and Musgrave 1973). Table 3 sets out what these different costs might be for each of the fifteen safety initiatives listed in Table 1. In the discussion that follows, we use the WHO's Surgical Safety Checklist for purposes of illustration.

Table 3.

Categories of Costs for Safety Interventions

Safety Initiative Compliance Costs (Cost to Users) Administration Costs (Cost to the Organization) Some Potential Behavioral Distortions (Ways of Undermining the Initiative)
1. Checklists Low (simple procedure that takes only a few minutes per operation to complete) Low (training has been shown to be straightforward; educational materials are available centrally; and copies of the checklist cost only a few cents per patient. Overall cost has been estimated as $11 per patient) (Semel et al. 2010). Staff might try to complete the checklists for every operation in a single batch at the start of the operating list. Surgical and anesthesia teams might dispense with other safeguards that they previously used (Vats et al. 2010).
2. Crew resource management Moderate (behavior change) High (organizing, paying for, auditing, and scheduling CRM training) Peer pressure not to follow CRM approach; complacency that the “team” is in charge, so individuals are less alert to threats.
3. Joint safety briefings High (time in preparing and conducting briefings) Moderate (preparation of briefings and notifications; audit trail of signed briefings) Briefing skipped or performed in a perfunctory way; overcautious decision making might also be bad for care.
4. Minimum safety requirements High (more antisocial working patterns) High (employ additional staff and out-of-hours supplements) Reclassification of patients as being less acutely unwell.
5. Sterile cockpit rule Moderate (behavior change) Moderate (cost of vests, foot-operated signals, wasted time of cleaning staff not using noisy floor polishers etc. while waiting for ward round to finish, etc.) No one instigates the sterile cockpit rule, and those who do are treated as “overcautious.”
6. Alternation of roles High (involves significant behavior change) Moderate (junior doctors may be slower than their senior colleagues) Increased exception reporting (e.g., operation too complex for junior doctor); token compliance (senior doctor actually runs ward round).
7. Standard layout High initially (when learning the new layout), then low (standard layout used throughout) High initially (replacement of trolleys, surgical packs, redesign of anesthetic rooms, etc.), then low (economies of scale from standard layouts) Staff rearrange layout back to the previous design.
8. Black box Low (need to enter patients’ details into monitor) High (recording devices, backup, monitoring) Staff less willing to use monitoring; staff use mobile phones rather than hospital phones to communicate.
9. Corporate responsibility for training Low or zero High (organizing, paying for, auditing, and scheduling training) Staff stop using their own initiative to arrange training; training becomes less bespoke.
10. First-names-only rule Moderate (involves a simple behavior change) Low (dissemination, training, monitoring) Lack of respect/authority leading to reduced discipline or refusal to obey orders.
11. Incentivized no-fault reporting Moderate (although tangible incentives for complying) Moderate (many elements of the system already in place) Reporting trivial events; system swamped with reports; potential immunity causes clinicians to act with impunity.
12. Bottle-to-throttle rule Low or zero for most staff Moderate (dissemination, spot checks) Increased absenteeism from hung-over staff.
13. Mistake-proofing Low or zero High initially (cost of designing, purchasing, and implementing new equipment), then potentially low (fewer medical errors) Specifically designed to take account of human behaviors but could encourage complacency and overreliance on systems that may not be absolutely fail-safe.
14. Forcing functions Low High initially (cost of designing, purchasing, and implementing new equipment), then potentially low (fewer medical errors) Staff might seek ways of overriding the forcing function.
15. Flight envelope protection Low or zero High initially (cost of designing, purchasing, and implementing new equipment), then potentially low (fewer medical errors) Could encourage reckless behavior.

Note: High, moderate, and low costs as perceived or anticipated by the authors. Costs would vary according to the characteristics and structure of the health care provider implementing the safety initiative.

Compliance Costs

Compliance costs are the costs borne by people in complying with the rules. For a tax, these may include the costs to an individual of producing accounts and filing tax returns.2 Similarly, patient safety initiatives that require changes in clinical behavior impose compliance costs on the individual practitioner.

Compliance costs can be divided into one-off and ongoing costs. One-off compliance costs are usually associated with training and with changes in procedures or processes and often require considerable time and effort. Such costs can be minimized by simplifying protocols, by increasing flexibility so that individuals can comply in a wide variety of ways that are most convenient to them, and by providing a range of learning tools and resources.

Ongoing compliance costs are associated with demonstrating compliance, for example, by filling out forms. Recent advances in behavioral economics suggest that the best way to improve compliance may not necessarily be by increasing the sanctions on noncompliant behavior. Rather, ongoing compliance may be encouraged by creating social norms and group censure for noncompliant behavior.

In health care, policymakers should consider the costs of a proposed new patient safety initiative in regard to the time and effort required for the staff to comply with it. For instance, the surgical safety checklist has relatively low compliance costs because it is a simple procedure that takes only a few minutes per operation to complete.

Administration Costs

Administration costs refer to those costs incurred by an organization in imposing a particular initiative. In taxation policy, this refers to the costs borne by the Internal Revenue Service to collect taxes in a fair, efficient, and equitable manner. For a patient safety initiative, administration costs are those costs borne by a health care provider for ensuring that practices and processes are changed and for ongoing monitoring to ensure compliance. In health care, the administration costs of patient safety initiatives therefore relate to dissemination, training, monitoring, and auditing. Our example, the surgical safety checklist, has been found to have relatively low administration costs because training was straightforward, educational materials such as a training video were made available centrally, and copies of the checklist cost only a few cents per patient. But some of the other interventions listed in Table 1 might have considerable administrative costs (see Table 3).

Behavioral Distortions

Policymakers should be aware of two categories of behavioral distortion that may ensue from the introduction of a new patient safety intervention. First, people may deliberately undermine the efficacy of an initiative, which in the tax literature is referred to as evasion. In taxation, this is the cost imposed by individuals who restructure their affairs to avoid paying a tax, for instance, by resorting to black-market transactions in order to avoid paying sales taxes. Second, individuals may change or stop other behaviors, which we refer to as deadweight losses. For example, because a tax on labor discourages people from working, the result will be an avoidable reduction in economic activity.

Both types of behavior can be difficult to predict, and their detection requires long-term monitoring. For instance, opportunities for evading the surgical safety checklist may be relatively limited because it is used in a public place—the operating room—compared with the opportunities for evading safety initiatives applied in the privacy of the examination room, such as physicians washing their hands before seeing the next patient. This means that the checklist may be less likely to be ignored or poorly administered, although its use does require at least one team member to step up and insist on its use if others fail to start using it. Vats and colleagues documented several ways in which the WHO checklist can be evaded, such as by completing the checklist when some key members of the team are absent or by providing dismissive answers (Vats et al. 2010). Although going through the checklist in a perfunctory way might speed things up, this also would clearly reduce its efficacy (Neily et al. 2009). Deadweight losses might result if, for example, the surgical and anesthesia teams dispensed with other safeguards that they previously used in the operating room before the checklist was introduced.

The lesson from taxation policy is that some degree of behavioral distortion generally is tolerable in that it is unlikely to be cost-effective to eradicate every last behavioral distortion. The challenge, therefore, is to minimize these costs rather than to eradicate them.

Resistance from Physicians

Given the large volume of safety research that has been conducted in aviation over many decades, we believe that numerous ideas might usefully be adopted and disseminated more widely across the health care sector (Loukopoulos, Dismukes, and Barshi 2009). Nonetheless, we would expect resistance from both individual doctors and organized medicine. The framework we have presented for classifying patient safety initiatives may help explain the reasons for such resistance in the following ways.

Counterheroism

Safety initiatives that discourage individual heroism may be unpopular with doctors because they downplay the role of decisive, autonomous decision making in the face of uncertainty, which is seen as a core component of medical professionalism (Royal College of Physicians 2005). Given the prevailing culture of heroism, doctors might perceive any attempts to introduce tighter codification in health care, such as through more standardized layouts, as restrictions on their ability to innovate and hence as a threat to their professional status.

Common Knowledge

One of the hallmarks of a profession is the existence of a clearly defined, specialist knowledge base that is common knowledge to members of the profession but not to outsiders. Any measures that expand the circle of people who share that knowledge base so that it includes, for example, nurses, managers, patients, and/or visitors, thus may be viewed as undermining the status of the medical profession. At a time when doctors are facing many other challenges to their professional standing, such as through independent prescribing by nurses and pharmacists, any new measures that increase common knowledge may be unwelcome if they are seen as yet another threat to a physician's status.

An example is a video produced by the U.S. Centers for Disease Control and Prevention (CDC), which is designed to encourage patients to ask their physician or nurse to wash their hands (CDC 2008). In the video, a patient is seen asking a physician to wash his hands. The physician is seen complying and explaining that he does not mind being reminded. The point of the video is to open up the knowledge regarding the importance to patients of hand washing, with the intention that the new shared knowledge between the doctor and the patient that doctors should wash their hands will also encourage doctors to do so even without being reminded.

Ergonomics

Finally, physicians may respond negatively to calls for better safety through design if they feel powerless to instigate the necessary design changes themselves. In the United Kingdom, the British Medical Journal is attempting to address this problem through an initiative in which doctors are encouraged to “pitch” their patient safety ideas to a panel, with the winning entries being published in the journal and considered for implementation across the health service by the National Patient Safety Agency (NPSA).

Conclusion

One drawback to borrowing concepts from other industries is that their implementation is affected by the deep-seated cultural attitudes of health care staff toward safety (Bosk et al. 2009). But we believe that the process of adopting safety concepts from aviation could in itself alter the culture of health care teams if the interventions dissuaded heroic actions, increased common knowledge, or encouraged safety by design. In other words, some degree of initial resistance from physicians is predictable, but it may be expected to give way over time to a change in workplace culture simply by virtue of the intervention itself. Such a cultural shift recently was demonstrated in a study of crew resource management programs in health care, in which personal behaviors and empowerment were seen to improve over the course of several years (Sax et al. 2009).

Clearly, the fact that a safety concept works well in the aviation industry does not necessarily mean that it is either needed or will be effective or indeed cost-effective in health care. For example, some of the suggestions listed in this article might be too draconian or too simplistic for health care, or they might create an atmosphere of mistrust. Although we listed in Table 3 some possible drawbacks to our examples, we suspect that there will be many others. Very careful consideration, troubleshooting, evaluation, and monitoring therefore will be required before any of our examples are adopted and promoted more systematically.

Acknowledgments

Jennifer Dixon, Alan Garber, Sue Osborn, Martin Marshall, and three anonymous reviewers provided helpful comments on earlier drafts of this article.

Endnotes

1

This may be due to the public's “extreme” reaction to aircraft disasters, based on an element of cognitive dissonance.

2

Compliance costs do not include the actual costs of the taxes themselves, since these were the direct intention of the policymakers rather than an unfortunate side effect.

References

  1. Air Accidents Investigation Branch. Report on the Accident to Boeing 737-400, G-OBME, Near Kegworth, Leicestershire, on 8 January 1989. 1990. Report 4/1990. Available at http://www.aaib.gov.uk/publications/formal_reports/4_1990_g_obme.cfm (accessed December 15, 2010)
  2. Alamberti YA, Berwick D, Barach P. Five System Barriers to Achieving Ultrasafe Health Care. Annals of Internal Medicine. 2005;142:756–64. doi: 10.7326/0003-4819-142-9-200505030-00012. [DOI] [PubMed] [Google Scholar]
  3. Association of Anaesthetists of Great Britain and Ireland. Checklist for Anaesthetic Equipment. 2004. Available at http://www.aagbi.org/publications/guidelines/docs/checklista404.pdf (accessed December 15, 2010) [Google Scholar]
  4. Bell CM, Redelmeier DA. Mortality among Patients Admitted to Hospitals on Weekends as Compared with Weekdays. New England Journal of Medicine. 2001;345(9):663–68. doi: 10.1056/NEJMsa003376. [DOI] [PubMed] [Google Scholar]
  5. Berenholtz SM, Pronovost PJ, Lipsett PA, Hobson D, Earsing K, Farley JE, Milanovich S, et al. Eliminating Catheter-Related Bloodstream Infections in the Intensive Care Unit. Critical Care Medicine. 2004;32(10):2014–20. doi: 10.1097/01.ccm.0000142399.70913.2f. [DOI] [PubMed] [Google Scholar]
  6. Besco RO. PACE: Probe, Alert, Challenge, and Emergency Action. Business and Commercial Aviation. 1999;84(6):72–74. [Google Scholar]
  7. Borg MA. Bed Occupancy and Overcrowding as Determinant Factors in the Incidence of MRSA Infections within General Ward Settings. Journal of Hospital Infection. 2003;54(4):316–18. doi: 10.1016/s0195-6701(03)00153-1. [DOI] [PubMed] [Google Scholar]
  8. Bosk CL, Dixon-Woods M, Goeschel CA, Pronovost PJ. Reality Check for Checklists. The Lancet. 2009;374(9688):444–45. doi: 10.1016/s0140-6736(09)61440-9. [DOI] [PubMed] [Google Scholar]
  9. CDC (Centers for Disease Control and Prevention) Hand Hygiene Saves Lives: Patient Admission Video. 2008. Available at http://www2c.cdc.gov/podcasts/player.asp?f=9467# (accessed December 15, 2010)
  10. Civil Aviation Authority. Standards Document 24 (version 08). U.K. Civil Aviation Authority. 2010. Available at http://www.caa.co.uk/docs/33/srg_l&ts_Stds%20Doc%2024_v8.pdf (accessed December 15, 2010)
  11. Collins WE, Mertens HW, Higgins EA. Some Effects of Alcohol and Simulated Altitude on Complex Performance Scores and Breathalyzer Readings. Aviation, Space, and Environmental Medicine. 1987;58(4):328–32. [PubMed] [Google Scholar]
  12. Cooper GE, White MD, Lauber JK. Moffett Field, CA: NASA Ames Research Center; 1979. Resource Management on the Flight Deck (NASA Conference Publication 2120) [Google Scholar]
  13. Croskerry P, Cosby KS, Schenkel SM, Wears RL. Patient Safety in Emergency Medicine. Philadelphia: Lippincott Williams & Wilkins; 2008. [Google Scholar]
  14. Dunn EJ, Mills PD, Neily J, Crittenden MD, Carmack AL, Bagian JP. Medical Team Training: Applying Crew Resource Management in the Veterans Health Administration. Joint Commission Journal on Quality and Patient Safety. 2007;33(6):317–25. doi: 10.1016/s1553-7250(07)33036-5. [DOI] [PubMed] [Google Scholar]
  15. EASA (European Aviation Safety Agency) Notice of Proposed Amendment No. 200902C, Sections AMC OR.OPS.030.FC and OR.OPS.130.FC. 2009. Available at http://www.easa.europa.eu/ws_prod/r/doc/NPA/NPA%202009-02C.pdf (accessed January 7, 2011)
  16. FAA (Federal Aviation Administration) Code of Federal Regulations, Title 14, Part 121, Section 121.542; and Part 135, Section 135.100. 1981. Available at http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgFAR.nsf/MainFrame?OpenFrameSet (accessed December 15, 2010)
  17. FAA (Federal Aviation Administration) Security Risk Management Guide. 2002. Available at http://fast.faa.gov/Riskmgmt/Secriskmgmt/secriskmgmt.htm (accessed December 15, 2010) [Google Scholar]
  18. FAA (Federal Aviation Administration) Code of Federal Regulations, Title 14, Part 91, Section 91.17. 2006. Available at http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgFAR.nsf/MainFrame?OpenFrameSet (accessed December 15, 2010)
  19. Gawande A. The Checklist: If Something So Simple Can Transform Intensive Care, What Else Can It Do. New Yorker. 2007 December 10. Available at http://www.newyorker.com/reporting/2007/12/10/071210fa_fact_gawande (accessed December 15, 2010) [PubMed] [Google Scholar]
  20. Gladwell M. Outliers: The Story of Success. Boston: Little, Brown; 2008. [Google Scholar]
  21. Godlee F. Human as Hero. BMJ. 2009;338:b238. [Google Scholar]
  22. Gosbee J. Human Factors Engineering and Patient Safety. Quality and Safety in Health Care. 2002;11:352–54. doi: 10.1136/qhc.11.4.352. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Gough I. A Surgical Safety Checklist for Australia and New Zealand. ANZ Journal of Surgery. 2010;80(1/2):3–5. doi: 10.1111/j.1445-2197.2009.05166.x. [DOI] [PubMed] [Google Scholar]
  24. Hall S. Medical Error Death Risk 1 in 300. The Guardian. 2006 November 7. Available at http://www.guardian.co.uk/society/2006/nov/07/health.lifeandhealth (accessed December 15, 2010) [Google Scholar]
  25. Haynes AB, Weiser TG, Berry WR, Lipsitz SR, Breizat AH, Dellinger EP, Herbosa T, et al. A Surgical Safety Checklist to Reduce Morbidity and Mortality in a Global Population. New England Journal of Medicine. 2009;360(5):491–99. doi: 10.1056/NEJMsa0810119. [DOI] [PubMed] [Google Scholar]
  26. Healey AN, Primus CP, Koutantji M. Quantifying Distraction and Interruption in Urological Surgery. Quality and Safety in Health Care. 2007;16:135–39. doi: 10.1136/qshc.2006.019711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Helmreich RL, Merritt A. Brookfield, VT: Ashgate; 1998. Culture at Work in Aviation and Medicine: National, Organizational, and Professional Influences. [Google Scholar]
  28. Lande K. Standardization of Flight Decks—Operational Aspects. In: Soekkha H, editor. Aviation Safety. Ridderkerk: Ridderprint; 1997. pp. 189–201. [Google Scholar]
  29. Lauda Air. Training Manual for Boeing 767 Crews. 1999. Available at http://www.aero-pack.de/aviation/767Manuals/767LaudaAir_Training%20Manual.pdf (accessed December 15, 2010) [Google Scholar]
  30. Levin A. Cockpit Chatter Cited in Six Crashes. USA Today. 2009 October 1. Available at http://www.usatoday.com/news/nation/2009-10-01-pilot-speak_N.htm (accessed December 15, 2010) [Google Scholar]
  31. Levinson DR. Adverse Events in Hospitals: National Incidence among Medicare Beneficiaries. Washington, DC: U.S. Department of Health and Human Services, November; 2010. Available at http://oig.hhs.gov/oei/reports/oei-06-09-00090.pdf (accessed December 15, 2010) [Google Scholar]
  32. Lewis DK. Convention: A Philosophical Study. Cambridge, MA: Harvard University Press; 1969. [Google Scholar]
  33. Loukopoulos LD, Dismukes RK, Barshi I. The Multitasking Myth: Handling Complexity in Real-World Operations. Farnborough: Ashgate; 2009. [Google Scholar]
  34. Marshall M. Applying Quality Improvement Approaches to Health Care. BMJ. 2009;339:b3411. doi: 10.1136/bmj.b3411. [DOI] [PubMed] [Google Scholar]
  35. Musgrave R, Musgrave P. Public Finance in Theory and Practice. New York: McGraw-Hill; 1973. [Google Scholar]
  36. NASA (National Aeronautics and Space Administration) Aviation Safety Reporting System: Confidentiality and Incentives to Report. 2007. Available at http://asrs.arc.nasa.gov/overview/confidentiality.html (accessed December 15, 2010)
  37. NASA (National Aeronautics and Space Administration) Patient Safety Reporting System. 2009. Available at http://www.psrs.arc.nasa.gov/web_docs/PSRS_Brochure09.pdf (accessed December 15, 2010)
  38. Neily J, Dunn E, Mills PD. Medical Team Training—An Overview. Topics in Patient Safety. 2004;4(5):1–3. Available at http://www.patientsafety.gov/TIPS/Docs/TIPS_NovDec04.pdf (accessed December 15, 2010) [Google Scholar]
  39. Neily J, Mills PD, Eldridge N, Dunn EJ, Samples C, Turner JR, Revere A, DePalma RG, Bagian JP. Incorrect Surgical Procedures Within and Outside of the Operating Room. Archives of Surgery. 2009;144(11):1028–34. doi: 10.1001/archsurg.2009.126. [DOI] [PubMed] [Google Scholar]
  40. Neily J, Mills PD, Young-Xu Y, Carney BT, West P, Berger DH, Mazzia LM, Paull DE, Bagian JP. Association between Implementation of a Medical Team Training Program and Surgical Mortality. JAMA. 2010;304(15):1693–700. doi: 10.1001/jama.2010.1506. [DOI] [PubMed] [Google Scholar]
  41. NPSA (National Patient Safety Agency) Seven Steps to Patient Safety: The Full Reference Guide. 2004. Available at http://www.nrls.npsa.nhs.uk/EasySiteWeb/getresource.axd?AssetID=59971&type=full&servicetype=Attachment (accessed December 15, 2010) [Google Scholar]
  42. NPSA (National Patient Safety Agency) Alerts. 2009a. Available at http://www.nrls.npsa.nhs.uk/resources/type/alerts/ (accessed December 15, 2010) [Google Scholar]
  43. NPSA (National Patient Safety Agency) National Reporting and Learning System. 2009b. Available at http://www.nrls.npsa.nhs.uk/report-a-patient-safety-incident/about-reporting-patient-safety-incidents/ (accessed December 15, 2010)
  44. NPSA (National Patient Safety Agency) Safer Surgery Checklist for Cataract Surgery Only. 2010. Available at http://www.nrls.npsa.nhs.uk/EasySiteWeb/getresource.axd?AssetID=74125&type=full&servicetype=Attachment (accessed December 15, 2010) [Google Scholar]
  45. NTSB (National Transportation Safety Board) Additional Flight Crew-Related Accident Information. In Aircraft Accident Report NTSB/AAR-06/01 (chapter 1.18.2) 2006. Available at http://www.ntsb.gov/publictn/2006/AAR0601.pdf (accessed December 15, 2010)
  46. Oksaar E. Social Networks, Communicative Acts and the Multilingual Individual: Methodological Issues in the Field of Language Change. In: Jahr EH, editor. Language Change: Advances in Historical Sociolinguistics. Germany: Mouton de Gruyter; 1998. Berlin, 3–20. [Google Scholar]
  47. Pape TM. Applying Airline Safety Practices to Medication Administration. Medical-Surgical Nursing. 2003;12:77–93. [PubMed] [Google Scholar]
  48. Pownall M. Complex Working Environment, Not Poor Training, Blamed for Drug Errors. BMJ. 2009;339:b5328. doi: 10.1136/bmj.b5328. [DOI] [PubMed] [Google Scholar]
  49. Pronovost PJ, Goeschel CA, Olsen KL, Pham JC, Miller MR, Berenholtz SM, Sexton JB, et al. Reducing Health Care Hazards: Lessons from the Commercial Aviation Safety Team. Health Affairs. 2009;28(3):w479–89. doi: 10.1377/hlthaff.28.3.w479. [DOI] [PubMed] [Google Scholar]
  50. Reason J. Human Error: Models and Management. BMJ. 2000;320(7237):768–70. doi: 10.1136/bmj.320.7237.768. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Royal College of Physicians. London: 2005. Doctors in Society: Medical Professionalism in a Changing World. Available at http://www.rcplondon.ac.uk/pubs/books/docinsoc/docinsoc.pdf (accessed December 15, 2010) [Google Scholar]
  52. Sax HC, Browne P, Mayewski RJ, Panzer RJ, Hittner KC, Burke RL, Coletta S. Can Aviation-Based Team Training Elicit Sustainable Behavioral Change? Archives of Surgery. 2009;144(12):1133–37. doi: 10.1001/archsurg.2009.207. [DOI] [PubMed] [Google Scholar]
  53. Semel ME, Resch S, Haynes AB, Funk LM, Bader A, Berry WR, Weiser TG, Gawande AA. Adopting a Surgical Safety Checklist Could Save Money and Improve the Quality of Care in U.S. Hospitals. Health Affairs. 2010;29(9):1593–99. doi: 10.1377/hlthaff.2009.0709. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Sims J. Flight Recorder: The Witness Box. The Independent. 2010 September 22. Available at http://www.independent.co.uk/life-style/gadgets-and-tech/features/flight-recorder-the-witness-box-2085594.html (accessed December 15, 2010) [Google Scholar]
  55. Söder JCM. Reducing Traffic Injuries Resulting from Alcohol Impairment. Brussels: 1991. Evaluatie Onderzoek VVN-Campagne Alcohol in Het Verkeer 1986–1991. VK 91-10. Haren: Traffic Research Centre, University of Groningen. Quoted in European Transport Safety Council. 1995. Available at http://www.etsc.eu/documents/Reducing%20traffic%20injuries%20resulting%20from%20alcohol%20impairment.pdf (accessed December 15, 2010) [Google Scholar]
  56. Vats A, Vincent CA, Nagpal K, Davies RW, Darzi A, Moorthy K. Practical Challenges of Introducing WHO Surgical Checklist: UK Pilot Experience. BMJ. 2010;340:b5433. doi: 10.1136/bmj.b5433. [DOI] [PubMed] [Google Scholar]
  57. Waring J, Harrison S, McDonald R. A Culture of Safety or Coping? Ritualistic Behaviours in the Operating Theatre. Journal of Health Services Research and Policy. 2007;12(supp. 1):3–9. doi: 10.1258/135581907780318347. [DOI] [PubMed] [Google Scholar]
  58. WHO (World Health Organization) Implementation Manual Surgical Safety Checklist. 1st ed. 2008a. Available at http://www.who.int/entity/patientsafety/safesurgery/tools_resources/SSSL_Manual_finalJun08.pdf (accessed December 15, 2010) [Google Scholar]
  59. WHO (World Health Organization) Surgical Safety Checklist. 1st ed. 2008b. Available at http://www.who.int/entity/patientsafety/safesurgery/tools_resources/SSSL_Checklist_finalJun08.pdf (accessed December 15, 2010) [Google Scholar]
  60. WHO (World Health Organization) The WHO Surgical Safety Checklist: Adaptation Guide. 2008c. Available at http://www.who.int/patientsafety/safesurgery/checklist_adaptation.pdf (accessed December 15, 2010) [Google Scholar]
  61. Wiese JG, Shlipak MG, Browner WS. The Alcohol Hangover. Annals of Internal Medicine. 2002;132(11):897–902. doi: 10.7326/0003-4819-132-11-200006060-00008. [DOI] [PubMed] [Google Scholar]
  62. Williamson M. David Warren: Inventor and Developer of the “Black Box” Flight Data Recorder. The Independent. 2010 July 31. Available at http://www.independent.co.uk/news/obituaries/david-warren-inventor-and-developer-of-the-black-box-flight-data-recorder-2040070.html (accessed December 15, 2010) [Google Scholar]
  63. Woods DD. Essential Characteristics of Resilience. In: Hollnagel E, Woods DD, Leveson N, editors. Resilience Engineering: Concepts and Precepts. Farnham: Ashgate; 2006. pp. 21–34. [Google Scholar]
  64. Yesavage JA, Leirer VO. Hangover Effects on Aircraft Pilots 14 Hours after Alcohol Ingestion: A Preliminary Report. American Journal of Psychiatry. 1986;143:1546–50. doi: 10.1176/ajp.143.12.1546. [DOI] [PubMed] [Google Scholar]

Articles from The Milbank Quarterly are provided here courtesy of Milbank Memorial Fund

RESOURCES