Abstract
Safety in aviation has often been compared with safety in healthcare. Following a recent article in this journal, the UK government set up an Independent Patient Safety Investigation Service, to emulate a similar well-established body in aviation. On the basis of a detailed review of relevant publications that examine patient safety in the context of aviation practice, we have drawn up a table of comparative features and a conceptual framework for patient safety. Convergence and divergence of safety-related behaviours across aviation and healthcare were derived and documented. Key safety-related domains that emerged included Checklists, Training, Crew Resource Management, Sterile Cockpit, Investigation and Reporting of Incidents and Organisational Culture. We conclude that whilst healthcare has much to learn from aviation in certain key domains, the transfer of lessons from aviation to healthcare needs to be nuanced, with the specific characteristics and needs of healthcare borne in mind. On the basis of this review, it is recommended that healthcare should emulate aviation in its resourcing of staff who specialise in human factors and related psychological aspects of patient safety and staff wellbeing. Professional and post-qualification staff training could specifically include Cognitive Bias Avoidance Training, as this appears to play a key part in many errors relating to patient safety and staff wellbeing.
Keywords: Medical error, patient safety, patients
Comparisons have often been made between safety management in aviation and healthcare.1,2 This emulation is in the context of major achievements in the field of aviation – despite the number of worldwide flight hours doubling over the past 20 years (from approximately 25 million in 1993 to 54 million in 2013), the number of fatalities has fallen from approximately 450 to 250 per year.3 This stands in comparison to healthcare, where in the USA alone there are an estimated 200,000 preventable medical deaths every year, which amounts to the equivalent of almost three fatal airline crashes per day. As the renowned airline pilot Chesley Sullenberger noted,4 if such a level of fatalities was to happen in aviation, airlines would stop flying, airports would close, there would be congressional hearings and there would be a presidential commission. No one would be allowed to fly until the problem had been solved.
In this article, we present a comprehensive review of similarities and differences between aviation and healthcare and the application to healthcare of lessons learned in aviation.
Aviation versus healthcare: how comparable?
Table 1 summarises how aviation compares with healthcare. Some authors have expressed reservations about the analogies between aviation and healthcare,5–9 and others have noted that industries such as mining10 and metal manufacture11 may provide just as valuable safety lessons as aviation. Amalberti et al.12 have pointed to some inbuilt features of healthcare which may mean that it can never be as ultrasafe as industries such as aviation. In contrast to aviation, Reason13 has referred to the close personal contact in healthcare and to the ‘lethal convergence of benevolence’, which may result in the bypassing of protocols, barriers and safeguards, often with patients’ best interests at heart.
Table 1.
Distinctive features of aviation and healthcare.
Review framework
We provide a narrative review of the application of aviation-based human factors interventions in healthcare. As our guiding framework, we have adapted the models developed by Helmreich14 and by Lawton et al.15 (Figure 1).
Figure 1.
This Figure provides a framework for the approach offered in this paper. It is adapted from the models described by Helmreich14 and by Lawton et al.15 We distinguish between background ‘Latent’ factors and more current, situational ‘Active’ factors. Active failures include lapses, mistakes and violations. We also allow for an analysis of adverse events, but we adopt the more neutral term ‘Performance Analysis’ to allow for the analysis of high levels of excellence, so that lessons can be learned from such ‘positive’ behaviours as well as from ‘negative’ behaviours, which have traditionally been the primary focus of investigations.
Latent factors and organisational culture
At least three safety-related cultural attributes appear to distinguish aviation from healthcare. Aviation has much more of a blame-free culture in the case of reporting and owning up to safety incidents; in healthcare, there more often appear to be competing demands between economic factors and safety, with financial pressures and considerations constantly making news headlines; and safety permeates all levels of the business of airlines, whereas in healthcare it is still regarded as the priority of some, not the obligation of all. What is common to both industries is the concept of professionalism, but paradoxically this may sometimes lend itself to corners being cut and social fragmentation between professional groups.16
A safety culture toolkit developed in the UK after railway accidents identified the following key features – leadership, two-way communication, employee involvement, learning culture and attitude towards blame.17 It is widely accepted that along these dimensions the organisational culture in aviation has changed dramatically over the past 30–40 years, but in healthcare organisations such as the NHS in the UK there is still the feeling that hierarchies and fear of speaking out persist and that the lack of accountability for those who have transgressed, together with the absence of any apology, perpetuates these cultural limitations.18 Sullenberger19 has referred to an era in aviation where pilots ‘acted like gods with a little “g” and cowboys with a capital “C”’. Sadly, some of this culture would still appear to remain in parts of healthcare. As Timmons et al.20 have argued, full and successful implementation of human factors initiatives may be stalled if the culture in an organisation is not accommodating. They found that a six-day human factors training course taken by emergency and perioperative staff appeared to be valued and considered helpful by staff who took part, but that implementation of behavioural changes on the ground was stalled by long-standing cultural and organisational issues. Sullenberger4 has powerfully argued for patient safety to be embedded in board and financial decision making in healthcare – and noted,
Safety should be part and parcel of everything we do … Every decision that is made, whether it’s administrative, budgetary, or otherwise, should take safety implications into account because there is such an important business case for doing so … What we have right now, quite frankly, in healthcare are islands – visible islands of excellence in a sea of invisible failures, with risk lurking just below the waterline. We need to widen those islands of excellence. We need to connect these islands with more dry land. We need to address these areas of risk. That is going to require transparency, it’s going to require data, it’s going to require personal story telling, and it’s going to require effective use of health IT.
Implicit in healthcare comparisons with other safety-critical industries is the message that staff wellbeing, morale and motivation are key to the safe, successful and profitable delivery of a service and of a supportive organisational culture. As Paul O’Neill, former US Treasury Secretary and CEO of the metal company Alcoa, stated, ‘I don’t think you can be habitually excellent at everything unless you begin with caring about your workers’.11 Staff may suffer distress and ill-health for a variety of reasons, ranging from distress following major complications of a treatment they have carried out21 to suicide in the context of undergoing investigations by a regulatory body.22 The Francis Report into whistleblowing in the NHS18 referred to many cases of whisteblowers and others being badly treated, and sometimes being subject to ‘kangaroo courts’ by NHS management, with no allowance for Plurality, Independence and Expertise principles to ensure fairness. Invariably, such cases may not only impinge on patient safety and staff wellbeing but may also involve significant expenditure from public funds coupled with financial hardship to staff who have to pay for their own legal costs. Legal settings, such as employment tribunals, are not interested in the implications of such cases for patient safety and staff wellbeing, and may sometimes be seen as weighted in favour of NHS employers, who have financial resources to maximise a legal case, to take an unfavourable ruling to a higher court, etc. In recent years, in the UK health service there have been prominent cases of NHS staff who have suffered as a result of extreme stress – including Eva Clark, the nurse at Mid-Staffordshire hospital, who committed suicide after being bullied at work23 and Jacintha Saldanha, who committed suicide in December 2012 after suffering the humiliation of mistakenly answering a hoax phone call, pretending to be from the Queen, to the ward where the Duchess of Cambridge was a patient.24 In both of these cases, the level of support that should have been provided to staff was apparently absent.
The Public Administration Select Committee of the UK House of Commons recommended25 that the government adopt the proposal set out by Macrae and Vincent26 for an independent Patient Safety Investigation Agency and this recommendation has been accepted by the government. When adverse events in healthcare seriously affect staff wellbeing, morale and motivation – regardless of whether the origins are poor patient outcome, poor management, etc. – such events need to be given the same urgency as when patients suffer. In line with the above message propounded by Paul O’Neill, it is worth considering whether, in addition to an Independent Patient Safety Investigation Service, a parallel body is put in place, an Independent Staff Investigation and Support Service, so that lessons can be learned when healthcare staff suffer in major ways in the clinical workplace, and so that staff support mechanisms can be readily put in place.27 The current UK Health Secretary is quoted as stating in June 2015, ‘The performance of the NHS is only as good as the support we give to the staff’ (https://abetternhs.wordpress.com/2015/06/10/supervision/), and this needs to be translated into practical changes on the ground.
Active factors
Checklists
The need for checklists is based on the premise that in the execution of procedures the human brain may be subject to three key cognitive limitations: we may forget to retrieve one of a number of steps in a procedure; we may retrieve a step but for one reason or another (e.g. distraction, fatigue) may not remember to carry it out; or we may retrieve the step, remember to carry it out, but execute the action incorrectly. In aviation, there is usually much more in terms of procedural documentation of immediate relevance, such as in Airline Operations Manuals or Quick Reference Handbooks, and Toff28 has proposed the availability of similar systems in healthcare. In aviation, there appear to be three forms of checklists, one for simple, routine operations; one for more complex operations; and one for emergency procedures (where the checklist may be ‘do-verify later’ rather than ‘read-verify’). Checklists also vary between types of aircraft. Checklists have traditionally been a more integral part of aviation workflow, whereas in medical disciplines such as surgery, they have been a more recent innovation. To this extent, they may be seen to represent a form of ‘time out’ during an established routine. Medical applications of checklists have included the fields of surgery and infection control,29–31 and there have also been attempts to reap the benefits of checklists to help avoid errors in medical diagnosis.32,33
Catchpole et al.34 used both aviation and Formula 1 pit-stop expertise to inform the use of checklists to ensure smooth handover between surgery and intensive care. Low et al.35 focused on the application of checklists on key transition points in surgery, ‘flow checklists’, so as to ensure that high-risk points such as departure from operating room do not suffer from lapses in procedures being executed. Wadhera et al.36 showed how such an approach, if applied to key stages of cardiovascular surgery with high cognitive demands, can yield benefits. In a similar vein, Federwisch et al.37 incorporated staff shift change-over times with a form of checklist by incoming and outgoing nurses to note items such as identification bracelet and IV catheter sites. Schelkun38 extended the checklist concept to implementing a form of aviation plan in surgical settings – plan the operation taking into account the patient, the injury/illness, and the goals of the operation; decide on details of the operation, noting surgical approach, equipment needed, etc.; put together a surgical equipment checklist; and ensure good communication at every stage of the procedure, including debriefing afterwards to review what went well and what could have been improved. On the more cautious side, Clay-Williams and Colligan39 argued that there is variable evidence on the efficacy of checklists in healthcare, that checklists may not be applicable in more complex clinical settings (cf.6), and that over-reliance on checklists may detract from other forms of safety. In a similar vein, Catchpole and Russ40 argued that a checklist is a ‘complex socio-technical intervention that requires attention to design, implementation and basic skills required for the task’, and that checklists may succeed and fail in healthcare for a variety of reasons.
Training
Training in aviation and training in fields such as surgery have been compared, with aviation training and competency assessment generally considered to be more rigorous and more regimented.41–43 Initial pilot training normally takes around three years, and becoming a captain will usually take around a further 10 years. Training to become a doctor usually takes around five years, with generally a further 10 years before becoming a consultant. Keeping up with the explosion of knowledge in healthcare is daunting but necessary, even for experienced consultants, but this is not so much the case in aviation. Pilot training is in a variety of settings, on the ground, in an aircraft and always in a simulator. Simulation has also been extended to teamwork and debriefing. Simulators are overall less used in medical training – or they are used less systematically. Pilots have to undergo proficiency checks, usually in a simulator, every six months. Doctors in the UK now undergo re-validation every five years. Pilot training is broken down into core competency skills, and this form of behavioural analysis of the skill training needs is becoming more common in healthcare. Non-technical skills, such as leadership, team working, decision making, situational awareness, managing stress and coping with fatigue, are extensively taught in pilot training, with well-established protocols for behavioural measurements of crew while in flight.44 It is only in recent years that behavioural marker systems that capture the non-technical skills of healthcare professionals have been developed in medicine, with some areas such as anaesthesia and surgery particularly embracing their value.28,45–47 When unexpected or emergency situations arise, both doctors and pilots will benefit from a commitment to life-long learning, a good understanding of disasters and how to deal with them and an ability to think flexibly.48,49 What is more, the personality of the pilot has been considered as part of determining risk-profiles during training, but as yet this has not happened in medicine.50 In surgery, Lewis et al.51 have argued that there may persist macho and ‘heroism’ personalities in surgeons, where improvising or finding a solution over-rides seeking or heeding advice from others in a team.
Crew resource management and sterile cockpit
Crew resource management essentially refers to how members of a team interact and are aware of factors that influence performance. Seager et al.52 noted five features of crew resource management – cooperation, leadership, workload management, situational awareness and decision making. The ‘team’ in aviation may primarily be just the pilot and co-pilot, with a degree of hierarchy between the two, whereas the team in surgery or other medical settings may be more diverse, with more distinct roles and with a variable degree of hierarchy. Communication failures may be more likely to occur in healthcare than in aviation cockpit settings for a variety of reasons, including the wide range of staff and distractions/interruptions that are prevalent in many clinical interactions. In healthcare, there is probably a wider range of information, with the reliability and dynamic nature of such information differing from that in aviation. In addition, the effects of introducing aviation-style teamwork training into medicine may vary according to the speciality,53 and may be determined in part by organisational and attitudinal factors.54 Although there are usually clear differences in knowledge, skills and experience between a pilot and co-pilot, safety in aviation is encouraged to take priority over deference, with simple measures such as the use of first names in interactions.51 This is not common practice in healthcare, since it is inherently hierarchical, with resultant barriers to assertiveness.55 As Ornato and Peberdy56 argued, some healthcare settings may well benefit from the implementation of aviation procedures such as cross-checks, read-back and ‘two challenge rule’ (another team member is allowed to over-ride someone if that person has been challenged twice but has failed to respond appropriately). Seager et al.52 have noted features of crew resource management which could be readily applied to healthcare settings such as the operating theatre, and these include peer monitoring, briefings, defining operating procedures and standards, recognition of fatigue as a factor in performance, regular ‘check rides’ in the form of assessment in a simulator, blame-free reporting culture, use of checklists and application of the principle of a ‘sterile cockpit’. Briefings before and after surgery may be particularly helpful in both encouraging members in the team to stand back and appraise procedures, and also to encourage mutual respect and team bonding between the members.57–59 Good communication within crew resource management involves respect for each other’s roles, and also simple measures such as direct eye contact, introducing each other, using non-judgemental words and putting safety before self-esteem.
A ‘sterile cockpit’, which essentially refers to an environment free of unnecessary distractions, may improve patient safety if applied at key points in clinical procedures.36 A distraction-free environment is especially important when a critical or complex procedure is being carried out, whether it be an intricate stage of a surgical procedure in healthcare or taking off/landing in aviation. There is a high frequency of distractions and interruptions in the work of healthcare professionals,60 with a negative impact on patient safety.61,62 A number of studies, such as that by Federwisch et al.,37 have successfully applied the sterile cockpit idea to medication delivery, where ‘DO NOT DISTURB’ tabards or signs are visible during medication rounds, so as to reduce the number of distractions. When emergencies arise in a cockpit or in a surgical setting, multiple alarms may be activated, and the ability to notice and respond to key alarms, and to think flexibly, are key for safe outcomes – analogies can readily be made here between airline and medical settings.49,63–65
Performance analysis
Investigation of incidents
In the UK, an investigation report by the Air Accidents Investigation Board can involve at least several months of work, with field investigations where appropriate, and detailed background information sought on the equipment and individuals involved. The usual structure of an Air Accidents Investigation Board Report is as follows:
There is firstly a factual summary of the key features of the incident which includes detailed information about the aircraft and the pilot.
- There follows a synopsis of the report:
- An exposition of all the relevant facts of the incident, often with graphs and photographs
- An analysis of the data gathered with a view to understanding what could have contributed to the incident
- Conclusions and safety recommendations
Woloshynowych et al.66 have documented the types of investigations and analyses that are carried out for critical incidents and adverse events in healthcare, and studied 138 papers that provided relevant evidence. They cited systems such as the Australian Incident Monitoring System, the Critical Incident Technique, Significant Event Auditing, Root Cause Analysis, the Organizational Accident Causation Model and the Comparison with Standards approach. They concluded that:
There was little or no information on the training of investigators, how the data was extracted or any information on quality assurance for data collection and analysis … In most papers, there was little or no discussion of implementation of any changes as a result of the investigations. (p. iii)
Macrae and Vincent26 have pointed to major limitations in the quality of investigations and monitoring of the implementation of recommendations for improvement in the case of healthcare compared with other industries such aviation. They have argued for an independent investigations agency in the NHS, comparable to the Air Accidents Investigation Board, and to its parallel body in the USA, the National Transportation Safety Board, a recommendation that has been accepted by the UK government. In the USA, a specific aviation safety body was set up in 1998 to bring together stakeholders in government and industry, and was called the Commercial Aviation Safety Team. This team identifies top safety areas through analysis of accident and incident data; it charters joint teams of experts to develop methods to fully understand the chain of events leading to accidents; and it identifies and implements high-leverage safety interventions to reduce the fatality rate in these areas. Pronovost et al.67 argued for a similar body to be set up within healthcare.
Reporting of incidents
Reporting of incidents has many dimensions, which include the extent to which reporting is blame-free; the readiness to produce a report; the documentation of near-misses; the particular reports which are investigated; the format, investigation and dissemination of reports; the body that investigates and reports on serious incidents; positive or negative consequences for those who have contributed to or highlighted an adverse event; and the resulting action plans. In healthcare, Morbidity & Mortality meetings, where they happen, are often a forum where problematic cases are reported and discussed, and where deaths and serious complications ought to be reviewed to promote learning and improvements in practice. In terms of national reporting, in the UK there is the National Reporting and Learning Service, which is one of the largest reporting systems of its type in the world. A key criticism of reporting within healthcare is that the link from error to learning has often not materialised, and few mechanisms are put in place to ensure that changes have been implemented and errors are not repeated. In aviation, a major incident is often followed by the causes being simulated and becoming part of training, and particular equipment design, procedural or training recommendations being put in place, such as happened after the 2009 Air France plane disaster.68
In clinical practice, adverse events such as complications are often considered to be routine, and thus may not be reported. Apart from blame, some doctors may not report near-miss adverse events due to a sense of pride or self-esteem, or due to fear of litigation. There may also be lack of time for reporting and high workload, lack of understanding why reporting is needed, concerns that no beneficial action will follow and in some countries lack of confidentiality or absence of adequate reporting systems in place.69,70 As has been found in aviation,71 near-misses may often be as instructive as adverse events.72 It may be worth translating into healthcare the aviation system of immunity from disciplinary action for the reporting of adverse incidents, apart from cases of gross or wilful negligence.73 The system used in aviation, Confidential Human Factors Incident Reporting Programme, has now been emulated in the field of surgery – Confidential Reporting System in Surgery – and has been found to work well.74 Similar schemes, which also encourage the reporting of near-misses, have adopted user-friendly online reporting formats.75
Ferroli et al.76 described how, with the support of aviation specialists, they designed a Patient Incident Reporting System form which was used to record near-misses in a neurosurgical setting. They analysed 14 such incidents and were able to distinguish different types of failures – human factors (the most common), technological factors, organisational factors and procedural factors. Their reporting and analysis system appeared to encourage a no-blame reporting culture. Clinicians rarely keep an audio or video record of their interactions with patients, and the introduction of such recordings is a matter of debate.77 However, in aviation, ‘black boxes’ – which record flight data and cockpit conversations – are carried in all commercial aircraft. The idea of documenting all safety failures, however minor, was also highlighted by Bowermaster et al.,78 who likened their approach to that of using the ‘black box’ principle in aviation (cf.79). Helmreich14 has described a ‘Line Operations Safety Audit’ that involved expert observers in the cockpit during normal flights. As well as potential safety threats, such as mountains and adverse weather, types of human error were documented, and fell into several groups – violations (e.g. conscious failure to adhere to procedures), procedural errors (e.g. erroneous entry into flight computer), errors in communication (e.g. misunderstood altitude clearance), lack of proficiency (e.g. inability to program computer-based device), and poor decisions (e.g. decision to navigate through adverse weather). There is scope for emulating aviation by including direct observation of clinical staff as part of routinely evaluating quality of care.80
Implications for healthcare
There are many opportunities for safety measures and concepts in high-risk industries such as aviation to be considered for adoption in healthcare, with a need for actions to be proactive and generative, rather than solely reactive to adverse events.81 A focus on systems rather than individuals, and an examination of ‘latent risk factors’ that may result in adverse events, are other lessons that we can learn from aviation.82,83 Naturally, adopting measures from aviation without adapting them for the unique healthcare environment would be unwise, but where this has been done in a systematic but flexible way, clear benefits have been found.84 Issues such as privacy and patient confidentiality are particularly important in healthcare. In the finance-driven world of healthcare, any safety improvements should ideally have a good economic argument to accompany them, but – as Lewis et al.51 have argued – making such a case should be relatively easy to do, especially bearing in mind the huge litigation costs of clinical negligence claims.
As happens in safety-critical industries such as aviation, human factors training and related psychological training in patient safety and staff wellbeing need to be an integral part of all NHS staff work-plans, from the board-room to the bedside, with dedicated human factors/patient safety psychologists in post. Most major airlines have well-established departments that are staffed by a large team of psychologists/human factors specialists, while this is the exception rather than the rule for major NHS hospitals. The psychology of patient safety and staff wellbeing should be an integral part of the professional training curricula of healthcare staff, staff selection, induction, appraisal, revalidation, merit awards and Continuing Professional Development, so as to gradually develop the appreciation within the healthcare community of the impact of human factors, psychological variables and non-technical skills on safety. Cognitive Bias Avoidance Training could form a key component of such training curricula in view of the key part cognitive decision making plays in a number of adverse incidents,85 and the potential effectiveness of Cognitive Bias Avoidance Training for reducing diagnostic errors.86,87 Key bodies, such as NHS England, the Care Quality Commission and the Department of Health, as well as regulatory bodies such as the General Medical Council, should have resident expertise in human factors and the psychology of safety, together with an ethos that embraces and rewards clinical excellence (cf.88–90).
In a recent television interview, Captain Chesley Sullenberger, the senior crew member of the Hudson River aircraft incident, is reported as stating,
We have purchased at great cost lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting those lessons and have to relearn them.
It behoves all of those involved in healthcare delivery to have this same urgency of purpose.
Acknowledgments
We are grateful to the following for reading and providing helpful comments on the manuscript: Dr Veronica Bradley, Dr Ciaran O'Driscoll and Ms Dalia Levi.
Declarations
Competing interests
The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health. NS delivers team and patient safety assessment and training interventions to hospitals in England and internationally via London Safety and Training Solutions Ltd, which he directs. NK is a member of the Royal College of Surgeons' Confidential Reporting System for Surgery (CORESS) advisory committee. The remaining authors report no competing interests in respect of this study.
Funding
NS's research was supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC), South London at King's College Hospital NHS Foundation Trust. NS is a member of King's Improvement Science, which is part of the NIHR CLAHRC South London and comprises a specialist team of improvement scientists and senior researchers based at King's College London. Its work is funded by King's Health Partners (Guy's and St Thomas' NHS Foundation Trust, King's College Hospital NHS Foundation Trust, King's College London and South London and Maudsley NHS Foundation Trust), Guy's and St Thomas' Charity, the Maudsley Charity and the Health Foundation. TS's work represents independent research supported by the UK's National Institute for Health Research (NIHR) Imperial Patient Safety Translational Research Center (RD PSC 79560). The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.
Ethical approval
Not required
Guarantor
NK
Contributorship
NK proposed the idea for this paper and produced an initial draft. AP and TS worked on the next draft of the paper. TR and NS contributed to subsequent drafts of the paper. All authors read and made suggestions in respect of later drafts of the paper.
Provenance
Not commissioned; peer-reviewed by Philemon Gukop
References
- 1.Thomas E, Helmreich R. Will airline safety models work in healthcare? In: Rosenthal M, Sutcliffe K. (eds). Medical error: What do we know? What do we do?, San Francisco: Jossey Bass, 2002, pp. 217–234. [Google Scholar]
- 2.Gordon S, Mendenhall P, O’Connor B. Beyond the checklist. What else health care can learn from aviation teamwork and safety, Ithaca: Cornell University Press, 2013. [Google Scholar]
- 3.Boeing Commercial Airlines. Statistical summary of commercial jet airplane accidents: worldwide operations 1959–2014, Seattle, Washington State: Aviation Safety, Boeing Commercial Airlines, 2014. [Google Scholar]
- 4.Sullenberger C, Chesley B. ‘Sully’ Sullenberger: making safety a core business function. Healthc Financ Manage 2013; 67: 50–54. [PubMed] [Google Scholar]
- 5.Rissmiller R. Patients are not pilots and doctors are not pilots. [Letter]. Crit Care Med 2006; 34: 2869–2869. [DOI] [PubMed] [Google Scholar]
- 6.Rogers J. Have we gone too far in translating ideas from aviation to patient safety? Yes. BMJ 2011; 342: c7309–c7309. [Google Scholar]
- 7.Gaba D. Have we gone too far in translating ideas from aviation to patient safety? No. BMJ 2011; 342: c7309–c7309. [Google Scholar]
- 8.Reader T, Cuthbertson B. Teamwork and team training in the ICU: where do the similarities with aviation end? Crit Care 2011; 15: 1–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Ricci R, Panos A, Lincoln J, et al. Is aviation a good model to study human errors in health care? Am J Surg 2012; 203: 798–801. [DOI] [PubMed] [Google Scholar]
- 10.Shaw J, Calder K. Aviation is not the only industry: healthcare could look wider for lessons on patient safety. Qual Saf Health Care 2008; 17: 314–314. [DOI] [PubMed] [Google Scholar]
- 11.Pillow M, O’Neill P. An interview with Paul O’Neill. Jt Comm J Qual Patient Saf 2014; 40: 428–432. [DOI] [PubMed] [Google Scholar]
- 12.Amalberti R, Auroy Y, Berwick D, et al. Five systems barriers to achieving ultrasafe health care. Ann Intern Med 2005; 142: 756–764. [DOI] [PubMed] [Google Scholar]
- 13.Reason J. A life in error. From little slips to big disasters, Farnham: Ashgate, 2013. [Google Scholar]
- 14.Helmreich R. On error management: lessons from aviation. BMJ 2000; 320: 781–785. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lawton R, McEachan R, Giles S, et al. Development of an evidence-based framework of factors contributing to patient safety incidents in hospital settings: a systematic review. BMJ Qual Saf 2012; 21: 369–380. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Holtman M. Paradoxes of professionalism and error in complex systems. J Biomed Inform 2011; 44: 395–401. [DOI] [PubMed] [Google Scholar]
- 17.Health and Safety Executive. A review of safety culture and safety climate literature for the development of the safety culture inspection toolkit, London: Her Majesty’s Stationery Office, 2005. [Google Scholar]
- 18.Francis R. Freedom to speak up, London: UK Department of Health, 2015. [Google Scholar]
- 19.Sullenberger C. Foreword. In: Gordon S, Mendenhall P, O’Connor B. (eds). Beyond the checklist. What else health care can learn from aviation teamwork and safety, Ithaca: Cornell University Press, 2013, pp. vii–vii. [Google Scholar]
- 20.Timmons S, Baxendale B, Buttery A, et al. Implementing human factors in clinical practice. Emerg Med J 2015; 32: 368–372. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Pinto A, Faiz O, Bicknell C, et al. Acute traumatic stress among surgeons after major surgical complications. Am J Surg 2014; 208: 642–647. [DOI] [PubMed] [Google Scholar]
- 22.Horsfall S. Doctors who commit suicide while under GMC fitness to practice investigation, London: General Medical Council, 2014. [Google Scholar]
- 23.Francis R. Public inquiry into Mid-Staffordshire Hospital NHS Foundation Trust. Volume 3, London: Her Majesty’s Stationery Office, 2013, pp. 1510–1510. [Google Scholar]
- 24.BBC News, December 12, 2012. http://www.bbc.co.uk/news/uk-20696610 (accessed 2 November 2015).
- 25.Public Administration Select Committee. Investigating clinical incidents in the NHS, London: House of Commons, 2015. [Google Scholar]
- 26.Macrae C, Vincent C. Learning from failure: the need for independent safety investigation in healthcare. J Roy Soc Med 2014; 107: 439–443. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Kapur N. The NHS needs a Staff Support Commission. Health Serv J 2014; 124: 22–23. [PubMed] [Google Scholar]
- 28.Toff N. Human factors in anaesthesia: lessons from aviation. Br J Anaesth 2010; 105: 21–25. [DOI] [PubMed] [Google Scholar]
- 29.Haynes A, Berry W, Gawande A. What do we know about the safe surgery checklist now? [Editorial]. Ann Surg 2015; 261: 829–830. [DOI] [PubMed] [Google Scholar]
- 30.Pronovost P, Goeschel C, Colantuoni E, et al. Sustaining reductions in catheter related bloodstream infections in Michigan intensive care units: observational study. BMJ 2010; 340: c309–c309. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.The World Health Organization. The World Health Organization Surgical Safety Checklist 2008. http://www.who.int/patientsafety/safesurgery/ss_checklist/en/ (accessed 2 November 2015).
- 32.Ely J, Graber M, Croskerry P. Checklists to reduce diagnostic errors. Acad Med 2011; 86: 307–313. [DOI] [PubMed] [Google Scholar]
- 33.Winters B, Aswani M, Pronovost P. Reducing diagnostic errors: another role for the checklist? Acad Med 2011; 86: 279–281. [DOI] [PubMed] [Google Scholar]
- 34.Catchpole K, De Leval M, McEwan A, et al. Patient handover from surgery to intensive care: using Formula 1 pit-stop and aviation models to improve patient safety and quality. Pediatr Anesth 2007; 17: 470–478. [DOI] [PubMed] [Google Scholar]
- 35.Low D, Reed M, Geiduschek J, et al. Striving for a zero-error patient surgical journey through adoption of aviation-style challenge and response flow checklists: a quality improvement project. Pediatr Anesth 2013; 23: 571–578. [DOI] [PubMed] [Google Scholar]
- 36.Wadhera R, Parker S, Burkhart H, et al. Is the ‘sterile cockpit’ concept applicable to cardiovascular surgery critical intervals or critical events? The impact of protocol-driven communication during cardiopulmonary bypass. J Thorac Cardiovasc Surg 2010; 139: 312–319. [DOI] [PubMed] [Google Scholar]
- 37.Federwisch M, Ramos H, Adams S. The sterile cockpit: an effective approach to reducing medication errors? Am J Nurs 2014; 114: 47–55. [DOI] [PubMed] [Google Scholar]
- 38.Schelkun S. Lessons from aviation safety: ‘plan your operation – operate your plan’. Patient Saf Surg 2014; 8: 1–3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Clay-Williams R, Colligan L. Back to basics: checklists in aviation and healthcare. BMJ Qual Saf 2015; 24: 428–431. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Catchpole K, Russ S. The problem with checklists. BMJ Qual Saf 2015; 24: 1–5. [DOI] [PubMed] [Google Scholar]
- 41.Sommer K-J. Pilot training: what can surgeons learn from it? Arab J Urol 2014; 12: 32–36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Schwaitzberg S, Godinez C, Kavic S, et al. Training and working in high-stakes environments: lessons learned and problems shared by aviators and surgeons. Surg Innov 2009; 16: 187–195. [DOI] [PubMed] [Google Scholar]
- 43.Dekker S. Patient safety. A human factors approach, London: CRC Press, 2011. [Google Scholar]
- 44.Harris D. Improving aircraft safety. Psychologist 2014; 27: 90–94. [Google Scholar]
- 45.Flin R, O’Connor P and Crichton M (eds). Safety at the sharp end. Aldershot: Ashgate, 2008.
- 46.Musson D. Teamwork in medicine: crew resource management and lessons from aviation. In: Croskerry P, Cosby K, Schenkel S, et al. (eds). Patient safety in emergency medicine, Philadelphia: Lippincott Williams and Wilkins, 2009, pp. 188–194. [Google Scholar]
- 47.Hull L, Arora S, Kassab E, et al. Observational teamwork assessment for surgery: content validation and tool refinement. J Am Coll Surg 2011; 212: 234–243. [DOI] [PubMed] [Google Scholar]
- 48.Assael L. ‘Sully’ Sullenberger and the miracle on the Hudson”: a lesson in heroism for oral and maxillofacial surgeons. J Oral Maxillofac Surg 2009; 67: 711–712. [DOI] [PubMed] [Google Scholar]
- 49.Bhangu A, Bhangu S, Stevenson J, et al. Lessons for surgeons from the final moments of Air France Flight 447. World J Surg 2013; 37: 1185–1192. [DOI] [PubMed] [Google Scholar]
- 50.Sommer K. Learning from errors. Applying aviation safety concepts to medicine. Eur Urol 2013; 64: 680–681. [DOI] [PubMed] [Google Scholar]
- 51.Lewis G, Vaithianathan R, Hockey P, et al. Counter-heroism, common knowledge, and ergonomics: concepts from aviation that could improve patient safety. Milbank Q 2011; 89: 4–38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Seager L, Smith D, Patel A, et al. Applying aviation factors to oral and maxillofacial surgery – the human element. Br J Oral Max Surg 2013; 51: 8–13. [DOI] [PubMed] [Google Scholar]
- 53.McCulloch P, Mishra A, Handa A, et al. The effects of aviation-style non-technical skills on technical performance and outcome in the operating theatre. Qual Saf Health Care 2009; 18: 109–115. [DOI] [PubMed] [Google Scholar]
- 54.Catchpole K, Dale T, Hirst D, et al. A multicentre trial of aviation-style training for surgical teams. J Patient Saf 2010; 6: 180–186. [DOI] [PubMed] [Google Scholar]
- 55.Lyndon A. Communication and teamwork in patient care: how much can we learn from aviation. J Obstet Gynecol Neonatal Nurs 2006; 35: 538–546. [DOI] [PubMed] [Google Scholar]
- 56.Ornato J, Peberdy M. Applying lessons from commercial aviation safety and operations to resuscitation. Resuscitation 2014; 85: 173–176. [DOI] [PubMed] [Google Scholar]
- 57.Mathieu J, Heffner T, Goodwin G, et al. The influence of shared mental models on team process and performance. J Appl Psychol 2000; 85: 273–283. [DOI] [PubMed] [Google Scholar]
- 58.Gillespie BM, Chaboyer W, Longbottom P, et al. The impact of organisational and individual factors on team communication in surgery: a qualitative study. Int J Nurs Stud 2010; 47: 732–741. [DOI] [PubMed] [Google Scholar]
- 59.Leonard M, Graham S, Bonacum D. The human factor: the critical importance of effective teamwork and communication in providing safe care. Qual Saf Health Care 2004; 13(Suppl 1): i85–i90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Rivera-Rodriguez AJ, Karsh B-T. Interruptions and distractions in healthcare: review and reappraisal. Qual Saf Health Care 2010; 19: 304–312. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Sevdalis N, Undre S, McDermott J, et al. Impact of intraoperative distractions on patient safety: a prospective descriptive study using validated instruments. World J Surg 2014; 38: 751–758. [DOI] [PubMed] [Google Scholar]
- 62.Wheelock A, Suliman A, Wharton R, et al. The impact of operating room distractions on stress, workload, and teamwork. Ann Surg 2015; 261: 1079–1084. [DOI] [PubMed] [Google Scholar]
- 63.Dehais F, Causse M, Regis N, et al. Failure to detect critical auditory alerts in the cockpit: evidence for inattentional deafness. Hum Factors 2014; 56: 631–644. [DOI] [PubMed] [Google Scholar]
- 64.Edworthy J. Alarms are still a problem [Editorial]. Anaesthesia 2013; 68: 791–803. [DOI] [PubMed] [Google Scholar]
- 65.De Man F, Greuters S, Boer C, et al. Intra-operative monitoring – many alarms with minor impact. Anaesthesia 2013; 68: 804–810. [DOI] [PubMed] [Google Scholar]
- 66.Woloshynowych M, Rogers S, Taylor-Adams S, et al. The investigation and analysis of critical incidents and adverse events in healthcare. Health Technol Assess 2005; 9: 1–143. [DOI] [PubMed] [Google Scholar]
- 67.Pronovost P, Goeschel C, Olsen K, et al. Reducing health care hazards: lessons from the Commercial Aviation Safety Team. Health Affair 2009; 28: 479–489. [DOI] [PubMed] [Google Scholar]
- 68.BEA, Bureau d’Enquêtes, d’Analyses pour la sécurité de l’Aviation civile. Final report on the accident on June 1, 2009 to the Airbus 330-203, Flight AF 447. Paris: BEA, 2012.
- 69.Holmstrom A-R, Airaksinen M, Weiss M, et al. National and local medication error reporting systems – a survey of practices in 16 countries. J Patient Saf 2012; 8: 165–176. [DOI] [PubMed] [Google Scholar]
- 70.Vincent C, Stanhope N, Crowley-Murphy M. Reasons for not reporting adverse events: an empirical study. J Eval Clin Pract 1998; 5: 13–21. [DOI] [PubMed] [Google Scholar]
- 71.Macrae C. Analyzing near-miss events: risk management in incident reporting and investigation systems. Economic & Social Research Council, Discussion Paper 47, 2007.
- 72.Jeffs L, Berta W, Lingard L, et al. Learning from near misses: from quick fixes to closing off the Swiss-cheese holes. BMJ Qual Saf 2012; 21: 287–294. [DOI] [PubMed] [Google Scholar]
- 73.Wilf-Miron R, Lewenhoff I, Benyamini Z, et al. From aviation to medicine: applying concepts of aviation safety to risk management in ambulatory care. Qual Saf Health Care 2003; 12: 35–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Lewis A, Smith F, Tait P, et al. UK surgery already applies aviation safety practice. [Letter]. BMJ 2011; 342: d1310–d1310. [DOI] [PubMed] [Google Scholar]
- 75.Bilimoria K, Kmiecik T, DaRosa D, et al. Development of an online morbidity, mortality, and near-miss reporting system to identify patterns of adverse events in surgical patients. Arch Surg-Chicago 2009; 144: 305–311. [DOI] [PubMed] [Google Scholar]
- 76.Ferroli P, Caldiroli D, Acerbi F, et al. Application of an aviation model of incident reporting and investigation to the neurosurgical scenario: method and preliminary data. Neurosurg Focus 2012; 33: 1–8. [DOI] [PubMed] [Google Scholar]
- 77.Elwyn G, Buckman L. Should doctors encourage patients to record consultations? BMJ 2015; 350: g7645–g7645. [DOI] [PubMed] [Google Scholar]
- 78.Bowermaster R, Miller M, Ashcraft T, et al. Application of the aviation black box principle in pediatric cardiac surgery: tracking all failures in the pediatric cardiac operating room. J Am Coll Surg 2015; 220: 149–155. [DOI] [PubMed] [Google Scholar]
- 79.Ross J. Aviation tools to improve patient safety. J Paranesth Nurs 2014; 29: 508–510. [DOI] [PubMed] [Google Scholar]
- 80.Symons N, Almoudaris A, Nagpal K, et al. An observational study of the frequency, severity, and etiology of failures in post-operative care after major elective surgery. Ann Surg 2013; 257: 1–5. [DOI] [PubMed] [Google Scholar]
- 81.Hudson P. Applying the lessons of high risk industries to health care. Qual Saf Health Care 2003; 12(Suppl): i7–i12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Van Beuzekom M, Boer F, Akerboom S, et al. Patient safety: latent risk factors. Br J Anaesth 2010; 105: 52–59. [DOI] [PubMed] [Google Scholar]
- 83.Van Beuzekom M, Boer F, Akerboom S, et al. Patient safety in the operating room: an intervention study on latent risk factors. BMC Surg 2012; 12: 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.De Korne D, van Wijngaarden J, Hiddema U, et al. Diffusing aviation innovations in a hospital in the Netherlands. Jt Comm J Qual Patient Saf 2010; 36: 339–347. [DOI] [PubMed] [Google Scholar]
- 85.Fargen K, Friedman W. The science of medical decision making: neurosurgery, errors, and personal cognitive strategies for improving quality of care. World Neurosurg 2014; 82: e21–e29. [DOI] [PubMed] [Google Scholar]
- 86.Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013; 22: ii58–ii64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013; 22: ii65–ii72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Kapur N. On the pursuit of clinical excellence. Clin Governance 2009; 14: 24–37. [Google Scholar]
- 89.Kapur N. Mid-Staffordshire hospital and the Francis report: what does psychology have to offer? Psychologist 2014; 27: 16–20. [Google Scholar]
- 90.Kapur N. The health secretary needs a psychologist appointment. Health Serv J 2014; 124(6421): 18–18. [Google Scholar]