Introduction
Surgery and aviation have many similarities. Both disciplines have made the extraordinary ordinary – involving teams of specialists using expensive equipment to perform previously unimaginable tasks in life-threatening situations. But perhaps what is most miraculous about these two astonishing human endeavours is not what they entail, but rather how frequently they occur. On any one day, an estimated 85,000 flights and 550,000 operations are completed worldwide. What are the implications of scaling up life-or-death events to such astronomical levels?
One consequence of such large throughputs is that even tiny risks are magnified. If even a small percentage of patients and passengers die, this amounts to a significant number of families affected by preventable deaths. Every year around 500 people die in aviation accidents, and the World Health Organization (WHO) estimates that a staggering 1 million people die in the perioperative period,1 many due to avoidable mistakes. So, what is being done to minimize these risks?
Checklists
The parallels between surgery and aviation make the airline industry an ideal source of inspiration for researchers involved in surgical safety.2 Indeed, Lucian Leap, a former surgeon and father of the modern patient safety movement, recommends that the medical profession should adopt three traits of the aviation industry: error assumption; procedure standardization; and institutionalized safety.3 In response to this and other calls for safer healthcare, the WHO launched the ‘World Alliance for Patient Safety’ in 2004, headed by Professor Sir Liam Donaldson.4 His innovative team has devised a series of ‘Global Patient Safety Challenges’, which encapsulate the zeitgeist of public health: clean care; safer surgery; and tackling antimicrobial resistance.
The second challenge, ‘Safer Surgery’, has been headed by Dr Atul Gawande, a visionary surgeon based at the Brigham and Women's Hospital and the Harvard School of Public Health. His group is planning several interventions to improve performance and reduce complications in surgery. One such intervention is a Safe Surgery Checklist that aims to ensure proven procedures – such as administering antibiotics before surgery – are carried out at the appropriate time. The checklist will be used at three stages: before the induction of anaesthesia; before the first incision is made; and before the patient leaves the operating room.
It was modelled on a checklist that Boeing developed in the 1930s, to assist pilots in conducting routine procedures at a time when flying was becoming more complex. The results were impressive – the checklist dramatically reduced the number of crashes, immediately minimizing expenditure and saving lives. The question is: could a similar checklist be as effective in surgery, which is also becoming more complex by the day?
Last June, the Safe Surgery Checklist was launched in Washington DC, and has since sparked a media firestorm in anticipation of the potential benefits. The checklist has been shown to reduce complications by around one-third in an eight-centre study involving surgeons in both the developed and developing world.5 In the UK alone, around 80,000 complications are estimated to be avoided per year.
Some surgeons will have their doubts. Is the checklist not just another product of an interfering ‘nanny state’ and a threat to the autonomy of theatre staff? Furthermore, in the eight-centre trial, all the hospitals had volunteered to be involved and were keen to implement the checklist. Will outcomes be as dramatic when hospitals are obliged to use it?
Teamwork
I was in theatre myself last Thursday, assisting with a total hip replacement. As the patient was being set up in stirrups, I noticed something was wrong – we were about to operate on the wrong hip. The cause of the confusion was a breakdown in communication and a failure of teamwork: the site had not been marked. The error may have been spotted by someone else. Somebody, one hopes, would have noticed. But what if they had not? Is hope really good enough when we are dealing with money, time and, most importantly, lives?
A fascinating recent study compared error, stress and teamwork in different professional subgroups, including pilots and hospital staff.6 Its results are telling. Independent adjudicators rated surgeons worse than pilots in several respects: admitting fatigue; embracing flatter hierarchies; and working in teams. Exactly what is it that makes pilots work better together than theatre staff?
Mr Peter McCulloch and colleagues at the Nuffield Department of Surgery, Oxford, have set up a research group to address this very question. Their Quality, Reliability, Safety and Teamwork Unit is using interdisciplinary approaches to develop interventions that will bring surgery in line with other high-risk disciplines, such as Formula One and aviation.7,8 For instance, they have adapted a crew resource management (CRM) training programme, used in aviation, to improve teamwork in theatre. Their tripartite model of patient safety builds on James Reason's Swiss Cheese model of accident causation,9 and adds to a growing body of literature on teamwork and error reduction.10
Last summer, I spent a few days with STAT MedEvac, the air-ambulance service for the Johns Hopkins Hospital, Baltimore, USA. The crew were kind enough to let me eat, sleep and fly with them, giving me a unique insight into the way pilots think and work.
One thing that interested me was what pilots call ‘sterile cockpit’. When a pilot gives this order, he is demanding silence so that he may focus his full attention on a particularly tricky part of the flight. Of course, to some extent, surgeons do this already, but with increasing use of monitoring and imaging devices in the operating room, and given the number of ancillary staff often present during surgery, perhaps there is a need for a similar standardized signal to alert all staff of a critical phase of the operation.
Another difference I noticed was the flow of communication between the crew. The entire flight crew took a moment to pause and run through what the flight would entail, who would do what, and how things should be done. At any point it was acceptable for even the most junior team member to make suggestions and correct the pilot. Supporters of a steep hierarchy might like to consult the work of Kurt Lewin, a pioneering psychologist of the last century, who found that leaders who favour ‘authoritarian climates’ are less successful than those who create ‘democratic climates’.11
Such a democratic climate is what is needed to make the surgical checklist work. At three points, the team leader (interestingly, this need not be the surgeon; the anaesthetist or a senior nurse can do it too) will call a halt to proceedings. Soon, ‘sign in’, ‘time out’ and ‘sign out’ will be as familiar to theatre staff as ‘sterile cockpit’ is to flight crews. During these brief periods, everyone in theatre will pause activities and conversations, to introduce themselves, make checks and be clear about the operation ahead. The improvements in staff satisfaction that will arise from such structured and inclusive conversations, while less important than the primary aim of improving patient safety, are not insignificant and should not be overlooked.
Simulators
We have already seen how the aviation industry is informing surgical practice, but how about surgical training?
In his best-selling book, Outliers, Malcolm Gladwell popularized the ‘10,000-hour rule’. Quite simply, it states that to be exceptionally good at anything, from playing the cello to kicking a football, one has to dedicate at least 10,000 hours to intense practice. In real terms, this amounts to approximately three hours every day for 10 years. A similar moral can be drawn from the recent emergency landing of a commercial flight on the Hudson River in New York, the so-called ‘miracle on the Hudson’. The hero of flight 1549, Captain Chesley Sullenberger, had 19,000 hours of flight time with US Airways under his belt, not to mention seven more years with the US Air Force. The duration and diversity of his experience no doubt contributed to his successful landing.
This is relevant to surgery, because opportunities to operate are often scarce. In the UK, this has been compounded by the Calman Report and the recent European Working Time Directive, which is limiting paid working hours to less than 48 hours per week. A senior surgeon recently told me that it was not rare for his generation of surgeons to spend 30,000 hours operating before they became consultants. He estimated that my generation of surgeons would spend just 6000 hours before reaching the same post. In a specialty that demands supreme technical expertise and breadth of experience, exactly how can we generate good surgeons given such time restrictions?
One solution is to use simulators. Simulators are a safe, time-efficient and relatively cheap way to practice risky procedures. Again, the aviation industry got there first, with the Sanders Teacher in 1910. By today's standards, it was crude – merely comprising parts of old aircraft mounted on a moveable joint, with stormy conditions recreated by assistants pulling on the wings like stagehands – but it has spawned today's Full Flight Simulators, which are an everyday part of training. Many commercial pilots must demonstrate competency on flight simulators annually.
One hundred years after the Sanders Teacher, surgery is finally catching up. Lord Ara Darzi leads a pioneering research group based at St Mary's Hospital, which is investigating the potential use of simulators in surgical training.12 They are developing several types of surgical simulators, including MIST-VR (the minimal invasive surgical trainer – virtual reality), which mimics laparoscopic surgery using virtual reality. Mr Jonathan Sackier, himself a minimally-invasive surgeon and amateur pilot, has commented that ‘the first time that a resident deals with crisis management should not be a situation of true crisis’.13
The black box
Regardless of how many preventative measures are in place, situations of true crisis will inevitably occur. The importance of learning from such disasters has only recently been formally recognized.14 Some doctors are good at debriefing with peers and senior colleagues, while others keep diaries to log their experiences. We also use critical incident forms, which were inspired by systems of ‘near miss’ reporting developed by the airline industry.15,16 Perhaps the best tool we have in learning from such events are M&M (morbidity and mortality) meetings. These provide doctors with a regular, safe forum in which the details of catastrophic cases are picked over. These measures have served surgeons well, but are there better ways to learn from tragedy?
In 1954, the aviation industry was at crisis point. A series of fatal accidents forced the British manufacturer de Havilland to ground the entire fleet of DH106 Comets, the first commercial jet airliner. Without a discovery to prevent such disasters, de Havilland – and the commercial airline industry as a whole – might not have survived. Their savior was Dr David Warren, an Australian chemist who thought of designing a device for recording the pilots' conversations and pre-crash data. He came up with the ‘red egg’, a robust flight data recorder (FDR), painted orange-red to help it stand out in wreckage. A newer version, the so-called ‘black box’, is still used today, and its retrieval continues to help us identify a cause for crashes, and prevent further tragedies.
Ostensibly, there is no need for such a device in surgery; unlike pilots, surgeons do not put their own lives at immediate risk when they work, so all the important information required can be known and remembered by the surgeon. But is this the case? In stressful cases, is it fair that we rely entirely on the surgeon to recall all the details of the operation? Could beat-to-beat physiological fluctuations be recorded electronically, and an automated record generated?
Already, obstetricians use this concept to learn from difficult births. Instead of the black box, they have the cardiotocogram and the birth chart, which record the progress of the baby and mother. Senior and junior obstetricians regularly sit around these charts and painstakingly go over every decision that was, or should have been, made. Might a surgeon whose patient unexpectedly dies also benefit from this kind of slow motion replay?
For obvious reasons, many surgeons are resistant to such close observation. In the healthcare system's current ‘blame culture’, where patient safety is often linked to an individual's performance rather than system errors, such a review process would not be in a surgeon's own interests. The philosopher Dame Onora O'Neill has offered potential solutions to this ‘crisis of trust’.17 Rather than opting for ‘Herculean micromanagement’ and making such data available to the public, she suggests that public servants should be ‘allowed some margin for self-governance’, since few can argue with being supportively and safely judged by one's peers.
Surgery: an industry or a calling?
We can see, then, how surgery has learnt, and is still learning, from the airline industry – most notably with the use of checklists, teamwork training, simulators and critical incident reporting.
However, while I have laboured several parallels between surgery and aviation for the purpose of this article, it goes without saying that there are important differences. Robert Helmreich, a Professor of Psychology who has published widely on the subject of error in aviation and surgery, reminds us that ‘aircraft tend to be more predictable than patients’.18 Surgery involves factors that are hard to simulate, namely the physiological and psychological aspects of being human. It is impossible to predict exactly how one patient's body will react to surgery compared to another, and how satisfied they will be with what the surgeon might deem a successful operation. It is this human aspect that makes surgery so challenging and so fascinating.
Furthermore, critics might argue that it is unfair to compare surgery – with its sick patients and often suboptimal working environments – to civil aviation, which deals with healthy people in close-to-ideal conditions. Fighter jets and naval vessels in war-zones are probably better correlates, but these too use communications and teamwork systems similar to commercial airlines to avoid disaster. For instance, Naval Officers have their own variant of the ‘sterile cockpit’ command; when the officer in charge blows a whistle, only three predesignated people are allowed to talk.
It is perhaps time to consider what surgery is. Should it be thought of as an industry, like aviation, or do we concur with Sir William Osler when he said ‘the practice of medicine is an art, not a trade; a calling, not a business’?19 A quarter of a century after Ivan Illich warned us that ‘irreparable damage accompanies industrial expansion’, we seem to be responding to the problem of iatrogenesis (a term Illich himself coined) not by ‘reduc[ing] professional intervention to the minimum’, as he recommended, but rather by further intensifying our interference in people's health; in his words, ‘expropriat[ing] the potential of people to deal with their human condition’.25 In our campaign to ensure that we are ‘doing the thing right’, we must not forget to ask whether we are also ‘doing the right thing’.
Whether interventions to improve patient safety will be welcomed by surgeons and other theatre staff is uncertain, but we will not have to wait long to find out. By this time next year, most hospitals in the UK (and a significant proportion worldwide) will be using the Safe Surgery Checklist, and other radical changes in surgical practice are bound to follow. One thing is for sure, however: there is one group of people who will certainly not be complaining when surgeons are practising on simulators, using checklists with their teams, and doing all they can to learn from their mistakes – and that is our patients.
Footnotes
DECLARATIONS —
Competing interests None declared
Funding None
Ethical approval Not applicable
Guarantor NS
Contributorship NS is the sole contributor
Acknowledgements
The author wishes to thank Peter McCulloch, Nuffield Department of Surgery, Oxford
References
- 1.World Alliance for Patient Safety WHO guidelines for safe surgery. Geneva: World Health Organization, 2008 [Google Scholar]
- 2.Rutherford W. Aviation safety: a model for health care? Qual Saf Health Care 2003;12:162–3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Leape LL. Error in medicine. JAMA 1994;272:1851–7 [PubMed] [Google Scholar]
- 4.Donaldson LJ, Fletcher MG. The WHO World Alliance for Patient Safety: towards the years of living less dangerously. Med J Aust 2006;184 (Suppl. 10):S69–S72 [DOI] [PubMed] [Google Scholar]
- 5.Haynes AB, Weiser TG, Berry WR, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 2009;360:491–9 [DOI] [PubMed] [Google Scholar]
- 6.Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ 2000;320:745–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Catchpole K, Mishra A, Handa A, McCulloch P. Teamwork and error in the operating room: analysis of skills and roles. Annals of Surgery 2008. 247:699–706 [DOI] [PubMed] [Google Scholar]
- 8.Catchpole KR, De Leval MR, McEwan A, et al. Patient handover from surgery to intensive care: using formula one pit-stop and aviation models to improve safety and quality. Paediatr Anaesth 2007;17:470–8 [DOI] [PubMed] [Google Scholar]
- 9.Reason J. Human error: models and management. BMJ 2000;320:768–770 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Makary MA, Sexton JB, Freischlag JA, et al. Operating room teamwork among physicians and nurses: teamwork in the eye of the beholder. J Am Coll Surg 202:746–52 [DOI] [PubMed] [Google Scholar]
- 11.Lewin K. Frontiers in group dynamics: Concept, method and reality in social science; social equilibria and social change. Human Relations 1947;1:5–41 [Google Scholar]
- 12.Torkington J, Smith SGT, Reest BI, Darzi A. The role of simulation in surgical training. Ann R Coll Surg Engl 2000;82:88–94 [PMC free article] [PubMed] [Google Scholar]
- 13.Sackier J. Evaluation of technical surgical skills: lessons from minimal access surgery. Surg Endosc 1998;12:1109–10 [DOI] [PubMed] [Google Scholar]
- 14.Donaldson LJ. An organisation with a memory. Journal of the Royal College of Physicians 2002;2:452–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Billings C, Cook RI, Woods DD, et al. Incident reporting systems in medicine and experience with the aviation safety reporting system. A tale of two stories: contrasting views of patient safety: report from a workshop on assembling the scientific basis for progress on patient safety. Chicago, IL: National Patient Safety Foundation, AMA; 1998 [Google Scholar]
- 16.Barach P, Small SD. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ 2000;320:759–63 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.O'Neill O. A Question of Trust: The BBC Reith Lectures 2002. Cambridge: Cambridge University Press; 2002 [Google Scholar]
- 18.Helmreich RL. On error management: lessons from aviation. BMJ 2000;320:781–5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Osler W. Aequanimitas with Other Addresses to Medical Students, Nurses and Practitioners of Medicine. 2nd edn Philadelphia, PA: Blakiston; 1930 [Google Scholar]
- 20.Illich I. Medical nemesis. Lancet 1974;303:918–21 [DOI] [PubMed] [Google Scholar]