Skip to main content
Critical Care logoLink to Critical Care
. 2010 Mar 9;14(2):217. doi: 10.1186/cc8858

Patient safety and acute care medicine: lessons for the future, insights from the past

Peter G Brindley 1,
PMCID: PMC2887110  PMID: 20236461

Abstract

This article is one of ten reviews selected from the Yearbook of Intensive Care and Emergency Medicine 2010 (Springer Verlag) and co-published as a series in Critical Care. Other articles in the series can be found online at http://ccforum.com/series/yearbook. Further information about the Yearbook of Intensive Care and Emergency Medicine is available from http://www.springer.com/series/2855.

Introduction

"All truth passes through three stages. First it is ridiculed. Second it is violently opposed. Third it is accepted as self-evident." [1]

Arthur Schopenhauer 1788-1860

It is estimated that approximately 40,000-100,000 Americans die annually from medical errors [2]. Thousands more suffer harm from medical errors. Still others are exposed to errors, but are lucky enough to suffer no obvious harm [3]. In fact, medical errors are now the eighth leading cause of death in the USA; data are no less alarming from other nations [4]. Regardless of the exact figures, it seems that patient safety is far from adequate. Crudely put, if medicine were a patient, we physicians would say it is time to admit there is a problem. We would expect urgent action, and we would welcome any ideas, rather than tolerate further delays. This chapter hopes to provide a call-to-arms, but most importantly a range of ideas, both new and old, to achieve the sort of care that our patients deserve.

'The missing curriculum' [3]

Albert Einstein stated that: "you can never solve a problem by using the same thinking that created it" [5]. As such, the first step is to emphasize that medical errors are rarely merely negligence, sloppiness, incompetence, or poor motivation. Instead, we should accept that health care is amongst the world's most complex social systems [3]. Coupled with the complexity of medical diagnosis, and the need to make decisions despite time pressure and incomplete information, the shocking patient safety figures make more sense. Perhaps the complexity of the task ahead is also a little clearer.

The slogan states "Safety is no accident" [3]; stated another way, errors in healthcare are rarely random, unpredictable events. Some errors may ultimately be rooted in our organizations and perpetuated by our traditions. Like many complex systems, medicine has a double-headed Janus [6], where these traditions are both our greatest asset and our keenest shortfall. For example, the laudable tradition of self-reliance and patient-ownership means that physicians usually stay until the work is done, and diligently follow patients from admission to discharge. However, downsides include the dangerous effects of fatigue, and a reluctance to permit input from others. It has also created a system where we appreciate that errors occur, just not at a personal level! Centuries of pedagogy also mean we have been slow to implement innovative methods of training. For example, despite functioning in multi-professional teams that require nuanced co-ordination and communication skills, these skills are rarely deliberately taught, or sought after from applicants [7]. Our traditions also mean that while medical graduates are versed in the science of medicine, and acquire skills to look after individual patients, few are trained to tackle systemic safety issues, or to understand how humans work in large groups or complex systems. One way to do so is to be open to innovative ideas, regardless of their source (Table 1). Another is to change the very way we regard our work.

Table 1.

Insights for acute care medicine from diverse sources

Example Insight
Engineering Most errors are neither random nor unpredictable
Benefits of Standard Operating Procedures
Usefulness of second-opinions; fail-safes; time-outs
Benefits of a systems approach to error and education
Apply the Swiss-cheese model to understanding error
Cognitive Psychology Benefits and detriments of:
 Gestalt, Law of Prägnanz, premature closure; availability and anchoring heuristics
Human/Machine Interface Humans excel at pattern-recognition;
Computers excel at calculation and vigilance
The best system mitigates shortcomings
Chess Need to concurrently manage multiple threats
Two patterns of attention: Focus of predator; gaze of prey
Benefits of risk-free simulation

Engineering and acute care medicine

A favorite debate is whether medicine is more 'science' or 'art'. However, safe patient care could instead be understood as 'engineering'. After all, engineering means "applying the best current technical, scientific, and other knowledge to design and implement structures, machines, devices, systems, and processes to safely realize an objective" [8]. Commercial aviation is far from perfect, and there are differences between scheduled flights and unscheduled medical crises. However, aviation has achieved a log reduction in fatalities. This has been largely accomplished by applying engineering principles. In fact, there is now 1 fatal crash per 4.5 million take offs, and the most dangerous part of many a pilot's day is the airport commute, rather than the subsequent flight [9]. The same cannot be said for patients entering a hospital. An engineering approach would also mandate Standard Operating Procedures (such as protocols and check lists) and implement redundancies (such as double-checks, fail-safes, and time-outs). Engineering theory also means accepting that the complexity of the system exceeds the ability of any one individual. This means encouraging second-opinions and practicing teamwork [3]. Engineering also means accepting continuous updates, and utilizing the best current information, even if imperfect (i.e. "a good solution now is better than a perfect solution later"). In contrast, with our current medical model, imperfect research offers an excuse not to change. With an engineering model, near misses also represent an opportunity to improve the system, especially if freely discussed, and especially if all are permitted to contribute and learn. An open approach fosters a sense of responsibility and empowerment, rather than resignation.

The goal of aviation is safe, efficient, and predictable travel from point A to B. There is no reason why medicine should not similarly promote safe, efficient, and predictable care from A to D (admission to discharge). Aviation passengers do not mind if pilots divide their task into take-off, flight, and landing. It does patients no disservice if healthcare workers similarly divide hospital care into input, throughout, and output. Furthermore, seeing ourselves as 'product safety engineers' redefines our role to that of coordinating the safe transit of a patient through the system, rather than making us responsible for making every minor decision, or performing every minor treatment.

Engineering and error prevention

Using the engineering model, errors are better conceptualized using a system model [7,10]. For example, in a typical commercial airline crash, there might be a technical problem, but this alone is rarely enough to cause a crash. The crew might also be tired, such that decision-making skills erode, and things are missed that would otherwise not be. The plane might be behind schedule, adding stress and a reluctance to invest the extra time for safety. In addition, many crews have not flown together, so are unfamiliar with each other's style. The sum total of these minor stresses is a team that is 'maxed out', with nothing left if adversity strikes. Most of the time they will be lucky. Some of the time they will not.

An old proverb states that "failing to plan" is "planning to fail" [11]. This is why engineers and pilots also talk about enhancing situational awareness [12,13]. This is because identifying a discrepancy between what is happening and what should be happening is often the first indication of an error. Enhanced situational awareness promotes a proactive, rather than reactive, approach. Pilots talk about "flying ahead of the plane", because they realize optimal crisis management begins before a crisis erupts. Regardless, defenses against error include personnel, technology, training, and administration [3,7]. However, most important is culture: The collective attitudes, beliefs, and values [3]. Ideally, the combined layers of defense are impermeable. In reality, there are weaknesses and the layers are - to borrow another analogy from engineering - like slices of Swiss-cheese that contain holes. Fortunately, because there are multiple layers, single errors (i.e., a single hole) do not normally cause a bad outcome. In contrast when mishaps occur, the holes have lined up, at least momentarily [3]. This is why a minor technical problem, fatigue, or time pressure alone, would rarely cause a disaster, but when combined they can. In fact, when errors are dissected (whether following plane crashes, power station meltdowns, or medical mishaps) it is typical to find three or more minor issues resulting in one major error [3].

When an adverse event occurs, a system-approach means that corrective efforts should focus less on who, and more on how did it happen, why did the defenses fail, and what can be done to prevent it happening in future. This contrasts with the traditional medical approach where the focus is on assigning responsibility (so called 'name, blame, shame'). Traditional efforts to reduce error, however well intentioned, emphasize discipline, and retraining, but ignore the context in which the error occurred [3]. This is also why they are less likely to prevent recurrence [12,13].

Understanding the basics of human error

The most common reason for commercial aviation to crash is human error [9,12-14]. The same appears to be true in acute care medicine [2,3,12-14]. Engineering therefore incorporates more than just mechanical know-how. A comprehensive strategy also means teaching situational awareness, improved communication, appropriate task distribution, and optimal teamwork [12-14]. This skill set, collectively known as Crew Resource Management, is widely taught in aviation. In contrast, medicine's Crisis Resource Management is rarely included in the standard medical curriculum [12-14]. Physicians, like engineers, should also be taught the basics of why errors occur if we are ever to mitigate them. What follows is a very basic introduction to the field of cognitive psychology.

The 'Gestalt effect' is the tendency to recognize objects or patterns instead of, for example, only seeing lines or curves [15,16]. To pattern-recognize is an essential part of our ability, and one of our greatest sources of insight [12,14]. The ability to see connections between seemingly disparate information enables our cleverest diagnoses, and most innovative thought. A simple example of pattern-recognition is the way we are able to recognize that an aging male with chest discomfort, breathlessness, and arm pain likely has an acute coronary syndrome. Early clinical training is all about pattern-recognition. Later on, we gain sufficient experience to pattern recognize automatically, almost without thinking. Unfortunately, as with any action that involves decision-making with minimal thinking, errors can occur [17].

Pattern-recognition is essential for efficient and expeditious medical care, but it requires that we prioritize some pieces of information, while downplaying others. In other words, when we look 'here', we risk missing 'there'. Most medical practitioners are familiar with the benefits of Occam's Razor [18], where we appropriately assume the most common explanation to be correct. However, we are less familiar with the detriments of the Law of Prägnanz, where we also subconsciously organize information into the simplest form possible [15,17]. We also search for patterns in order to avoid the extra effort required for complex thought or calculation. Moreover, we subconsciously process information to maintain a sense of order and a feeling of competence. We downplay contrary evidence, and are reluctant to pursue alternatives (also known as 'premature closure') [16,17]. We may even judge the likelihood by how easily the idea sprang to mind (the so called 'availability heuristic') [17,19,20]. We then tend to stick with our initial assumptions (the so called 'anchoring heuristic') [17,19,20]. This means that we tend to favor diagnoses that we are comfortable treating, overlook more serious possibilities, and even favor the excuse that it is "not my problem" [17]. Overall, an engineering approach means building systems to mitigate cognitive errors rather than assuming they result from mere arrogance, stupidity, or sloth. For example, cockpits are now deliberately configured to have two people operating them. This encourages a system where each checks the other and offers a second input. We have yet to consider the design of acute care areas in similar terms. In the meantime, there is no reason why we could not start by modifying medical education and training.

Educating for safety

Learning from others could also change how we educate [7,10]. For example, rather than relying upon teachers to simply cover their favorite topics, with minimal attention to relevance, curricula would be more deliberately matched to the goal of safer care. Routine audits would establish major problem areas (i.e. common shortfalls or steps that require particular precision or the coordination of many people). Results would then be widely shared, rather than being the purview of a select few. A curriculum would then be drafted (using all relevant experts and a modified-Delphi approach) and alpha-tested in order to produce a polished product. Next, wide-scale dissemination occurs using the optimized material (i.e. beta testing) [10]. The process then begins again. In this way, educators are not merely passing facts from one generation to another, but are in fact running the patient safety laboratory (or 'crash-test site') for the modern hospital [7,10]. Accordingly, educators become important agents of change, and as highly valued as good researchers or clinicians.

Maximizing the best of human and machine

As outlined above, modern hospital care mandates an understanding of human factors and of technology. Therefore, understanding this interface is vital. The 1997 chess match between world champion Garry Kasparov and IBM's Deep Blue offers intriguing insights [21]. Kasparov (an example of the human mind) won the first game and Deep Blue (an example of technology) won the second. This proves that both are capable of impressive performance. However, it is more important to look at their respective skills and weaknesses. For example, Deep Blue was capable of evaluating 200 million positions per second, whereas Kasparov could only evaluate a handful and overlooked certain moves when overly focused. As outlined above, the inability to pick up on clues in medicine is known as a fixation error, and is a major source of error, even for experienced practitioners [12,13].

The computer, Deep Blue, never fatigued, or succumbed to emotions. Kasparov had to be nourished and rested. Deep Blue also possessed a superior opening and end-game. Kasparov could think abstractly and plan long-term strategies. Using pattern recognition, Kasparov recognized fragments from previous games in order to choose the most appropriate few things upon which to focus. When Kasparov won, he did so by maximizing the middle game, namely where there are too many pieces (variables) on the chessboard for computers to calculate all possibilities. When Deep Blue won it was through consistency, aided by impeccable memory [21].

Humans excel at pattern recognition. In contrast, we are often poor at recognizing, or responding to, gradual deterioration. When stressed we are particularly prone to tunnel-vision (ignoring additional clues due to excessive focus) [12,13,17]. We are also weak at calculation (11 × 24 = ?). Computers are worse at pattern-recognition, but excel with calculation and vigilance. The lesson for health care from Kasparov versus Deep Blue is that health care should leverage each in their area of strength: Humans to recognize constellations of symptoms, and computers to monitor vital-signs and activate a response to gradual changes or concerning trends.

Additional insights include how Kasparov and Deep Blue's programmers learnt to mitigate their respective weaknesses. For example, Kasparov used computer chess engines to objectively analyze positions. Deep Blue's programmers teamed up with chess masters who recommended certain strategic moves, based upon their collective experience. It could be argued that both man and machine were actually 'cyborgs': Functional hybrids of each other [22]. Regardless, another lesson from Kasparov and Deep Blue's programmers is that harnessing the best of the humantechnology hybrid created more than the sum of its parts [21,22]. Similarly, we should learn that it is not a battle of human independence versus technological dominance, but the search for synergies in order to achieve excellence. Maximizing the best of the human and the technology is the real victory. Hopefully the patient will be the ultimate victor.

Other lessons from the chessboard

Engineering and aviation are well known for their use of simulation as a key strategy to improve safety. However, the game of chess is probably amongst the oldest examples of simulation, and was likely developed to hone military skills [23]. Chess has, therefore, been touted by proponents to emphasize that simulation is well-established, not an untested departure [23]. It is also remarkable how this archetype of simulation has other prescient lessons for acute care medicine, even 6,000 years on.

The ability to manage concurrent threats is essential in chess and in medicine. Interestingly, it is also essential for animals throughout nature. Two classic types of attention exist [24]. The first is the predator's focused-gaze. Whether this means a predator moving in for the kill, a chess player quickly capturing an opponent's queen, or a physician resuscitating a patient, there is a need to attend to only the most pressing issues, ignore less important stimuli...and to hopefully know the difference. The second type of attention is less discriminate vigilance. This is illustrated by the generalized watchful vigilance of prey, the caution shown during chess's opening moves, or the ability to attend to many non-acute issues during routine medical moments, such as daily rounds. In this case, there is a need to be more open to clues, to watch how others react, and to make a more measured response. Presumably good chess players, trusted acute care clinicians, and even wild animals that live to old age, possess both styles; success also means having the versatility to switch between the two.

The fact that 'play' is so widespread in both humans and animals suggests an important role - otherwise natural selection would have selected against it as a waste of scarce energy. Harmless games, like chess, may be beneficial precisely because they might result in 'less harm'. They allow practice in an environment where mistakes can be made with minimal consequences for those involved. This is presumably why play is so common in nature, and also why many medical societies now strongly endorse medical simulation [3,7]. However, again compared to other high-risk professions, medicine lags far behind [25]. Medical simulation is not yet a routine or mandated part of medical training or ongoing practice. Increasingly the question is not why should we simulate, but rather why do we not?

Conclusion

If we really are serious about designing safer patient care for the future, then we should be open to lessons from all possible sources. As a result, the modest intent of this review was to offer insights from the profession of engineering, the field of cognitive psychology, and even from games such as chess. The conclusion should be obvious - diverse ideas already exist and, therefore, medicine need not 'reinvent the wheel'. However, the question, yet to be answered, is whether as a profession we have the insight, the will, or the humility. So far, no other high-risk industry has waited, or expected the level of unequivocal proof, before making changes [25]. That change is needed should indeed be "self-evident" [1]. Whether the increasing call for change will be "ridiculed" or "violently opposed" [1] represents the next stage in the evolution of acute care medicine and patient safety.

Competing interests

The authors declare that they have no competing interests.

References

  1. Arthur Schopenhauer quotes. http://www.brainyquote.com Accessed Dec 2009.
  2. Kohn LT, Corrigan J, Donaldson MS. To Err is Human: Building a Safer Health System. Washington: National Academy Press; 2000. [PubMed] [Google Scholar]
  3. Aron D, Headrick L. Educating physicians prepared to improve care and safety is no accident: it requires a systematic approach. Qual Saf Health Care. 2002;11:168–173. doi: 10.1136/qhc.11.2.168. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Baker GR, Norton PG, Flintoft V. The Canadian Adverse Events study: the incidence of adverse events among hospital patients in Canada. Can Med Assoc J. 2004;170:1678–1686. doi: 10.1503/cmaj.1040498. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Albert Einstein quotes. http://www.brainyquote.com/words/so/solve221543.html Accessed Dec 2009.
  6. St Pierre M, Hofinger G, Buerschaper C. In: Crisis Management in Acute Care Settings: Human factors and team psychology in a high stakes environment. St Pierre M, Hofinger G, Buerschaper C, editor. New York: Springer; 2008. Basic principles: error, complexity and human behavior; pp. 1–16. [Google Scholar]
  7. Dunn W, Murphy JG. Simulation: About safety, not fantasy. Chest. 2008;133:6–9. doi: 10.1378/chest.07-1719. [DOI] [PubMed] [Google Scholar]
  8. Engineering. Wikipedia, the free encyclopedia. http://www.en.wikipedia.org/wiki/engineering Accessed Dec 2009.
  9. New York Times (October 1st, 2007). Fatal Airline crashes drop 65% http://www.nytimes.com/2007/10/01/business/01safety.html Accessed Dec 2009.
  10. Barry R, Murcko A, Brubaker C. The Six Sigma Book for Healthcare: Improving Outcomes by Reducing Errors. Chicago: Health Administration Press; 2002. [Google Scholar]
  11. "Failing to plan is planning to fail". http://thinkexist.com/quotation/failing_to_plan_is_planning_to/175849.html Accessed Dec 2009.
  12. Gaba DM, Fish KJ, Howard SK. Crisis Management in Anesthesiology. New York: Churchill Livingstone; 1994. [Google Scholar]
  13. Rall M, Gaba D. In: Miller's Anesthesia. Miller R, editor. Philadelphia: Elsevier Churchill Livingstone; 2005. Human performance and patient safety; pp. 3021–3072. [Google Scholar]
  14. Leonard M, Graham S, Bonacum D. The Human Factor: The critical importance of effective teamwork and communication in providing safe care. Qual Saf Health Care. 2004;13:185–190. doi: 10.1136/qshc.2004.010033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Koontz NA, Gunderman RB. Gestalt theory: implications for radiology education. Am J Roentgenol. 2008;190:1156–1160. doi: 10.2214/AJR.07.3268. [DOI] [PubMed] [Google Scholar]
  16. Gestalt psychology. http://en.wikipedia.org/wiki/Gestalt_psychology Accessed Dec 2009.
  17. Berner ES, Graber ML. Overconfidence as a Cause of Diagnostic Error in Medicine. Am J Med. 2008;121(5 Suppl):S2–23. doi: 10.1016/j.amjmed.2008.01.001. [DOI] [PubMed] [Google Scholar]
  18. Occam's Razor. Wikipedia, the free encyclopedia. http://en.wikipedia.org/wiki/Occam's_razor Accessed Dec 2009.
  19. Schwab AP. Putting cognitive psychology to work: Improving decision-making in the medical encounter. Soc Sci Med. 2008;67:1861–1869. doi: 10.1016/j.socscimed.2008.09.005. [DOI] [PubMed] [Google Scholar]
  20. Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med. 1999;74:791–794. doi: 10.1097/00001888-199907000-00012. [DOI] [PubMed] [Google Scholar]
  21. Miah A. In: Philosophy Looks at Chess. Benjamin Hale, editor. Chicago: Open Court; 2008. A Deep Blue grasshopper. Playing games with artificial intelligence; pp. 13–24. [Google Scholar]
  22. Hartmann J. In: Philosophy Looks at Chess. Benjamin Hale, editor. Chicago: Open Court; 2008. Garry Kasparov is a cyborg, or What ChessBase teaches us about technology; pp. 39–64. [Google Scholar]
  23. Rosen KR. The history of medical simulation. J Crit Care. 2008;23:157–166. doi: 10.1016/j.jcrc.2007.12.004. [DOI] [PubMed] [Google Scholar]
  24. Proctor RN. In: Agnotology: The Making and Unmaking of Ignorance. Proctor RN, Schiebinger L, editor. Stanford: Stanford University Press; 2008. Agnotology: A missing term to describe the cultural production of ignorance (and its study) pp. 1–36. [Google Scholar]
  25. Gaba DM. Improving anesthesiologists' performance by simulating reality. Anesthesiology. 1992;76:491–494. [PubMed] [Google Scholar]

Articles from Critical Care are provided here courtesy of BMC

RESOURCES