Skip to main content
Croatian Medical Journal logoLink to Croatian Medical Journal
. 2008 Oct;49(5):689–692. doi: 10.3325/cmj.2008.5.689

Pilot, Swiss Cheese, and Cash Machine

Karmen Lončarek
PMCID: PMC2582364  PMID: 18925705

There is hardly a topic as odious to doctors as the one about their faults. Such an opinion is a consequence of the prejudice that medical staff should be infallible and, more generally, a product of today’s culture, which discourages admitting mistakes. This prejudice is built into modern health practitioners already in the medical school, where students are nourished by the belief: “If I am diligent enough in studying and working, I will not commit a mistake; have I made one, it would mean that I have not learned or worked diligently enough.”

The one who acts makes mistakes. Making mistakes is human. Mistakes in medicine, although sometimes fatal, are simply unavoidable.

Of course, there are bad apples among the doctors too, although their number is negligible. The majority of medical faults are faults of the system; individual faults are rare. That is to say, a doctor might be the most experienced expert, doing the best in his power, and he or she might still commit a mistake.

What can a doctor learn from a cash machine?

Well, how can it happen that someone who is perfectly capable of performing a task reliably and who has performed the same task for many times in the past and is very aware of the damaging consequences of his or her mistake, can still make a mistake? To illustrate, let me give an example that everyone is familiar with – an automatic teller machine. There are two types of this machine: those that first give you the money and then the card and those that give the money only when the user has pulled out the card. In spite of the fact that the vast majority of cash machine users are totally capable to handle the task, have done it for many times, are aware of the grave consequences of a possible mistake – they still keep forgetting their cards in the machine which first gives the money and then the card.

That is why the machines that first give the money are being gradually replaced by the other type. This is an example of how the system has corrected a fault that had been a part of it. However, the human being is not a machine and it is extremely difficult to change any aspect of human experience to enable people avoid making mistakes. Nevertheless, it is possible to build systems that may diminish the possibility of committing mistakes. Let us first consider the data on the frequency of medical faults.

Epidemiology of faults

In the year 1999, the Institute of Medicine of the American National Academy of Sciences estimated that a total of 98 000 patients had died because of medical faults in American hospitals only (1); that is to say, more patients died because of faults than because of breast cancer or AIDS. Further, 1 million patients sustained some damage.

The research undertaken in 1995, including 14 000 patients in 28 Australian hospitals, proved that unwanted events or damage had happened in 17% of cases, having caused permanent health damage in 1 in 7 patients; 1 in 20 had died (2). More than a half of these events could have been avoided. Another study in the USA has found oversights (with or without consequences) in administering of medicines in more than 7% of all the cases (3). It has been shown that the health care workers in one Israeli intensive care unit commit on average 1.7 mistakes per patient daily (4). Medical faults increased the average cost per hospital patient by US $2300 and prolonged the duration of hospital stays for two days on average. Other studies have found that these costs were US $4700 higher and hospital stays were 5 days longer (5).

Faults are most often done in surgery (most probably because they become evident very soon), the emergency unit (because it is necessary to act instantly), when new methods are being introduced, and among non-experienced doctors.

It is of course so that medical faults do not happen in hospitals only, but in outpatient clinics, house visit care, old peoples’ homes, and pharmacies. American research has discovered complications caused by medicines in 18% of patients (6); other studies have estimated that three of four such complications can be avoidable.

The culture of insecurity

The culture of security in health systems if far behind the culture of security in other high-risk activities, as air traffic, nuclear power plants, or oil platforms. Health should learn from them a great deal.

As already said, medical officers are trained in belief that through great personal efforts it is possible to obtain infallibility in their work, a belief which is based on the hypothesis that the main cause of faults is being hidden in a person. Doctors very often deny that their exhaustion, stress, overload and time pressure increase the risk of fault. Pilots are also prone to overestimate their possibilities under similar conditions, but their training includes the methods for recognizing these conditions and coping with them (7).

Although it is often said that doctors are just human beings, technological wonders, extreme precision of laboratory checks, and inventions that allow the visualization of illness have created “great expectations” in the sense of absolute perfection. Such an atmosphere makes patients, together with doctors, deny that medical faults are possible. Hospital administrations are biased to react to each fault as an anomaly that would be solved by blaming an individual, promising at the same time that such a fault would never happen again. Paradoxically, such reactions diminish a possibility of discovering the systemic causes of fault and making system improvements.

Crashing and burying

Contemporary medical culture is not a culture of security, but a culture of guilt and condemnation. Generally speaking, medical culture discourages an open dialogue on faults. Fault is unwillingly discussed, and such discussions are accepted with reluctance. Besides, there are powerful hierarchic obstacles – medical nurse, student of medicine, medical specialist, and younger doctors often avoid to notify a superior doctor that she or he has committed a mistake, even if there is an evident threat to the security of the patient.

Workers in air traffic, on the contrary, are trained to find and anticipate a mistake through simulations of risky situations, which enables them to discuss it in the most open form with superiors and other members of the team (8). It is thus unthinkable that a flight attendant or a member of the team lower in rank holds their tongue about an obvious fault of a pilot. It is almost equally unthinkable that the instrument nurse notifies the surgeon about his or her obvious mistake during an operation. Some would say: motivations of the doctor and the pilot are different – the pilot crashes down together with his mistake, while the doctor buries that of him.

The Swiss cheese

Although the immediate cause of an incident is a certain fault or omission, a more detailed analysis usually discovers a set of events and aberrations from secure procedures, each of them being in some way or another a product of a broader organizational context, ie, a complex picture. Such a view is just starting to be accepted in health, but it is unfortunately still very rarely applied in investigations of particular incidents. Research has shown that one fifth of the total responsibility lies with the individual, while four-fifths lie with the system (1). In the majority of cases, both sorts of causes are present. Their combinations could be illustrated through the model of Swiss cheese (9). If we cut the cheese in slices and order them at random, it is most probable that it will be in no way possible to look through all of the slices. However, with a little patience, the holes could be arranged in such a way that the cheese becomes hollow.

Similarly, a hole in a security system is made when individual and systemic elements are ordered in some way. Here is an example from the United Kingdom. A patient, an eight-year-old boy, died during a simple operation of the eardrum: the day before, the hospital technical service had changed the temperature probe connector and failed to inform the anesthesiologists about it; these new temperature probes were not compatible with the monitor; the anesthesiologist was too tired and he fell asleep during the operation; nurses did not dare to wake him up because they were afraid of confrontation; and the surgeon noticed that the patient was breathing five times faster the normal rate, but he went on to operate. The boy died a few hours later because of hypothermia during the surgery, caused by the banal dysfunction of the heater on the breathing tube, something that the exhausted anesthesiologist did not notice and which was not registered by the instruments (10).

Systemically arranged holes

An immediate responsibility, of course, lies with the neglectful anesthesiologist. But what about his superior, who put him to work in such an exhausted condition, about technical service which did not inform him that they had built in incompatible connectors, about nurses who hesitated to wake him up, or about the surgeon who proceeded to operate although he noticed the worsening of the patient’s condition? It is far easier, and emotionally more satisfying, to blame an individual than the system. The system is diffuse and amorphous; it is difficult to consider it as a whole and assume an emotional attitude toward it. Still, the system contributed a great deal in arranging all the holes in the Swiss cheese in the way to bring a patient to death over a simple ear operation.

Dogma about infallibility

In Croatia, the Croatian Medical Chamber made a deal with a lawyer firm to defend physicians against charges related to mistakes at work. Such an undertaking deserves to be praised, but it is of no less importance to establish a service, which exists in New Zealand and Scandinavian countries (11,12), which would collect anonymous information about situations when grave omissions almost have happened – not for the sake of pressing charges, but for studying such situations to discover mechanisms which had brought them about.

In some developed countries there exists a system of “no fault compensation,” which ensures financial compensation to most patients without pressing charges against medical workers. Although this form of compensation seems to be designed to protect neglectful doctors, it in fact protects the damaged patients. The legal system is not very efficient in compensating the damaged, because only 6% of the total of damaged patients gains substantial amounts of money, even though three fifths of them go to the court.

Modern medicine has its myths, prejudice, and dogmas, and one of these is the prejudice that individual’s maximum of diligence must be enough to prevent him or her from making mistakes. However, the truth is quite different – one who works makes mistakes. Especially since doctors are human beings rather than machines and since medicine is not an entirely exact science. But this does not mean that one cannot learn from ones mistakes. Exaggerated concerns and fears about charges and avoiding to admit faults prevent medicine from learning from its faults, thus preventing the development of patients’ security.

References

  • 1.Kohn L, Corrigan J, Donaldson M. To err is human. Washington: National Academy Press; 2000. [PubMed] [Google Scholar]
  • 2.Wilson RM, Runciman WB, Gibberd RW, Harrison BT, Newby L, Hamilton JD. The quality in Australian health care study. Med J Aust. 1995;163:458–71. doi: 10.5694/j.1326-5377.1995.tb124691.x. [DOI] [PubMed] [Google Scholar]
  • 3.Bates DW, Cullen DJ, Laird N, Petersen LA, Small SD, Servi D, et al. Incidence of adverse drug events and potential adverse drug events. Implications for prevention. ADE Prevention Study Group. JAMA. 1995;274:29–34. doi: 10.1001/jama.274.1.29. [DOI] [PubMed] [Google Scholar]
  • 4.Donchin Y, Gopher D, Olin M, Badihi Y, Biesky M, Sprung CL, et al. A look into the nature and causes of human errors in the intensive care unit. Crit Care Med. 1995;23:294–300. doi: 10.1097/00003246-199502000-00015. [DOI] [PubMed] [Google Scholar]
  • 5.Bates DW, Spell N, Cullen DJ, Burdick E, Laird N, Petersen LA, et al. The costs of adverse drug events in hospitalized patients. Adverse Drug Events Prevention Study Group. JAMA. 1997;277:307–11. doi: 10.1001/jama.277.4.307. [DOI] [PubMed] [Google Scholar]
  • 6.Gandhi TK, Burstin HR, Cook EF, Puopolo AL, Haas JS, Brennan TA, et al. Drug complications in outpatients. J Gen Intern Med. 2000;15:149–54. doi: 10.1046/j.1525-1497.2000.04199.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ. 2000;320:745–9. doi: 10.1136/bmj.320.7237.745. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Barach P, Small SD. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ. 2000;320:759–63. doi: 10.1136/bmj.320.7237.759. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Reason J. Human error: models and management. BMJ. 2000;320:768–70. doi: 10.1136/bmj.320.7237.768. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Helmreich RL. On error management: lessons from aviation. BMJ. 2000;320:781–5. doi: 10.1136/bmj.320.7237.781. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Bismark M, Dauer E, Paterson R, Studdert D. Accountability sought by patients following adverse events from medical care: the New Zealand experience. CMAJ. 2006;175:889–94. doi: 10.1503/cmaj.060429. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Erichsen M. The Danish patient insurance system. Med Law. 2001;20:355–69. [PubMed] [Google Scholar]

Articles from Croatian Medical Journal are provided here courtesy of Medicinska Naklada

RESOURCES