Skip to main content
The Permanente Journal logoLink to The Permanente Journal
. 2017 Jun 14;21:16-181. doi: 10.7812/TPP/16-181

To Err is Human: Can American Medicine Learn from Past Mistakes?

Jeffrey B Ritterman 1
PMCID: PMC5478585  PMID: 28633732

Abstract

The history of medicine includes many errors. Some persisted for decades and caused great harm. Several are highlighted in this article, including the mythical thymic diseases: thymic asthma and status thymicolymphaticus. Some medical mistakes, such as the diet-heart hypothesis of Ancel Keys, continue to cause harm. To avoid future errors and their associated harm, I suggest a cultural shift encouraging professional humility and greater questioning of medical dogma. Medical education focused on teaching students this history may help with this cultural shift.

INTRODUCTION

During my medical training, we were taught that stress and lifestyle factors caused gastritis and peptic ulcer disease. We accepted without question the idea that bacteria could not live in the highly acidic environment of the stomach. Patients with severe ulcer disease would be offered surgery. We now know, thanks to the pioneering work of Marshall and Warren,1 that peptic ulcer is caused by a bacterium, Helicobacter pylori.

Warren discovered the curved bacteria in the stomachs of patients with peptic ulcer disease and gastritis in 1979.2 But it wasn’t until his research partner, Marshall, deliberately infected himself with the bacterium and gastritis developed that their findings were taken seriously.

Marshall’s ability to take a fresh look at these gastric bacteria as etiologic agents, rather than to uncritically accept the stress theory of ulcer disease, was in part because of his lack of experience. Having started his study of gastroenterology in 1981, Marshall had an easier time than more seasoned researchers in overcoming a “set of well entrenched beliefs that conflicted with the new ideas.”3

It took a generation for Marshall and Warren’s pioneering work to be recognized and acknowledged. They first published their findings on H pylori in 1984. More than a decade later, in 1995, only 5% of American physicians were prescribing antibiotics for treatment of peptic ulcer disease.3 In 2005, Marshall and Warren received the Nobel Prize in Medicine for their discovery, 26 years after Warren discovered H pylori.2

This problem of mistaken ideas persisting despite scientific evidence to the contrary has been present since the onset of the scientific method. In 1633, Galileo was sentenced to house arrest for the crime of proclaiming that the sun, not the earth, was the center of our planetary system.4

Three hundred years later, Nobel prize-winning physicist Max Planck5 stated: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Or more succinctly: “Science advances one funeral at a time.”6

This problem is of particular concern in medical science, where outmoded ideas translate into excess morbidity and mortality. How can medicine learn from its mistakes and make these timely corrections? Perhaps a few additional examples will help make clear the importance of doing so.

A CAUTIONARY TALE: SUDDEN INFANT DEATH SYNDROME AND THE “ENLARGED” THYMUS GLAND

In the first half of the 19th century, physicians were becoming alarmed by sudden infant death syndrome (SIDS). Healthy infants would be put to bed and found dead in the morning. In 1830, pathologists noted that SIDS-affected infants had enlarged thymus glands compared with “normal” autopsy specimens.7 It seemed logical to conclude that these “enlarged” glands were in some way responsible for the deaths.

In 1830, Kopp introduced the term thymic asthma, suggesting that the “enlarged” thymus occluded the trachea.8 The existence of this fictitious disease became widely and quickly accepted, and persisted for at least a century. The thymic syndrome underwent an additional modification by the Austrian physician, Paltauf, who added the term status thymicolymphaticus to the medical lexicon in 1889.8 Paltauf believed that a systemic disorder leading to vascular collapse caused the sudden deaths. The enlarged thymus, it was believed, caused this unexplained vascular collapse, often precipitated by minor stress.

Descriptions and case reports of these thymus “diseases” appeared in medical articles and textbooks.9,10 There was even a list of physical characteristics that accompanied these syndromes, including changes in incisor teeth, heart size, and skin color. The 1924 edition of Management of the Sick Infant claimed that the clinical picture of thymic asthma was “so characteristic that once seen, it is unlikely to be mistaken.”8

If an enlarged thymus was leading to sudden infant death, removal of the thymus might be of preventive value. Radiology had advanced to the point at which physicians began making the diagnosis of thymic enlargement from x-ray films. After radiographic diagnosis, thymectomy was initially recommended, but the mortality rate was unacceptably high. Thymus irradiation became the treatment of choice.8

The first “successful” use of irradiation to shrink the thymus was reported by Friedländer in 1907.11 Thousands of children eventually received radiation to prevent status thymicolymphaticus. Some physicians advocated prophylactic irradiation for all neonates.8

There was only one slight problem. It turned out to be deadly.7

The cadavers used by anatomists to determine the “normal” thymus size were from the poor, most having died of highly stressful chronic illnesses such as tuberculosis, infectious diarrhea, and malnutrition. What was not appreciated at the time was that chronic stress shrinks the thymus gland. The “normal” thymus glands of the poor were abnormally small. Here is where the fatal mistake occurred: because the autopsied thymus glands of the poor were regarded as normal in size, the SIDS-affected infants were erroneously believed to have thymic enlargement.7,8

The thyroid gland, which is highly sensitive to irradiation, sits close to the thymus. The increased risk of thyroid malignancy in the patients who had undergone thymic irradiation was first recognized in 1949.12 The patients subjected to thymic radiation “therapy” also experienced higher rates of breast cancer.1315

The regular practice of thymic irradiation was finally halted in the 1940s, almost four decades after Friedländer irradiated the first patient. In the first edition of his radiology textbook in 1945,16 John Caffey, MD, a pioneer in pediatric radiology, proclaimed that “a causal relationship between hyperplasia of the thymus and sudden unexplained death has been completely refuted. … [I]rradiation of the thymus … is an irrational procedure at all ages.”16

More than 10,000 deaths caused by thyroid cancer resulted from this treatment.7

Rudolf Virchow, the father of cellular pathology, a man who stood at the top of the academic medical world for 50 years, was one of those who endorsed the mistaken therapy.7 Virchow, the man who first explained the pathophysiology of pulmonary embolus, the man who named leukemia, and a founder of social medicine, got it wrong!17

A CAUTIONARY TALE: FAT

Perhaps there is no better modern medical example of our capacity for serious error than the fact that we have given the wrong dietary advice since shortly after President Eisenhower’s heart attack in 1955. Not only has our advice been wrong, it has been dangerously wrong.18

As in the case of the supposed thymic disorders, once again a mistake has led to great harm.

Ancel Keys, PhD, a physiologist, studied the American and European diets after World War II. He studied the epidemiology of cardiovascular disease (CVD) and noted that American business executives had high rates of CVD,19,20 whereas the heart disease rates in postwar Europe had fallen sharply, presumably from reduced food supplies. He postulated that the different rates of CVD were owing to markedly different rates of dietary fat consumption. Keys was convinced that dietary fat led to elevated cholesterol levels, which then caused CVD.21 Keys presented his diet-heart hypothesis to the World Health Organization in 1955. His research was epidemiologic and could only prove an association, not causality. But Keys was a convincing salesman at a time when the country was searching for solutions to prevent the sudden deaths resulting from this newly recognized killer. In January 1961, Keys became a cultural hero, his picture gracing the cover of Time Magazine, and the diet-heart hypothesis was accepted.22

In 1978, Keys published his data in support of dietary fat as the cause of CVD, in the Seven Countries Study.23 Unfortunately, he excluded data from 15 countries and 4 indigenous tribes that did not fit well with his hypothesis.24

While Keys was proposing dietary fat as the cause of CVD, Brown and Goldstein were advancing our understanding of cholesterol and fatty acid metabolism, work for which they received the Nobel Prize in 1985.25 Working with skin cells from patients with a rare genetic disorder, familial hypercholesterolemia, Brown and Goldstein25 demonstrated the presence of the low-density lipoprotein (LDL) cholesterol receptor. Patients with the disorder lacked the normal number of receptors, had high serum cholesterol levels, and had a risk of heart attack early in life. The new knowledge seemed to fit well with Keys’ “dietary fat hypothesis” as the cause of CVD. Because LDL cholesterol correlated with the risk of CVD and dietary fat increased blood LDL cholesterol levels, it seemed logical to conclude that dietary fat was the cause of CVD.

Once again, incomplete knowledge led to the pursuit of a dangerous path. In the dietary guidelines case, epidemiologic research that showed an association was wrongly assumed to prove causality. In addition, the contrary evidence to Keys’ diet-heart hypothesis was ignored. There never was any association between dietary fat and all-cause mortality. Certainly, if dietary fat was the cause of CVD, one would expect such an association. In the single randomized controlled trial that compared a 10% saturated fat intake vs a diet with unrestricted saturated fat, the subjects with low-fat intake had a higher death rate due to all causes, including heart disease.26

In 1977, the McGovern Commission, chaired by then Senator George McGovern, issued dietary guidelines in keeping with the diet-heart hypothesis.27 Decades later, we have continued to follow these guidelines.28 Americans have been repeatedly told to consume no more than 30% of total calories from fat and no more than 10% from saturated fat.28

When the food companies responded to the guidelines by removing the fat from food, the taste went with it. The solution: add sugar, and lots of it. This worked well economically, as the invention of high-fructose corn syrup provided an endless supply of cheap sugar. The result of admonishing people to eat less fat was that sugar consumption skyrocketed.24,29,30 This substitution of sugar for fat has been the major driver of the diabetes epidemic31,32,33 and has played a key role in causing coronary heart disease,3436 strokes,37 fatty liver disease,38 obesity,39 hypertension,40 and some cancers.41 In addition, as Americans began avoiding fat, they also increased their intake of simple starches. Like sugar, diets high in refined starches are associated with an increased risk of obesity, CVD, and Type 2 diabetes.4244

Now the so-called “French paradox” makes sense.45,46 People in France consume high rates of fat but do not have correspondingly high rates of CVD. It isn’t a paradox. There simply is no connection between CVD and dietary fat.

Many physicians continue to warn their patients to avoid dietary fat despite accumulating evidence showing that unrefined carbohydrates cause metabolic syndrome and its related illnesses. In 2015, the Dietary Guidelines Advisory Committee Report47 for the first time started to change course and to exonerate fat and saturated fat. Instead, the report focuses our attention on fructose and other simple carbohydrates as the real culprits of diet-related illnesses. It took 100 years for the faux thymic conditions to be understood to be a gross medical error. How many more years will it take before we correct our mistaken dietary advice?

Embracing Professional Humility

During a leadership training session that I attended, a National Aeronautics and Space Administration (NASA) scientist explained that the July 1969 Apollo Mission to the moon was on the ideal flight path only 3% of the time. Great achievements depend not on perfection, but on our ability to quickly notice when we are off course and to make adjustments.

As a profession, we have failed miserably to notice that we were terribly off course in both the fictitious thymus diseases tragedy and the dietary guideline mishap. In the first instance, the error persisted for more than 100 years, in the second, many decades. In each case, innumerable people were harmed, and many died.

To prevent similar tragedies in the future, we will need a cultural shift in medicine. Coulehan48 has critiqued our present medical culture as “characterized by arrogance and entitlement.” Berger49 pointed out that the arrogance goes beyond the individual physician and is systemic:

The physician has become a “provider” and the patient a “health consumer.” This distancing of the doctor from the patient breeds a kind of “system arrogance,” in which the patient is no longer seen as a human being but simply as a job to be done cost-effectively.

The late Franz Ingelfinger,50 former editor of the New England Journal of Medicine, stated: “Efficient medical practice, I fear, may not be empathic medical practice, and it fosters, if not arrogance, at least the appearance of arrogance.”

If the toxin is professional arrogance, the antidote is professional humility.

One area in health care in which we have witnessed a cultural shift is in our understanding of how to provide competent care to patients from different backgrounds. Tervalon and Murray-Garcia51 have challenged us to go beyond “cultural competency” and to embrace “cultural humility.” They explain:

… cultural competence in clinical practice is best defined not by a discrete endpoint but as a commitment and active engagement in a lifelong process that individuals enter into on an ongoing basis with patients, communities, colleagues, and with themselves. ... It is a process that requires humility as individuals continually engage in self-reflection and self-critique as lifelong learners and reflective practitioners.

The underlying principle is that, given the great diversity of cultural practices and beliefs, humility is the appropriate mindset. Practitioners should be humble enough “to say that they do not know when they truly do not know and to search for and access resources … .”51 The practitioner is both a teacher and a student.

This model holds for the general practice of medicine as well. Humility is both a personal virtue and a professional necessity. Personal humility is essential for good doctoring.5255 Professional humility promotes the questioning of medical dogma, leading to the scientific testing of hypotheses.

William Osler,56 considered by many the father of American Medicine, addressed the question of humility in a 1906 lecture to medical students at the University of Minnesota:

In these days of aggressive self-assertion, when the stress of competition is so keen and the desire to make the most of oneself so universal, it may seem a little old-fashioned to preach the necessity of this virtue, but I insist for its own sake and for the sake of what it brings, that a due humility should take the place of honour on the list [of virtues] … since with it comes not only reverence for truth, but also proper estimation of the difficulties encountered in our search for it. … [T]his grace of humility is a precious gift.

The more humble the medical profession is, the more likely we will avoid costly errors.

To facilitate this cultural shift, we will need to unlearn old behaviors and replace them with new ones. This will require a major re-education effort for those already in practice, and the development of a robust curriculum to reach those in training. To be successful, we will need to have an impact on all layers of the medical hierarchy, including nonphysician health care workers, students, physicians-in-training, and those in positions of authority.

Our aim must be to create a safe learning environment where questions and alternative points of view are encouraged. The curriculum in medical and allied health professional schools should include courses on medical history, highlighting past medical errors, and stressing the importance of questioning current medical practice.57 Medical and allied health professional students should be required to research an area of medical care to determine if current practices are consistent with the latest medical science.

Continuing medical education courses should be developed to reach those who have already completed their formal medical education. When it became clear that physicians in practice were not well educated in end-of-life care and in pain management, training in both areas became mandatory for medical license renewal. We can do the same for professional humility.

It will be crucial to this effort for the leaders in American medicine to embrace this cultural shift. Those in authority must be open to new ideas, even if those ideas challenge paradigms associated with their own success. Medical students and physicians-in-training will find it much easier to raise important questions if they feel encouraged to do so.

Would the terrible health outcomes from thymus irradiation have been avoided if a medical student had felt empowered to ask, “Dr Virchow, are we sure that the thymus gland is abnormally enlarged in infants with SIDS?”

Much Labour and Time

In medicine (what men are scarcely aware of until they become somewhat severely practical), it requires as much labour and time fairly to lay hold of an error, and uproot it, and have done with it, as to learn and settle a truth, and abide by it.

— Peter Mere Latham, MD, 1789–1875, British physician and medical educator, physician extraordinary to Queen Victoria

Acknowledgment

The author thanks Vivien Feyer for editorial assistance and Charlie Clements, MD, for suggesting the inclusion of the Helicobacter pylori story.

Kathleen Louden, ELS, of Louden Heatlh Communications provided editorial assistance.

Footnotes

Disclosure Statement

The author(s) have no conflicts of interest to disclose.

References


Articles from The Permanente Journal are provided here courtesy of Kaiser Permanente

RESOURCES