Short abstract
Everyone makes mistakes. But our reliance on cognitive processes prone to bias makes treatment errors more likely than we think
Psychologists have studied the cognitive processes involved in decision making extensively and have identified many factors that lead people astray. Because doctors' decisions have profound effects on their patients' health, these decisions should be of the best possible quality. All doctors should therefore be aware of possible pitfalls in medical decision making and take steps to avoid these unnecessary errors. In this article, I present five examples of cognitive biases that can affect medical decision making and offer suggestions for avoiding them.
Psychology of decision making
Doctors often have to make rapid decisions, either because of medical emergency or because they need to see many patients in a limited time. Psychologists have shown that rapid decision making is aided by heuristics—strategies that provide shortcuts to quick decisions—but they have also noted that these heuristics frequently mislead us.1 Good decision making is further impeded by the fact that we often fall prey to various cognitive biases.
To make correct decisions in clinical practice, doctors must first gather information on which to base their judgments. According to decision making experts Russo and Schoemaker,2 the best way to do this is to ask the most appropriate questions, to interpret answers properly, and to decide when to quit searching further. Straightforward though this sounds, misleading heuristics and cognitive biases create pitfalls throughout this process.
Doctors may believe that, as highly trained professionals, they are immune to these pitfalls. Unfortunately, they are just as prone to errors in decision making as anyone else.3-5 Even worse, it is common for people who are particularly prone to cognitive biases to believe that they are good decision makers.2 As Shakespeare put it, “The fool doth think he is wise, but the wise man knows himself to be a fool.”w1 Studies based on both simulated cases and questionnaires show that doctors are susceptible to decision making biases,6,7 including insensitivity to known probabilities,7 overconfidence,w2 a failure to consider other options,w3 the attraction effect,w4 and the availability heuristic.w5 The good news is that training in these dangers can reduce the probability of flawed medical decision making.w6
Pitfall 1: the representativeness heuristic
The representativeness heuristic is the assumption that something that seems similar to other things in a certain category is itself a member of that category. Kahneman and Tversky showed this heuristic in a classic experiment in which they presented participants with descriptions of people who came from a fictitious group of 30 engineers and 70 lawyers (or vice versa).8 The participants then rated the probability that the person described was an engineer. Their judgments were much more affected by the extent to which the description corresponded to the stereotype of an engineer (for example, “Jack is conservative and careful”) than by base rate information (only 30% were engineers), showing that representativeness had a greater effect on the judgments than did knowledge of the probabilities.
The representativeness heuristic has also been shown in nursing. Nurses were given two fictitious scenarios of patients with symptoms suggestive of either a heart attack or a stroke and asked to provide a diagnosis.9 The heart attack scenario sometimes included the additional information that the patient had recently been dismissed from his job, and the stroke scenario sometimes included the information that the patient's breath smelt of alcohol. The additional information had a highly significant effect on the diagnosis and made it less likely—consistent with the representativeness heuristic—that the nurses would attribute the symptoms to a serious physical cause. The effect of the additional information was similar for both qualified and student nurses, suggesting that training had little effect on the extent to which heuristics influenced diagnostic decisions.
How can we avoid being led astray by the representativeness heuristic? The key is to be aware not only of the likelihood of a particular event (such as a stroke) based on situational information (such as alcohol on the breath), but also how likely the event is in the absence of that information. In other words, it is important to be aware of base rates of the occurrence of a particular condition and to avoid giving too much weight to one piece of information. By the same token, if a disease is extremely rare, it may still be unlikely to be the correct diagnosis even if a patient has the signs and symptoms of that disease.
Pitfall 2: the availability heuristic
When we use the availability heuristic, we place particular weight on examples of things that come to mind easily, perhaps because they are easily remembered or recently encountered. In general, this guides us in the right direction, as things that come to mind easily are likely to be common, but it may also mislead. The availability heuristic is apparent after a major train crash, when some people choose to travel by car instead of by rail, in the incorrect belief that it is safer.w7
In the medical setting, one study asked doctors to judge the probability that medical inpatients had bacteraemia. The probability was judged to be significantly higher when doctors had recent experience of caring for patients with bacteraemia.10 Another example is the documented tendency of doctors to overestimate the risk of addiction when prescribing opioid analgesics for pain relief and to undertreat severe pain as a result.11-13 w8-w11 Risk of addiction is actually low when patients receive opioids (particularly controlled release formulations) for pain,14,15 but opiate addiction tends to receive high publicity and so—through the availability heuristic—its likelihood may be overestimated.
To avoid falling prey to the availability heuristic, doctors should try to be aware of all the diverse factors that influence a decision or diagnosis. They should ask if their decision is influenced by any salient pieces of information and, if so, whether these pieces of information are truly representative or simply reflect recent or otherwise particularly memorable experiences. Knowing whether information is truly relevant, rather than simply easily available, is the key.
Rules for good decision making
Be aware of base rates
Consider whether data are truly relevant, rather than just salient
Seek reasons why your decisions may be wrong and entertain alternative hypotheses
Ask questions that would disprove, rather than confirm, your current hypothesis
Remember that you are wrong more often than you think
Pitfall 3: overconfidence
To use our knowledge effectively, we must be aware of its limitations. Unfortunately, most of us are poor at assessing the gaps in our knowledge, tending to overestimate both how much we know and how reliably we know it (see bmj.com for an example). Research has shown that almost all of us are more confident about our judgments than we should be. Since medical diagnoses typically involve some uncertainty, we know that almost all doctors make more mistakes in diagnosis than they think they do. Overconfidence also comes into play when doctors rate their clinical skills. Larue et al found that both primary care doctors and medical oncologists rated their ability to manage pain highly, even though they actually had serious shortcomings in their attitudes toward and knowledge of pain control.16
The dangers of overconfidence are obvious. Doctors who overestimate their management of a condition may continue to prescribe suboptimal treatment, unaware that their management could be improved. Also, overconfidence in diagnostic abilities may result in too hasty a diagnosis, when further tests are needed. It is critical, therefore, to be aware of the limits of your knowledge and to ensure that knowledge is kept up to date. Awareness of your shortcomings makes it more likely that you will gather further information. It can also be helpful to make a habit of seeking the opinions of colleagues.17
Pitfall 4: confirmatory bias
Confirmatory bias is the tendency to look for, notice, and remember information that fits with our pre-existing expectations. Similarly, information that contradicts those expectations may be ignored or dismissed as unimportant.1,2 Confirmatory bias has been shown to affect peer-reviewers' assessments of manuscripts. Mahoney sent fictitious manuscripts with identical methods but different results to reviewers.18 Reviewers gave significantly better ratings to the methods section when the results supported their pre-existing beliefs.
Once again, doctors are not immune to confirmatory bias. In taking medical histories, doctors often ask questions that solicit information confirming early judgments. Even worse, they may stop asking questions because they reach an early conclusion, thus failing to unearth key data. More generally, the interpretation of information obtained towards the end of a medical work-up might be biased by earlier judgments.19
The confirmatory bias can also lead to treatment errors. It is natural to expect that the drug you are about to administer is the correct drug. Apparently obvious information that you have the wrong drug—for example, a label marked ephedrine instead of the expected epinephrine—may be ignored or misinterpreted to confirm your expectation that the drug is correct.20
Summary points
Psychologists have extensively studied the cognitive processes involved in making decisions
Heuristics and biases that lead to poor decisions are widespread, even among doctors
Awareness of the cognitive processes used to make decisions can reduce the likelihood of poor decisions
Although the danger of confirmatory bias is greatest when making decisions about diagnosis, ongoing treatment decisions are also affected. It is thus critical to remain constantly vigilant for any information that may contradict your existing diagnosis, and to give any such information careful consideration, rather than dismissing it as irrelevant. It is also a good idea to try to think of specific reasons why your current theory might be wrong and to ask questions that could potentially disprove your hypothesis. Always be aware of alternative hypotheses and ask yourself whether they may be better than your current ideas.
Pitfall 5: illusory correlation
Illusory correlation is the tendency to perceive two events as causally related, when in fact the connection between them is coincidental or even non-existent. (It has some overlap with confirmatory bias when causes that fit with pre-existing ideas are noticed.) Homoeopathy provides an excellent example of illusory correlation. Homoeopaths will often notice when patients improve after being treated with a homoeopathic remedy and claim this as evidence that homoeopathic treatment works. However, no convincing evidence exists that homoeopathic treatments are effective.w12 w13 Illusory correlation is probably at work: homoeopaths are likely to remember occasions when their patients improve after treatment.
Falling prey to illusory correlation can reinforce incorrect beliefs, which in turn can lead to the persistence of suboptimal practices. Ask yourself whether any instances do not fit with your assumed correlations. A straightforward way to do this is simply to keep written records of events that you believe to be correlated, making sure that all relevant instances are recorded.
Conclusions
Doctors often have to make decisions quickly. However, the greatest obstacle to making correct decisions is seldom insufficient time but distortions and biases in the way information is gathered and assimilated. Being aware that decisions can be biased is an important first step in overcoming those biases. In real life, of course, biases may not necessarily fit neatly into any one of the categories I described above but may result from a complex interaction of different factors. This increases the potential for poor decisions still further. The good news is that it is possible to train yourself to be vigilant for these errors and to improve decision making as a result (box).
Supplementary Material
References w1-13 and an example of how we overestimate our knowledge is on bmj.com
I thank Adam Jacobs of Dianthus Medical for help in preparing the manuscript.
Contributors and sources: JGK has a PhD in social psychology and has spent much of her academic career conducting research on biases in impression formation. Sources cited in this article were derived from extensive searches of Medline and Embase.
Funding: This paper was prepared with financial assistance from Janssen-Cilag.
Competing interests: JGK has received speaking and consultancy fees from Janssen-Cilag, which manufacture an opioid analgesic patch.
References
- 1.Kahneman D, Slovic P, Tversky A, ed. Judgement under uncertainty: heuristics and biases. Cambridge: Cambridge University Press; 1982.
- 2.Russo JE, Schoemaker PJH. Winning decisions: how to make the right decision the first time. London: Piatkus, 2002.
- 3.Bornstein BH, Emler AC. Rationality in medical decision making: a review of the literature on doctors' decision-making biases. J Eval Clin Pract 2001;7: 97-107. [DOI] [PubMed] [Google Scholar]
- 4.McDonald CJ. Medical heuristics: the silent adjudicators of clinical practice. Ann Intern Med 1996;124: 56-62. [DOI] [PubMed] [Google Scholar]
- 5.Dawson NV. Physician judgment in clinical settings: methodological influences and cognitive performance. Clin Chem 1993;39: 1468-78 (discussion 1478-80). [PubMed] [Google Scholar]
- 6.Hershberger PJ, Part HM, Markert RJ, Cohen SM, Finger WW. Development of a test of cognitive bias in medical decision making. Acad Med 1994;69: 839-42. [DOI] [PubMed] [Google Scholar]
- 7.Borak J, Veilleux S. Errors of intuitive logic among physicians. Soc Sci Med 1982;16: 1939-47. [DOI] [PubMed] [Google Scholar]
- 8.Kahneman D, Tversky A. On the psychology of prediction. Psychol Rev 1973;80: 237-51. [Google Scholar]
- 9.Brannon LA, Carson KL. The representativeness heuristic: influence on nurses' decision making. Appl Nurs Res 2003;16: 201-4. [DOI] [PubMed] [Google Scholar]
- 10.Poses RM, Anthony M. Availability, wishful thinking, and physicians' diagnostic judgments for patients with suspected bacteremia. Med Decis Making 1991;11: 159-68. [DOI] [PubMed] [Google Scholar]
- 11.Weinstein SM, Laux LF, Thornby JI, Lorimor RJ, Hill CS Jr, Thorpe DM, et al. Physicians' attitudes toward pain and the use of opioid analgesics: results of a survey from the Texas Cancer Pain Initiative. South Med J 2000;93: 479-87. [PubMed] [Google Scholar]
- 12.Morgan JP. American opiophobia: customary underutilization of opioid analgesics. Adv Alcohol Subst Abuse 1985;5: 163-73. [DOI] [PubMed] [Google Scholar]
- 13.Potter M, Schafer S, Gonzalez-Mendez E, Gjeltema K, Lopez A, Wu J, et al. Opioids for chronic nonmalignant pain. Attitudes and practices of primary care physicians in the UCSF/Stanford Collaborative Research Network. University of California, San Francisco. J Fam Pract 2001;50: 145-51. [PubMed] [Google Scholar]
- 14.McCarberg BH, Barkin RL. Long-acting opioids for chronic pain: pharmacotherapeutic opportunities to enhance compliance, quality of life, and analgesia. Am J Ther 2001;8: 181-6. [DOI] [PubMed] [Google Scholar]
- 15.Brookoff D. Abuse potential of various opioid medications. J Gen Intern Med 1993;8: 688-90. [DOI] [PubMed] [Google Scholar]
- 16.Larue F, Colleau SM, Fontaine A, Brasseur L. Oncologists and primary care physicians' attitudes toward pain control and morphine prescribing in France. Cancer 1995;76: 2375-82. [DOI] [PubMed] [Google Scholar]
- 17.Koriat A, Lichtenstein S, Fischhoff B. Reasons for confidence. J Exp Psychol Hum Learn Mem 1980;6: 107-18. [Google Scholar]
- 18.Mahoney MJ. Publication prejudices: an experimental study of confirmatory bias in the peer review system. Cognit Ther Res 1977;1: 161-75. [Google Scholar]
- 19.Wallsten TS. Physician and medical student bias in evaluating diagnostic information. Med Decis Making 1981;1: 145-64. [DOI] [PubMed] [Google Scholar]
- 20.Nott MR. Misidentification, in-filling and confirmation bias. Anaesthesia 2001;56: 917. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.