The emergency department (ED) is a high‐risk environment where diagnostic error is not uncommon. Most errors (70%) are due to faulty reasoning.1 Decision making occurs through two primary pathways: 1) Pattern recognition is fast, intuitive, and heuristically driven and occurs largely unconsciously; 2) analytic thinking is slow and deliberate and takes place under conscious control. When functioning optimally, expert clinicians toggle back and forth between these two systems depending on the complexity of the case and the demands of the environment. Systematic errors (known as biases) can interfere with reasoning via either pathway, but predominately affect the abbreviated decision making associated with pattern recognition. Thus, a critical feature of cognitive bias mitigation involves deliberate “switching” from intuitive to analytical processing and the deliberate use of debiasing strategies.2, 3
Model of Reasoning and Debiasing
Prominent cognitive psychologist Daniel Kahneman (Thinking Fast and Thinking Slow) holds the largely pessimistic view that physicians are incapable of employing bias mitigation strategies to overcome their flawed intuition.4 Recent research, however, offers strong converging evidence that doctors do have the means to overcome bias through education.5 This Med Ed download focuses on some of the most common biases amongst ED providers so that you can more effectively recognize and mitigate bias in yourself and in your learners. The aim is to help teachers and learners develop a common language around bias to make you STOP, THINK about the thinking that underlies these errors, and ACT by proposing debiasing strategies to address them.
Key Points (See Table 1):
Table 1.
Bias | Description/Example | Debiasing Strategy |
---|---|---|
Aggregate bias | A belief that aggregate data (i.e. practice guidelines) does not apply to individual patients, which can lead to unnecessary testing. | Routinely apply guidelines/clinical decision rules. Superiority over clinical judgment has been demonstrated, e.g., PERC rule, NEXUS criteria. |
Anchoring bias | Anchoring onto particular features early in a presentation is normal, but bias occurs when we persist with the initial anchor and fail to adjust when new data suggest another diagnosis. | Avoid sticking with early impressions, judgments, and preconceptions. Seek more information. Revisit diagnosis with new data. Mnemonics (i.e., VINDICATES) can help broaden the differential. |
Availability bias | A tendency to judge things as more likely if they readily come to mind. Recent exposure to a disease increases the likelihood of it being diagnosed, whereas not seeing a disease for a long time decreases the likelihood. | Judge cases on their own merits rather than recent experiences. Be aware of the recency effect. Question the objective basis for clinical decisions. |
Confirmation bias | An inclination to seek evidence to support a diagnosis rather than refute it. For example, allowing N/V and photophobia to confirm migraine HA, rather than seeking clues that would refute the diagnosis of SAH (gradual onset). | Consider the opposite. Try to disconfirm initial hypothesis. Ensure alternatives are considered. Argue the case for and against. |
Triage cueing | A predilection to allow triage to signal subsequent diagnoses and management, meaning patients placed in nonacute areas are not sick. |
See the patient yourself and form your own impressions before reading the triage summary or nurses' notes or hearing a learner's case presentation. Two heads (or many) are better than one. You will invariably each pick up important data that the other person did not. Collectively this information forms a more complete picture of the case. “Group think” should be used for difficult cases. Ask a colleague for an independent assessment or a second opinion. Do not “frame” the patient to a colleague; give objective data. |
Diagnosis momentum | A propensity for labels or diagnoses to “stick” once they have been applied. This process may start with anyone (the patient, EMS, nurses, medical students, residents, other attendings) and continues as data are related from person to person. The diagnosis gathers momentum often without gathering evidence. | |
Premature closure | A readiness to accept a diagnosis before it has been fully verified. | Force consideration of alternative possibilities. Generate and work through a reasonable differential diagnosis. Also be sure to ask, “What else might this be?” Always rule out worst‐case scenarios (ROWS). |
Represent‐ativeness Restraint | A habit of looking for prototypical manifestations of disease such that atypical variants may be missed. | Be aware of individual variation and atypical presentations. What looks like a duck, walks like a duck, quacks like a duck may not be a duck. |
Search satisficing | A readiness to call off a search once something is found. | The most commonly missed fracture is the second one. Always consider comorbidities. For example, a patient presents with diabetic ketoacidosis. What was the trigger? |
Psych‐out error | An impulse to assume a psychiatric etiology and overlook serious medical conditions (i.e., hypothyroidism misdiagnosed as depression; chest pain attributed to anxiety. | Employ “until proven otherwise” to ensure that you do not make a psychiatric diagnosis until other diagnoses have been systematically excluded. Return to a broad differential diagnosis before settling. |
Visceral bias | A disposition to be influenced by affective sources of error. Countertransference may be in the form of negative feelings toward particular patient populations (i.e., obese, chronic pain, chronic intoxicants) or positive emotions (i.e., this patient reminds me of my mom.) | Remember to act calm no matter how you feel and be aware of emotion on decision making. Take extra time to examine all the data and employ evidence‐based medicine. Objective scientific data should aid analytic decisions instead of feelings. |
VINDICATES = vascular, infection, neoplastic, drugs/toxins, inflammatory/idiopathic, congenital, autoimmune, trauma, endocrine/environmental, something else/psychological.
More than 100 cognitive and affective biases have been described.
Raising awareness of common biases affecting emergency physicians is important to prevent diagnostic error.
Pattern recognition is most vulnerable to bias and suboptimal decision making.
Debiasing strategies may include cognitive forcing techniques applied to individual cases.
References
- 1. Graber M. Diagnostic errors in medicine: a case of neglect. Jt Comm J Qual Patient Saf 2005;31:106–13. [DOI] [PubMed] [Google Scholar]
- 2. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013; 22:ii58–ii64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Croskerry P. When I say … cognitive debiasing. Med Educ 2015;49:656–7. [DOI] [PubMed] [Google Scholar]
- 4. Kahneman D. Thinking, Fast and Slow. New York: Farrar, Straus, and Giroux, 2011. [Google Scholar]
- 5. Croskerry P. Our better angels and black boxes. Emerg Med J 2016;33:242–4. [DOI] [PubMed] [Google Scholar]
- 6. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013;1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–80. [DOI] [PubMed] [Google Scholar]