Skip to main content
Medical Journal of the Islamic Republic of Iran logoLink to Medical Journal of the Islamic Republic of Iran
. 2022 Oct 26;36:124. doi: 10.47176/mjiri.36.124

Anchoring Errors in Emergency Medicine Residents and Faculties

Helen Dargahi 1,2, Alireza Monajemi 3,*, Akbar Soltani 4, Hooman Hossein Nejad Nedaie 5, Ali Labaf 5
PMCID: PMC9700406  PMID: 36447549

Abstract

Background: Clinical reasoning is the basis of all clinical activities in the health team, and diagnostic reasoning is perhaps the most critical of a physician's skills. Despite many advances, medical errors have not been reduced. Studies have shown that most diagnostic errors made in emergency rooms are cognitive errors, and anchoring error was identified as the most common cognitive error in clinical settings. This research intends to determine the frequency and compare the percentage of anchoring bias perceived among faculty members versus residents in the emergency medicine department.

Methods: In this quasi-experimental study, Emergency Medicine's Faculties and Residents are evaluated in clinical reasoning by nine written clinical cases. The clinical data for each clinical case was presented to the participants over three pages, based on receiving clinical and para-clinical information in real situations. At the end of each page, participants were asked to write up diagnoses. Data were analyzed using one-way ANOVA test. The SPSS software (Version 16.0) was employed to conduct statistical tests, and a P value < 0.05 was considered to be statistically significant.

Results: Seventy-seven participants of the residency program in the Emergency Medical group volunteered to participate in this study. Data showed Faculties were significantly higher in writing correct diagnoses than residents (66% vs. 41%), but the anchoring error ratio was significantly lower in residents (33% vs. 75%). In addition, the number of written diagnoses, time for writing diagnoses, and Clinical experience in faculties and residents were compared.

Conclusion: Findings showed that increasing clinical experience increased diagnostic accuracy and changed cognitive medical errors. Faculties were higher than residents in anchoring error ratio. This error could be the result of more exposure and more decision-making in the mode of heuristic or intuitive thinking in faculties.

Keywords: Clinical Decision-Making, Diagnostic Errors, Clinical Reasoning, Emergency Medicine, Medical Faculty

Introduction

↑What is “already known” in this topic:

According to the DPT theory, cognitive errors often occur in the Intuitive mode of thinking. Even though mental shortcuts can lead to appropriate judgments, there are usually made by relying on instinctive first impressions and can result in severe errors.

→What this article adds:

Increasing clinical experience, although increases diagnostic accuracy, there are changes in cognitive medical errors. Faculties are higher than residents exposed to anchoring errors. This error could result from more exposure and more decision-making in the mode of heuristic or intuitive thinking in them.

Clinical reasoning as a cognitive process is the basis of all clinical activities in the health care team (1,2), and diagnostic reasoning is perhaps the most critical of a physician's skills (2-5). Despite many advances made in technology and the growth of evidence-based medicine, medical errors have not reduced over the last century (3), and they are still the eighth leading cause of death in the US (6). Makary and Daniel attribute a higher range of medical errors in the US, so they say medical errors are the third-leading cause of death in the USA (7). However, annually more than one million injuries and a hundred thousand deaths due to medical errors have been reported (8) that a considerable part of them is due to diagnostic medical errors (3). Short-duration patient encounters with the most severely ill patients in a busy emergency department (ED) setting have created a rich environment for medical errors (9).

Cognitive errors underlay most diagnostic errors made in the emergency rooms (10-13). According to a report from the Institute of Medicine, "To Err is Human: Building a Safer Health System," 70%-82% of errors in the emergency department(ED) are preventable (8,9). These errors were the most common diagnostic errors that can lead to death or permanent disability (8).

Even though emergency medicine seems to be a procedural discipline, they are highly dependent on cognitive skills (14) and the need to think deliberately (15). Emergency physicians make decisions in situations with limited information resources, time, acuity, pressure, and decision density (10,16) and increasingly rely on intuitive thinking (17,18). However, emergency physicians treat patients with different illness levels and use the spectrum of analytical and non-analytical thinking in decision-making processes (19). This leads us to the dual-processing theory. Kahneman and Tversky introduced a dual-system theoretical framework to explain making decisions under uncertainty (20).

Dual Process Theory (DPT) is accepted as a dominant description of cognitive processes that define human decision-making (21). In DPT, decisions are made with two modes of thinking, a rapid, automatic, and high capacity and another slow, conscious and deliberative, that are called Intuitive (system1) and Analytical (system2) mode of thinking, respectively (21-24). However, systems are not used in isolation; and they exist on a cognitive continuum and in a different situations with varying degrees of each (2,3,25). In the mode of intuition, the brain makes shortcuts to facilitate problem-solving and simplify decision-making (26). Although that is very efficient and time-saving, a cognitive bias that may lead to errors occurs during intuitive processing (16,27,28).

One of the most important biases in emergency medicine is anchoring (29). Anchoring bias refers to the excessive weighting of initial information and the inability to adjust the initial diagnostic hypothesis when further information becomes available (30-32). In Orkide et al. research (2012), anchoring bias was identified as the most common cognitive error in medicine (33). It may occur in a wide range of ED from triage to diagnostic labeling (29). Another study has shown one of three judgmental heuristics for decision-making in uncertain situations, when information is insufficient, is anchoring bias. Two other heuristics are representativeness and availability (34). Moreover, some cognitive biases are exacerbated by anchoring and some others contribute to anchoring (10,11,35).

The goal of this study was to determine the frequency and to compare the percentage of anchoring bias perceived among faculty members versus residents in the department of emergency medicine. Since the different educational strategies for cognitive debasing are used for every cognitive medical error (36), the results of this study will help us to provide a more appropriate training program for more accurate decision-making at the different levels of Emergency Medicine.

Methods

Participants

A total of 77 participants, including faculty members and residents in Emergency Medicine, participated in a clinical reasoning test. In our study, an emergency medical faculty is an Emergency Medicine Specialist who, after graduation, has at least two years of clinical experience in the emergency department and an emergency medicine resident is a general physician who passed the entrance exam and is educating in a 3-year residency program.

The study is approved by the Vice-Chancellor in Research Affairs- Tehran University of Medical Sciences (Ethics code IR.TUMS.VCR.REC.1395.171). Participation in the test was voluntary, and after completing the study, the results were shared with the participants in personal privacy.

A set of nine written clinical cases was used in this study. Cases consisted of a short description of a patient's medical history, Physical Examination, Laboratory, and Imaging test results that were presented to participants in three pages.

The cases were designed based on the diagnoses with the highest probability of diagnostic error in the emergency department. Then the diagnoses were adapted to the real patients admitted to Imam Khomeini hospital in TUMS. A correct definitive diagnosis and the most common incorrect diagnosis were identified for each case.

During several consecutive sessions, cases were reviewed, and nine of twelve finalized cases were approved by an expert panel (24). The panel consisted of experts from emergency medicine, anesthesiologist, internist, endocrinologist, and cardiologists. Finally, the research team reviewed and validated all cases. A clinical difficulty level for each case was provided from 1 representing the easiest level to 7, the hardest level by the expert panel (37). Cases were equally selected from these three groups: easy (score 1-4), moderate (5-5.5), and difficult (score 5.5-7). Test design was done using web-based software and was presented to the participants by the computer system.

Procedure

The study was conducted in the Emergency Medicine department at Tehran, Iran, and Kerman Universities of Medical Sciences. Seventy-seven people participated in the study from April to Dec 2017. Data was presented over three pages for each case. The first two pages were part of the disease data. Data is based on the process of receiving clinical and para-clinical information in real situations. At the end of the first and second pages, participants were asked to write up to a maximum of three possible diagnoses. Writing initial diagnoses at the end of the first page was optional. After writing diagnoses on the second page, it was possible to move to the third page. At the end of the third page, once all the data had been revealed, participants were required to enter a final diagnosis. Different types of adjustment were assessed by comparing the initial and final diagnoses. The participant had to complete all three pages in a maximum of five minutes. This time was calculated by the pilot study.

After writing the Diagnoses of cases (nine clinical cases), participants were asked to determine their level of direct clinical experience with 18 diagnoses on a 7-point Likert scale, ranging from 1 no experience in the disease to 7, highly experienced in the disease. In the final stage, they completed their demographic information forms. The total time the participants spent was up to 50 minutes.

Data analysis

All 77 participants finished the test and provided consent. All cases had at least a confirmed diagnosis that was used to evaluate the accuracy of the diagnoses provided by the participants. However, two experts in emergency medicine independently assessed all of the diagnoses that were written by participants and again decided on the accuracy or inaccuracy of the provided answers. For each correct diagnosis, a score of 1point was assigned. Eventually, written diagnoses on a totally of 690 clinical cases were assessed, and the answers were categorized as follows; No answer and several categories of adjustment, including: (a) Insufficient Adjustment when incorrect or correct initial diagnosis is followed by an incorrect final diagnose, (b) Sufficient Adjustment when incorrect initial diagnosis is following by correct final diagnoses; and (c) No Adjustment when the correct diagnosis is entered as the initial and final diagnosis (34).

We had two types of Insufficient Adjustment; when there was the same incorrect initial and final diagnosis or when they were different. The first type was called anchoring bias (34).

Other variables analyzed in this study were the number of diagnoses written by each participant, time taken to answer each case, and clinical experiences. Since participants could write multiple diagnoses at the end of the first or second pages, the number and mean of entered diagnoses were calculated. Also, the clinical experience of participants from 18 clinical diagnoses, including nine most likely clinical diagnoses and nine most common incorrect diagnoses analyzed.

To describe the data, descriptive statistics including frequency, percentage, mean, and standard deviation were used. The Shapiro-Wilk test was employed to examine the data distribution. The one-way ANOVA and Tukey’s post hoc test were used for between-group comparisons. The SPSS software (Version 16.0) was employed to conduct statistical tests, and the significance level was considered less than 0.05 in all the cases.

Results

Seventy-seven participants, including ten faculty members (mean age 37.9±5.2, Range=31-49) and 67 residents, 18 were in their first (mean age 34.3±6.8, Range=27-51), 26 were in their second (mean age 36.2±5.7, Range=26-47) and 23 were in their third year (mean age 36.9±5.7, Range=28-48) of the residency program of the Emergency Medical group volunteered to participate in this study. Mean faculty members' clinical experience after finishing the residency course was 5.7±3 years (range=2-10).

The number of diagnoses

The mean number of entered diagnoses in each three pages at first, second and third-year residents and faculties were calculated and ANOVA showed no significant difference between the numbers of responses (Table 1). The results support the general validity of the procedure and the selection of the participants, as If it were significant, the group with more diagnoses would have more chance of writing correct diagnosis.

Table 1. The mean number of entered diagnoses in each page .

Participants (N) Page 1 Page 2 Page 3
Mean (SD)
PGY*-1 (18) 2.5 (0.8) 1.5 (0.7) 0.9 (0.5)
PGY-2 (26) 2.5 (0.7) 1.7 (0.4) 1 (0.4)
PGY-3 (23) 2.6 (0.7) 1.7 (0.4) 1 (0.2)
Faculty (10) 2.2 (0.4) 1.6 (0.4) 1 (0.1)
Total (77) 2.5 (0.7) 1.6 (0.5) 1 (0.3)
P value 0.7 0.5 0.4

*Post graduate Year

Time for writing diagnoses

The average time taken for writing diagnoses in each case was 4.7±0.6 minutes (Range=2.4-5). An Analysis of variance (ANOVA) showed differences between faculties (mean±SD= 3.6±0.5) and residents (mean±SD= 4.9±0.5) were significant (p<0.001). So that faculty members answered questions in a shorter time.

Clinical exposure in diagnoses

Participants reported their level of clinical exposure with 18 diagnoses (nine correct and nine the most common incorrect diagnoses). Based on the one-way ANOVA test, residents' and faculties' differences in reported experience with diagnoses presented as correct diagnoses (N=9) were not significant, but there was a significant difference between the means with diagnoses presented as incorrect diagnoses (N=9); so that faculty members had more clinical exposure to incorrect suggestions (p<0.001).

Correct diagnoses and Anchoring error

According to the one-way ANOVA test, there was a significant difference between the faculties and residents in anchoring error (Same incorrect initial and final diagnoses) (p<0.001). The results of Tukey’s post hoc showed faculties were significantly higher than all three groups of residents in providing correct diagnoses (p<0.001), but the anchoring error ratio was significantly lower in residents (p=0.001). However, according to our expectations, the first-year residents were significantly weaker in providing correct diagnoses; there was no significant difference in the anchoring error ratio between first, second and third-year residents (Tables 2 and 3). As seen in Table 3, the mean percentage of anchoring error (anchoring error/total errors×100) in first, second and third-year residents is 25%, 38% and 35% respectively, while this rate is 75% in faculties.

Table 2. The Classification of faculties and residents' answers to the clinical reasoning test .

Participants (N) No Answer
(%)
Insufficient adjustment Sufficient adjustment (%) No Adjustment (%) Total
(%)
Same incorrect* (%) Different incorrect (%)
Frequency (percent)
PGY**-1 (18) 49 (28.8%) 29 (17.1%) 54 (31.7%) 26 (15.3%) 12 (7.1%) 170 (100%)
PGY-2 (26) 31 (13.7%) 48 (21.2%) 46 (20.4%) 67 (29.6%) 34 (15%) 226 (100%)
PGY-3 (23) 19 (9.2%) 31 (15%) 51 (24.7%) 78 (37.7%) 28 (13.5%) 207 (100%)
Faculty (10) 1 (1.1%) 23 (26.4%) 7 (8%) 37 (42.5%) 19 (21.8%) 87 (100%)
Total (77) 100 (14.5%) 131 (19%) 158 (22.9%) 208 (30.1%) 93 (13.5%) 690 (100%)

Incorrect answer = No answer + Insufficient adjustment, Correct answer = Sufficient adjustment + No Adjustment

* Same incorrect initial and final diagnoses = Anchoring error

**Post Graduate Year

Table 3. Comparison of the Ratio of incorrect answers and anchoring error in residents and faculties .

Participants (N) Incorrect answers P value Anchoring error P value
Mean (SD) Mean (SD)
PGY-1 (18) 0.88 (0.2) <0.001 0.25 (0.23) 0.001
PGY-2 (26) 0.66 (0.2) 0.38 (0.3)
PGY-3 (23) 0.49 (0.2) 0.35 (0.27)
Faculty (10) 0.34 (0.2) 0.75 (0.34)
Total (77) 0.57 (0.24) 0.38 (0.3)

Discussion

This study investigated anchoring errors in Emergency Medicine. Results, similar to other studies (38-40), showed with increasing experience, the style of decision-making changed. Since the clinical experience in faculties is more than residents, they were different in making errors.

One of these differences was the anchoring error ratio that could be the result of decision-making in the intuitive mode of thinking based on the DPT theory. The results showed that the anchoring error rate in the faculties is meaningfully higher than in the residents (Table 3). Although experts are better than residents in focusing on the relevant and related information and generating more links to relate critical cues (41), their diagnostic decision-making is dominated by using heuristic thinking (42). Heuristics are efficient cognitive strategies or mental shortcuts that are ignored as part of the information (40,43) and used for making decisions faster and more frugally with effort reduction to improve judgments and decision makings in uncertain situations (43,44). Of course, they can lead to systematic and predictable errors (44,45). They are decision-making facilitators (45). In this study, faculties, because of more experience in uncertain situations, made the decision more quickly than residents, which could be due to more heuristic thinking in them.

Since each case was identified as a "definitive correct diagnosis" and a "most common incorrect diagnosis", the level of exposure to these diagnoses was evaluated in both groups. The results indicated that in faculties, clinical exposure to the diagnosis reported as anchoring error was meaningfully higher than in residents. In other words, when clinical exposure increases, the probability of anchoring error also increases. Interestingly, there was no considerable difference in the level of exposure in diagnoses presented as correct diagnoses between the groups.

For example, in scenarios where the correct diagnosis is aortic dissection and the incorrect diagnosis is Myocardial Infarction (MI), the faculties had more exposure to MI, and they were anchored to MI in error. In our study, 75% of faculties' errors were anchoring, while these ratios in residents were less (Table 3).

Other indicators in this study were "anchoring and adjustment," which is an expertise indicator in non-analytical decision-making (46). "Adjustment refers to the process of consciously moving the estimated value away from the anchor toward one thought to be more accurate" (47), and "Anchoring and adjustment" mean that final opinions are sensitive to the initial diagnosis (the anchor) so that the revision up or down from this anchor makes the final diagnosis (48). In this study, the data were provided to participants on three consecutive pages, and at the end of each page, participants were asked to write their own diagnoses. Then, the ordering of diagnoses was evaluated, and the results showed that the faculty members used sufficient adjustment more than the residents to reach the final diagnosis. Also, in addition to "sufficient Adjustment" in other classes, including "No Answer", "No Adjustment", and "Insufficient Adjustment", faculties were better than residents. Therefore, it can be concluded that although increasing clinical experience might change the pattern of cognitive errors, it leads to more accurate diagnoses. Other studies (49-52) have reported the same results.

The inability of residents to provide a final diagnosis should take into consideration. After completing the data, in 42% of the cases, they did not come up with an answer, or they gave different wrong answers at each step and did not come to a conclusion for the diagnosis of the disease. This status was 9% in the faculties.

Conclusion

This study showed with increasing clinical experience, although increased diagnostic accuracy, there were changes in the form of errors so that the ratio of anchoring error in faculties was higher than in residents, which could be due to their more clinical exposure to diagnoses in emergency situations. This error can be the result of faculties' more decision-making in the mode of heuristic or intuitive thinking. Since experienced emergency physicians may not be aware of the correctness of their diagnoses for a limited time, they are not looking for ways to strengthen and support their decisions. The results of this study help us to provide a more appropriate training program for cognitive debasing in different levels of emergency medicine. It is suggested in subsequent research, other common cognitive errors that are important in emergency medicine should be considered. In addition, other clinical environments where cognitive errors are critical are to be considered. Moreover, it is essential to perform educational interventions and evaluate the effectiveness of the intervention to reduce cognitive error.

Limitations of the study

The generalizability of findings from this study because of laboratory conditions to the real situation may be restricted. However, here we tried to consider the factors that make it possible to bring laboratory conditions closer to real situations.

Acknowledgment

The authors are thankful to the faculties and residents in the emergency medicine department at Tehran, Iran, and Kerman Universities of Medical Sciences, who dedicated their scarce time to participate in this study.

Conflict of Interests

The authors declare that they have no competing interests.

Cite this article as: Dargahi H, Monajemi A, Soltani A, Hossein Nejad H, Labaf A. Anchoring Errors in Emergency Medicine Residents and Faculties. Med J Islam Repub Iran. 2022 (26 Oct);36:124. https://doi.org/10.47176/mjiri.36.124

References

  • 1.Norman G. Research in clinical reasoning: past history and current trends. Med Educ. 2005;39(4):418–27. doi: 10.1111/j.1365-2929.2005.02127.x. [DOI] [PubMed] [Google Scholar]
  • 2.Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16:5890. doi: 10.3402/meo.v16i0.5890. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Croskerry P. A Universal Model of Diagnostic Reasoning. Acad Med. 2009;84(8):1022–8. doi: 10.1097/ACM.0b013e3181ace703. [DOI] [PubMed] [Google Scholar]
  • 4.Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edinb. 2011;41:155–62. doi: 10.4997/JRCPE.2011.208. [DOI] [PubMed] [Google Scholar]
  • 5.Simpkin AL, Vyas JM, Armstrong KA. Diagnostic Reasoning: An Endangered Competency in Internal Medicine Training. Ann Intern Med. 2017;167(7):507–8. doi: 10.7326/M17-0163. [DOI] [PubMed] [Google Scholar]
  • 6.Zhang J, Patel VL, Johnson TR, Shortliffe EH. A cognitive taxonomy of medical errors. J Biomed Inform. 2004;37:193e204. doi: 10.1016/j.jbi.2004.04.004. [DOI] [PubMed] [Google Scholar]
  • 7.Makary MA, Daniel M. Medical error-the third leading cause of death in the US. BMJ. 2016;353:i2139. doi: 10.1136/bmj.i2139. [DOI] [PubMed] [Google Scholar]
  • 8.Croskerry P, Sinclair D. Emergency Medicine: A practice prone to error? Can J Emerg Med. 2001;3(4):271–6. doi: 10.1017/s1481803500005765. [DOI] [PubMed] [Google Scholar]
  • 9.Okafor N, Payne VL, Chathampally Y, Miller S, Doshi P, Singh H. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine. Emerg Med J. 2016;33:245–52. doi: 10.1136/emermed-2014-204604. [DOI] [PubMed] [Google Scholar]
  • 10.Croskerry P. Achieving Quality in Clinical Decision Making: Cognitive Strategies and Detection of Bias. Acad Emerg Med. 2002;9(11):1184–204. doi: 10.1111/j.1553-2712.2002.tb01574.x. [DOI] [PubMed] [Google Scholar]
  • 11.Croskerry P. The Cognitive Imperative: Thinking About How We Think. Acad Emerg Med. 2000;7(11):1223. doi: 10.1111/j.1553-2712.2000.tb00467.x. [DOI] [PubMed] [Google Scholar]
  • 12.Croskerry P. Cognitive Forcing Strategies in Clinical Decision Making. Acad Emerg Med. 2003;41:110–20. doi: 10.1067/mem.2003.22. [DOI] [PubMed] [Google Scholar]
  • 13. Payne VL, Crowley RS. Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning. AMIA Annu Symp Proc. 2008:571–5. [PMC free article] [PubMed]
  • 14. Morgenstern J. "Cognitive theory in medicine: A brief overview", First10EM blog, September 14, 2015. Available at: https://first10em.com/cognitive-overview/.
  • 15.Geary U, Kennedy U. Clinical Decision-Making in Emergency Medicine. Emergencias. 2010;22:56–60. [Google Scholar]
  • 16.Probst MA, Kanzaria HK, Schoenfeld EM, Menchine MD, Breslin M, Walsh C, et al. Shared decision making in the emergency department: a guiding framework for clinicians. Ann Emerg Med. 2017;70:688–95. doi: 10.1016/j.annemergmed.2017.03.063. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Croskerry P. ED cognition: any decision by anyone at any time. Can J Emerg Med. 2014;16(1):13–9. doi: 10.2310/8000.2013.131053. [DOI] [PubMed] [Google Scholar]
  • 18.Coget JF, Keller E. The Critical Decision Vortex: Lessons from the Emergency Room. J Manag Inq. 2010;19(1):56–67. [Google Scholar]
  • 19.Cabrera D, Thomas JF, Wiswell JL, Walston JM, Anderson JR, Hess EP, et al. Accuracy of 'My Gut Feeling:' Comparing System 1 to System 2 Decision-Making for Acuity Prediction, Disposition and Diagnosis in an Academic Emergency Department. West J Emerg Med. 2015;16(5):653–7. doi: 10.5811/westjem.2015.5.25301. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak. 2016;16(1):138. doi: 10.1186/s12911-016-0377-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Djulbegovic B, Hozo I, Beckstead J, Tsalatsanis A Pauker SG. Dual processing model of medical decision-making. BMC Med Inform Decis Mak. 2012;12:94. doi: 10.1186/1472-6947-12-94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ. 2009;14:27–35. doi: 10.1007/s10459-009-9182-2. [DOI] [PubMed] [Google Scholar]
  • 23.Norman G. Dual processing and diagnostic errors. Adv Health Sci Educ. 2009;14:37–49. doi: 10.1007/s10459-009-9179-x. [DOI] [PubMed] [Google Scholar]
  • 24. Payne VL. Effect of a metacognitive intervention on cognitive heuristic use during diagnostic reasoning. Doctoral Dissertation, University of Pittsburgh, 2011.
  • 25.Custers E. Medical Education and Cognitive Continuum Theory: An Alternative Perspective on Medical Problem Solving and Clinical Reasoning. Acad Med. 2013;88:1074–80. doi: 10.1097/ACM.0b013e31829a3b10. [DOI] [PubMed] [Google Scholar]
  • 26.Richie M, Josephson SA. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness. Teach Learn Med. 2018;30(1):67–75. doi: 10.1080/10401334.2017.1332631. [DOI] [PubMed] [Google Scholar]
  • 27.O'Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. J R Coll Physicians Edinb. 2018;48(3):225–32. doi: 10.4997/JRCPE.2018.306. [DOI] [PubMed] [Google Scholar]
  • 28.Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22:ii58–ii64. doi: 10.1136/bmjqs-2012-001712. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Waldrop RD. Medical Decision-Making Errors Due to Faulty Heuristics in the Pediatric Emergency Department and the Use of Mindfulness. J Pediatr Neonatal Care. 2017;7(4):00299. [Google Scholar]
  • 30.Nendaz M, Perrier A. Diagnostic errors and flaws in clinical reasoning: mechanisms and prevention in practice. Swiss Med Wkly. 2012;142:w13706. doi: 10.4414/smw.2012.13706. [DOI] [PubMed] [Google Scholar]
  • 31.Ehrlinger J, Readinger WO, Kim B. Decision-making and cognitive biases. Encyclopedia of Mental Health. 2016;12(3):83–7. [Google Scholar]
  • 32. Nurius PS, Gibson JW. Clinical observation, inference, reasoning, and judgment in social work: An update. Social Work Research and Abstracts, 1990;26(2):18-25. Oxford University Press.
  • 33.Ogdie AR1, Reilly JB, Pang WG, Keddem S, Barg FK, Von Feldt JM, et al. Seen Through Their Eyes: Residents' Reflections on the Cognitive and Contextual Components of Diagnostic Errors in Medicine. Acad Med. 2012;87(10):1361–7. doi: 10.1097/ACM.0b013e31826742c9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Ellis MV, Robbins ES, Schult D, Ladany N, Banker J. Anchoring Errors in Clinical Judgments: Type I Error, Adjustment, or Mitigation? J Couns Psychol. 1990;37(3):343–51. [Google Scholar]
  • 35.Etchells E. Anchoring Bias with Critical Implications. AORN J. 2016;103(6):658–31. doi: 10.1016/j.aorn.2016.03.012. [DOI] [PubMed] [Google Scholar]
  • 36.Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22:ii65–ii72. doi: 10.1136/bmjqs-2012-001713. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Van den Berge K, Mamede S, van Gog T, Romijn JA, van Guldener C, van Saase JL, et al. Accepting diagnostic suggestions by residents: a potential cause of diagnostic error in medicine. Teach Learn Med. 2012;24:149–54. doi: 10.1080/10401334.2012.664970. [DOI] [PubMed] [Google Scholar]
  • 38.Berner ES, Graber ML. Overconfidence as a Cause of Diagnostic Error in Medicine. Am J Med. 2008;121(5A):S2–S23. doi: 10.1016/j.amjmed.2008.01.001. [DOI] [PubMed] [Google Scholar]
  • 39.Friedlander ML, Phillips SD. Preventing anchoring errors in clinical judgment. J Consult Clin Psychol. 1984;52:366–71. doi: 10.1037//0022-006x.52.3.366. [DOI] [PubMed] [Google Scholar]
  • 40.norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking. Acad Med. 2017;92(1):23–30. doi: 10.1097/ACM.0000000000001421. [DOI] [PubMed] [Google Scholar]
  • 41.Joseph GM, Patel VL. Domain knowledge and hypothesis generation in diagnostic reasoning. Med Decis Mak. 1990;10:31–46. doi: 10.1177/0272989X9001000107. [DOI] [PubMed] [Google Scholar]
  • 42.Regehr G, Norman GR. Issues in Cognitive Psychology: Implications for Professional Education. Acad Med. 1996;71(9):998–1001. doi: 10.1097/00001888-199609000-00015. [DOI] [PubMed] [Google Scholar]
  • 43.Gigerenzer G, Gaissmaier W. Heuristic Decision Making. Ann Rev Psychol. 2011;62:451–82. doi: 10.1146/annurev-psych-120709-145346. [DOI] [PubMed] [Google Scholar]
  • 44.Tversky A, Kahneman D. Judgment under uncertainty: heuristics and bias. Science. 1974;185:1124–31. doi: 10.1126/science.185.4157.1124. [DOI] [PubMed] [Google Scholar]
  • 45.Patel VL, Kaufman DR, Kannampallil TG. Diagnostic reasoning and decision-making in the context of health information technology. Rev Hum Factors Ergon. 2013;8(1):149–90. [Google Scholar]
  • 46.Epley N, Gilovich T. The Anchoring-and-Adjustment Heuristic- Why the Adjustments Are Insufficient. Psychol Sci. 2006;17(4):311–8. doi: 10.1111/j.1467-9280.2006.01704.x. [DOI] [PubMed] [Google Scholar]
  • 47.Lighthall GK, Guillamet CV. Understanding Decision Making in Critical Care. Clin Med Res. 2015;13(3-4):156–68. doi: 10.3121/cmr.2015.1289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med. 1999;74(7):791–4. doi: 10.1097/00001888-199907000-00012. [DOI] [PubMed] [Google Scholar]
  • 49. Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach in Advances in Patient Safety: From Research to Implementation. Rockville, MD: Agency for Healthcare Research and Quality (Publication No. 050021) 2005;2:241e54. [PubMed]
  • 50.Graber M. Diagnostic errors in medicine: A case of neglect. The joint commission journal on quality and patient safety. 2005;31(2):106–13. doi: 10.1016/s1553-7250(05)31015-4. [DOI] [PubMed] [Google Scholar]
  • 51.Kuhn GJ. Diagnostic errors. Acad Emerg Med. 2002;9(7):740–50. doi: 10.1111/j.1553-2712.2002.tb02155.x. [DOI] [PubMed] [Google Scholar]
  • 52.Friedman CP, Gatti GG, Franz TM, Murphy GC, Wolf FM, Heckerling PS, et al. Do physicians know when their diagnoses are correct? Implications for decision support and error reduction. J Gen Intern Med. 2005;20:334–9. doi: 10.1111/j.1525-1497.2005.30145.x. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Medical Journal of the Islamic Republic of Iran are provided here courtesy of Iran University of Medical Sciences

RESOURCES