Abstract
Cognitive biases can cause diverse medical errors and lead to malpractice with potential harm to patients. Some cognitive biases are due to social behavior, professional specialization, and personal experience, leading to commission or omission in medical conduct. We would like to propose a previously undescribed cognitive bias called “Schrödinger’s cat bias.” In 1935, Erwin Schrödinger proposed a dual system based on quantum mechanics that a cat could be dead or alive at the same time. The “Schrödinger’s cat bias” is a situation in which a physician takes a decision and requests an exam or procedure that was unnecessary and puts the patient through an unforeseen risk. After the procedure, if there is a good outcome, the patient will be grateful for it. However, if there is a bad outcome, he would still be grateful for their efforts in trying to find the etiology. This cognitive bias will, most of the time, favor the therapies over the decision of not to treat.
Keywords: cognition, bias, epidemiology
Introduction
Cognitive biases can be the source of up to 75% of medical errors [1]. We usually see clinicians making a diagnosis or taking clinical decisions based on their clinical expertise. A study demonstrated that 25% of emergency physicians elaborate on their diagnosis hypothesis before meeting the patient and 75% within the initial five minutes of consulting [2].
Kahneman and Tversky proposed a dual-system framework to explain human judgments and decisions [3]. The first path is an easier and quicker way to decide, and it is based on previous experiences. The second path is a more rational and deliberated way, based on critical thinking. Most of the cognitive biases that lead to medical mistakes are because of decisions made relying on the first path, which are clinical guesses that individuals make based on automated thought of previous associations [4]. Another common cause of medical mistakes is due to harmful omissions and commissions. Omissions in medicine are followed with questions such as “Am I right about my diagnosis? Should I wait for a second opinion?” and it can delay emergency situations, just like placing a chest tube in a pneumothorax. On the other hand, commissions usually come along with the following thinking: “better safe than sorry” [5].
With the awareness of the medical community rising on cognitive biases based on over 32 types described [6], we would like to propose a new one called “Schrödinger’s cat bias.”
Materials and methods
A non-systematic review was conducted searching on the PubMed/MEDLINE database for articles published at any time discussing cognitive biases, particularly in clinical practice. Search terms included “cognitive biases” in the title, abstract, or publication type combined with “clinical practice” in the title or abstract. These terms were further combined with one or more of the following terms in the title or abstract fields: ‘cognitive heuristics’ or “counterfactual thinking”. Resultant articles were cross-referenced for other pertinent articles not identified in the initial search about specific biases, such as commission bias, or were written by eminent scientists in this field. For the examples chosen to illustrate the bias, cases that match the author’s reality and routine as an interventional cardiologist were described and a direct search was performed for the relevant evidence about each case.
Results
The Schrödinger's cat cognitive bias
In 1935, Erwin Schrödinger proposed an example of quantum mechanics applied in a more complex macroscopic system. The system includes a cat and a radioactive particle which can decay and drop poison that kills the cat. It creates a state of superposition in which the cat is dead and alive at the same time. If the particle has not decayed, the cat is alive; if it did, the cat is dead [7].
The bias proposed in this paper is that an individual, because he does not have the chance to live parallel timelines, does not have the capacity, as he has not got practical experience, to analyze what could have happened to him if he or his doctor had made a different decision.
This is particularly relevant in medicine. Patients usually overestimate benefits and underestimate risks. If one comes out alive from an intervention, he may feel grateful for that because even if he evolved badly, at least he didn’t live the timeline in which no intervention was performed.
Discussion
Examples of Schrödinger's cat bias
Situation 1: The Exercise Treadmill Test
The exercise test has a sensitivity of around 60%-70% and specificity around 71% [8], with a positive likelihood ratio of around 2.3 and a negative likelihood ratio of 0.46.
A patient comes to a medical appointment asking for an exercise treadmill test. There are four possible timelines after this appointment (Figure 1).
Figure 1. Situation 1 – The exercise treadmill test.
Different timelines that may happen when a simple treadmill test is asked by the patient. Reader must keep in mind that coronarography, the exam that confirms lesions in coronary artery, is not free from complications (bleeding, stroke, contrast-induced nephropathy) and that a stent may not reduce the risk of having an acute myocardial infarction in future or dying and may have complications (stent thrombosis or restenosis and bleeding).
Timeline 1.1: The physician refuses to request the exercise test, the patient gets upset, seeks a different doctor who offers the exam, and states: “What could go wrong in an exercise treadmill test? Who knows if there is myocardial ischemia?” The result is negative, and the patient will not go to the original doctor anymore.
Timeline 1.2: The physician requests the exercise test, and it is negative for ischemia. The patient now feels protected and praises the doctor for taking care of his life.
Timeline 1.3: The physician requests the exam, and there is a true-positive result for ischemia. Now the patient will undergo a percutaneous coronary intervention (PCI). The patient thanks the doctor for finding out about a disease he didn’t feel any symptoms of and could end his life. The patient does not know that PCI probably does not reduce his risk of acute myocardial infarction (AMI) and death [9,10], and only the risk of the procedure remains.
Timeline 1.4: The physician requests the exam, and there is a false-positive result for ischemia. The patient undergoes a coronarography; It does not show any obstructive coronary heart disease, and now the patient feels safe again.
Timeline 1.5: The physician requests the exam, and there is a false-negative result for ischemia. Two independent timelines may be imagined.
1.5.1: The patient and his family are satisfied with his doctor until he goes through a massive AMI. He and his family will not blame the doctor who did his best for diagnosing a silent disease.
1.5.2: The patient and his family are back at the medical office and are still concerned and seeking further exams. The doctor is driven to request subsequent more invasive exams that can culminate in either Timeline 1.3 or 1.4 or a loop of different exams and conditions.
From the patient’s point of view, it is difficult to know that the physician in Timeline 1.1 was working to avoid unnecessary exams and procedures. Terms like “overdiagnosis” and “overtreatment” are not universally known, and a decision like this may be misinterpreted as omission [11,12].
In fact, in all timelines, the risk of an AMI did not change because for AMI to happen, it does not demand previous ischemia, the result of a positive exercise test. Timelines 1.3 and 1.4, however, bring risks for the patient, as coronarography is an invasive exam and there is no clear evidence that invasive treatment [for example, a percutaneous coronary intervention (PCI) is beneficial for decreasing AMI risk or mortality. Timeline 1.3 will presumably be interpreted by the patient as a concerned doctor who saved his life. The subject in Timeline 1.4 will be comforted that the coronarography was negative and possibly will never think that the exam that generated it could never have occurred. Schrödinger’s cat bias almost invariably encourages commission bias [1,13]. This example also illustrates the difficulties that doctors and patients have to recognize and abandon medical reversals, which are previously established standards that were later statistically proved not beneficial: the Schrodinger cat’s bias itself and the fact that contradiction of mainstream practices undermines trust in the medical system [14].
Situation 2: The Surgery
A hypothetical surgery is indicated to a patient because, if not done, the patient will have a five percent risk of dying in 10 years. The patient does not know, but his risk of dying in surgery or the postoperative period is four percent. Over 10 years, one percent of operated patients will die. That means that, at the end of 10 years, there will be no statistical benefit in one strategy over the other.
At this time, two distinct timelines can take place (Figure 2).
Figure 2. Situation 2 – The surgery.
Different timelines that may happen when surgery is proposed. This hypothetical surgery depicted poses a 4% risk of dying during surgery. The risk of dying from the disease if surgery is performed is 5% in 10 years, the same risk if the surgery is not performed (1% if survived surgery).
Timeline 2.1: The surgery is done, and the patient ran an immediate statistical risk of death of four percent but survived. His post-op, however, was problematic. By altered laboratory tests, he started an antibiotic regimen that ended up prolonging his hospital stay. After that, decubitus ulcers appeared. He was discharged 45 days later in a regular condition, requiring home care, which lasted six months. After this period, his statistical risk of death from that condition was one percent over the next 10 years. With six months to go until the 10th anniversary of the surgery, the patient felt sick. The tests revealed a worsening of the disease for which he had been operated. He goes through six hard months, with progressive functional limitations, until he dies.
Timeline 2.2: The surgery is not done, and the patient remained at home, doing his daily activities for nine years and six months. With six months to go before the 10th anniversary of the surgery not done; his illness worsens. He goes through six hard months, with progressive functional limitations, until he dies.
The quality of life described in Timeline 2.2 was superior to that in Timeline 2.1. Besides, the patient in Timeline 2.1 went through a four percent risk of immediate death, which did not happen in Timeline 2.2. A situation like this may be clearly recognizable in surgeries or invasive procedures that lack evidence of statistical benefit.
Because the patient had the perception that the surgery saved his life - after all the complications, he left the hospital alive - in a few or no moments comes to his mind the possibility of the existence of another timeline in which he had a life free of complications. Note that both timelines had the same five percent accumulated chance of dying. The second timeline, in fact, maybe more correlated with dissatisfaction with the doctor who did not indicate that surgery 10 years ago.
The problem
For many diseases, patients and physicians do not have a probabilistic sense of the problem. For example, the presence of a partial coronary lesion is virtually deterministic in the patient’s perception of an imminent inexorable infarction, although this is not correct. As the patient and the doctor can only live one of these timelines, and especially in the defensive phase of medicine, the tendency is that one thinks more positively in the timeline in which the exam or intervention was performed than in which it was not, even if complications arise.
Counterfactual reasoning is mental representations of alternatives to past events [15]. The bias described in this paper refers to the possibility of the absence of counterfactual reasoning from both patients and physicians, which may drive them to be more susceptible to appreciate testing and therapies regardless of its outcomes and risks, because patients and doctors have excessive and ingenuous reliance on tests and interventions [16]. There are cognitive biases described when counterfactual thinking contributes to unsatisfactory outcomes that may arise when the doctor’s past experiences were traumatic, possibly affecting his decisions in the next comparable case [17]. However, in our literature review, we did not find another reference to a cognitive bias that results from the lack of counterfactual thinking.
Conclusions
The Schrödinger's cat bias is a new cognitive bias that we propose in this paper that stands for the difficulty that patients and doctors have in imagining their situation if a drug or interventionist therapy had not been indicated. This is due to the fact that, if the patient has gone through treatment, the timeline in which he did not pass through it did not exist, and therefore neither the patient nor the doctor gained practical experience with that other reality. This cognitive bias favors therapies over the decision not to treat, notably at the time of “defensive medicine” in which we live.
The content published in Cureus is the result of clinical experience and/or research by independent individuals or organizations. Cureus is not responsible for the scientific accuracy or reliability of data or conclusions published herein. All content published within Cureus is intended only for educational, research and reference purposes. Additionally, articles published within Cureus should not be deemed a suitable substitute for the advice of a qualified health care professional. Do not disregard or avoid professional medical advice due to content published within Cureus.
The authors have declared that no competing interests exist.
Human Ethics
Consent was obtained or waived by all participants in this study
Animal Ethics
Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue.
References
- 1.Cognitive bias in clinical medicine. O’Sullivan ED, Schofield SJ. J R Coll Physicians Edinb. 2018;48:225–232. doi: 10.4997/JRCPE.2018.306. [DOI] [PubMed] [Google Scholar]
- 2.How and when do expert emergency physicians generate and evaluate diagnostic hypotheses? A qualitative study using head-mounted video cued-recall interviews. Pelaccia T, Tardif J, Triby E, Ammirati C, Bertrand C, Dory V, Charlin B. Ann Emerg Med. 2014;64:575–585. doi: 10.1016/j.annemergmed.2014.05.003. [DOI] [PubMed] [Google Scholar]
- 3.Judgment under uncertainty: heuristics and biases. Tversky A, Kahneman D. Science. 1974;185:1124–1131. doi: 10.1126/science.185.4157.1124. [DOI] [PubMed] [Google Scholar]
- 4.Toward an instance theory of automatization. Logan GD. Psychol Rev. 1988;95:492–527. [Google Scholar]
- 5.Cognitive errors detected in anaesthesiology: a literature review and pilot study. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Br J Anaesth. 2012;108:229–235. doi: 10.1093/bja/aer387. [DOI] [PubMed] [Google Scholar]
- 6.The importance of cognitive errors in diagnosis and strategies to minimize them. Croskerry P. Acad Med. 2003;78:775–780. doi: 10.1097/00001888-200308000-00003. [DOI] [PubMed] [Google Scholar]
- 7.Nobel lecture: superposition, entanglement, and raising Schrödinger’s cat. Wineland DJ. Rev Mod Phys. 2013;85:1103–1114. doi: 10.1002/anie.201303404. [DOI] [PubMed] [Google Scholar]
- 8.III Diretrizes da Sociedade Brasileira de Cardiologia sobre teste ergométrico. (Article in Portuguese) Meneghelo RS, Araújo CGS, Stein R, et al. Arq Bras Cardiol. 2010;95:1–26. doi: 10.1590/S0066-782X2010000800001. [DOI] [PubMed] [Google Scholar]
- 9.Optimal medical therapy with or without percutaneous coronary intervention to reduce ischemic burden: results from the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial nuclear substudy. Shaw LJ, Berman DS, Maron DJ, et al. Circulation. 2008;117:1283–1291. doi: 10.1161/CIRCULATIONAHA.107.743963. [DOI] [PubMed] [Google Scholar]
- 10.Initial invasive or conservative strategy for stable coronary disease. Maron DJ, Hochman JS, Reynolds HR, et al. N Engl J Med. 2020;382:1395–1407. doi: 10.1056/NEJMoa1915922. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Omission and commission in judgment and choice. Spranca M, Minsk E, Baron J. J Exp Soc Psychol. 1991;27:76–105. [Google Scholar]
- 12.A matter of perspective: choosing for others differs from choosing for yourself in making treatment decisions. Zikmund-Fisher BJ, Sarr B, Fagerlin A, Ubel PA. J Gen Intern Med. 2006;21:618. doi: 10.1111/j.1525-1497.2006.00410.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Countering cognitive biases in minimising low value care. Scott IA, Soon J, Elshaug AG, Lindner R. Med J Aust. 2017;206:407–411. doi: 10.5694/mja16.00999. [DOI] [PubMed] [Google Scholar]
- 14.Reversals of Established Medical Practices: Evidence to Abandon Ship. Prasad V, Cifu A, Ioannidis JPA. JAMA. 2012;307:37–38. doi: 10.1001/jama.2011.1960. [DOI] [PubMed] [Google Scholar]
- 15.The functional theory of counterfactual thinking. Epstude K, Roese NJ. Pers Soc Psychol Rev. 2008;12:168–192. doi: 10.1177/1088868308316091. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Taleb NN. Antifragile: Things That Gain From Disorder. New York, USA: Random House Publishing Group; 2012. [Google Scholar]
- 17.Pitfalls of counterfactual thinking in medical practice: preventing errors by using more functional reference points. Petrocelli JV. J Public Health Res. 2013;2:0. doi: 10.4081/jphr.2013.e24. [DOI] [PMC free article] [PubMed] [Google Scholar]