Several observations in clinical practice including that of digestive diseases appear to be difficult to explain on a rational basis. These include doing “too much” such as repeated testing, parallel testing, expensive and extensive testing for unlikely rare disorders, as well as doing “too little” such as screening for common cancer, vaccination, or counseling. Some of the practices seem baffling at first glance especially given the presence of decision analysis models, cost effectiveness models, and occasionally evidence-based published guidelines that delineate rational medical decision making. Conventional economics (i.e., rational utility theory) assumes that decisions are arrived at through a “rational” process of weighing benefits against costs for the goal of maximizing one’s utility. Medical decision making is modeled by similar assumptions; namely, physicians (and patients) should make utility maximizing (i.e., rational economic) decisions about each patient who presents to them considering all the available data and balancing benefits/harms against costs. However, there has been a growing interest in behavioral economics, which concerns itself with describing how decisions are actually made. Behavioral economics assumes that we (people) are “predictably irrational”(1) in the way we make decisions on several things including those decisions pertaining to health, and that understanding the basis for this irrationality might be used to make us “do the right thing” despite ourselves. Daniel Kahneman was awarded the Nobel prize in Economics in 2002 for his and Amos Tversky’s empiric work describing how individuals do not make decisions based on a rational, utility maximizing process. Kahneman and Tversky (1973) demonstrated that individuals use mental short-cuts (heuristics) to efficiently and reliably make decisions that are often in contrary to models that determine preferences based on weighing benefits against costs. Furthermore, using the term “prospect” to refer to a set of outcomes with a probability distribution, they state that where winning is possible but not probable, i.e. when probabilities are low, most people choose the prospect that offers the larger gain. Conversely, with the same quantitative probability for a loss, most people choose the prospect that offers the smallest loss. Their Prospect theory (1979) illustrated how individuals predictably overweigh losses and underweigh gains.
We will illustrate these concepts with an example of seemingly non-rational behavior in GI practice, highlight the relevant explanations from behavioral economics, and conclude with proposed ways to intervene based on these theories.
Observation
Patients with Barrett’s esophagus (BE) have a very small risk of esophageal adenocarcinoma of approximately 0.5% per year. Endoscopic surveillance of patients with BE but no dysplasia, even at 5-year intervals, is an expensive practice. Models of cost-effectiveness have shown that endoscopic surveillance programs in the United States either do more harm than good compared with no surveillance or are unlikely to be cost-effective at usual levels of willingness to pay. Similarly in UK studies, non-surveillance dominated surveillance (i.e., cost less and conferred more benefit)(2), or in other words, surveillance programs do more harm than good (3).
Yet patients and providers remain highly concerned. Surveillance of BE is responsible for a large volume of endoscopy and office visits. One possible explanation is that reimbursement and defensive medicine drive(4) many of these seemingly irrational practices. However, surveillance in BE and or esophageal adenocarcinoma has been reported in healthcare systems where there are no financial incentives to doing more and medical malpractice cases are rare.
Explanation
Cognitive heuristics are rules of thumb that are used to make various types of judgments. These can be used properly to simplify judgment under uncertain circumstances. However, people make systemic errors in judgment based on common cognitive biases(5). Lay people think about risk in ways that deviate from those of expert risk assessors. Clinicians despite their medical expertise are susceptible to these traps when making clinical assessments because they are not trained or proficient at estimating and calculating risk. The following classic heuristics explain a lot of apparently non-rational behavior.
1) Default options
People in general are highly prone to go with customary or non-action defaults than to actively make a choice for an alternative (non-default) action, even when the alternative action is good for them(6). Rational economics assumes that people weigh utilities for every given choice frame and then pick the utility maximizing choice. Using this argument, it would be clear that screening and surveillance in BE is not a good choice. However, behavioral economics say that we overweigh default options even if we have higher utilities for the non-default option. For the patient, this makes the physician recommendation to undergo surveillance a very important factor in patients opting for surveillance. For the physician, expectations set by professional society guidelines and community practice standards may play a similar role.
2) Anchoring heuristic
The presence or absence of risk estimates in health information creates “anchors” for our choices. The initial “anchor” or first impression has a disproportionate bearing on ultimate judgments, often because people do not make sufficient adjustments from that anchor. Base rates of cancer, including our example of BE, are salient cognitive and affective anchors, however inadequate adjustment can result when initial anchors are inaccurate(7) “base-rate fallacy”. A survey of patients with BE participating in an endoscopic surveillance program found that 68% of patients overestimated their 1-year risk of cancer, with a mean estimated 1-year cancer risk being close to 14%(8). People who perceive they are at more risk will seek out more information but inconsistent information creates ambiguity and perceiving high personal risk can also produce more anxiety.
3) Availability heuristic
People often make judgments of frequency or risk based on whatever information is most accessible or available to them. Vivid or sensational events leave a more lasting impression than more common mundane events(5). Therefore, physicians who have been “burned” with a missed diagnosis of esophageal adenocarcinoma would be more likely to overweigh the risk of cancer in BE, and consequently test too many people including many that would not fit the risk profile. Similarly, patients who have a family member with esophageal adenocarcinoma may have a disproportionate affect on risk aversion behavior toward this cancer. This heuristic creates biases in computing risks by overweighing the co-occurrence of risk (BE) to outcomes (cancer) and overweighing the opinions of risk in determining your chance of an outcome. Furthermore, patients with additional anxiety related to improper or unadjusted anchors may further overweight risks based on the recurrent mental image of esophageal cancer. The attention given to the small risk of esophageal cancer created by availability and anchoring heuristics may be disproportionate compared to the attention direct at screening for common but more likely cancers (e.g., colon cancer). The interaction of availability bias and base-rate fallacy is an important theme in physicians’ explanations for why they often do not follow established clinical guidelines(9).
4) Endowment effects
Once a patient’s symptoms or complaints are given a diagnostic label, the diagnosis produces an endowment effect on the patient and physician. Endowment effects skew decisions based on the perception of gain or loss (10). In the example of BE without dysplasia, patients and physicians who “own” the diagnosis of BE will overweigh the chance of dysplasia and cancer and underweigh the probability that no harm will occur. Patient and physician behaviors related to cancer screening and treatment are more consistent with underlying values for “avoiding regret” or “loss aversion” rather than a rational calculation of utilities or weighted preferences(11).
The presence of endowment effects moderates the interaction between perceptions of disease risk and health behaviors, especially in cancer prevention behavior. The presence of real or perceived risk factors for cancer (endowment effects) correlates with perceived cancer risk, with the most frequently cited general factors being diet, family history, general health, and experiences of cancer (12). Patients who overestimated cancer risk in BE were more likely to have more symptomatic reflux (8).
So we are irrational about our risk estimates; what can we do about it?
Behavioral economics advocates policy decisions that help us to break these cognitive biases. The following proposed strategies exploit the cognitive biases that produce behavior distortions in the first place.
1] Structural barriers and facilitators are commonly described in the cancer screening literature
These can be configured in ways that make screening of low-risk patients more difficult and screening of high-risk patients easier without interfering in the dynamics of the doctor-patient relationship. For low-risk patients, such as those with BE without dysplasia, endoscopic surveillance (or at least strategies that involve repeated frequent endoscopy) can be made non-default options with bureaucratic obstacles to scheduling and even higher insurance co-pays. Screening for high-risk patients can be made a default option with no co-pays and automatic scheduling and timely reminders.
Clearly, endorsement of professional societies and possibly modification of current BE surveillance guidelines is required. In fact, professional society guidelines can be designed in such a way that the “right” choice (defined by evidence) is the default one. While these interventions may appear paternalistic at first glance, they do not directly interfere with the doctor-patient encounter or physicians determinations of medical appropriateness. Furthermore, the paternalism is asymmetric in that patients and doctors who maintain strong preferences for low-effectiveness strategies are still free to pursue them in the same manner that patients in managed care plans can still see “out-of-network” physicians(13).
2] Avoidance of labeling pre-cancerous conditions
Not withstanding pathophysiological terms required for research in this area, patients with enodscopic and histological findings consistent with BE without dysplasia should not labeled with a “diagnosis” and in particular the term pre-cancerous can be avoided. Technically accurate information about the endoscopic and histological findings and the dysplasia/cancer risk estimates can be given to provide information without endowing the label of a pathological condition.
3] Point of care decision support
Decision support tools can be included as standard language on endoscopy or pathology reports. These tools can target both physicians and patients using simple language and multiple presentations of the same numerical estimates of 5-year or lifetime probabilities of cancer. These numerical estimates can serve as accurate risk anchors that frame future discussions. Along with these risk anchors, the potential harms related to frequent surveillance can also be clearly described. Availability bias can be reduced by presenting absolute risk numbers for patients with BE who never progressed to cancer juxtaposed against the much smaller proportion of those with progression.
In conclusion, concepts and lessons learned from behavioral economics can be used to explain decision making in clinical practice and to design interventions aimed at improving these decisions.
Reference List
- 1.Ariely D. Predictably Irrational. New York: Harper Collins; 2009. [Google Scholar]
- 2.Somerville M, Garside R, Pitt M, Stein K. Surveillance of Barrett’s oesophagus: is it worthwhile? Eur J Cancer. 2008 Mar;44(4):588–599. doi: 10.1016/j.ejca.2008.01.015. [DOI] [PubMed] [Google Scholar]
- 3.Garside R, Pitt M, Somerville M, Stein K, Price A, Gilbert N. Surveillance of Barrett’s oesophagus: exploring the uncertainty through systematic review, expert workshop and economic modelling. Health Technol Assess. 2006 Mar;10(8):1–iv. doi: 10.3310/hta10080. [DOI] [PubMed] [Google Scholar]
- 4.Rubenstein JH, Saini SD, Kuhn L, McMahon L, Sharma P, Pardi DS, et al. Influence of malpractice history on the practice of screening and surveillance for Barrett’s esophagus. Am J Gastroenterol. 2008 Apr;103(4):842–849. doi: 10.1111/j.1572-0241.2007.01689.x. [DOI] [PubMed] [Google Scholar]
- 5.Kahneman D, Slovic P, Tversly A. Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press; 1982. [DOI] [PubMed] [Google Scholar]
- 6.Halpern SD, Ubel PA, Asch DA. Harnessing the power of default options to improve health care. N Engl J Med. 2007 Sep 27;357(13):1340–1344. doi: 10.1056/NEJMsb071595. [DOI] [PubMed] [Google Scholar]
- 7.Fagerlin A, Zikmund-Fisher BJ, Ubel PA. How making a risk estimate can change the feel of that risk: shifting attitudes toward breast cancer risk in a general public survey. Patient Educ Couns. 2005 Jun;57(3):294–299. doi: 10.1016/j.pec.2004.08.007. [DOI] [PubMed] [Google Scholar]
- 8.Shaheen NJ, Green B, Medapalli RK, Mitchell KL, Wei JT, Schmitz SM, et al. The perception of cancer risk in patients with prevalent Barrett’s esophagus enrolled in an endoscopic surveillance program. Gastroenterology. 2005 Aug;129(2):429–436. doi: 10.1016/j.gastro.2005.05.055. [DOI] [PubMed] [Google Scholar]
- 9.Cavazos JM, Naik AD, Woofter A, Abraham NS. Barriers to physician adherence to nonsteroidal anti-inflammatory drug guidelines: a qualitative study. Aliment Pharmacol Ther. 2008 Sep 15;28(6):789–798. doi: 10.1111/j.1365-2036.2008.03791.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Ubel PA. Why Human Nature is at Odds with Economics—and Why it Matters. Boston, MA: Harvard Business Press; 2009. Free Market Madness. [Google Scholar]
- 11.Collins ED, Moore CP, Clay KF, Kearing SA, O’Connor AM, Llewellyn-Thomas HA, et al. Can women with early-stage breast cancer make an informed decision for mastectomy? J Clin Oncol. 2009 Feb 1;27(4):519–525. doi: 10.1200/JCO.2008.16.6215. [DOI] [PubMed] [Google Scholar]
- 12.Robb KA, Miles A, Wardle J. Perceived risk of colorectal cancer: sources of risk judgments. Cancer Epidemiol Biomarkers Prev. 2007 Apr;16(4):694–702. doi: 10.1158/1055-9965.EPI-06-0151. [DOI] [PubMed] [Google Scholar]
- 13.Loewenstein G, Brennan T, Volpp KG. Asymmetric paternalism to improve health behaviors. JAMA. 2007 Nov 28;298(20):2415–2417. doi: 10.1001/jama.298.20.2415. [DOI] [PubMed] [Google Scholar]
