A few years ago, British women were informed that the use of the contraceptive pill leads to a 2-fold increase in the risk of thromboembolism. Many stopped taking the pill, which resulted in unwanted pregnancies and abortions. If the official statement had instead been that the pill increases the risk from 1 to 2 in 14,000 women, few women would have been scared. Life and death can depend on how information is framed.
Yet framing itself is not the problem—every piece of information communicated requires a form or frame. The problem is a larger, societal one: the lack of education in understanding uncertainties and risks, also known as innumeracy. For instance, Sheridan et al. in this issue of the Journal of General Internal Medicine report that only 2% of patients could correctly answer three simple numeracy questions, and that nonwhites, females, and patients without college education misunderstood treatment benefits most.1 Our schools do not teach children the mathematics that will be most useful in their future lives: statistical thinking. Statistical thinking and the art of framing are also absent from most medical curricula and from continuing education offered to physicians. This omission is costly and irresponsible.
We could easily help patients, medical students, and physicians, turning their collective innumeracy into insight. Programs now exist that can achieve this goal with simple tools and in short time.2 The know-how is based on two sources: on empirical studies that show when framing has an effect, as in the excellent review by Moxey et al.3 (this volume), and, equally important, on theoretical knowledge of why framing has an effect. Here are a few examples of frames that can make a difference.
First, the use of single-event probabilities tends to confuse patients. Consider the case of a physician who used to inform his clients of Prozac's side effects by saying, “If you take Prozac, you have a 30% chance of a sexual problem.” When the physician changed his way of communicating the risk by using the frequency statement “out of every 10 patients to whom I prescribe Prozac, 3 to 5 experience a sexual problem,” his patients were less anxious and more willing to take Prozac. It turned out that many of them had originally understood that “something would go awry in 30% of my sexual encounters.” For single-event versus frequency statements, the mechanism of framing is clear. A single-event probability, by definition, does not specify the reference class (30% of what?). The physician thought of his patients, but his patients thought of their own sexual encounters. The confusion can be avoided by consistently using frequency statements.
Second, there is strong empirical evidence that conditional probabilities (such as sensitivity and specificity) tend to confuse minds, specifically when one wants to infer the chances of having a disease after a positive test. The reason for the confusion again lies in the reference classes, which in this case are switched: the sensitivity refers to patients with disease, and specificity to patients without disease. This second form of confusion can be avoided by using natural frequencies.4
Third, relative risk reduction (RRR) tends to mislead patients into overestimating the benefits of therapies and consequently increases their willingness to consent (uninformed consent), compared to absolute risk reduction (AAR) and number needed to treat (NNT). Again, the reason has to do with the reference class. For instance, many health organizations inform women that “mammography screening reduces the risk of dying from breast cancer by 25%” (RRR). But 25% of what? A woman may assume that the percentage refers to women like herself who consider screening, and erroneously conclude that 25 of every 100 women who participate in screening are saved. In contrast, when one frames the benefit as “screening reduces the risk of dying from 4 to 3 in 1,000 women” (an ARR of 1 in 1,000), the reference class is made clear.
Thus, instead of these three frames that tend to confuse patients and physicians—conditional probabilities, single-event probabilities, and RRR—we can teach physicians to use frames that foster insight: frequency statements, natural frequencies, and absolute risks.2,4 Good hypotheses about why framing influences minds can assist in planning new studies and ordering the apparent chaos of positive and negative findings.
In contrast, Sheridan et al.1 report that RRR led to more correct answers by patients than did NNT and ARR. This surprising result may well be due to their unusual phrasing of ARR (“treatment A reduces the chance that you will develop disease Y by 10 per 1,000 persons”), which is a hybrid between a single-person and a frequency statement, and their equally awkward phrasing of NNT. I wager that a clearer statement of ARR and NNT will increase the understanding of the size of benefit (e.g., ARR: “participating in treatment A prevents 10 out of every 1,000 persons from getting disease Y”; NNT: “100 patients need to undergo treatment A in order to prevent 1 from getting disease Y”). Transparent wording is the essence of a frame that fosters insight.
Finally, there are positive frames (“you have an 80% chance of surviving surgery”) versus negative frames (“you have a 20% chance of dying from surgery”). My hypothesis is that they have an effect if patients can reasonably assume that the physician's choice of frame conveys additional information, such as dynamic information. For instance, the positive frame can imply that surgery will increase the survival chance from 0% to 80%, whereas the negative frame suggests that surgery increases the chance of dying from 0% to 20%.
Understanding when and why framing has an effect is essential for informed consent and shared decision making. It is high time to enter this knowledge into the curricula of medical schools.
REFERENCES
- 1.Sheridan SL, Pignone MP, Lewis CL. A randomized comparison of patients' understanding of number needed to treat and other common risk reduction formats. J Gen Intern Med. 2003;18:884–892. doi: 10.1046/j.1525-1497.2003.21102.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Gigerenzer G. Calculated Risks: How to Know When Numbers Deceive You. New York: Simon & Schuster; 2002. (UK edition: Reckoning with Risk: Learning to Live with Uncertainty. London: Penguin Books, 2002) [Google Scholar]
- 3.Moxey A, O'Connell D, McGettigan P, Henry D. Describing treatment effects to patients: how they are expressed makes a difference. J Gen Intern Med. 2003;18:948–959. doi: 10.1046/j.1525-1497.2003.20928.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hoffrage U, Lindsey S, Hertwig R, Gigerenzer G. Communicating statistical information. Science. 2000;290:2261–2. doi: 10.1126/science.290.5500.2261. [DOI] [PubMed] [Google Scholar]