Abstract
A comment on “COVID‐19 and misinformation”.
Subject Categories: S&S: Economics & Business, S&S: Ethics
We read Emilia Niemiec’s article, COVID‐19 and misinformation, with great interest (Niemiec, 2020). While we agree that censorship is an inadequate solution to the “infodemic” of false medical news on social media, we would like to provide additional context regarding the usefulness of either education or censorship as tools to fight misinformation. We also discuss empirically supported solutions to the problem of online misinformation, including accuracy nudges and crowdsourced ratings.
Our primary disagreement with Niemiec concerns the particular forms of education she recommends to combat online health misinformation. Several remedies lack strong empirical support, such as teaching social media companies’ business models. Others may produce unintended consequences. For example, teaching about researcher bias and flawed peer‐reviewed systems could increase vulnerability to health misinformation by undermining trust in science and scientists (Roozenbeek et al, 2020).
Not all educational approaches to reducing misinformation’s impacts are unsupported or potentially misguided. Teaching strategies for spotting misinformation—for instance, checking authors’ sources—improves discernment between real and fake news (Guess et al, 2020). Learning techniques commonly used to peddle misinformation in a game‐like environment reduces the perceived reliability of fake news items and improves confidence in correct reliability judgments (Basol et al, 2020).
Despite their promise, educational interventions have significant limitations: Chiefly, they require individuals who are motivated to seek and voluntarily engage them. This complicates outreach to populations with lower digital media literacy, such as older individuals, who may be most likely to share fake news. Furthermore, even effective educational interventions published in prominent journals do not eliminate vulnerability to misinformation. For example, after learning strategies to spot misinformation, more than 20% of people still rated fake news “somewhat accurate” or “very accurate” (Guess et al, 2020). A final limitation of educational interventions stems from their focus on the perceived accuracy of misinformation. Perceived accuracy has little impact on information sharing, likely because social media encourages individuals to focus on other factors, such as whether sharing will attract and please followers and friends (Pennycook et al, 2019). Accordingly, educational interventions that improve detection of online health misinformation may not reduce misinformation sharing. Interventions that do not reduce misinformation sharing are therefore incomplete because sharing begets misinformation exposure, which begets increased perceptions of truth.
Clearly, education alone is an inadequate solution to the problem of medical misinformation on social media. Reducing harms associated with misinformation requires multipronged, empirically validated approaches, which may include forms of censorship, nudges, and crowdsourcing.
Censorship can prevent individuals from being exposed to false and potentially dangerous ideas. Preventing exposure is integral because merely viewing misinformation increases perceptions of truth, as demonstrated in experiments examining the “illusory truth effect”, which extends to fake news, and holds even when information is implausible or contradicts pre‐existing knowledge (Fazio et al, 2019).
A significant amount of misinformation promoting COVID‐19 “cures” and “preventative agents” is clearly false and potentially dangerous. In Iran, misinformation about using ethanol to cure and/or prevent infection, in combination with cultural factors (alcohol being illegal), has precipitated fatal methanol poisonings (Hassanian‐Moghaddam et al, 2020). False claims that the COVID‐19 vaccine contains a microchip and will alter DNA may encourage COVID‐19 vaccine hesitancy (Roozenbeek et al, 2020) and thereby interfere with the establishment of herd immunity.
Disabusing individuals of beliefs inspired by this misinformation will be difficult: Individuals often continue to rely upon misinformation even after viewing explicit corrections—a phenomenon known as the “continued influence effect” (Basol et al, 2020). Furthermore, human cognition appears organized to resist belief modification, and humans display cognitive biases, such as confirmation bias, that help to maintain beliefs (Bronstein et al, 2019).
Because censorship circumvents exposure to false information and thus intervenes before beliefs become established and subject to these biases, it has immense value in the fight against online health misinformation. To be clear, we advocate for deletion of false and dangerous information; other forms of censorship, like labeling information as disputed, can have unintended consequences, such as causing unlabeled false information to seem more accurate—the “implied truth effect” (Pennycook & Rand, 2019).
In cases where censorship is less well‐suited, such as when information’s epistemic status is highly ambiguous (Niemiec, 2020), social media platforms could use nudges and crowdsourced judgments to limit users’ exposure. Platforms could, for example, require individuals to rate a non‐health‐related news headline’s accuracy before viewing health‐related news. Research has shown that similar accuracy nudges nearly tripled the magnitude of individuals’ stated preference for “sharing” true over false news headlines (Pennycook et al, 2020). If this translates to reduced misinformation sharing, this intervention could significantly reduce exposure to fake news. Exposure could be limited further by crowdsourcing judgments of health‐related media outlets’ trustworthiness and employing algorithms that disfavor content from outlets considered less trustworthy (Pennycook & Rand, 2019).
We are glad that Niemiec (2020) has called attention to the fight against online health misinformation, which the COVID‐19 pandemic has imbued with a new sense of urgency. We posit that creative, empirically supported solutions such as the ones we mentioned, when combined with more traditional tools like censorship and education, can make substantial progress in this fight.
Author contributions
MVB and SV developed the manuscript’s concept. MVB drafted the manuscript. SV provided critical revisions. All authors approved the final manuscript.
Acknowledgements
M.V. Bronstein is supported by the Wells Family Trust, which had no role in writing the manuscript or deciding to submit it for publication.
EMBO Reports (2021) 22: e52282
References
- Basol M, Roozenbeek J, Van Der Linden S (2020) Good news about bad news: gamified inoculation boosts confidence and cognitive immunity against fake news. J Cogn 3: 1–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bronstein MV, Pennycook G, Joormann J, Corlett PR, Cannon TD (2019) Dual‐process theory, conflict processing, and delusional belief. Clin Psychol Rev 72: 101748 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fazio LK, Rand DG, Pennycook G (2019) Repetition increases perceived truth equally for plausible and implausible statements. Psychon Bull Rev 26: 1705–1710 [DOI] [PubMed] [Google Scholar]
- Guess AM, Lerner M, Lyons B, Montgomery JM, Nyhan B, Reifler J, Sircar N (2020) A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proc Natl Acad Sci USA 117: 15536–15545 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hassanian‐Moghaddam H, Zamani N, Kolahi AA, McDonald R, Hovda KE (2020) Double trouble: methanol outbreak in the wake of the COVID‐19 pandemic in Iran ‐ A cross‐sectional assessment. Crit Care 24: 10–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Niemiec E (2020) COVID ‐19 and misinformation. EMBO Rep 19–22 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pennycook G, Epstein Z, Mosleh M, Arechar AA, Eckles D, Rand DG (2019) Understanding and reducing the spread of misinformation online [DOI] [PubMed]
- Pennycook G, McPhetres J, Zhang Y, Lu JG, Rand DG (2020) Fighting COVID‐19 misinformation on social media: experimental evidence for a scalable accuracy‐nudge intervention. Psychol Sci 31: 770–780 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pennycook G, Rand DG (2019) Fighting misinformation on social media using crowdsourced judgments of news source quality. Proc Natl Acad Sci USA 116: 2521–2526 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roozenbeek J, Schneider CR, Dryhurst S, Kerr J, Freeman ALJ, Recchia G, van der Bles AM, van der Linden S (2020) Susceptibility to misinformation about COVID‐19 around the world. R Soc Open Sci 7: 201199 [DOI] [PMC free article] [PubMed] [Google Scholar]