Although everyone has the potential to be misled by false information, online misinformation is not an equal opportunity aggressor. Some of us are more likely to believe misinformation than are others and serve as vectors by sharing it on social media. To effectively combat misinformation on social media, it is crucial to understand the underlying factors that lead certain people to believe and share false and misleading content online. A growing body of research has tackled this issue by investigating who is susceptible to online misinformation and under what circumstances. This literature can help shape future research and interventions to address health misinformation. We provide a brief overview of what we know about who is susceptible and what we still have to learn.
THEORETICAL PERSPECTIVES
One dominant perspective, which is sometimes referred to as the deficit hypothesis, is that people who believe misinformation do not have sufficient knowledge or literacy to discriminate between true and false information. Although health researchers often focus on health literacy, other types of literacy deficits are relevant, such as digital literacy, media literacy, and science literacy. Brashier and Schacter recently argued that the reason older adults share fake news on social media more frequently than do younger adults is not because of cognitive declines but because older adults have lower digital literacy than do younger adults. Older adults may be less savvy at identifying reliable online news sources, advertised (vs editorial) content, and manipulated photographs.1 Accordingly, some interventions have sought to address misinformation susceptibility by improving digital literacy (and related skills). For example, Guess et al. recently reported that a brief digital media literacy intervention improved detection of fake news headlines in both the United States and India.2
Another perspective is that people tend to be susceptible to misinformation that is consistent with their preexisting beliefs or worldview.3 Considerable research has shown that people tend to preferentially believe information that is consistent with their other preexisting beliefs.3 However, recent research has found that people may not be as influenced by their preexisting attitudes as previously thought. Specifically, in one study, individuals who had a more reflective cognitive style, as measured by the Cognitive Reflection Test, were better able to discern between true and false news content than were people who were more intuitive.4 Importantly, this occurred regardless of whether the news headlines were consistent or inconsistent with the participants’ political ideology. Individuals’ tendency to engage in greater reflective thought is also associated with their ability to detect COVID-19 misinformation.5
Moreover, other work has found that those who are worse at discerning between true and false information tend to overclaim their own knowledge and to be receptive to “pseudoprofound” statements (i.e., they rate random sentences filled with buzzwords but devoid of intended meaning as being profound).6 Evaluating these findings altogether, experts have speculated that receptivity to misinformation is related to being more “reflexively openminded.”6 That is, people who are susceptible to misinformation fail to even consider that the content is inaccurate, regardless of their underlying political ideology or preexisting beliefs.
Accordingly, a recent study showed that a simple accuracy nudge that primes people to think about whether headlines are true is sufficient to increase the quality of COVID-19–related news content that people indicate they would share on social media.5 A Twitter field experiment employing a similar intervention has also reported promising results.7 These findings support the idea that people fall for misinformation because they fail to think about the accuracy of content that they come across on social media, not because they are exercising politically motivated reasoning or are simply confused about what is and is not true.
To summarize, there are three currently dominant (albeit not entirely mutually exclusive) theoretical perspectives addressing why certain people are susceptible to online misinformation: (1) being confused about what is true versus false, suggesting that knowledge or various literacies are a primary factor; (2) having strong preexisting beliefs or ideological motivations that lead to motivated reasoning and therefore a desire to believe and share misinformation; and (3) neglecting to sufficiently reflect about the truth or accuracy of news content that is encountered on social media.
QUESTIONS FOR FUTURE RESEARCH
There are, of course, other individual characteristics that may be particularly relevant to accepting health-related misinformation that are not as neatly characterized under these perspectives. An important element is trust in health experts and health science. Trust is multifaceted: people can possess varying levels of (dis)trust in doctors, medical science, scientists, and health care systems. Each type of distrust may make an individual more susceptible to health misinformation. More research is needed on different facets of trust and their implications for believing misinformation. Other individual characteristics that have not yet been adequately studied in relation to misinformation susceptibility include traits such as the need for autonomy and one’s orientation toward medicine. For example, a medical-maximizing orientation (i.e., the tendency to want active, aggressive approaches to health care) was recently found to be robustly associated with susceptibility to COVID-19 misinformation, a finding that warrants further explanation and exploration.5
A key unanswered question is whether susceptibility to misinformation is a generalized trait or is context dependent. The people who believe misinformation about politics may be the same people who believe misinformation about health5—however, there may be important differences between people who believe one or the other type of misinformation, and this issue has not been systematically investigated. For that matter, health misinformation spans many different health topics, and it is unclear whether people who believe misinformation about a particular health topic, such as vaccines, also tend to believe misinformation about other health topics (e.g., misinformation about cancer treatments, COVID-19). No research has explicitly addressed this question, but an answer to it could provide insight into the extent to which findings in one content area can inform other areas. Such knowledge would help to streamline the development and testing of interventions. For example, if we knew that similar people believe misinformation about health and politics and science, then we could more confidently extend interventions from one domain to others.
ADDRESSING SUSCEPTIBILITY
Although content moderation on social media platforms is clearly needed, we also need scalable interventions that can efficiently reach and effectively influence the people who are susceptible to believing and sharing health misinformation. These might be interventions to improve digital literacy or misinformation awareness in online environments. We envision a targeted public health campaign, and the first thing that any campaign needs is an excellent understanding of its audience: who they are, what motivates their beliefs and behaviors, and what is likely to persuade them. To understand our audience and deliver effective messages, we need to identify the characteristics of people who are particularly susceptible to misinformation. Identifying who is susceptible to misinformation will also help us understand why they are susceptible. Understanding misinformation susceptibility in this way could help us make great strides in addressing it through targeted public health interventions.
CONFLICTS OF INTEREST
The authors have no conflicts of interest to disclose.
Footnotes
See also Chou and Gaysynsky, p. S270.
REFERENCES
- 1.Brashier NM, Schacter DL. Aging in an era of fake news. Curr Dir Psychol Sci. 2020;29(3):316–323. doi: 10.1177/0963721420915872. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Guess AM, Lerner M, Lyons B et al. A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proc Natl Acad Sci U S A. 2020;117(27):15536–15545. doi: 10.1073/pnas.1920498117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Lewandowsky S, Ecker UKH, Seifert CM, Schwarz N, Cook J. Misinformation and its correction: continued influence and successful debiasing. Psychol Sci Public Interest. 2012;13(3):106–131. doi: 10.1177/1529100612451018. [DOI] [PubMed] [Google Scholar]
- 4.Pennycook G, Rand DG. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition. 2019;188:39–50. doi: 10.1016/j.cognition.2018.06.011. [DOI] [PubMed] [Google Scholar]
- 5.Pennycook G, McPhetres J, Zhang Y, Lu JG, Rand DG. Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy-nudge intervention. Psychol Sci. 2020;31(7):770–780. doi: 10.1177/0956797620939054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Pennycook G, Rand DG. Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. J Pers. 2020;88(2):185–200. doi: 10.1111/jopy.12476. [DOI] [PubMed] [Google Scholar]
- 7.Pennycook G, Epstein Z, Mosleh M, Arechar A, Eckles D, Rand D. Understanding and reducing the spread of misinformation online. PsyArXiv Preprints. Published November 13, 2019. Available at: https://psyarxiv.com/3n9u8. Accessed August 4, 2020. https://doi.org/10.31234/osf.io/3n9u8.