Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Dec 1.
Published in final edited form as: J Appl Res Mem Cogn. 2022 Dec;11(4):471–477. doi: 10.1037/mac0000090

Misinformation and the Sins of Memory: False-Belief Formation and Limits on Belief Revision

Eryn J Newman 1, Briony Swire-Thompson 2,3, Ullrich K H Ecker 4,5
PMCID: PMC10284569  NIHMSID: NIHMS1857784  PMID: 37351375

Misinformation has the potential to negatively affect both individual decision making and the common good. For example, belief in misinformation can have a negative impact on public health, environmental behaviors, and democracy (Cook, 2019; Lewandowsky et al., 2017; Loomba et al., 2021; Nisbet et al., 2021; Swire-Thompson & Lazer, 2022). Both in online and offline settings, people’s vulnerability to misinformation arises in part from features of our cognitive system. These cognitive features can lead us to form false beliefs, remember false claims as true, and struggle with memory updating and belief revision when misinformation is corrected. Here, we provide a commentary on the target article Memory Sins in Applied Settings: What Kind of Progress? (Schacter, 2022b), as many of Schacter’s sins of memory are fundamental to understanding the mechanisms behind the cognitive impacts of misinformation. Specifically, the current article discusses these sins in the context of (1) the formation of false beliefs and (2) the continued influence that misinformation can have on cognition after having been corrected.

False-Belief Formation and Inherent Vulnerabilities of Truth Assessment

Whether online or otherwise, we are routinely tasked with assessing the credibility of information we encounter. When sorting true from false, people draw on several criteria to make inferences about veracity (Brashier & Marsh, 2020; Ecker, Lewandowsky, et al., 2022; Schwarz, 2015; Schwarz et al., 2016). For instance, people are cautious of inconsistencies (Unkelbach & Rom, 2017; Winkielman et al., 2012); when information is inconsistent or incompatible with what people already believe, they are less likely to endorse it as factual (Jiang et al., 2021; Oyserman & Dawson, 2020; Pennycook & Rand, 2021b). People are also sensitive to internal coherence of a message, that is, whether it is logical and forms a rational argument (Pennington & Hastie, 1992; Simon, 2004). Through a more social lens, people draw on consensus in forming attitudes and beliefs (Festinger, 1954; van der Linden et al., 2015); when there is a perception that others agree on a given account of reality, we are inclined to concur (Kerr & van der Linden, 2022). The source of the information matters too; people are generally more inclined to believe credible sources (Briñol & Petty, 2009; Mackie et al., 1990; Nadarevic et al., 2020). While a careful analysis of these criteria may lead to a discerning conclusion regarding truth, some features of truth assessment, such as a reliance on intuition and feelings to infer veracity, can lull us into error, and associated sins of human memory can contribute to false-belief formation.

When people seek and encode information, veracity is not always at the front of their mind, and information can appeal for reasons other than its factual truthfulness (Acerbi, 2019). Moreover, built-in biases can introduce error, which can lead us to prioritize some accounts of truth over others. Schacter identified bias as a key sin of human memory where beliefs and knowledge can shape how we encode and recollect information. For instance, bias in favor of a consistent identity can systematically shape how we recall our pasts (Schacter, 1999; 2022a). A similar bias operates on assessments of truth: Identity can color perceptions of what seems right or feels true, with a tendency for us to endorse ideas and people that align with how we see ourselves and our groups (Ecker et al., 2021; Frenda et al., 2013; Hornsey & Fielding, 2017; Murphy et al., 2019; Pennycook & Rand, 2021a; Swire, Berinsky, et al., 2017; Wang et al., 2022). This identity-oriented bias can make us vulnerable to false beliefs, as identity-aligning information may not be scrutinized sufficiently (Oyserman, 2019).

Another sin of memory that may reduce our tendency to scrutinize information is the sin of absentmindedness, where people make cognitive errors due to inattention on a target task (Schacter, 2022b). Schacter highlights that the sin of absentmindedness may be associated with our digital environment where people are increasingly task-switching across devices and between apps (Schacter, 2022a, 2022b). While Schacter notes that the inattention associated with this sin is a concern for retaining information in educational contexts (Wammes et al., 2019), we suggest this is also a concern for the discernment between high- and low-quality information. Indeed, a lack of analytical, deliberative assessment of truth can contribute to false-belief formation (e.g., Bago et al., 2020). Media multitasking and ongoing absentmindedness may contribute to false belief formation through at least two key routes, by increasing the chances that people’s attention is drawn away from considering the veracity of information, or by increasing the chances that people take a ‘resource-light’ route in their assessment, relying on intuition or gut feelings to infer veracity.

While intuitive impressions allow a rapid and less cognitively demanding assessment of truth, reliance on intuition may lead to false beliefs through the sin of misattribution—drawing on metacognitive cues that can arise out of prior exposure to infer truth. Schacter (2022b) outlines one well-documented example that is highly relevant for misinformation: People judge repeated claims as more likely to be true, a phenomenon called the illusory truth effect (ITE; Hasher et al., 1977). The ITE is thought to occur because people use feelings such as information familiarity, cognitive fluency, and coherence—all of which can emerge as the result of repetition—as evidence that a claim is valid (Unkelbach et al., 2019; Wang et al., 2016). While feelings of familiarity, cognitive fluency, and coherence can serve as rational cues to truth in some social contexts, these feelings can also arise through repetition driven by disinformation campaigns or algorithms and bots on online platforms and can thus be a source of distortion (e.g., Unkelbach & Koch, 2019). The reliance on such repetition cues is insidious and as Schacter alludes, continues to persist even when people have more diagnostic information that they could draw on to assess truth. Indeed, people are inclined to believe repeated claims relative to new claims even when they have general knowledge contradicting the claim (Fazio et al., 2015; Fazio, 2020b), and even when the claim is shared by an untrustworthy source or declared false by a highly accurate source (Henkel & Mattson, 2011; Unkelbach & Greifeneder, 2018). While Schacter’s analysis highlights the distorting effects of repetition, in the absence of repetition, other factors entirely unrelated to truth can also produce feelings of ease of processing, which people routinely draw on to infer truth. For example, sources that are easy to pronounce are rated as more trustworthy and likable than those difficult to pronounce (Laham et al., 2012; Newman et al., 2014; Silva et al., 2017). Familiar faces are rated more favorably and their statements as more convincing (Brown et al., 2002; Weisbuch & Mackie, 2009). Speakers who have clear audio are rated as more credible, reliable and trustworthy (Bild et al., 2021; Newman & Schwarz, 2018). While people are sensitive to the experience of easy processing, they are less sensitive to the origin of the fluency (Jacoby et al., 1989; Schwarz 2015). Thus, people often use fluency as an informative cue to truth even when the real source of fluent processing (e.g., pronunciation ease) is tangential and non-diagnostic.

A lack of skepticism may further contribute to errors in assessments of veracity. Indeed, people have a default to assume incoming information is correct (Gilbert et al., 1993; Grice, 1975; Schwarz et al., 2007). However this assumption of truth can be disrupted, which can benefit decision making in this domain. Explicit warnings about the presence of false information or prompts to think about accuracy can help people engage in more critical analysis, increase truth discernment, and reducing the impact of fluency on judgment (Bago et al., 2020; Brashier et al., 2020; Jalbert et al., 2020). Schacter (2022b) highlights how this might work in subtle ways—by simply presenting claims as questions rather than statements (see Cavillo & Harris, 2022). When people are engaged in more analytical processing or skeptical analysis, they are more likely to identify logical flaws in a claim and are less inclined to apply the default of believing incoming information (Lee et al., 2015; Mayo, 2019). While more analytical, deliberative processing can enhance our ability to discern facts from fiction, the online environment where people often seek information can present significant hurdles that make us vulnerable to false beliefs.

As Schacter (2022b) points out, one can encounter misinformation that has all the features we tend to associate with true information. Even if engaged in optimally skeptical, analytical processing, our internal fact-checks are only helpful to the degree that false information violates the criteria we apply. Deepfakes and fake news may make typical cues to truth less diagnostic, having the potential to lull us into error (Murphy & Flynn, 2022; Nightingale & Farid, 2022).. People can also miss or underutilize cues to falsity in digital environments that should suggest cause for hesitation (Dias et al., 2020; Nightingale et al., 2017). Moreover, the contemporary information environment facilitates repeated exposure to false ideas, which can not only lead to the abovementioned illusory truth effects but also give rise to misperceptions of social consensus—for example, minority opinions that are prominent on social media can lead people to assume that their own (majority) opinion is less common, a case of pluralistic ignorance (Lewandowsky et al., 2021; Shamir & Shamir, 1997; see also Weaver et al., 2007). Perhaps even more challenging for discernment is that content is customized for us online, exploiting detailed knowledge about individuals to create appealing messages via microtargeting (Acerbi, 2019; Kozyreva et al., 2020).

Tasked with evaluating a large volume of content in our day-to-day environments, false beliefs can be acquired for many of the reasons outlined above. Once a false belief is acquired, to what extent can it be revised? The sins of memory also bear directly on the extent to which corrections hold and belief revision occurs.

The Continued Influence of Misinformation and Implications for Corrections

Once people have formed false beliefs and consider misinformation to be true, revision becomes a challenging task. The sins of memory may help explain the underlying difficulty. One of the main issues with misinformation is that people often continue to rely on misinformation in their reasoning even after it has been corrected. This is known as the continued influence effect (Ecker, Lewandowsky, et al., 2022; Johnson & Seifert, 1994) and can be observed even when people understand, believe, and remember a correction. For example, the Australian government, segments of the media, and social-media groups repeatedly blamed arson for the catastrophic Australian “Black Summer” wildfires in 2019/2020; although these claims were subsequently debunked, some continued to falsely believe that arson played a significant causal role in those fires, contributing to increased polarization about climate change (Mocatta & Hawley, 2020; Weber et al., 2020).

An information-deficit view of communication would argue that the solution to the problem of a false belief is simple: provide a clear, coherent account of the truth, allowing people to update. However, people do not always adjust their knowledge and beliefs even when a correction is compelling—at least not to the extent that the influence of the misinformation is fully mitigated. It seems tempting to attribute this phenomenon to the sin of bias. Indeed, people’s identity and worldview (e.g., their political attitudes) can shape their post-correction reliance on identity- or worldview-related misinformation (Hornsey & Fielding, 2017; Trevors, 2022; Trevors & Duffy, 2020). However, the existing evidence is not as strong as one may think. On the one hand, there have been claims that corrections of worldview-congruent misinformation can be ineffective (Ecker & Ang, 2019) or can even backfire (Nyhan & Reifler, 2010). Likewise, a recent meta-analysis suggested that corrections are more successful when the corrected misinformation is worldview-discordant (Walter & Tukachinsky, 2020). On the other hand, findings of ineffective and backfiring corrections have proven difficult to replicate (Ecker et al., 2021; Guess & Coppock, 2020; Wood & Porter, 2019), and accumulating evidence suggests that people by-and-large adjust their beliefs at least somewhat when confronted with counterevidence, even if the misinformation is worldview-congruent (Aird et al., 2018; Ecker, Sanderson, et al., 2022; Nyhan et al., 2020; Swire, Berinsky, et al., 2017; Weeks, 2015).

Moreover, the continued influence of false information also occurs with information that is worldview-neutral. As such, it has been linked to memory lapses, namely failures to retrieve corrective information (i.e., selective retrieval of misinformation) or failures to integrate corrective information into one’s mental model or causal account of a situation (Ecker et al., 2022). From this perspective, continued influence is most immediately related to the sin of persistence. Although Schacter (2022b) conceptualizes persistence as “unwanted and emotionally arousing intrusive memories, typically resulting from traumatic events”, we suggest broadening this definition such that false but non-disturbing information and non-intrusive recall are included. As a “treatment” for persistence, Schacter suggests reconsolidation. This refers to a theory that retrieval of a consolidated memory can transfer the memory representation into a labile state that allows it to be updated (Lee et al., 2017). Although Schacter applies this to intrusive memories, “labilization” may also help overcome continued influence. Prior research has found that repeating misinformation directly prior to or within a correction can render the correction more effective (Ecker et al., 2017; Kowalski & Taylor, 2017; Wahlheim et al., 2020). However, it should be noted that this does not require the assumption of reconsolidation (also see Allanson & Ecker, 2017; Howe et al., 2020). For example, an alternative model suggests that co-activation of a misconception and related corrective information facilitates conflict detection and integration of the corrective information into the relevant mental model, thus boosting knowledge revision (Kendeou & O’Brien, 2014).

Even if people update their belief successfully after misinformation has been corrected, this change is rarely sustained over time (Carey et al., 2022). In other words, belief frequently returns towards pre-correction levels (Kowalski & Taylor, 2017; Paynter et al., 2019; Swire, Ecker, et al., 2017); this is called belief regression. Swire-Thompson et al. (2022) found memory for the correction explained 66% of the variance in belief regression after correcting for measurement reliability. We can conceptualize this into two distinct memory sins: The sin of transience would result in people forgetting that the misinformation has been corrected, leaving them unsure of whether the claim is true or false. By contrast, the sin of misattribution would be involved when people misremember misinformation as having been presented as true. Swire-Thompson et al. (2022) found evidence for both transience and misattribution occurring within the context of belief regression. The proportion of people who believed in the misinformation but were unsure of whether it was presented as true or false (committing the sin of transience) increased from 0.6% immediately after corrections were presented to 5.6% one month later. The proportion of people who believed in the misinformation and thought it had been presented as true (committing the sin of misattribution) increased from 4.1 to 15.6%.

Again, it seems that the role of the sin of bias is limited in belief regression, with beliefs regressing at a similar rate whether a correction is pro- or counter-attitudinal. Swire, Berinsky, et al. (2017) found that Democrats, Republican supporters of Donald Trump, and Republican non-Trump-supporters all forgot that corrected Trump misinformation was false at a similar rate. In other words, motivated forgetting, with a person’s worldview influencing the rate at which a person re-endorses a claim, did not occur. However, this needs to be replicated and extended.

It is tempting to suggest that continued belief in or reliance on corrected misinformation is an irrational expression of a flawed cognitive system. However, in line with Schacter’s (2022b) perspective, one can also entertain a functional account of continued influence. Viewed through a broader lens, effective belief updating is a cognitive challenge arising in part from a core conundrum of memory—namely, balancing the need for stable representations and the need to allow for some representational flexibility to cope with an ever-changing world (Ecker et al., 2014). In the context of misinformation, maintaining a stable representation of corrected misinformation can be functional to the extent that the claim may end up being true after all. Evidence is seldom absolute, and source credibility can rarely be assessed with full confidence; in some situations it can be entirely rational and adaptive to not fully suppress misinformation belief after a correction (Connor Desai et al., 2020). It can also be useful to retain a mental model of an event or causal relation that is known to be false, for example to aid with counterarguing false claims in a debate.

Future Research Directions

In considering the formation of false beliefs, the saturated information environment poses a challenge for allocation of attentional resources and analytical assessment of truth. But technology can be leveraged—at least online—to alleviate some of the sins of memory that make us vulnerable to false belief formation (Kozyreva et al., 2020; Lewandowsky et al., 2017). For example, online prompts to consider truth or accuracy may work to reduce absentmindedness and increase analytical assessment (Fazio, 2020a; Pennycook et al., 2021). While promising, more thorough investigation is warranted to better understand how long such prompts are effective and for which users. Further, emerging research on intellectual humility—one’s willingness to engage in self-reflection and be open to considering disconfirming evidence—may improve our understanding of the sin of bias and may lead to innovations that encourage less identity-oriented assessment of truth (Bowes et al., 2020). Another emerging direction is that digital-literacy and media-literacy education, as well as pre-emptive inoculation techniques specifically designed to boost resilience against misinformation, may equip people with a toolset for more skeptical analysis of content (Cook et al., 2022; Guess et al., 2020; Lewandowsky & van der Linden, 2021; see Modirrousta-Galian et al., 2022).

Considering the continued influence effect through the lens of the sins of memory may also foster new lines of research. For instance, Schacter (2022b) suggests that persistence may be more prevalent in people with lower levels of cognitive control. This meshes well with research suggesting that people’ susceptibility to continued influence relates to their working memory function (Brydges et al., 2018). However, more recent research suggests that people’s episodic memory may be even more important when it comes to continued influence (Sanderson et al., 2021), which highlights the need for future research to further investigate what memory does and does not explain with regards to the continued influence effect. For example, another important factor of the continued influence effect is whether or not a person believes the correction to be valid. O’Rear & Radvansky (2020) found that a large proportion of participants who remembered the correction did not think that it was accurate or genuine, and consequently relied on the misinformation in their reasoning at a similar rate as individuals who never received a correction at all.

Conclusion

There is a clear association between Schacter’s (2022b) sins of memory and (1) the formation of false beliefs and (2) the continued influence effect after misinformation is corrected. This highlights the importance of considering memory in future research, in particular the development of interventions and the measurement of efficacy over time. For instance, interventions that address misinformation may benefit from making memory sins a key focus—reducing bias, absentmindedness, and possible misattributions. One avenue that has been explored to counteract these sins in the continued-influence domain is correction repetition and ensuring a correction can be processed with undivided attention (Ecker et al., 2011; Sanderson et al., 2022). Future research should further explore additional avenues, such as making corrections more distinctive (e.g., via mental imagery). Beyond corrections, interventions will need to be memorable to maximize their long-term efficacy (such as inoculation Maertens et al. 2021; also see Schwarz et al., 2016). In sum, memory is fundamental to belief formation and the correction of information, and the conceptualization of Schacter’s (2022b) sins can be a helpful paradigm for misinformation researchers going forward.

Acknowledgments

UKHE is supported by Australian Research Council grant FT190100708 and BST is supported by National Institute of Health Pathway to Independence Award (1K99CA248720-01A). We thank Krissy Kilgallen for research support.

References

  1. Acerbi A. (2019). Cognitive attraction and online misinformation. Palgrave Communications, 5(1), Article 1. 10.1057/s41599-019-0224-y [DOI] [Google Scholar]
  2. Aird MJ, Ecker UKH, Swire B, Berinsky AJ, & Lewandowsky S. (2018). Does truth matter to voters? The effects of correcting political misinformation in an Australian sample. Royal Society Open Science, 5(12), 180593–180593. 10.1098/rsos.180593 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Allanson F, & Ecker UKH (2017). No evidence for a role of reconsolidation in updating of paired associates. Journal of Cognitive Psychology, 29(8), 912–919. 10.1080/20445911.2017.1360307 [DOI] [Google Scholar]
  4. Bago B, Rand DG, & Pennycook G. (2020). Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines. Journal of Experimental Psychology: General, 149, 1608–1613. 10.1037/xge0000729 [DOI] [PubMed] [Google Scholar]
  5. Bild E, Redman A, Newman EJ, Muir BR, Tait D, & Schwarz N. (2021). Sound and credibility in the virtual court: Low audio quality leads to less favorable evaluations of witnesses and lower weighting of evidence. Law and Human Behavior, 45, 481–495. 10.1037/lhb0000466 [DOI] [PubMed] [Google Scholar]
  6. Bowes SM, Blanchard MC, Costello TH, Abramowitz AI, & Lilienfeld SO (2020). Intellectual humility and between-party animus: Implications for affective polarization in two community samples. Journal of Research in Personality, 88, 103992. 10.1016/j.jrp.2020.103992 [DOI] [Google Scholar]
  7. Brashier NM, Eliseev ED, & Marsh EJ (2020). An initial accuracy focus prevents illusory truth. Cognition, 194, 104054. 10.1016/j.cognition.2019.104054 [DOI] [PubMed] [Google Scholar]
  8. Brashier NM, & Marsh EJ (2020). Judging Truth. Annual Review of Psychology, 71 (1). Retrieved from https://par.nsf.gov/biblio/10190139. 10.1146/annurev-psych-010419-050807 [DOI] [PubMed] [Google Scholar]
  9. Briñol P, & Petty RE (2009). Source factors in persuasion: A self-validation approach. European Review of Social Psychology, 20(1), 49–96. 10.1080/10463280802643640 [DOI] [Google Scholar]
  10. Brown AS, Brown LA, & Zoccoli SL (2002). Repetition-based credibility enhancement of unfamiliar faces. The American Journal of Psychology, 115(2), 199–209. [PubMed] [Google Scholar]
  11. Brydges CR, Gignac GE, & Ecker UKH (2018). Working memory capacity, short-term memory capacity, and the continued influence effect: A latent-variable analysis. Intelligence, 69, 117–122. 10.1016/j.intell.2018.03.009 [DOI] [Google Scholar]
  12. Calvillo DP, & Harris JD (2022). Exposure to headlines as questions reduces illusory truth for subsequent headlines. Journal of Applied Research in Memory and Cognition. 10.1037/mac0000056 [DOI]
  13. Carey JM, Guess AM, Loewen PJ, Merkley E, Nyhan B, Phillips JB, & Reifler J. (2022). The ephemeral effects of fact-checks on COVID-19 misperceptions in the United States, Great Britain and Canada. Nature Human Behaviour, 6(2), 236–243. 10.1038/s41562-021-01278-3 [DOI] [PubMed] [Google Scholar]
  14. Connor Desai SA, Pilditch TD, & Madsen JK (2020). The rational continued influence of misinformation. Cognition, 205, 104453. 10.1016/j.cognition.2020.104453 [DOI] [PubMed] [Google Scholar]
  15. Cook J. (2019). Understanding and countering misinformation about climate change. In Handbook of research on deception, fake news, and misinformation online (pp. 281–306). Information Science Reference/IGI Global. 10.4018/978-1-5225-8535-0.ch016 [DOI] [Google Scholar]
  16. Cook J, Ecker UKH, Trecek-King M, Schade G, Jeffers-Tracy K, Fessmann J, Kim SC, Kinkead D, Orr M, Vraga E, Roberts K, & McDowell J. (2022). The cranky uncle game—Combining humor and gamification to build student resilience against climate misinformation. Environmental Education Research, 0(0), 1–17. 10.1080/13504622.2022.2085671 [DOI] [Google Scholar]
  17. Dias N, Pennycook G, & Rand DG (2020). Emphasizing publishers does not effectively reduce susceptibility to misinformation on social media. Harvard Kennedy School Misinformation Review, 1(1). 10.37016/mr-2020-001 [DOI] [Google Scholar]
  18. Ecker UKH, & Ang LC (2019). Political attitudes and the processing of misinformation corrections. Political Psychology, 40(2), 241–260. 10.1111/pops.12494 [DOI] [Google Scholar]
  19. Ecker UKH, & Antonio LM (2021). Can you believe it? An investigation into the impact of retraction source credibility on the continued influence effect. Memory & Cognition, 49(4), 631–644. 10.3758/s13421-020-01129-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Ecker UKH, Hogan JL, & Lewandowsky S. (2017). Reminders and repetition of misinformation: Helping or hindering its retraction? Journal of Applied Research in Memory and Cognition, 6, 185–192. 10.1016/j.jarmac.2017.01.014 [DOI] [Google Scholar]
  21. Ecker UKH, Lewandowsky S, Cook J, Schmid P, Fazio LK, Brashier N, Kendeou P, Vraga EK, & Amazeen MA (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), Article 1. 10.1038/s44159-021-00006-y [DOI] [Google Scholar]
  22. Ecker UKH, Lewandowsky S, Swire B, & Chang D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18(3), 570–578. 10.3758/s13423-011-0065-1 [DOI] [PubMed] [Google Scholar]
  23. Ecker UKH, Sanderson JA, McIlhiney P, Rowsell JJ, Quekett HL, Brown GD, & Lewandowsky S. (2022). Combining refutations and social norms increases belief change. Quarterly Journal of Experimental Psychology, 17470218221111750. 10.1177/17470218221111750 [DOI] [PMC free article] [PubMed]
  24. Ecker UKH, Swire B, & Lewandowsky S. (2014). Correcting misinformation—A challenge for education and cognitive science. In Rapp DN & Braasch JLG (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences (pp. 13–37). The MIT Press. 10.7551/mitpress/9737.001.0001 [DOI] [Google Scholar]
  25. Ecker UKH, Sze BKN, & Andreotta M. (2021). Corrections of political misinformation: No evidence for an effect of partisan worldview in a US convenience sample. Philosophical Transactions of the Royal Society B: Biological Sciences, 376(1822), 20200145. 10.1098/rstb.2020.0145 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Fazio LK (2020a). Pausing to consider why a headline is true or false can help reduce the sharing of false news. Harvard Kennedy School Misinformation Review, 1(2). 10.37016/mr-2020-009 [DOI] [Google Scholar]
  27. Fazio LK (2020b). Repetition increases perceived truth even for known falsehoods. Collabra: Psychology, 6(1), 38. 10.1525/collabra.347 [DOI] [Google Scholar]
  28. Fazio LK, Brashier NM, Payne BK, & Marsh EJ (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144, 993–1002. 10.1037/xge0000098 [DOI] [PubMed] [Google Scholar]
  29. Festinger L. (1954). Motivations leading to social behavior. In Jones MR (Ed.), Nebraska symposium on motivation, 1954 (pp. 191–219). University of Nebraska Press. [Google Scholar]
  30. Frenda SJ, Knowles ED, Saletan W, & Loftus EF (2013). False memories of fabricated political events. Journal of Experimental Social Psychology, 49(2), 280–286. 10.1016/j.jesp.2012.10.013 [DOI] [Google Scholar]
  31. Gilbert DT, Tafarodi RW, & Malone PS (1993). You can’t not believe everything you read. Journal of Personality and Social Psychology, 65(2), 221–233. 10.1037/0022-3514.65.2.221 [DOI] [PubMed] [Google Scholar]
  32. Grice HP (1975). Logic and conversation. In Cole P. & Morgan JL (Eds.), Syntax and semantics, Vol.3: Speech acts (pp. 41–58). Academic Press [Google Scholar]
  33. Guess A, & Coppock A. (2020). Does counter-attitudinal information cause backlash? Results from three large survey experiments. British Journal of Political Science, 50(4), 1497–1515. doi: 10.1017/S0007123418000327 [DOI] [Google Scholar]
  34. Guess AM, Lerner M, Lyons B, Montgomery JM, Nyhan B, Reifler J, & Sircar N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117 (27), 15536–15545. 10.1073/pnas.1920498117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Henkel LA, & Mattson ME (2011). Reading is believing: The truth effect and source credibility. Consciousness and Cognition, 20(4), 1705–1721. 10.1016/j.concog.2011.08.018 [DOI] [PubMed] [Google Scholar]
  36. Hornsey MJ, & Fielding KS (2017). Attitude roots and Jiu Jitsu persuasion: Understanding and overcoming the motivated rejection of science. The American Psychologist, 72(5), 459–473. 10.1037/a0040437 [DOI] [PubMed] [Google Scholar]
  37. Howe ML, Akhtar S, Bland CE, & Hellenthal MV (2020). Reconsolidation or interference? Aging effects and the reactivation of novel and familiar episodic memories. Memory, 28(7), 839–849. 10.1080/09658211.2019.1705489 [DOI] [PubMed] [Google Scholar]
  38. Jacoby LL, Kelley CM, & Dywan J. (1989). Memory attributions. In Roediger III HL & Craik FIM (Eds.), Varieties of memory and consciousness: Essays in honour of Endel Tulving (pp. 391–422). Lawrence Erlbaum Associates, Inc. [Google Scholar]
  39. Jalbert M, Schwarz N, & Newman E. (2020). Only half of what I’ll tell you is true: Expecting to encounter falsehoods reduces illusory truth. Journal of Applied Research in Memory and Cognition, 9, 602–613. 10.1016/j.jarmac.2020.08.010 [DOI] [Google Scholar]
  40. Jiang Y, Newman EJ, Schwarz N. (2021, May 26–27). The Role of Prior Beliefs and Attitudes in the Illusory Truth Effect for Climate Science Claims [Poster session]. APS 2021 Virtual Convention.
  41. Johnson HM, & Seifert CM (1994). Sources of the continued influence effect: When misinformation in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory and Cognition, 20, 1420–1436. [Google Scholar]
  42. Kendeou P, & O’Brien EJ (2014). The Knowledge Revision Components (KReC) framework: Processes and mechanisms. In Rapp DN & Braasch JLG (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences (pp. 353–377). The MIT Press. [Google Scholar]
  43. Kerr JR, & Van der Linden S. (2022). Communicating expert consensus increases personal support for COVID-19 mitigation policies. Journal of Applied Social Psychology, 52(1), 15–29. 10.1111/jasp.12827 [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Kowalski P, & Taylor AK (2017). Reducing students ‘misconceptions with refutational teaching: For long-term retention, comprehension matters. Scholarship of Teaching and Learning in Psychology, 3, 90–100. 10.1037/stl0000082 [DOI] [Google Scholar]
  45. Kozyreva A, Lewandowsky S, & Hertwig R. (2020). Citizens versus the internet: Confronting digital challenges with cognitive tools. Psychological Science in the Public Interest, 21(3), 103–156. 10.1177/1529100620946707 [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Laham SM, Koval P, & Alter AL (2012). The name-pronunciation effect: Why people like Mr. Smith more than Mr. Colquhoun. Journal of Experimental Social Psychology, 48(3), 752–756. 10.1016/j.jesp.2011.12.002 [DOI] [Google Scholar]
  47. Lee DS, Kim E, & Schwarz N. (2015). Something smells fishy: Olfactory suspicion cues improve performance on the Moses illusion and Wason rule discovery task. Journal of Experimental Social Psychology, 59, 47–50. 10.1016/j.jesp.2015.03.006 [DOI] [Google Scholar]
  48. Lee J, Nader K, & Schiller D. (2017). An update on memory reconsolidation updating. Trends in cognitive sciences, 21(7), 531–545. 10.1016/j.tics.2017.04.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Lewandowsky S, Ecker UKH, & Cook J. (2017). Beyond Misinformation: Understanding and coping with the “Post-Truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. 10.1016/j.jarmac.2017.07.008 [DOI] [Google Scholar]
  50. Lewandowsky S, Facer K, & Ecker UKH (2021). Losses, hopes, and expectations for sustainable futures after COVID. Humanities and Social Sciences Communications, 8(1), Article 1. 10.1057/s41599-021-00961-0 [DOI] [Google Scholar]
  51. Lewandowsky S, & van der Linden S. (2021). Countering misinformation and fake news through inoculation and prebunking. European Review of Social Psychology, 32(2), 348–384. 10.1080/10463283.2021.1876983 [DOI] [Google Scholar]
  52. Loomba S, de Figueiredo A, Piatek SJ, de Graaf K, & Larson HJ (2021). Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nature Human Behaviour, 5(3), Article 3. 10.1038/s41562-021-01056-1 [DOI] [PubMed] [Google Scholar]
  53. Mackie DM, Worth LT, & Asuncion AG (1990). Processing of persuasive in-group messages. Journal of Personality and Social Psychology, 58(5), 812–822. 10.1037/0022-3514.58.5.812 [DOI] [PubMed] [Google Scholar]
  54. Maertens R, Roozenbeek J, Basol M, & van der Linden S. (2021). Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments. Journal of Experimental Psychology: Applied, 27, 1–16. 10.1037/xap0000315 [DOI] [PubMed] [Google Scholar]
  55. Mayo R. (2019). Knowledge and distrust may go a long way in the battle with disinformation: Mental processes of spontaneous disbelief. Current Directions in Psychological Science, 28, 409–414. 10.1177/0963721419847998 [DOI] [Google Scholar]
  56. Mocatta G, & Hawley E. (2020). Uncovering a climate catastrophe? Media coverage of Australia’s black summer bushfires and the revelatory extent of the climate blame frame. M/C Journal, 23(4), Article 4. 10.5204/mcj.1666 [DOI] [Google Scholar]
  57. Modirrousta-Galian A, Higham PA, & Seabrooke T. (2022, July 26). Effects of Inductive Learning and Gamification on News Veracity Discernment. 10.31234/osf.io/4wfds [DOI] [PubMed]
  58. Murphy G, & Flynn E. (2022). Deepfake false memories. Memory, 30(4), 480–492. 10.1080/09658211.2021.1919715 [DOI] [PubMed] [Google Scholar]
  59. Murphy G, Loftus EF, Grady RH, Levine LJ, & Greene CM (2019). False Memories for Fake News During Ireland’s Abortion Referendum. Psychological Science, 30(10), 1449–1459. 10.1177/0956797619864887 [DOI] [PubMed] [Google Scholar]
  60. Nadarevic L, Reber R, Helmecke AJ, & Köse D. (2020). Perceived truth of statements and simulated social media postings: an experimental investigation of source credibility, repeated exposure, and presentation format. Cognitive Research: Principles and Implications, 5(1), 56–56. 10.1186/s41235-020-00251-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Newman EJ, Sanson M, Miller EK, Quigley-McBride A, Foster JL, Bernstein DM, & Garry M. (2014). People with easier to pronounce names promote truthiness of claims. PLOS ONE, 9(2), e88671. 10.1371/journal.pone.0088671 [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Newman EJ, & Schwarz N. (2018). Good sound, good research: How audio quality influences perceptions of the research and researcher. Science Communication, 40(2), 246–257. 10.1177/1075547018759345 [DOI] [Google Scholar]
  63. Nightingale SJ, & Farid H. (2022). AI-synthesized faces are indistinguishable from real faces and more trustworthy. Proceedings of the National Academy of Sciences, 119(8), e2120481119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Nightingale SJ, Wade KA, & Watson DG (2017). Can people identify original and manipulated photos of real-world scenes? Cognitive Research: Principles and Implications, 2(1), 30. 10.1186/s41235-017-0067-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Nisbet EC, Mortenson C, & Li Q. (2021). The presumed influence of election misinformation on others reduces our own satisfaction with democracy. The Harvard Kennedy School Misinformation Review. 10.37016/mr-2020-59 [DOI]
  66. Nyhan B, Porter E, Reifler J, & Wood TJ (2020). Taking Fact-Checks literally but not seriously? The effects of journalistic fact-checking on factual beliefs and candidate favorability. Political Behavior, 42(3), 939–960. 10.1007/s11109-019-09528-x [DOI] [Google Scholar]
  67. Nyhan B, & Reifler J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330. 10.1007/s11109010-9112-2 [DOI] [Google Scholar]
  68. O’Rear AE, & Radvansky GA (2020). Failure to accept retractions: A contribution to the continued influence effect. Memory & Cognition, 48(1), 127–144. 10.3758/s13421-019-00967-9 [DOI] [PubMed] [Google Scholar]
  69. Oyserman D. (2019). Cultural Fluency, mindlessness, and gullibility. In Forgas JP& Baumeister R. (Eds.), The Social Psychology of Gullibility (pp. 255–278). Routledge. 10.4324/9780429203787-14 [DOI] [Google Scholar]
  70. Oyserman D, & Dawson A. (2020). Your Fake News, Our Facts: Identity-based motivation shapes what we believe, share, and accept. In Greifeneder R, Jaffé M, Newman EJ, & Schwarz N(Eds.), The Psychology of Fake News (1st ed., pp. 173–195). Routledge. 10.4324/9780429295379-13 [DOI] [Google Scholar]
  71. Paynter J, Luskin-Saxby S, Keen D, Fordyce K, Frost G, Imms C, Miller S, Trembath D, Tucker M, & Ecker U. (2019). Evaluation of a template for countering misinformation—Real-world Autism treatment myth debunking. PLOS ONE, 14(1), e0210746. 10.1371/journal.pone.0210746 [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Pennington N, & Hastie R. (1992). Explaining the evidence: Tests of the Story Model for juror decision making. Journal of personality and social psychology, 62(2), 189. [Google Scholar]
  73. Pennycook G, Epstein Z, Mosleh M, Arechar AA, Eckles D, & Rand DG (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590–595. [DOI] [PubMed] [Google Scholar]
  74. Pennycook G, & Rand DG (2021a). Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election. The Harvard Kennedy School Misinformation Review. 10.37016/mr-2020-51 [DOI]
  75. Pennycook G, & Rand DG (2021b). The Psychology of Fake News. Trends in Cognitive Sciences, 25(5), 388–402. 10.1016/j.tics.2021.02.007 [DOI] [PubMed] [Google Scholar]
  76. Sanderson JA, Bowden V, Swire-Thompson B, Lewandowsky S, & Ecker UKH (2022). Listening to misinformation while driving: Cognitive load and the effectiveness of (repeated) corrections. Journal of Applied Research in Memory and Cognition, 10.1037/mac0000057 [DOI] [PMC free article] [PubMed]
  77. Sanderson JA, & Ecker UKH (2020). The Challenge of misinformation and ways to reduce its impact. In Van Meter P, List A, Lombardi D, & Kendeou P. (Eds.), Handbook of Learning from Multiple Representations and Perspectives. Routledge. https://www.taylorfrancis.com/chapters/edit/10.4324/9780429443961-30/challenge-misinformation-ways-reduce-impact-jasmyne-sanderson-ullrich-ecker [Google Scholar]
  78. Sanderson JA, Gignac GE, & Ecker UKH (2021). Working memory capacity, removal efficiency and event specific memory as predictors of misinformation reliance. Journal of Cognitive Psychology, 33(5), 518–532. 10.1080/20445911.2021.1931243 [DOI] [Google Scholar]
  79. Schacter DL (1999). The seven sins of memory: insights from psychology and cognitive neuroscience. American psychologist, 54(3), 182. 10.1037/0003-066X.54.3.182 [DOI] [PubMed] [Google Scholar]
  80. Schacter DL (2022a). The seven sins of memory: An update. Memory, 30(1), 37–42. 10.1080/09658211.2021.1873391 [DOI] [PMC free article] [PubMed] [Google Scholar]
  81. Schacter DL (2022b) Memory sins in applied Settings: What kind of progress? Journal of Applied Research in Memory and Cognition. [DOI] [PMC free article] [PubMed]
  82. Schwarz N. (1994). Judgment in a social context: Biases, shortcomings, and the logic of conversation. Advances in Experimental Social Psychology, 26, 123–162. [Google Scholar]
  83. Schwarz N. (2015). Metacognition. In Mikulincer M, Shaver PR, Borgida E, & Bargh JA (Eds.), APA handbook of personality and social psychology, Volume 1: Attitudes and social cognition (pp. 203–229). American Psychological Association. 10.1037/14341-006 [DOI] [Google Scholar]
  84. Schwarz N, Newman E, & Leach W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy 2(1), 85–95. doi: 10.1353/bsp.2016.0009. [DOI] [Google Scholar]
  85. Shamir J, & Shamir M. (1997). Pluralistic Ignorance across issues and over time: Information cues and biases. The Public Opinion Quarterly, 61(2), 227–260. [Google Scholar]
  86. Silva RR, Chrobot N, Newman E, Schwarz N, & Topolinski S. (2017). Make it short and easy: Username complexity determines trustworthiness above and beyond objective reputation. Frontiers in Psychology, 8. https://www.frontiersin.org/articles/10.3389/fpsyg.2017.02200 [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. Simon D. (2004). A third view of the black box: Cognitive Coherence in legal decision making. University of Chicago Law Review, 71(2), 511–586. [Google Scholar]
  88. Swire B, Berinsky AJ, Lewandowsky S, & Ecker UKH (2017). Processing political misinformation: Comprehending the Trump phenomenon. Royal Society Open Science, 4, 160802. 10.1098/rsos.160802 [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Swire B, Ecker UKH, & Lewandowsky S. (2017). The role of familiarity in correcting inaccurate information. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43, 1948–1961. 10.1037/xlm0000422 [DOI] [PubMed] [Google Scholar]
  90. Swire-Thompson B, Dobbs M, Thomas A, & DeGutis J. (2022). Memory failure predicts belief regression after the correction of misinformation. Cognition, 230, 105276. 10.1016/j.cognition.2022.105276 [DOI] [PMC free article] [PubMed] [Google Scholar]
  91. Swire-Thompson B, & Lazer D. (2022). Reducing health misinformation in science: A call to arms. The ANNALS of the American Academy of Political and Social Science, 700(1), 124–135. 10.1177/00027162221087686 [DOI] [PMC free article] [PubMed] [Google Scholar]
  92. Trevors GJ (2022). The Roles of identity conflict, emotion, and threat in learning from refutation texts on vaccination and immigration. Discourse Processes, 59(1–2), 36–51. 10.1080/0163853X.2021.1917950 [DOI] [Google Scholar]
  93. Trevors GJ, & Duffy MC (2020). Correcting COVID-19 misconceptions requires caution. Educational Researcher, 49(7), 538–542. 10.3102/0013189X20953825 [DOI] [Google Scholar]
  94. Unkelbach C. (2007). Reversing the truth effect: learning the interpretation of processing fluency in judgments of truth. Journal of Experimental Psychology. Learning, memory, and cognition, 33(1), 219–230. 10.1037/0278-7393.33.1.219 [DOI] [PubMed] [Google Scholar]
  95. Unkelbach C, & Greifeneder R. (2018). Experiential fluency and declarative advice jointly inform judgments of truth. Journal of Experimental Social Psychology, 79, 78–86. 10.1016/j.jesp.2018.06.010 [DOI] [Google Scholar]
  96. Unkelbach C, & Koch A. (2019). Gullible but Functional?: Information Repetition and the Formation of Beliefs. In Forgas JP& Baumeister R(Eds.), The Social Psychology of Gullibility (1st ed., pp. 42–60). Routledge. 10.4324/9780429203787-3 [DOI] [Google Scholar]
  97. Unkelbach C, Koch A, Silva RR, & Garcia-Marques T. (2019). Truth by Repetition: Explanations and Implications. Current Directions in Psychological Science, 28(3), 247–253. 10.1177/0963721419827854 [DOI] [Google Scholar]
  98. Unkelbach C, & Rom SC (2017). A referential theory of the repetition-induced truth effect. Cognition, 160, 110–126. 10.1016/j.cognition.2016.12.016 [DOI] [PubMed] [Google Scholar]
  99. Van der Linden SL, Leiserowitz AA, Feinberg GD, & Maibach EW (2015). The scientific consensus on climate change as a gateway belief: Experimental evidence. PloS One, 10(2), e0118489–e0118489. 10.1371/journal.pone.0118489 [DOI] [PMC free article] [PubMed] [Google Scholar]
  100. Wahlheim CN, Alexander TR, & Peske CD (2020). Reminders of everyday misinformation statements can enhance memory for and beliefs in corrections of those statements in the short term. Psychological Science, 31(10), 1325–1339. 10.1177/0956797620952797 [DOI] [PubMed] [Google Scholar]
  101. Walter N, & Tukachinsky R. (2020). A Meta-Analytic Examination of the continued influence of misinformation in the face of correction: How powerful is it, Why does It happen, and how to stop it? Communication Research, 47(2), 155–177. 10.1177/0093650219854600 [DOI] [Google Scholar]
  102. Wammes JD, Ralph BCW, Mills C, Bosch N, Duncan TL, & Smilek D. (2019). Disengagement during lectures: Media multitasking and mind wandering in university classrooms. Computers and Education, 132, 76–89. 10.1016/j.compedu.2018.12.007 [DOI] [Google Scholar]
  103. Wang C, Platow M, & Newman E. (2022). There is an “I” in truth: How salient identities shape dynamic perceptions of truth. European Journal of Social Psychology, 10.1002/ejsp.2909 [DOI]
  104. Wang WC, Brashier NM, Wing EA, Marsh EJ, & Cabeza R. (2016). On Known Unknowns: Fluency and the Neural Mechanisms of Illusory Truth. Journal of Cognitive Neuroscience, 28(5), 739–746. 10.1162/jocn_a_00923 [DOI] [PMC free article] [PubMed] [Google Scholar]
  105. Weaver K, Garcia SM, Schwarz N, & Miller DT (2007). Inferring the popularity of an opinion from its familiarity: A repetitive voice can sound like a chorus. Journal of Personality and Social Psychology, 92(5), 821–833. 10.1037/0022-3514.92.5.821 [DOI] [PubMed] [Google Scholar]
  106. Weber D, Nasim M, Falzon L, & Mitchell L. (2020). #ArsonEmergency and Australia’s “Black Summer”: Polarisation and Misinformation on Social Media. In van Duijn M, Preuss M, Spaiser V, Takes F, & Verberne S. (Eds.), Disinformation in Open Online Media (pp. 159–173). Springer International Publishing. 10.1007/978-3-030-61841-4_11 [DOI] [Google Scholar]
  107. Weeks BE (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of communication, 65(4), 699–719. [Google Scholar]
  108. Weisbuch M, & Mackie D. (2009). False fame, perceptual clarity, or persuasion? Flexible fluency attribution in spokesperson familiarity effects. Journal of Consumer Psychology, 19, 62–72. 10.1016/j.jcps.2008.12.009Yeah [DOI] [Google Scholar]
  109. Winkielman P, Huber DE, Kavanagh L, & Schwarz N. (2012). Fluency of consistency: When thoughts fit nicely and flow smoothly. In Gawronski B, & Strack F. (Eds.) Cognitive consistency: A fundamental principle in social cognition, 89–111. [Google Scholar]
  110. Wood T, & Porter E. (2019). The elusive backfire effect: Mass attitudes ‘steadfast factual adherence. Political Behavior, 41(1), 135–163. 10.1007/s11109-018-9443-y [DOI] [Google Scholar]

RESOURCES