Skip to main content
Science Advances logoLink to Science Advances
. 2023 Mar 3;9(9):eadg8333. doi: 10.1126/sciadv.adg8333

Social media: Why sharing interferes with telling true from false

Valerie F Reyna 1,*
PMCID: PMC9984168  PMID: 36867696

Abstract

Sharing on social media decreases true-false discrimination but focusing on accuracy helps people recognize what they already know. Process-oriented research offers hope in combatting misinformation.


From the invention of the printing press to direct-to-consumer drug advertising, the democratization of information is a long-term trend (1). Today’s technology allows the public to access an abundance of information and the onus is increasingly on them to use it to make decisions, as in patient-centered medical decisions (2). Social media has amplified this trend, promising to give everyone a voice and a vote (or a “like”), but that very democratic element has removed the gatekeeping of traditional media and peer review (3). As a result, misinformation on social media is now a major problem, although people might disagree about which information is amiss.

Rather than face a dismal dilemma between widespread misinformation or widespread censorship, Epstein et al.’s findings (4), reported in this issue of Science Advances, open the door to ways of helping individuals discern truth from falsehood so that they can benefit from the abundance of information without falling prey to myths and misrepresentations. Many practicalities about how to achieve these goals need to be worked out, but this is always true in the initial stages of the scientific study of a topic, which is the case at this point in time for social media. Speculation about social media’s effects on the human psyche is rampant but that is no substitute for rigorous research that tests underlying mechanisms, which can serve as a foundation for practical remedies.

That is, Epstein et al. show that merely adding the task of considering whether to share social media decreases the discrimination between true and false information, relative to judging accuracy by itself (Fig. 1). (Note that the underlying ability to discriminate is not affected by this manipulation because that ability manifests itself when the extra task of sharing is removed.) In other words, judging whether to share information followed by judging its accuracy produces less discrimination between true and false information than judging accuracy alone. (Effects were not as strong when accuracy judgments preceded sharing judgments.) This truth-degrading effect applied to COVID-19 as well as other politicized or political information, and it occurred across the partisan divide (though the size of effects varied).

As Epstein et al. argue, the truth-degrading effect of adding sharing decisions to accuracy decisions is troubling because sharing is an inherent part of the social media experience. Specifically, adding the task of deciding to share makes accuracy ratings of true information go down and accuracy ratings of false information go up (when people decided about both sharing and accuracy), essentially throwing sand in the gears of the information-processing engine. Also troubling, when the only task is to decide whether to share information, this, too, degrades discrimination between true and false relative to accuracy only, as prior research demonstrated (5).

However, these gloomy results have a silver lining: They show that the “truth” is, to some degree, within people because they distinguish true from false better for the same information when asked to only judge its accuracy. The ability to discriminate is not completely lacking within people but, rather, it is interfered with by having the goal of sharing. This result joins prior results of “accuracy prompts” in underscoring that improving truth discernment does not always require convincing people about what is true or false but instead can involve sparking their own ability to discern truth. Although political and partisan differences might seem insurmountable, these findings suggest that there is hope. Inserting some attention to accuracy at some point in the process of engaging with social media could make a difference, and like other social media effects, its ramifications could then propagate through social networks.

More generally, intervention research building on explanatory research can make progress in addressing these information challenges. Hence, the most important aspect of this research is not just the observation of findings but why effects were observed. Epstein et al. lay the groundwork for further critical research on the mechanisms of discernment between truth and falsity. They entertain two plausible classes of mechanisms: whether a desire for consistency causes accuracy judgments to be brought into line with sharing intentions or whether sharing distracts from accuracy, thus making accuracy judgments noisier. The latter straightforward mechanism is supported by Epstein et al.’s findings. This is not to say that people do not desire consistency; they do, despite also embracing contradictory beliefs, another fascinating paradox of human nature (6).

Thus, research about psychological mechanisms adds another dimension to efforts to promote truth in showing that people do not simply store a binary fact in memory—the truth—that is later retrieved when relevant. Scientific facts are important to communicate but that is only the beginning. Epstein et al.’s work points up the importance of the mindset of human information processors, as contrasted with machine information processors. Humans have goals when they process information that go beyond acquiring and transmitting facts; having multiple goals can interfere with “knowing what one knows” by decreasing true-false discrimination.

Other research complements the approach taken here by emphasizing how different mental representations of the facts can trigger different values, illustrating that values, much like knowledge, are subject to variable retrieval cues (7). As the current results demonstrate, sometimes people bring to mind what they know and value and sometimes they do not. Rather than assuming that laypeople have knowledge or not or that they have certain values or not as reflected in their judgments, decisions, or behaviors, this process-oriented psychological approach implies that knowledge and values can vary in their availability to decision-makers. This insight has broad implications for how the knowledge and values of individuals are implemented in real-world decisions about health, education, and policy, namely, unevenly.

In addition, Epstein et al. focus on the downstream effects of discerning true from false information once beliefs are formed but they do not focus on why people are susceptible to misinformation to begin with. Susceptibility to misinformation and the formation of what may seem to be implausible beliefs are not emphasized in the current research but naturally play a large role in discerning truth. People have implausible beliefs for many reasons, and partisanship and news outlets are not the whole of the explanation (8).

Theories explain vulnerability to misinformation in terms of lack of factual knowledge, less reflective or analytical thinking, unreasoning emotion, and motivational biases (9, 10). Acknowledging that each of these explains part of the vulnerability, fuzzy-trace theory offers an expanded view (11). Mental representations of information and misinformation compete not only head-to-head (captured in truth-discernment scores) but as alternative construals of a larger universe of related facts called “gist.” Integrating the present account, which emphasizes vigilance about accuracy, with perspectives on the formation and implementation of beliefs about what is accurate (and how that is mentally represented) should be a focus of future research.

Therefore, the solution to the problem of being awash in misinformation on social media is not just science education, just paying attention to accuracy, or just tagging, curating, or convincing people that misinformation is mistaken. It is all of these and more and must be approached with the rigor of any scientific research. As curing cancer is not about curing one disease (but treating multiple diseases), remedying misinformation will not come down to one cure-all approach.

In sum, the hopeful notes sounded by Epstein et al.’s results are that it is possible to enhance identification of misinformation by varying whether sharing information is a goal, that people know more about what is true than is sometimes apparent, that interventions need not be costly and complicated, and that success ultimately depends on knowing why effects occur not just that they occur. In short, we need not give up on truth in the tumultuous, divisive, and truth-challenged context of social media but must pursue scientific understanding with even greater resolve.

Fig. 1. Discerning true and false information on social media.

Fig. 1.

Judging whether to share information followed by judging its accuracy produces less discrimination between true and false information than judging accuracy alone. Illustration credit: Austin Fisher, Science Advances.

Acknowledgments

Funding: Preparation of this manuscript was supported in part by grants to V.F.R. from the National Science Foundation (SES-2029420) and from the National Institute of Standards and Technology (60NANB22D052).

REFERENCES AND NOTES

  • 1.D. A. Scheufele, N. M. Krause, Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U. S. A. 116, 7662–7669 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.V. F. Reyna, S. M. Edelson, B. Hayes, D. Garavito, Supporting health and medical decision making: Findings and insights from fuzzy-trace theory. Med. Decis. Making 42, 741–754 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.C. Betsch, N. T. Brewer, P. Brocard, P. Davies, W. Gaissmaier, N. Haase, J. Leask, F. Renkewitz, B. Renner, V. F. Reyna, C. Rossmann, K. Sachse, A. Schachinger, M. Siegrist, M. Stryk, Opportunities and challenges of Web 2.0 for vaccination decisions. Vaccine 30, 3727–3733 (2012). [DOI] [PubMed] [Google Scholar]
  • 4.Z. Epstein, N. Sirlin, A. Arechar, G. Pennycook, D. G. Rand, The social media context interferes with truth discernment. Sci. Adv. , eabo6169 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.G. Pennycook, J. McPhetres, Y. Zhang, J. G. Lu, D. G. Rand, Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychol. Sci. 31, 770–780 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.A. S. Chaxel, J. E. Russo, Cognitive consistency: Cognitive and motivational perspectives, in Neuroeconomics, Judgment, and Decision Making, E. A. Wilhelms, V. F. Reyna, Eds. (Psychology Press, 2015), pp. 29–48. [Google Scholar]
  • 7.V. F. Reyna, A scientific theory of gist communication and misinformation resistance, with implications for health, education, and policy. Proc. Natl. Acad. Sci. U.S.A. 118, e1912441117 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.K. H. Jamieson, D. Romer, P. E. Jamieson, K. M. Winneg, The role of non–COVID-specific and COVID-specific factors in predicting a shift in willingness to vaccinate: A panel study. Proc. Natl. Acad. Sci. U.S.A. 118, e2112266118 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.U. K. H. Ecker, S. Lewandowsky, J. Cook, P. Schmid, L. K. Fazio, N. Brashier, P. Kendeou, E. K. Vraga, M. A. Amazeen, The psychological drivers of misinformation belief and its resistance to correction. Nat. Rev. Psychol. 1, 13–29 (2022). [Google Scholar]
  • 10.J. Roozenbeek, R. Maertens, S. M. Herzog, M. Geers, R. Kurvers, M. Sultan, S. van der Linden, Susceptibility to misinformation is consistent across question framings and response modes and better explained by myside bias and partisanship than analytical thinking. Judgm. Decis. Making 17, 547–573 (2022). [Google Scholar]
  • 11.V. F. Reyna, D. A. Broniatowski, S. M. Edelson, Viruses, vaccines, and COVID-19: Explaining and improving risky decision-making. J. Appl. Res. Mem. Cogn. 10, 491–509 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Science Advances are provided here courtesy of American Association for the Advancement of Science

RESOURCES