Abstract
Many visible public debates over scientific issues are clouded in accusations of falsehood, which place increasing demands on citizens to distinguish fact from fiction. Yet, constraints on our ability to detect misinformation coupled with our inadvertent motivations to believe false science result in a high likelihood that we will form misperceptions. As science falsehoods are often presented with emotional appeals, we focus our perspective on the roles of emotion and humor in the formation of science attitudes, perceptions, and behaviors. Recent research sheds light on how funny science and emotions can help explain and potentially overcome our inability or lack of motivation to recognize and challenge misinformation. We identify some lessons learned from these related and growing areas of research and conclude with a brief discussion of the ethical considerations of using persuasive strategies, calling for more dialogue among members of the science communication community.
Keywords: science communication, misinformation, emotion, humor
Many visible public debates over scientific issues are clouded in accusations of falsehood, which place increasing demands on citizens to deduce fact from fiction. Doing so is challenging, as facts, half-truths, and falsehoods can seem indistinguishable (1). This is especially the case in a media environment where a consumer can pick and choose from numerous channels, formats, and types of information (2). Others in PNAS (3, 4) have dissected and quantified the scope of the misinformation problem, and we will not repeat those arguments here. It is sufficient, for this perspective, to assert that fake news is a salient issue in media today (e.g., refs. 5 and 6), and the challenge presented by science misinformation and misperceptions deserves attention.
In this essay, we first examine the constraints on our ability to detect misinformation that, when coupled with our inadvertent motivations to believe false science, result in a high likelihood that we will form misperceptions. We briefly engage with ability and motivation, drawing primarily from research in cognitive psychology. Then, extending prior research and discussions from previous colloquia (e.g., ref. 3), we focus our perspective on an important but relatively understudied area of research in science communication: emotion and humor.
As antiscience claims often appeal to emotions (e.g., ref. 7), a better understanding of the role of emotions in science communication can advance not only how we communicate and engage public audiences with science but also how we address misinformation. To this end, recent research on humor sheds light on how funny science can potentially combat misinformation and misperceptions through various mechanisms. We conclude with some lessons learned from this growing body of research and briefly touch on the ethical considerations, calling for more discussion about this area of science communication.
Our Ability to Recognize Misinformation Is Limited
In a 2016 survey, nearly 25% of adults in the United States said they shared inaccurate information on social media (8), a statistic that is likely much higher due to social desirability bias in self-reporting.* Our ability to recognize and avoid misinformation is curtailed not only by overwhelming amounts of information and the nature of scientific content that we encounter online (2) but by individual characteristics (e.g., science knowledge, media literacy) and structural constraints (e.g., local and regional news deserts).
Knowledge about basic science facts plays a role in our ability to parse accurate information from falsehoods and half-truths (e.g., ref. 9). But the level of science knowledge among US adults has been fairly stagnant for at least a decade; in 2018, American adults correctly answered about 5.5 of 9 true-or-false questions about basic science facts (10). Moreover, factual knowledge is not the only predictor of people’s perceptions about science (11), and merely filling the deficit in public knowledge is unlikely to remedy (mis)perceptions (12).
While knowledge about basic science facts is one aspect of science literacy, the term also encompasses an understanding of the practice of science and its role as a social process (13). As others have pointed out (3), knowledge about the practice and process of science, not just basic facts, is likely to be more relevant in the context of science misinformation. In the United States, these types of literacy have also been relatively unchanging. In 2018, only about 43% of adults correctly responded to several questions that measure understanding of the process of scientific inquiry (10).
Media literacy also augments our ability to evaluate information. Like science literacy, media literacy is a complex concept that generally refers to the ability to analyze and evaluate information (14, 15), much of which we now access online. Evaluating online information includes considering strategies used to create content; identifying a media producer’s purpose and perspective; recognizing the social, political, and historical contexts in which information is created and consumed (14); and determining credibility (16). Media literacy equips us with the ability to negotiate meaning and engage with information that is available in a variety of media formats. Yet, media literacy education in the United States has lagged behind other developed nations (17–19). Moreover, some argue that current media literacy education efforts focus on little more than familiarity with online and digital media tools that emphasize creativity and social connectivity (20). Education efforts that hone so-called “tool competence,” while useful, are unlikely to improve our ability to detect and avoid misinformation.
Media and science literacy are inextricably linked; media coverage of science issues is part of the social process of science (13). We need greater attention to media literacy education that focuses on critical engagement with media and its impacts on society. Better media literacy education coupled with an understanding of science literacy that includes considerations of media presentations of science (e.g., ref. 21) has the potential to improve our ability to discern science facts from falsehoods.
There are also structural constraints to our ability to detect misinformation. Two structural factors that limit our ability to discern credible science information from falsehoods include the shrinking of traditional science journalism and the prevalence of news deserts across the United States. Journalists play a key role in public understanding, legitimization, and support of science, through their coverage of scientific topics in media (22), yet this practice faces increasing challenges. Over the last decade, the traditional news industry has lost readers, influence, and advertising revenue, leading the industry to decline in both size and average salary (23).
Science news is typically less popular than other topics in media (e.g., politics, sports), and this low priority has intensified in a digital environment, resulting in fewer science journalist careers (22). In addition, there are increasing demands placed on science journalists, with constant, tighter deadlines across multiple media platforms (24) and an emphasis on simplifying material for audience consumption (25). Rather than the watchdog or gatekeeper roles that journalists traditionally held, many science journalists today find their role in science news coverage evolving into one necessitating public relations skills to navigate politicized issues, polarized debates, and interest-driven coverage (26). Fewer journalists, working faster and for less money, to produce bite-sized news stories in an oversaturated media market impairs the quality of science information that public audiences receive.
Moreover, since 2004, more than one in five newspapers in the United States has closed its doors permanently, and others have switched to a completely digital landscape, leaving many communities without a local newspaper (27). The departure of local newspapers produces news deserts, or communities with no coverage of local news (28). For many, this information absence is filled by the Internet. Today, most individuals obtain scientific information online (10), a trend even more prominent among so-called digital natives (12). While the Internet allows for public communication about science in novel ways, the information provided in this participatory media environment does not always face the same scrutiny upheld by established journalistic norms. As such, individuals other than scientists, such as politicians or religious leaders who may have contrary opinions, may challenge facts held in scientific consensus (29). The declining presence of traditional science journalism and an increase in physical news deserts emphasize that citizens’ abilities to identify information do not occur in a vacuum and can be threatened by structural, as well as individual, constraints.
We Often Lack Motivation to Parse Misinformation
Scholarship in the basic sciences of human cognition offers abundant evidence that the ways in which we seek and process information are not conducive to discerning misinformation. Many scientific issues that society faces are complex and novel to the average information consumer. Thus, it requires significant effort by citizens to make sense of the information necessary to thoroughly understand a single scientific issue on the public agenda (30).
To manage the deluge of information, we rely on mental shortcuts, or heuristics, that reduce complex cognitive tasks into simple operations that enable us to make judgments and form opinions about scientific issues society faces. Abundant empirical evidence shows that we use heuristics in the context of science. Predispositions such as political ideology and religious values are employed when we seek information (31) and form opinions about issues ranging from nuclear energy (32) to nanotechnology (33). Online, heuristics become helpful tools that allow us to efficiently make sense of information in an often overwhelming environment; we are constantly inundated with complex science messages from a myriad of sources and in diverse formats.
In addition to helping us sift through large amounts of information, we use these mental shortcuts in our evaluation of information (30, 34, 35). We are motivated skeptics engaged in motivated reasoning; we process information in unconsciously biased ways (36). As others have pointed out (4), this mechanism can explain both the difficulty of detecting misinformation and the challenge of correcting misperceptions.
Misinformation is often packaged in simplistic and emotional formats (37). Stories containing misinformation are often framed as clickbait, with captivating titles that capture attention with scandalous information. Indeed, extant scholarship indicates that emotions such as anger tend to favor biased processing of misinformation, resulting in attitude-consistent misperceptions (38). Such mechanisms encourage our acceptance of misinformation without much cognitive effort. Perhaps because the scientific endeavor is traditionally viewed as cold, rational, dispassionate, and objective, we have overlooked the role of emotion in the formation of science opinions and attitudes. Yet, appeals to emotional reactions are often used in the framing of false information (7). This “cold” view of science is seemingly at odds with “hot” topics like emotion. But this constructed dichotomy fails to account for decades of research in the social sciences; emotion is a fundamental part of almost all human actions and decisions (39).
The Role of Emotion in Science Communication
Emotions are subjective feeling states that result from appraisal of a situation and give rise to approach or avoidance motivational processes. Functional emotion theorists argue that emotions arise from meaningful interpretation of an object (e.g., a scientific message). In other words, emotions are the result of meaning making that give rise to action tendencies [i.e., approach or avoidance responses (40)]; each emotion has a core relational theme that guides our responses (41). Approach motivation is typically connected with incentive and reward, while avoidance is associated with aversion and threat (42). For example, fear is experienced in response to a physically or psychologically threatening object resulting in an avoidance motivation (43, 44). In contrast, anger that results from appraisal of an object or message is associated with approach action tendencies, as individuals are motivated to defend themselves or rectify a perceived wrong (45). These appraisal tendencies are implicit predispositions used to evaluate future stimuli (46) and can affect depth of information processing (47–49) and thought content (50, 51). Emotions, therefore, are likely to influence people’s attitudes toward science and their risk judgments.
Although emotional appeals and affect have a long history of study in the context of health communication, emotions can also influence our attitudes toward scientific issues and how we process scientific information (e.g., refs. 52–55). For example, disgust elicited by a message about fecal microbiome transplants can increase people’s risk perceptions (55) and influence their attitudes toward policy and regulation (54). Fear and anger toward videos from the Discovery Channel’s Shark Week have also been found to drive shark conservation behaviors (53). Emotions have likewise been examined as potentially strengthening cognitive strategies such as gain-versus-loss framing, in which information is presented in terms of gains or losses that result from engaging in a behavior (56, 57). Using the context of sea star wasting disease, Lu (52) found that gain-framed messages containing a sadness appeal (relative to loss-framed messages and hope appeals) increased proenvironmental behaviors, policy support, and information seeking among individuals. Others have found that gain-framed messages that evoke hope can influence people’s attitudes toward climate advocacy and policy (58), drawing on theoretical approaches to the study of emotion in science communication, including the cognitive functional model (59), which highlights mechanisms that potentially explain why rectifying misperceptions remains a challenge.
Strong emotions can impair our ability to process science information rationally (49). If processing ability is impaired, we generally resort to using mental shortcuts, or heuristic processing, to make sense of new information. Then, if a science falsehood aligns with our priors, heuristic processing impairs our ability to detect misinformation, while increasing the possibility of acceptance. If processing ability is not impaired, whether we adopt systematic or heuristic processing depends on the availability of mental shortcuts. If these shortcuts are present, and we are motivated to engage with the information and expect it to satisfy an emotion-induced goal, then we are more likely to process information heuristically. In the same state of motivation and goal expectation, the absence of mental shortcuts makes it more likely that we will process the information systematically. In the latter case, priors and predispositions can serve as moderators of the resulting attitude, judgment, or (mis)perception. If our priors lead us to accept misinformation, the misperceptions that result are likely to be long lasting and relatively stable.
A more recent theoretical framework is the emotional flow hypothesis (60, 61). While most communication research on emotion has focused on how a primary emotion affects downstream attitudinal and behavioral outcomes, message content can induce a series of emotional responses (62). This so-called emotional flow is defined as “the evolution of the emotional experience during exposure” to a message (60). Although primarily proposed and examined in the context of health messaging (e.g., refs. 63 and 64), an initial empirical test of emotional flow has been conducted in the context of climate change (58). Using gain-versus-loss framing coupled with threat and efficacy messages presented in succession, Nabi et al. (58) found that climate change messages designed to first elicit fear, then hope, were more effective in encouraging advocacy behavior when compared to messages that lacked emotional sequencing structure.
Although the evidence is yet sparse, the emotional flow hypothesis might offer a means of correcting misinformation. One study found that a narrative containing corrective information that had an emotional ending was more effective at rectifying attitudes compared to a corrective narrative without an emotional ending (65). Even though this study did not test the emotional flow hypothesis specifically, these findings are promising, as shifts in emotion are central to narratives and storytelling (61).
The effect of emotions on the detection and acceptance of misinformation, the formation of misperceptions, and their correction is not straightforward. Indeed, the mechanisms reviewed and proposed require further empirical testing. Additional research in this area can shed much-needed light on how emotional appeals and affective reactions to science information might limit or enhance our ability and motivation to address misinformation. Advances in this area will complement existing research on the cognitive mechanisms associated with misinformation and the correction of misperceptions.
Funny Science: How Humor Influences Science Attitudes
Related to our understanding of the role of emotions and emotional shifts is the use of humor in science messaging. Humor is derived primarily from surprise, as incongruity often plays a role in the elicitation of humor (66, 67), and can, if one gets the joke, result in amusement or mirth. Humor and emotion have a long history; emotional events are often retold or framed in humorous ways (68), and humor is regularly used in interpersonal emotion management (69). Today, we often see humorous content about current scientific issues that are emotionally charged (e.g., memes about mask wearing to prevent the spread of the coronavirus). Establishing a better understanding of humor, including its relationship to discrete emotions and the mechanisms that underlie shifts in emotion when we encounter funny, yet emotional, science content, is necessary to improve our understanding of the effects of humor and how it can be used in the practice of communicating complex scientific issues.
Humor is ubiquitous and constant in daily life. We see funny messages in television advertisements (70, 71), and almost 30% of Americans say they learned something about politics from satirical programs such as The Daily Show, The Colbert Report, and Saturday Night Live (72). Humor is also prevalent in science. A recent content analysis of Twitter and Instagram analyzed the types of humor present, finding that satire, wordplay, and anthropomorphism were relatively commonplace (73). The ubiquity of humor makes it an ideal subject of inquiry, as it allows researchers to examine theories of science communication in real-world settings, a research agenda that has been emphasized in a recent report of the National Academies (74).
In an era of (mis)information, humor has the potential to be implemented as a defense against falsehoods, but a better understanding of how humor influences public attitudes and decision-making is necessary. So far, research that has examined the use of humor to correct misinformation is inconclusive, though hopeful. Vraga et al. (75) compared the effectiveness of humor- vs. logic-based corrections of misinformation on Twitter and found that, of the three issues examined (climate change, HPV vaccinations, and gun control), only corrections about HPV vaccinations reduced misperceptions. However, both humor- and logic-based corrections were effective. In a related study using eye tracking, researchers found humor directed audiences’ attention to both the misinformation and the visual designed to correct it (76). Attending to the corrective image reduced people’s perceptions of credibility of the misinformation and, indirectly, reduced misperceptions. Other research on Facebook has shown that fake news from a source that self-identifies as a satirical outlet can potentially reduce misperceptions by reducing perceptions of credibility (77).
While the evidence is far from equivocal, these studies highlight why humor can be a valuable tool. First, it can serve as a means of drawing attention to issues to which audiences might not otherwise attend (e.g., refs. 78 and 79). Humorous messages also direct a viewer’s attention to information embedded within their content, which may be a result of the viewer marshaling cognitive resources to “get the joke” (67). In particular, visual forms of science humor (e.g., memes, comics) have the potential to capture attention (80), and some studies show that humor can also improve problem-solving skills and learning (81), although more research in this area is necessary. More importantly, humor impacts how we process information (e.g., ref. 82) and form attitudes and behavioral intentions (e.g., refs. 83 and 84).
Clearly, humor is already used to communicate science; scholars even recommend using humor for this purpose (85, 86). Yet, humor’s effects on people’s attitudes toward science and scientists largely remain an open empirical question. Science humor as a research context is integral to its application in practice. However, this is an emerging area of scholarship in science communication, and we look to the areas that have a longer history in the study of humor (e.g., education, advertising, political communication) for applicable insights.
Humor’s Effect on Source Evaluations
Audience perceptions of a communication source have long been recognized as important factors that impact the effectiveness of communication (87). Among the desirable attributes of a source, trustworthiness and likability play decisive roles in the persuasive impacts of messages. Trust has long been shown to affect people’s attitudes toward science (e.g., refs. 88–91). Although trust is a broad concept that can be measured in a variety of ways, source credibility is a common feature of numerous conceptualizations (e.g., refs. 92–95). To improve detection of misinformation and guard against misperceptions, then, we must consider the credibility of sources of scientific messages.
Related to credibility, source likability is typically conceptualized as an affective evaluation linked to an object (96). It is associated with traits that make a person likable in a general sense but are not necessarily relevant to the person’s expertise or credibility (97). Consistently, research has shown that more-likable communicators are more likely to influence audiences’ views through explicitly expressed intentions to persuade (98, 99). Taken together, source likability and credibility have potentially impactful roles as preventative and corrective measures against misinformation.
Humor has long been linked to source evaluations (100, 101). In advertising, its effects on source evaluations often depend on factors such as humor type (102). In education research, the relationship between humor and source evaluation is more consistent; humor is linked to more-positive evaluations of teachers (103, 104). In interpersonal communication research, inoffensive humor has been associated with attraction and building of rapport between individuals (105). When someone makes another person laugh, a recipient associates the source of humor with the pleasure of laughing. As a result, they view the source as more likable (106). In general, funny people are rated more favorably than others, a finding that has been replicated across diverse contexts (107).
Recent research has found supporting evidence in the context of science communication. Using a science joke on Twitter, we (108) found that people who found the content amusing also perceived the scientist who posted the joke as more likable. In another experiment manipulating the presence of a laugh track in a video clip featuring a scientist performing a standup comedy routine, Yeo et al. (109) found that laughter increased audiences’ perceptions of likability and expertise of the scientist. These findings are encouraging—scientists who use humor to engage audiences appear to be more likable, and, importantly, their credibility as a scientist is not undermined.
In addition to affecting perceptions of likability and expertise, funny content can impact downstream attitudes and behavioral intentions indirectly. Not only is a scientist performing a standup comedy routine perceived as more likable and credible, but greater perceptions of expertise are subsequently associated with perceptions of comedy as valid sources of science information (109). Experiencing humor as a result of funny science content also increases people’s motivations to follow more science on social media and their intentions to share and engage with such content (84, 108). Notably, these recent works on science humor are, for the most part, conducted with jokes that tend to be benign and inoffensive. However, satire and sarcasm are prevalent in online science humor (73), and it is to this type of biting, other-directed humor we now turn.
Regarding Satire and Sarcasm
Satire is commonly found in online science content (73) and exemplified by Twitter hashtags such as #overlyhonestmethods and #fieldworkfail. These hashtags are often used by researchers to express methodological frustrations that would not be considered appropriate for scholarly publication (110). The humor expressed in this content, instead of being self-deprecating, is other directed, poking fun at the scientific process (111). Some research on humor in science and health communication has probed the effects of satire on attitudes and information processing. For example, a satirical message about the importance of the measles, mumps, and rubella (MMR) vaccine led to less psychological reactance [a motivational state in which individuals feel their freedom is threatened (112)] and reduced defensive information processing for those who held misinformed beliefs about the MMR vaccine (113). In the context of climate change, viewers of a one-sided, sarcastic message mocking people who believe climate change is a hoax reported increased risk perceptions (114) and were encouraged to engage in more elaborative information processing (115). These findings and others (e.g., ref. 116) offer promising answers to the question of using humor to accomplish strategic science communication goals, including detecting and countering misinformation.
Although these few studies offer some understanding of satire’s role in science communication, this is still an under-studied area. However, we can look to its treatment in other contexts to gain insight into its role in communicating complex topics. On the one hand, research in political communication demonstrates the promise of satirical content to foster learning, engagement, and message elaboration. For example, following the 2012 election, exposure to The Colbert Report was found to increase people’s perceptions of their knowledge about super PACs, while also increasing factual knowledge of campaign finance regulation (117). Others have found that political humor can increase knowledge (118, 119), message elaboration (120), and political participation (120, 121). Applied to communicating complex science issues, finding satirical ways to present novel and intricate topics might facilitate learning and engagement among broad audiences.
On the other hand, it is easy to imagine that satire could potentially perpetuate misperceptions in science and negatively influence people’s perceptions of information sources and scientific actors. An analysis of climate change reporting on The Colbert Report found that, even though the issue was covered ironically, some audiences (primarily conservatives) took Colbert’s message about climate change being a “hoax” at face value (122, 123). These backfire effects can thus perpetuate misperceptions about climate change as a myth and would also be a concern for other scientific issues (e.g., vaccines). Others have found parodies of political candidates to increase the salience of caricatured traits (124) and affect perceptions of a joke’s target (125–128). Much of the extant research on satirical impersonations in political communication (e.g., Tina Fey’s portrayal of Sarah Palin on Saturday Night Live) shows that satire negatively influences people’s evaluations of political candidates. If satirical political content can negatively impact evaluations of political actors, it seems reasonable to consider that the same might occur in the context of satirical science: Does satirical science humor potentially undermine trust in scientists and other scientific professionals? This and other questions about the effect of satire in science communication are open empirical questions that should be addressed.
Looking Ahead
There is no simple remedy to the problem of science misinformation. Our best and most realistic approach is to use multiple approaches in concert with each other. To this end, a better understanding of the roles of emotion and humor in accepting misinformation and forming misperceptions, as well as correcting them, serves as one more resource for science communicators’ efforts against misinformation. It is crucial that members of the science communication community (e.g., trainers, practitioners, researchers) form mutual collaborations to facilitate the conduct of translational research to address the misinformation challenge we face today. We believe science communication research needs more translation—empirical observations from research can and should be turned into best practices, strategies, and interventions that improve the health of science and its role in society.
Yet, in doing this work, critiques arise about the potential ethical implications of the recommendations and best practices that result from theoretical work. One perspective asserts that it is manipulative for science communicators to use persuasive strategies to achieve better scientific citizenship (e.g., higher science literacy, proscience attitudes). Although this consideration of ethics is not new to science communication (129, 130), even a recent report on the science of science communication from the National Academies (74) remained agnostic on this issue. We need to discuss the ethics of using communication strategies when engaging publics, and a recent edited volume initiates this conversation (131). Yet, the issue of whether science communication informs or persuades has not been adequately addressed (132). Not engaging with the goals and related ethics of science communication and continuing to rely on “just the facts” of scientific issues risks allowing misinformation and misperceptions to become more pervasive in our information ecosystem.
Communication strategies are not inherently deceitful or malicious—it is how we deploy these strategies that matters. Of course, how practitioners adopt communication practices translated from research depends on many factors, including goals and intentions. If the goals for current science communication are to correct misperceptions and inoculate ourselves against misinformation, we must engage with the moral complexities and make ethically grounded decisions about whether and how to implement persuasive communication tools to meet our goals.
Although empirically driven communication techniques may be advantageous for combating ephemeral misinformation wars, another concern is the potential hazard of undermining public trust in science and scientists. For now, opinion surveys show that the public’s trust in science is relatively high. In 2018, 44% of Americans said they have a “great deal of confidence” in the scientific community (10), second only to confidence in the US military. Given the importance of trusted sources in communication, it is integral that this confidence in the scientific community is not eroded in the effort to diminish misinformation and reduce public misperceptions. Science communication researchers continue to advance our understanding of how emotion and humor as strategic communication techniques can be used in service to practice; collaborative projects between practitioners and researchers will only strengthen this knowledge base. But, although empirical discoveries may yield effective tools and techniques, the recommendations and best practices that result must be employed conscientiously. To this end, it is essential that we engage in discussions and dialogue about the ethical considerations and challenges that face science communication today.
Footnotes
The authors declare no competing interest.
This paper results from the Arthur M. Sackler Colloquium of the National Academy of Sciences, "Advancing the Science and Practice of Science Communication: Misinformation About Science in the Public Sphere," held April 3−4, 2019, at the Arnold and Mabel Beckman Center of the National Academies of Sciences and Engineering in Irvine, CA. NAS colloquia began in 1991 and have been published in PNAS since 1995. From February 2001 through May 2019, colloquia were supported by a generous gift from The Dame Jillian and Dr. Arthur M. Sackler Foundation for the Arts, Sciences, & Humanities, in memory of Dame Sackler's husband, Arthur M. Sackler. The complete program and video recordings of most presentations are available on the NAS website at http://www.nasonline.org/misinformation_about_science.
This article is a PNAS Direct Submission. C.M.R. is a guest editor invited by the Editorial Board.
*Social desirability bias is the tendency for survey respondents to answer questions in ways that they perceive to be socially acceptable (133).
Data Availability
There are no data underlying this work.
References
- 1.Pew Research Center , The future of truth and misinformation online (2017).
- 2.Brossard D., New media landscapes and the science information consumer. Proc. Natl. Acad. Sci. U.S.A. 110, 14096–14101 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Scheufele D. A., Krause N. M., Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 7662−7669 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Cacciatore M. A., Misinformation and public opinion of science and health: Approaches, findings, and future directions. Proc. Natl. Acad. Sci. U.S.A. 118, e1912437117 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Baker S. A., Wade M., Walsh M. J., Misinformation: tech companies are removing 'harmful' coronavirus content – but who decides what that means? The Conversation (2020). https://theconversation.com/misinformation-tech-companies-are-removing-harmful-coronavirus-content-but-who-decides-what-that-means-144534. Accessed 10 September 2020.
- 6.Garcia R. T., Brazil’s “fake news” bill won’t solve its misinformation problem. MIT Technology Review (2020). https://www.technologyreview.com/2020/09/10/1008254/brazil-fake-news-bill-misinformation-opinion. Accessed 10 September 2020.
- 7.Bean S. J., Emerging and continuing trends in vaccine opposition website content. Vaccine 29, 1874–1880 (2011). [DOI] [PubMed] [Google Scholar]
- 8.Barthel M., Mitchell A., Holcomb J., Many Americans believe fake news is sowing confusion (Pew Research Center, 2016). [Google Scholar]
- 9.Pennycook G., McPhetres J., Zhang Y., Lu J. G., Rand D. G., Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychol. Sci. 31, 770–780 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.National Science Board , Science and Engineering Indicators 2020 (National Science Foundation, 2020). [Google Scholar]
- 11.Simis M. J., Madden H., Cacciatore M. A., Yeo S. K., The lure of rationality: Why does the deficit model persist in science communication? Public Underst. Sci. 25, 400–414 (2016). [DOI] [PubMed] [Google Scholar]
- 12.Scheufele D. A., Communicating science in social settings. Proc. Natl. Acad. Sci. U.S.A. 110, 14040–14047 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.National Academies of Sciences, Engineering, and Medicine , Science Literacy: Concepts, Contexts, and Consequences (The National Academies Press, 2016). [PubMed] [Google Scholar]
- 14.Hobbs R., “Expanding the concept of literacy” in Media Literacy in the Information Age: Current Perspectives, Kubey R., Ed. (Transaction, 1997), pp. 163–183. [Google Scholar]
- 15.Bulger M., Davison P., The Promises, Challenges, and Futures of Media Literacy (Data & Society Research Institute, 2018). [Google Scholar]
- 16.Callister T. A. Jr, Media literacy: On-ramp to the literacy of the 21st century or cul-de-sac on the information superhighway. Adv. Reading Lang. Res. 7, 403–420 (2000). [Google Scholar]
- 17.Kellner D., Share J., Media literacy in the U.S. MedienPädagogik 11, 1–21 (2005). [Google Scholar]
- 18.Kubey R., Baker F., Has media literacy found a curricular foothold? Educ. Week 19, 56–58 (1999). [Google Scholar]
- 19.Mihailidis P., Media literacy in journalism/mass communication education: Can the United States learn from Sweden? J. Mass Commun. Educat. 60, 415–428 (2005). [Google Scholar]
- 20.Hobbs R., Jensen A., The past, present, and future of media literacy education. J. Media Lit. Educ. 1, 1 (2013). [Google Scholar]
- 21.Brossard D., Shanahan J., Do they know what they read? Building a scientific literacy measurement instrument based on science media coverage. Sci. Commun. 28, 47–63 (2006). [Google Scholar]
- 22.Schäfer M. S., “How changing media structures are affecting science news coverage” in The Oxford Handbook of the Science of Science Communication, Jamieson K. H., Kahan D. M., Scheufele D. A., Eds. (Oxford University Press, 2017), pp. 51–57. [Google Scholar]
- 23.Dunwoody S., “Science journalism: Prospects in the digital age” in Routledge Handbook of Public Communication of Science and Technology, Bucchi M., Trench B., Eds. (Routledge, ed. 2, 2014), pp. 27–39. [Google Scholar]
- 24.Brumfiel G., Science journalism: Supplanting the old media? Nature 458, 274–277 (2009). [DOI] [PubMed] [Google Scholar]
- 25.Goldacre B., Bad Science: Quacks, Hacks, and Big Pharma Flacks (McClelland & Stewart, 2010). [Google Scholar]
- 26.Ashwell D. J., The challenges of science journalism: The perspectives of scientists, science communication advisors and journalists from New Zealand. Public Underst. Sci. 25, 379–393 (2016). [DOI] [PubMed] [Google Scholar]
- 27.Abernathy P. M., The Expanding News Desert (University of North Carolina Press, 2018). [Google Scholar]
- 28.Miller J., News Deserts: No News is Bad News (Manhattan Institute for Policy Research, 2018). [Google Scholar]
- 29.Bubela T., et al., Science communication reconsidered. Nat. Biotechnol. 27, 514–518 (2009). [DOI] [PubMed] [Google Scholar]
- 30.Fiske S. T., Taylor S. E., Social Cognition (McGraw-Hill, ed. 2, 1991). [Google Scholar]
- 31.Yeo S. K., Xenos M. A., Brossard D., Scheufele D. A., Selecting our own science: How communication contexts and individual traits shape information seeking. Ann. Am. Acad. Pol. Soc. Sci. 658, 172–191 (2015). [Google Scholar]
- 32.Yeo S. K., et al., Partisan amplification of risk: American perceptions of nuclear energy risk in the wake of the Fukushima Daiichi disaster. Energy Policy 67, 727–736 (2014). [Google Scholar]
- 33.Cacciatore M. A., Scheufele D. A., Corley E. A., From enabling technology to applications: The evolution of risk perceptions about nanotechnology. Public Underst. Sci. 20, 385–404 (2011). [Google Scholar]
- 34.Popkin S. L., The Reasoning Voter: Communication and Persuasion in Presidential Campaigns (University of Chicago Press, 1991). [Google Scholar]
- 35.Kuklinski J. H., Quirk P. J., “Reconsidering the rational public: Cognition, heuristics and mass opinion” in Elements of Reason: Cognition, Choice, and the Bounds of Rationality, Lupia A., McCubbins M. D., Popkin S. L., Eds. (Cambridge University Press, 2000), pp. 153−182. [Google Scholar]
- 36.Taber C. S., Lodge M., Motivated skepticism in the evaluation of political beliefs. Am. J. Pol. Sci. 50, 755–769 (2006). [Google Scholar]
- 37.Lewandowsky S., Ecker U. K. H., Seifert C. M., Schwarz N., Cook J., Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012). [DOI] [PubMed] [Google Scholar]
- 38.Weeks B. E., Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. J. Commun. 65, 699–719 (2015). [Google Scholar]
- 39.Damasio A., Descartes’ Error: Emotion, Reason, and the Human Brain (Penguin, reprint ed., 2005). [Google Scholar]
- 40.Newhagen J. E., TV news images that induce anger, fear, and disgust: Effects on approach‐avoidance and memory. J. Broadcast. Electron. Media 42, 265–276 (1998). [Google Scholar]
- 41.Lazarus R. S., Progress on a cognitive-motivational-relational theory of emotion. Am. Psychol. 46, 819–834 (1991). [DOI] [PubMed] [Google Scholar]
- 42.Elliot A. J., Eder A. B., Harmon-Jones E., Approach–avoidance motivation and emotion: Convergence and divergence. Emot. Rev. 5, 308–311 (2013). [Google Scholar]
- 43.Frijda N. H., The Emotions (Cambridge University Press, 1986). [Google Scholar]
- 44.Lazarus R. S., Emotion and Adaptation (Oxford University Press, 1991). [Google Scholar]
- 45.Izard C. E., Human Emotions (Springer, 1977). [Google Scholar]
- 46.Lerner J. S., Keltner D., Beyond valence: Toward a model of emotion-specific influences on judgement and choice. Cogn. Emotion 14, 473–493 (2000). [Google Scholar]
- 47.Lerner J. S., Tiedens L. Z., Portrait of the angry decision maker: How appraisal tendencies shape anger’s influence on cognition. J. Behav. Decis. Making 19, 115–137 (2006). [Google Scholar]
- 48.Small D. A., Lerner J. S., Emotional policy: Personal sadness and anger shape judgments about a welfare case. Polit. Psychol. 29, 149–168 (2008). [Google Scholar]
- 49.Tiedens L. Z., Linton S., Judgment under emotional certainty and uncertainty: The effects of specific emotions on information processing. J. Pers. Soc. Psychol. 81, 973–988 (2001). [DOI] [PubMed] [Google Scholar]
- 50.Keltner D., Ellsworth P. C., Edwards K., Beyond simple pessimism: Effects of sadness and anger on social perception. J. Pers. Soc. Psychol. 64, 740–752 (1993). [DOI] [PubMed] [Google Scholar]
- 51.Raghunathan R., Pham M. T., All negative moods are not equal: Motivational influences of anxiety and sadness on decision making. Organ. Behav. Hum. Decis. Process. 79, 56–77 (1999). [DOI] [PubMed] [Google Scholar]
- 52.Lu H., The effects of emotional appeals and gain versus loss framing in communicating sea star wasting disease. Sci. Commun. 38, 143–169 (2016). [Google Scholar]
- 53.Myrick J. G., Evans S. D., Do PSAs take a bite out of Shark Week? The effects of juxtaposing environmental messages with violent images of shark attacks. Sci. Commun. 36, 544–569 (2014). [Google Scholar]
- 54.Sun Y., Yeo S. K., McKasy M., Shugart E. C., Disgust, need for affect, and responses to microbiome research. Mass Commun. Soc. 22, 508–534 (2019). [Google Scholar]
- 55.Yeo S. K., Sun Y., McKasy M., Shugart E. C., Disgusting microbes: The effect of disgust on perceptions of risks related to modifying microbiomes. Public Underst. Sci. 28, 433–448 (2019). [DOI] [PubMed] [Google Scholar]
- 56.Kahneman D., Tversky A., Prospect theory: An analysis of decision under risk. Econometrica 47, 263–291 (1979). [Google Scholar]
- 57.Rothman A. J., Salovey P., Shaping perceptions to motivate healthy behavior: The role of message framing. Psychol. Bull. 121, 3–19 (1997). [DOI] [PubMed] [Google Scholar]
- 58.Nabi R. L., Gustafson A., Jensen R., Framing climate change: Exploring the role of emotion in generating advocacy behavior. Sci. Commun. 40, 442–468 (2018). [Google Scholar]
- 59.Nabi R. L., A cognitive-functional model for the effects of discrete negative emotions on information processing, attitude change, and recall. Commun. Theory 9, 292–320 (1999). [Google Scholar]
- 60.Nabi R. L., Emotional flow in persuasive health messages. Health Commun. 30, 114–124 (2015). [DOI] [PubMed] [Google Scholar]
- 61.Nabi R. L., Green M. C., The role of a narrative’s emotional flow in promoting persuasive outcomes. Media Psychol. 18, 137–162 (2015). [Google Scholar]
- 62.Carrera P., Muñoz D., Caballero A., Mixed emotional appeals in emotional and danger control processes. Health Commun. 25, 726–736 (2010). [DOI] [PubMed] [Google Scholar]
- 63.Alam N., So J., Contributions of emotional flow in narrative persuasion: An empirical test of the emotional flow framework. Commun. Q. 68, 161–182 (2020). [Google Scholar]
- 64.Nabi R. L., Myrick J. G., Uplifting fear appeals: Considering the role of hope in fear-based persuasive messages. Health Commun. 34, 463–474 (2019). [DOI] [PubMed] [Google Scholar]
- 65.Sangalang A., Ophir Y., Cappella J. N., The potential for narrative correctives to combat misinformation. J. Commun. 69, 298–319 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Gruner C. R., The Game of Humor: A Comprehensive Theory of Why We Laugh (Routledge, 1997). [Google Scholar]
- 67.Suls J., “Cognitive processes in humor appreciation” in Handbook of Humor Research, McGhee P. E., Goldstein J. H., Eds. (Springer, New York, NY, 1983), pp. 39–57. [Google Scholar]
- 68.Meisiek S., Yao X., Hartel C. E. J., Zerbe W. J., Ashkanasy N. M., “Nonsense makes sense: Humor in social sharing of emotion at the workplace” in Emotions in Organizational Behavior, Härtel C. E., Zerbe W. J., Ashkanasy N. M., Eds. (Lawrence Erlbaum Associates, Inc., 2005), pp. 143–165. [Google Scholar]
- 69.Francis L. E., Laughter, the best mediation: Humor as emotion management in interaction. Symbolic Interact. 17, 147–163 (1994). [Google Scholar]
- 70.Gulas C. S., McKeage K. K., Weinberger M. G., It’s just a joke: Violence against males in humorous advertising. J. Advert. 39, 109–120 (2010). [Google Scholar]
- 71.Beard F. K., One hundred years of humor in American advertising. J. Macromark. 25, 54–65 (2005). [Google Scholar]
- 72.Baumgartner J. C., Morris J. S., Laughing Matters: Humor and American Politics in the Media Age (Routledge, 2008). [Google Scholar]
- 73.McKasy M., Yeo S. K., Cacciatore M. A., Su L. Y.-F., Oldroyd Z., American Association for the Advancement of Science (AAAS) Annual Meeting. February 14–17 2019, Washington, D.C. Poster 24727. [Google Scholar]
- 74.National Academies of Sciences, Engineering, and Medicine , Communicating Science Effectively: A Research Agenda (The National Academies Press, 2017). [PubMed] [Google Scholar]
- 75.Vraga E. K., Kim S. C., Cook J., Testing logic-based and humor-based corrections for science, health, and political misinformation on social media. J. Broadcast. Electron. Media 63, 393–414 (2019). [Google Scholar]
- 76.Kim S. C., Vraga E. K., Cook J., An eye tracking approach to understanding misinformation and correction strategies on social media: The mediating role of attention and credibility to reduce HPV vaccine misperceptions. Health Commun., 10.1080/10410236.2020.1787933 (2020). [DOI] [PubMed] [Google Scholar]
- 77.Garrett R. K., Poulsen S., Flagging Facebook falsehoods: Self-identified humor warnings outperform fact checker and peer warnings. J. Comput. Mediat. Commun. 24, 240–258 (2019). [Google Scholar]
- 78.Becker A. B., Waisanen D. J., From funny features to entertaining effects: Connecting approaches to communication research on political comedy. Rev. Comm. 13, 161–183 (2013). [Google Scholar]
- 79.Brewer P. R., McKnight J., Climate as comedy: The effects of satirical television news on climate change perceptions. Sci. Commun. 37, 635–657 (2015). [Google Scholar]
- 80.Lin S.-F., Lin H., Lee L., Yore L. D., Are science comics a good medium for science communication? The case for public learning of nanotechnology. Int. J. Sci. Education. Part B 5, 276–294 (2015). [Google Scholar]
- 81.Farinella M., The potential of comics in science communication. J. Clin. Outcomes Manag. 17, Y01 (2018). [Google Scholar]
- 82.Vraga E. K., Johnson C. N., Carr D. J., Bode L., Bard M. T., Filmed in front of a live studio audience: Laughter and aggression in political entertainment programming. J. Broadcast. Electron. Media 58, 131–150 (2014). [Google Scholar]
- 83.Cacciatore M. A., Becker A. B., Anderson A. A., Yeo S. K., Laughing with science: The influence of audience approval on engagement. Sci. Commun. 42, 195–217 (2020). [Google Scholar]
- 84.Yeo S. K., Su L. Y.-F., Cacciatore M. A., McKasy M., Qian S., Predicting intentions to engage with scientific messages on Twitter: The roles of mirth and need for humor. Sci. Commun. 42, 481–507 (2020). [Google Scholar]
- 85.Goodwin J., Dahlstrom M. F., Communication strategies for earning trust in climate change debates. Wiley Interdiscip. Rev. Clim. Change 5, 151–160 (2014). [Google Scholar]
- 86.Baram-Tsabari A., Lewenstein B. V., An instrument for assessing scientists’ written skills in public communication of science. Sci. Commun. 35, 56–85 (2013). [Google Scholar]
- 87.Hovland C. I., Weiss W., The influence of source credibility on communication effectiveness. Public Opin. Q. 15, 635–650 (1951). [Google Scholar]
- 88.Berdahl L., Bourassa M., Bell S., Fried J., Exploring perceptions of credible science among policy stakeholder groups: Results of focus group discussions about nuclear energy. Sci. Commun. 38, 382–406 (2016). [Google Scholar]
- 89.Siegrist M., The influence of trust and perceptions of risks and benefits on the acceptance of gene technology. Risk Anal. 20, 195–203 (2000). [DOI] [PubMed] [Google Scholar]
- 90.Siegrist M., Connor M., Keller C., Trust, confidence, procedural fairness, outcome fairness, moral conviction, and the acceptance of GM field experiments. Risk Anal. 32, 1394–1403 (2012). [DOI] [PubMed] [Google Scholar]
- 91.Sleeth-Keppler D., Perkowitz R., Speiser M., It’s a matter of trust: American judgments of the credibility of informal communicators on solutions to climate change. Environ. Commun. 11, 17–40 (2017). [Google Scholar]
- 92.Hardy B. W., Tallapragada M., Besley J. C., Yuan S., The effects of the “war on science” frame on scientists’ credibility. Sci. Commun. 41, 90–112 (2019). [Google Scholar]
- 93.Malka A., Krosnick J. A., Langer G., The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Anal. 29, 633–647 (2009). [DOI] [PubMed] [Google Scholar]
- 94.McComas K. A., Trumbo C. W., Source credibility in environmental health-risk controversies: Application of Meyer’s credibility index. Risk Anal. 21, 467–480 (2001). [DOI] [PubMed] [Google Scholar]
- 95.Renn O., Levine D., “Credibility and trust in risk communication” in Communicating Risks to the Public, Kasperson R. E., Stallen P. J. M., Eds. (Springer, 1991), pp. 175–217. [Google Scholar]
- 96.Roskos-Ewoldsen D. R., Fazio R. H., The accessibility of source likability as a determinant of persuasion. Pers. Soc. Psychol. Bull. 18, 19–25 (1992). [DOI] [PubMed] [Google Scholar]
- 97.Stone V. A., Eswara H. S., The likability and self-interest of the source in attitude change. Journalism Mass Commun. Q. 46, 61–68 (1969). [Google Scholar]
- 98.Mills J., Aronson E., Opinion change as a function of the communicator’s attractiveness and desire to influence. J. Pers. Soc. Psychol. 1, 173–177 (1965). [DOI] [PubMed] [Google Scholar]
- 99.Reinhard M.-A., Messner M., Sporer S. L., Explicit persuasive intent and its impact on success at persuasion: The determining roles of attractiveness and likeableness. J. Consum. Psychol. 16, 249–259 (2006). [Google Scholar]
- 100.Markiewicz D., Effects of humor on persuasion. Sociometry 37, 407–422 (1974). [Google Scholar]
- 101.Sternthal B., Craig C. S., Humor in advertising. J. Mark. 37, 12–18 (1973). [Google Scholar]
- 102.Weinberger M. G., Gulas C. S., The impact of humor in advertising: A review. J. Advert. 21, 35–59 (1992). [Google Scholar]
- 103.Bryant J., Comisky P. W., Crane J. S., Zillmann D., Relationship between college teachers’ use of humor in the classroom and students’ evaluations of their teachers. J. Educ. Psychol. 72, 511–519 (1980). [Google Scholar]
- 104.Zillmann D., Bryant J., Guidelines for the effective use of humor in children’s educational television programs. J. Child. Contemp. Soc. 20, 201–221 (1989). [Google Scholar]
- 105.Wilson C. P., Jokes: Form, Content, Use, and Function (Academic, 1979). [Google Scholar]
- 106.Graham E. E., Papa M. J., Brooks G. P., Functions of humor in conversation: Conceptualization and measurement. West. J. Commun. 56, 161–183 (1992). [Google Scholar]
- 107.Wanzer M. B., Booth‐Butterfield M., Booth‐Butterfield S., Are funny people popular? An examination of humor orientation, loneliness, and social attraction. Commun. Q. 44, 42–52 (1996). [Google Scholar]
- 108.Yeo S. K., Cacciatore M. A., Su L. Y.-F., McKasy M., O’Neill L., Following science on social media: The effects of humor and source likability. Public Underst. Sci., 10.1177/0963662520986942 (2021). [DOI] [PubMed] [Google Scholar]
- 109.Yeo S. K., Anderson A. A., Becker A. B., Cacciatore M. A., Scientists as comedians: The effects of humor on perceptions of scientists and scientific messages. Public Underst. Sci. 29, 408–418 (2020). [DOI] [PubMed] [Google Scholar]
- 110.Stemwedel J. D., “#overlyhonestmethods: Ethical implications when scientists joke with each other on public social media” in Ethical Issues in Science Communication: A Theory-Based Approach, Goodwin J., Dahlstrom M. F., Priest S., Eds. (Iowa State University Digital Press, 2013). [Google Scholar]
- 111.Simis-Wilkinson M., et al., Scientists joking on social media: An empirical analysis of #overlyhonestmethods. Sci. Commun. 40, 314–339 (2018). [Google Scholar]
- 112.Brehm S. S., Brehm J. W., Psychological Reactance: A Theory of Freedom and Control (Academic, 1981). [Google Scholar]
- 113.Moyer-Gusé E., Robinson M. J., Mcknight J., The role of humor in messaging about the MMR vaccine. J. Health Commun. 23, 514–522 (2018). [DOI] [PubMed] [Google Scholar]
- 114.Anderson A. A., Becker A. B., Not just funny after all: Sarcasm as a catalyst for public engagement with climate change. Sci. Commun. 40, 524–540 (2018). [Google Scholar]
- 115.Becker A. B., Anderson A. A., Using humor to engage the public on climate change: The effect of exposure to one-sided vs. two-sided satire on message discounting, elaboration and counterarguing. J. Clin. Outcomes Manag. 18, A07 (2019). [Google Scholar]
- 116.Skurka C., Niederdeppe J., Romero-Canyas R., Acup D., Pathways of influence in emotional appeals: Benefits and tradeoffs of using fear or humor to promote climate change-related intentions and risk perceptions. J. Commun. 68, 169–193 (2018). [Google Scholar]
- 117.Hardy B. W., Gottfried J. A., Winneg K. M., Jamieson K. H., Stephen Colbert’s civics lesson: How Colbert super PAC taught viewers about campaign finance. Mass Commun. Soc. 17, 329–353 (2014). [Google Scholar]
- 118.Baek Y. M., Wojcieszak M. E., Don’t expect too much! Learning from late-night comedy and knowledge item difficulty. Communic. Res. 36, 783–809 (2009). [Google Scholar]
- 119.Cao X., Political comedy shows and knowledge about primary campaigns: The moderating effects of age and education. Mass Commun. Soc. 11, 43–61 (2008). [Google Scholar]
- 120.Matthes J., Heiss R., Funny cats and politics: Do humorous context posts impede or foster the elaboration of news posts on social media? Communic. Res. 48, 100−124 (2019). [Google Scholar]
- 121.Hoffman L. H., Young D. G., Satire, punch lines, and the nightly news: Untangling media effects on political participation. Commun. Res. Rep. 28, 159–168 (2011). [Google Scholar]
- 122.Baumgartner J. C., Morris J. S., One “nation” under Stephen? The effects of The Colbert Report on American youth. J. Broadcast. Electron. Media 52, 622–643 (2008). [Google Scholar]
- 123.LaMarre H. L., Landreville K. D., Beam M. A., The irony of satire: Political ideology and the motivation to see what you want to see in The Colbert Report. Int. J. Press/Polit. 14, 212–231 (2009). [Google Scholar]
- 124.Young D. G., Late-night comedy and the salience of the candidates’ caricatured traits in the 2000 election. Mass Commun. Soc. 9, 339–366 (2006). [Google Scholar]
- 125.Baumgartner J. C., Morris J. S., Walth N. L., The Fey effect: Young adults, political humor, and perceptions of Sarah Palin in the 2008 Presidential election campaign. Public Opin. Q. 76, 95–104 (2012). [Google Scholar]
- 126.Esralew S., Young D. G., The influence of parodies on mental models: Exploring the Tina Fey–Sarah Palin phenomenon. Commun. Q. 60, 338–352 (2012). [Google Scholar]
- 127.Baumgartner J. C., Morris J. S., Coleman J. M., Did the “road to the White House run through” Letterman? Chris Christie, Letterman, and other-disparaging versus self-deprecating humor. J. Polit. Mark. 17, 282–300 (2018). [Google Scholar]
- 128.Becker A. B., Haller B. A., When political comedy turns personal: Humor types, audience evaluations, and attitudes. Howard J. Commun. 25, 34–55 (2014). [Google Scholar]
- 129.Dahlstrom M. F., Using narratives and storytelling to communicate science with nonexpert audiences. Proc. Natl. Acad. Sci. U.S.A. 111, 13614–13620 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 130.Dahlstrom M. F., Ho S. S., Ethical considerations of using narrative to communicate science. Sci. Commun. 34, 592–617 (2012). [Google Scholar]
- 131.Priest S., Goodwin J., Dahlstrom M. F., Ethics and Practice in Science Communication (University of Chicago Press, 2018). [Google Scholar]
- 132.Wilkinson C., Ethics and practice in science communication. J. Clin. Outcomes Manag. 17, R02 (2018). [Google Scholar]
- 133.Fisher R. J., Social desirability bias and the validity of indirect questioning. J. Consum. Res. 20, 303–315 (1993). [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
There are no data underlying this work.