Abstract
From vaccination refusal to climate change denial, antiscience views are threatening humanity. When different individuals are provided with the same piece of scientific evidence, why do some accept whereas others dismiss it? Building on various emerging data and models that have explored the psychology of being antiscience, we specify four core bases of key principles driving antiscience attitudes. These principles are grounded in decades of research on attitudes, persuasion, social influence, social identity, and information processing. They apply across diverse domains of antiscience phenomena. Specifically, antiscience attitudes are more likely to emerge when a scientific message comes from sources perceived as lacking credibility; when the recipients embrace the social membership or identity of groups with antiscience attitudes; when the scientific message itself contradicts what recipients consider true, favorable, valuable, or moral; or when there is a mismatch between the delivery of the scientific message and the epistemic style of the recipient. Politics triggers or amplifies many principles across all four bases, making it a particularly potent force in antiscience attitudes. Guided by the key principles, we describe evidence-based counteractive strategies for increasing public acceptance of science.
Keywords: antiscience, attitudes, social identity, politics, science communication
From refusing to get vaccinated against COVID-19 (1) to ignoring worsening climate change (2), rejection of scientific information is costing lives now and will continue to do so in the future. One need only look at recent polling data to find concerning cases of public rejection of scientific evidence or denial of solutions with high levels of consensus among scientists. For example, a September 2021 poll found that only 61% of Americans saw COVID-19 as a major public health threat (3). Another recent poll found that 40% of Americans do not think climate change is a major threat (4). Additional examples abound around the world (5).
Dismissal of scientific evidence is not a new phenomenon, however. When germ theory was proposed in the 19th century, an anticontagionist movement rejected the notion that disease could be spread through miniscule germs. Earlier scientific discoveries, such as the heliocentric nature of the solar system, were met with heavy opposition. But why? From early to contemporary examples, what are the psychological principles that account for people’s antiscience views? That is, when different individuals are provided with the same piece of scientific evidence, why do some go on to accept and integrate it as a fact, whereas others dismiss it as invalid or irrelevant?
Numerous scholars have pondered the antecedents of antiscience views. Existing models have identified factors that predict wariness of specific scientific innovations or theories (6, 7) or antiscience attitudes overall [e.g., the attitude roots and jiu jitsu models (8)]. These and other models noted throughout our article offer important insights. But one theoretical paradigm that has been largely ignored in the antiscience literature, despite its substantive relevance, is the classic perspective on attitudes and persuasion (9). This is surprising, because antiscience views represent a crisis of attitudes due to both effective persuasion by antiscience sources and ineffective persuasion by scientific or “proscience” sources. This is also a missed opportunity, because classic work on persuasion has highlighted a number of explanatory processes and remediative strategies, many of which are highly applicable to the problem of antiscience attitudes. The goal of our article is to make these connections explicit and constructive. We do so by connecting contemporary findings and models in the antiscience literature to key principles from decades of research on attitudes, persuasion, social influence, social identity, and acceptance versus rejection of information writ large. Drawing these connections confers the dual scientific benefits of organizing our understanding of antiscience phenomena and delineating how classic components of persuasive processes formulated in the 20th century are impacted by new forms of social dynamics in the 21st century (e.g., vast and fast social network effects in the spreading of misinformation on social media).
Why Are People Antiscience? An Inclusive Framework
Distinct clusters of basic mental processes can explain when and why people ignore, trivialize, deny, reject, or even hate scientific information—a variety of responses that might collectively be labeled as “being antiscience.” To organize these processes, we offer an inclusive framework that specifies four core bases of antiscience attitudes (Table 1, first column). In essence, people are prone to rejecting scientific messages when it comes from a source they do not find credible (basis 1), when they, as the recipient of the scientific message, identify with social groups that hold antiscience attitudes (basis 2), when the scientific message itself contradicts their related beliefs or attitudes (basis 3), or when it is delivered in ways that mismatch their motivational and cognitive approaches to information processing (basis 4). Each of these bases involves specific antecedents or predictors and elicits different nuances of psychological reaction (Table 1, second column); they also point to different counteractive strategies (Table 1, third column). Despite their differences in focus, the four bases are unified in revealing ways in which scientific information conflicts with people’s existing content or style of thought. Such conflicts are hard to swallow and easy to disavow, rendering effective communication of scientific information a thorny problem—but one that becomes more surmountable once its underlying bases are elucidated.
Table 1.
Key principles driving antiscience attitudes and counteractive strategies for addressing them
Basis of key principles | Key principles driving antiscience attitudes | Counteractive strategies for addressing the key principles |
---|---|---|
Basis 1. Source of the scientific message | When sources of scientific information (e.g., scientists) are perceived as 1) inexpert, 2) untrustworthy, or 3) biased, they lack credibility, and their messages are ignored or rejected. | 1, i) Improving perceived and actual validity of scientists’ work 1, ii) Legitimizing substantive scientific debate 2) Conveying warmth and prosocial goals in science communication and using accessible language 3) Conveying that the source is not antagonistic to the recipient, such as by providing two-sided messages that clearly state the side for which there is stronger evidence |
Basis 2. Recipient of the scientific message | When scientific information activates one’s social identity as a member of a group 1) that holds antiscience attitudes or 2) that has been underrepresented in science or exploited in scientific work, it triggers ingroup favoritism and outgroup antipathy. | 1) Activation of shared or superordinate identity 2) Engaging and collaborating with marginalized communities |
Basis 3. The scientific message itself | When scientific information contradicts what people 1) believe to be true, 2) evaluate as favorable, or 3) moralize, they experience cognitive dissonance, which is more easily resolved by rejecting the scientific information than by changing existing beliefs, attitudes, or values. | 1, i) Training in scientific reasoning 1, ii) Prebunking 2, i) Strong arguments 2, ii) Self-affirmation 3, i) Moral reframing 3, ii) Increasing the perceived naturalness and moral purity of scientific innovations |
Basis 4. Mismatch between the delivery of the scientific message and the recipient’s epistemic style | When scientific information is delivered in ways that mismatch one’s 1) construal level, 2) regulatory focus, 3) need for closure, or 4) need for cognition, it tends to be rejected. | 1–4) Matching the delivery of scientific information with the recipient’s epistemic style (e.g., framing messages as approaching gains for promotion-focused recipients but as avoiding losses for prevention-focused recipients) |
In the following sections, we introduce each basis of antiscience attitudes by highlighting key principles, identifying relevant models, and reviewing illustrative findings and real-world examples, from heavily studied domains like vaccination to less studied ones like nanotechnology. Next, through the conceptual lens of the four bases, we explain why politics has particularly potent effects on antiscience attitudes. Afterward, we present a variety of counteractive strategies for increasing acceptance of science by targeting the four bases. Finally, we conclude with theoretical contributions of our framework.
Basis 1: Source of the Scientific Message.
Lay people do not discover facts about reality in isolation, devoid of external inputs. Instead, they rely on sources of scientific information—scientists, or, more frequently for most people, journalists, health officials, politicians, or key opinion leaders—to construct their understanding of the world. In general, the more credible a source is perceived to be, the more likely people are to accept its information and be persuaded by it. Unfortunately, many people perceive scientists, who are supposed to be the original source of scientific information, as lacking credibility (10). Why?
Source credibility is composed of three pillars: expertise (i.e., possessing specialized skills and knowledge), trustworthiness (i.e., being honest), and objectivity (i.e., having unbiased perspectives on reality) (11). All three are necessary. When scientists (or anyone conveying scientific information) are perceived as inexpert, untrustworthy, or biased, their credibility is tainted, and they lose effectiveness at conveying scientific information and changing opinions.
Although scientists are generally perceived as high in competence and expertise (12), this perception is facing mounting challenges. Concerns about the truth value and robustness of scientific findings in multiple fields, from medical to social sciences (13, 14), have received media attention (15). Lay perception of scientists’ credibility can even be undermined by features central to the very mission of science: Legitimate debates happen within scientific fields, with different scientists championing different, sometimes contradictory, perspectives, theories, hypotheses, findings, and recommendations. [As a current example at the time of our writing, scientists differ in their recommendations about whether and when to roll out the second booster shot for COVID-19 (16).] In principle, these can be signs of a healthy scientific ecosystem. In practice, contradictions between scientists, especially against the backdrop of replicability concerns, threaten lay perceptions of scientists’ credibility (17).
Scientists’ trustworthiness is also threatened by multiple social forces. Distrust of elites (i.e., those with societal influence) is on the rise (18), and scientists whose voices are broadcast in the public sphere are often employed by elite media and institutions. Distrust of government organizations is on the rise too (19), which predicts distrust in scientists who recommend innovations that would require greater governmental regulation (20). Furthermore, scientists have been stereotyped as cold and unfeeling in character (12, 21), which undermines the public’s willingness to trust them (21).
Scientists’ objectivity has also been called into question. Scientists in certain fields are portrayed and perceived as exhibiting biased perspectives against Christian (22) and conservative (23) values. Indeed, many religious individuals reject science, in part, due to the perception that scientists are atheistic (24). More generally, when scientists are thought to have a vested interest (e.g., monetary incentives) in persuading their audience, they are perceived as both biased and untrustworthy (25). During the COVID-19 pandemic, widespread misinformation characterized prominent public health officials as promoting the vaccine because of their financial investment in various pharmaceutical companies (26). In short, scientists can be perceived as inexpert, untrustworthy, or biased, which threatens their credibility in the public eye.
Basis 2: Recipient of the Scientific Message.
People vary in how interested and willing they are to listen to different types of information (27, 28). A powerful force that shapes the types of information individuals expose themselves to or actively seek out is their social identities. Substantial research on social identity theory has found that the social groups to which individuals belong or feel a connection exert strong influences on their response to information perceived to be identity relevant (29). For example, young adults are more likely to seek out positive (vs. negative) information about young adults (their ingroup), and older individuals are more likely to seek out negative information about young adults (their outgroup) (30).
Social identities play a role in antiscience attitudes and behaviors. Those who have been underrepresented in science or who have historically been exploited in scientific experiments [e.g., Black and Indigenous individuals (31)] are more skeptical of science (32). In addition to demographic groups, people can identify with interest groups that shape antiscience attitudes. For example, those who strongly identify as video gamers are more likely to reject scientific evidence regarding the harms of playing video games (33). These findings are broadly consistent with research and models in science communication that describe how people tend to reject scientific information incompatible with their identities. Work on cultural cognition has highlighted how people contort scientific findings to fit with values that matter to their cultural identities (34, 35). Relatedly, work on identity-protective cognition shows that people selectively dismiss scientifically determined risk assessments that threaten their identity (36), as when White men are more likely than other racial and gender groups to dismiss data regarding the riskiness of guns, because guns are a more integral part of their cultural identity (37).
Beyond the effects of identifying with specific demographic or cultural groups that can conflict with specific scientific findings, some individuals identify with groups that altogether ignore and shut down scientific thought, recommendations, and evidence, in general (38, 39). This sort of identity is often tied to other personally meaningful identities, particularly, political ones [and religious ones (39)], a theme we elaborate on shortly. An important nuance and caveat, however, is that, although scientists might characterize some social groups as antiscience, the individuals who identify with these groups might not think of themselves as explicitly or consciously disavowing science. They might even think of themselves as proscience, in that they believe their own views are more scientifically sound than those of mainstream scientists (40). In what sense, then, are they antiscience? In the sense that, if they reject the preponderance of scientific evidence and instead favor positions with scant or pseudoscientific support, then they are de facto acting in opposition to how science works—they are against the scientific approach to knowledge creation and the knowledge created by it.
In addition to being against scientific information, individuals can be against the people providing or promoting the scientific information. This is, unfortunately, a common aspect of social identity, namely, antipathy toward those who do not share that identity and are thus part of the outgroup (41). For example, those who identify as climate change skeptics harbor hostile feelings toward climate change believers (42). For individuals who embrace an identity associated with antiscience attitudes, scientists are members of the outgroup. People tend to reject what outgroup members have to say, sometimes to the point of violence, which can arise even in the absence of substantive reasons for rejecting the outgroup member’s message other than that it comes from the outgroup (43). These forces of social identity reflect why many individuals who strongly identify with antiscience groups seem to vehemently reject scientific messages and frequently approach scientists with hostility, even threatening their lives (44).
Similar dynamics are evident in the marked rise in conspiracy theories related to COVID-19 (e.g., the pandemic was a hoax, or the vaccines contained microchips). These conspiracy theories often coalesce around particular social groups and are most vehemently promoted by those who feel highly identified with their pseudoscientific community (45). In recent years, conspiracy theories have led to highly visible behavior such as antimask and antivaccine protests. Due to social media, antiscience groups can now mobilize activists and followers more swiftly than in previous eras. Beyond the context of COVID-19, social groups that reject mainstream science have emerged surrounding unvalidated treatments for Lyme disease (46) and opposition to getting oneself or one’s children immunized in general (47).
Basis 3: The Scientific Message Itself.
People do not always think and behave in line with what science suggests. One reason is that they are unaware of the scientific evidence [i.e., the deficit model (48)]. Sometimes, when people simply learn about the scientific consensus, their thoughts and feelings follow suit [i.e., the gateway belief model (49)]. Other times, however, when scientific information contradicts people’s existing beliefs about what is factually true, they can reject even the strongest scientific evidence, because harboring conflicting cognitions is aversive. This phenomenon is known as cognitive dissonance (50), which arises when a person is exposed to information that conflicts with their existing beliefs, attitudes, or behaviors. Dissonance elicits discomfort. Given this aversive feeling, people are motivated to resolve the contradiction and eliminate the discomfort in a number of ways, such as rejecting the new information, trivializing the topic, rationalizing that there is no contradiction, or revising their existing thought (51).
Critically, people tend to resolve dissonance using the path of least resistance. To a person who has been smoking their entire life, it is far easier to reject or trivialize scientific evidence about the health risks of smoking than to alter their ingrained habit. With dissonance, the intransigence of existing beliefs resembles the stickiness of existing behaviors: It is easier to reject a piece of scientific information than to revise an entire system of existing beliefs one has accumulated and integrated into a worldview over the years, often reinforced by social consensus. One’s existing beliefs can be based on valid scientific information, previously accepted but now outdated scientific information, or scientific misinformation. As an example of dissonance arising from believing outdated scientific information, for thousands of years, it was a widespread belief that Earth was the center of the universe and that the sun orbited Earth (52). To a person who had always believed the sun revolved around Earth, it was far easier to reject the notion of Copernican heliocentrism than to overhaul the geocentric model of the universe, which was previously accepted and felt subjectively coherent enough, and thus in no obvious need for revision.
In addition to rejecting new information from scientific progress and updates, individuals might possess beliefs that contradict scientific evidence due to the spread of misinformation. The last few years have witnessed a proliferation of fake news (53), catalyzed by social media, which facilitates the rapid spread of information regardless of whether it is true. Sadly, fake news spreads “significantly farther, faster, deeper, and more broadly” than true news on social media platforms, because fake news stories often evoke stronger emotional reactions and come across as more novel than true ones, which are attributes that increase sharing behavior (54). Although some individuals might be sharing misinformation merely because of inattention to veracity (not because of endorsement of content) (55), extensive sharing of fake news among one’s ingroup makes it likely to be accepted, due to the dynamics of social identity outlined earlier, which can result in rapid acceptance of pseudoscientific or antiscientific beliefs.
Once misinformation has spread, it is difficult to correct (56), and there is often a continued influence of the misinformation even after it has been retracted. Corrections issued by media sources are typically ineffective at reducing belief in the misinformation. In fact, corrections can sometimes reinforce the belief by making it more salient (56). Unfortunately, misinformation on many scientific topics has been widely disseminated, such as exaggerated and unfounded risks of vaccines (including pre-COVID times), denial of climate change, and dismissal of evidence for evolution (57).
Scientific misinformation is especially difficult to correct when it provides a causal explanation for a phenomenon. Correcting the misinformation would leave a gap in people’s mental model of why an event or a situation has occurred (58) and would cause discomfort (59). People often refill that gap with misinformation to make sense of the issue at hand. Circling back to the example of heliocentrism, telling a geocentricist that Earth is actually not the center of the universe would leave a gap in their mental model of why the sun clearly appears to be revolving around Earth, a gap that is easy to refill by reaffirming their existing causal belief. Similar cognitive dynamics have long been observed in pseudoscience (60) and continue to result in rejection of scientific information today.
Not only do people possess beliefs about whether things are true or false, they also evaluate things as desirable or undesirable (attitudes) (9), important or unimportant (values) (61), and right or wrong (morals) (62). Some moral views are at odds with particular kinds of scientific information, resulting in morally fueled rejection. For example, people who endorse the moral significance of naturalness and purity are prone to resisting scientific technologies and innovations seen as tampering with nature. Vaccines (63) and genetically modified food (64), despite their documented benefits, are often rejected due to perceptions that they are unnatural. This cluster of moral intuitions about naturalness and purity is highly related to individual differences in aversion to “playing God,” an aversion that predicts lower willingness to fund the NSF and less monetary donation to organizations supporting novel scientific procedures (65).
Attitudes rooted in one’s notions of right and wrong (e.g., not eating meat as a moral issue rather than as a taste preference) are particularly strong (66) and tend to be more extreme, persistent over time, resistant to change, and predictive of behavior (67). For example, people with moralized attitudes toward recycling are more resistant to counterattitudinal information regarding the efficacy of recycling (68). To resolve dissonance from conflicting information, rejecting the novel scientific information is often the path of lesser resistance than revising one’s existing moralized attitudes. Likewise, when misinformation is consistent with one’s existing attitudes, it is difficult to correct (69). To people who love driving high-horsepower but gas-guzzling vehicles, misinformation such as “climate change is a hoax” would be attitude consistent, whereas scientific correction of this misinformation would be attitude inconsistent and thus prone to rejection.
Basis 4: Mismatch between the Delivery of the Scientific Message and the Recipient’s Epistemic Style.
Even when scientific information does not conflict with an individual’s beliefs or attitudes, it can still be rejected for reasons beyond the content of the message. In particular, when scientific information is delivered in ways that are at odds with a person’s style of thinking about the topic at hand or their general approach to information processing, it is less likely to be processed and more likely to be rejected (70).
For example, when people construe an issue in abstract/high-level (vs. concrete/low-level) terms, concrete (vs. abstract) scientific information about the issue mismatches their construal level and tends to be rejected. People typically construe the issue of climate change in abstract/high-level terms (e.g., global environmental degradation), because the consequences of climate change are seen as psychologically distant (71), and distance promotes abstract construal (72). Thus, when ecofriendly products are described in concrete/low-level terms (e.g., fine details about the product’s carbon savings), despite making a compelling case, they tend to be rejected (71). Evaluation and choice of sustainable products are also undermined when the products are described in concrete terms of self-interested economic savings to consumers who think abstractly about sustainability (73).
Even holding the level of abstractness/concreteness constant, scientific information can be presented in a gain frame or a loss frame. Describing a vaccine as 90% effective (gain frame) is technically equivalent to describing it as 10% ineffective (loss frame), but with dissimilar psychological effects, because the frame can be at odds with people’s regulatory focus (74). Promotion focus orients people to eagerly attaining gains; prevention focus orients people to cautiously preventing losses. When scientific information is framed as promoting gains (vs. preventing losses), it tends to be rejected by people who are prevention focused (vs. promotion focused) (74). Such mismatch effects have been found to result in rejection of climate change (75) and health messages (e.g., vaccination and smoking cessation) (76).
Framing of scientific information also varies in how certain and decisive it seems. Even when there is a high degree of scientific consensus, scientific information is often disseminated in terms that signal uncertainty. Such terminology, while technically accurate, leads people with high need for closure (i.e., low tolerance of epistemic uncertainty) (77) to reject it. For example, when people receive mixed scientific information about vaccines, those with high need for closure are particularly likely to become entrenched in their existing views and reject the mixed information (78). More generally, people with high need for closure are more likely to reject novel information that challenges their currently held conclusions or assumptions (77). This poses a challenge for scientists, who are trained to hedge their findings and avoid overclaiming certainty, as they try to communicate the preliminary, inconclusive, nuanced, or evolving nature of scientific evidence.
Finally, scientific information varies in its quality. Intuitively, high-quality arguments are more persuasive than low-quality ones (79). But this is often not true for people with low need for cognition (i.e., people who do not enjoy thinking), for whom low-quality arguments can be just as persuasive as high-quality ones if positive peripheral cues (e.g., a likable source) are present (80). Therefore, while good-quality scientific evidence is, overall, more likely to be accepted than bad-quality evidence (81), people who do not enjoy thinking are less likely to appreciate such quality distinctions. They are less likely to process complex information, as comprehending it requires active thinking (79). They are also less likely to choose to read nuanced science blog posts (82) and less likely to accept evidence for climate change and evolution (83).
Construal level, regulatory focus, need for closure, and need for cognition are different dimensions of epistemic style. On any of these dimensions, a mismatch between how scientific information is delivered and how the recipient thinks will increase the probability of rejection. More generally, source–recipient mismatches (basis 4), content conflicts (basis 3), social identity (basis 2), and sources lacking in credibility (basis 1) all contribute to antiscience attitudes. They also point to why politics is a particularly potent driver of these attitudes.
How Politics Drives Antiscience Attitudes.
Acceptance of scientific information is now sharply divided along political lines, with individuals in different camps holding, even enshrining, vastly different views (84). Conservatives are more likely than liberals to reject scientific evidence supporting evolution (85) and the existence of anthropogenic climate change (86), and have lower intentions to get vaccinated against COVID-19 (87). Although liberals, overall, are more inclined to accept scientific evidence (86–88), there are specific topics about which they are more likely to be skeptical, such as the risk of nanotechnology (35). How do we make sense of these political divides?
The literature on antiscience attitudes has found that rejection of scientific information by members of different political camps is often based on motivational factors (89). Building on these insights, we argue that politics can trigger or amplify basic mental processes across all four bases of antiscience attitudes, thereby making it a particularly potent force. Because the mental processes are not mutually exclusive, many of the political influences described below are likely to occur in conjunction with each other.
Politics impacts people’s perception of scientists’ credibility (the source) via perceived expertise and trustworthiness (90). In general, people see others with similar political views as more expert and knowledgeable. Both liberals and conservatives are less trusting of scientists whose work contradicts their ideological viewpoint (91), and recent exposures to such contradictory information reduces trust in the entire scientific community (92). Because liberals and conservatives find different sources credible (e.g., CNN vs. Fox News), they expose themselves to different scientific information (93) and misinformation (94), often reinforced by cues from trusted political elites (95), further entrenching them in siloed networks. In the era of social media and algorithmically customized news feeds, even what appears to be the same source (e.g., Facebook) can provide highly varied information to different users (96), exacerbating the division of communities along political lines.
For many, politics is more than just a set of beliefs or ideals; it is a core part of their identity (97), which can have a large impact on how they, as a recipient, react to different pieces of scientific evidence, policy proposals, and legislation. Those who identify strongly as a Democrat or a Republican tend to show different responses to various pieces of scientific information, with each group rejecting proposals that are purportedly proposed by the outgroup, even when it goes against their own best interest. For example, when carbon taxes are framed as being a predominantly Republican (vs. Democrat) policy, those who identify as Democrat (vs. Republican) are more likely to oppose the policy (96). This opposition to anything proposed by the outgroup is mediated by the perception that the outgroup is a threat to society (99), and threats reliably trigger outgroup antipathy (100). Such antipathy is prevalent in the political sectarianism of our time (101), which leads many individuals to selectively expose themselves to congenial scientific information (28).
Indeed, people have a strong tendency to seek out information (the message) that reinforces their existing beliefs (93), a phenomenon intensified by online platforms, which heighten the speed and scope of exposure to information and misinformation in homogenous and polarized echo chambers (102). Much of the misinformation online is politically charged, covering diverse topics from elections to climate change (57). Research on values-based messaging has found that, when a political message evokes values discordant with people’s existing values, it tends to be rejected (103). Indeed, when scientific information contradicts people’s beliefs shaped by political forces, it tends to be rejected outright as simply untrue, a tendency exhibited by both liberals and conservatives (104). Worse still, the more extreme or morally charged people’s political views, the stronger their sense of belief superiority, regardless of accuracy (105), further amplifying the rejection of belief-contradictory scientific information.
Alongside content differences (the types of messages liberals and conservatives seek out and accept), liberals and conservatives also differ in how they approach information (epistemic styles). Conservatives are, on average, more prevention focused, and liberals are more promotion focused (106). According to this logic, conservatives would be more likely to reject scientific information framed as approaching gains, and liberals would be more likely to reject scientific information framed as avoiding losses. Conservatives also have a stronger need for closure (107), which is linked to stronger beliefs in a variety of conspiracy theories with no scientific basis (108).
Altogether, politics is a particularly potent force in rejection of scientific information because it strikes all four bases of antiscience attitudes, at times amplifying them. Acute increases in political partisanship and sectarianism (101) in recent years have only accentuated the potency and toxicity of such political influences.
What Can We Do About Antiscience Attitudes?
By specifying the key principles underlying antiscience attitudes, our framework suggests counteractive strategies for increasing acceptance of scientific information by targeting each of the four bases (Table 1, third column). Obviously, no single strategy is perfect or universal, and the current era is replete with unique challenges, such as the spread of misinformation on social media, but specific strategies can still be effective in their respective contexts, for specific goals. We outline a number of these strategies briefly.
Targeting Basis 1: Increasing Perception of Scientific Information Sources as Credible.
Scientists lack credibility when they are perceived as inexpert, untrustworthy, or biased. To tackle emerging concerns about the quality of scientists’ work and their perceived expertise, trustworthiness, and objectivity, scientists need to improve the validity of their research (109) and establish the replicability and reproducibility of their findings. Scientists also need to communicate to the public that substantive debate and disagreement are inherent to the scientific process and signal a healthy scientific landscape, a point often missed by lay people who expect a given scientific finding to be absolute (17). To maximize effectiveness, scientists and science organizations need to recruit journalists, health officials, politicians, or key opinion leaders to join these communicative efforts, as they are often the sources conveying scientific information directly to the public or the sources that the public already trusts.
To reduce distrust in scientists due to their perceived coldness (12), when scientists communicate their findings and recommendations, they should ameliorate the unfavorable impressions by intentionally conveying interpersonal warmth and highlighting the communal nature of science, a tactic that has proven effective for a different but related goal—recruiting girls and women into STEM training programs and careers (12). Another strategy that is related to but distinct from conveying warmth is for scientists to communicate that they are pursuing prosocial goals in their work. When people perceive scientists as prosocial, they have greater trust in science (110).
Scientists also often use excessively complex language when communicating science to the general public (111). To mitigate the negative perception from jargon-laden wording that conceals the meaning of the science from lay people, scientists should use language that conveys their message clearly and precisely while still being accessible to a general audience. One specific suggestion in this vein, which most journals have yet to adopt, is for published articles to include “lay summaries” along with the more jargon-laden abstracts, so that interested lay people can better glean the information in terms that they can understand (112).
To reduce perceived bias, scientists should attempt to communicate in a balanced manner whenever possible. When communicators offer a nuanced, multifaceted perspective, especially if they change positions in the face of new evidence, they are perceived as less biased and more persuasive (113). When a communicator expresses openness to alternative views, especially when of high status, this can increase openness in committed recipients (114). For example, those who saw the issue of wearing masks in the COVID-19 pandemic as a moral impingement on their rights were more open to wearing masks when a communicator acknowledged the recipient’s view but explained why the promask position was preferable (115). Importantly, we are not suggesting that communicators adopt a position of false neutrality or “both sidesism.” Instead, we are suggesting that they honestly acknowledge any drawbacks of their position while ultimately explaining in clear and compelling terms why their position is still the most supported or more justifiable one.
Targeting Basis 2: Decreasing Recipients’ Identification with Antiscience Groups.
To reduce the salience or strength of recipients’ identification with groups that embrace antiscience views, science communicators should invoke meaningful and important shared social identities between themselves and the recipients of scientific messages (116). For groups in conflict, finding a common or superordinate identity often helps the two groups minimize their conflict and approach intergroup harmony (117). If those viewing scientists as outgroup members can see themselves as sharing a common identity with scientists, antiscience sentiment and the derogation of scientists can be reduced. For example, when scientists offer their recycled water policy suggestions to a hostile audience, finding common ground via a superordinate identity successfully increases audience receptivity (118). One way to legitimately claim a shared identity between scientists and antiscience community members is by bringing together different stakeholders to form one group (e.g., a committee) that is working toward shared goals, while still preserving the original subgroups within the superordinate identity (98).
Science communicators should also seek to earn the trust of groups that have been historically exploited or excluded by the science community (119, 120). This can be done by directly engaging with the target groups in the process of conducting the research (121). For example, rather than treating racialized or historically underrepresented groups as the objects of study, scientists can collaborate with members of these communities and build cultural competencies (122). Scientific funding agencies’ requirement of active Indigenous participation in any research that might impact or involve Indigenous communities (123) offers another step toward reconciliation. Programs that train marginalized individuals to be the scientists working within their own communities also help to earn trust from racialized communities, as when a program that trains Indigenous genome researchers increases trust in science (124). Many of these efforts are still rather nascent, however, and, unlike the other counteractive strategies outlined in our article, their efficacy has not yet been rigorously assessed. We encourage proper quantitative assessment of these efforts’ effectiveness. If useful, they can be scaled up to help rebuild or strengthen the rapport between scientists and diverse communities.
Targeting Basis 3: Increasing Acceptance of Scientific Information Even When It Contradicts One’s Beliefs and Attitudes.
To tackle rejection of scientific information that contradicts an audience’s beliefs, prevention is better than cure: Whenever possible, minimize the formation of ill-informed beliefs in the first place. One preventive strategy is to train people in scientific reasoning (i.e., the ability to evaluate the quality of scientific information). People equipped with scientific reasoning skills are more likely to accept high-quality scientific evidence (84). This strategy is especially apt for combatting the rise of fake news [which is another major problem that requires societal-level changes in digital infrastructure (125)]. Arming media consumers with the skills to differentiate between true and false scientific information leads them to become more discerning regarding which beliefs to adopt (125). Critically, this strategy pertains to conveying the correct scientific information prior to any misinformation being adopted. An additional caveat is that, although encouraging critical reasoning decreases belief in scientific misinformation, simply telling people that they should trust science more can actually increase belief in and dissemination of misinformation framed as being scientific (compared with misinformation not framed as being scientific) (126).
Related to the broader notion of training in scientific reasoning, a specific strategy is called prebunking. Derived from the logic of disease inoculation (127), it involves forewarning people that they will be receiving misinformation, then giving them a small dose of misinformation (the “vaccine”) and refuting it so that they will be better able to resist misinformation when they encounter it in the wild (the “disease”). Data from a field experiment among older adults have found this strategy to be effective for minimizing the impact of disinformation on people’s intention to receive a COVID-19 vaccine (128).
Another preventive strategy, which sounds intuitive but turns out to be ineffective for enhancing acceptance of scientific information, is increasing a population’s general scientific literacy. Unlike specialized scientific knowledge, general scientific literacy does not involve a deep dive into why a scientific phenomenon occurs (89). Unlike scientific reasoning skills, general scientific literacy does not teach people how to parse scientific information (84). Instead, it merely entails imparting an unelaborated list of scientific information (89). Why is it ineffective for enhancing acceptance of scientific information? Because people with more scientific literacy are simply more sophisticated at bolstering their existing beliefs by cherry-picking ideas and information to defend their worldview (84). Higher levels of scientific literacy, instead of leading people to coalesce around scientific truths, can increase polarization of beliefs (84). Similarly, greater cognitive sophistication (e.g., stronger analytic thinking) does not necessarily reduce antiscience views, as the most cognitively sophisticated and educated people can also be the most polarized (129), although the evidence for and interpretation of this pattern have been subject to debate (130).
When preventive strategies are implausible, curative ones are necessary. Simply learning information is often uncorrelated with attitude change (48, 131). What matters more than whether people learn or remember the information they have been told is how they react to that information. If people have positive reactions to a message, they are more likely to change their attitudes to be in line with that message (132). By implication, merely informing the public of scientific information is insufficient; one must also persuade them. Strong, well-reasoned, and well-substantiated arguments, implemented by skilled science communicators, have been found effective for altering even entrenched attitudes, such as toward climate change (133) and the safety of electronic health records (134).
But, for the particularly intransigent, additional strategies should be utilized to supplement persuasive arguments. As noted earlier, a fundamental mechanism that leads people to reject scientific information contradictory to their beliefs is cognitive dissonance. This aversive state has been found to be reduced by a procedure called self-affirmation, which involves prompting people to conjure and affirm values that matter to them (e.g., caring for one’s family) in ways unrelated to the cognitive conflict at hand (135). Why does self-affirmation reduce dissonance? Because it increases one’s sense of self-integrity and security, which reduces the threatening effect of dissonance to the self. Self-affirmation interventions have been used successfully to reduce defensiveness and increase acceptance of scientific information regarding health behaviors (136) and climate change (137).
Sometimes, scientific messages not only conflict with a person’s beliefs and attitudes but also with their particular moral concerns. To manage this, an effective strategy is to identify the specific morals the recipient endorses and reframe the scientific message to accord with them. Conservatives, who endorse the moral foundation of ingroup loyalty, are more persuaded by messages about climate change framed as a matter of loyalty to one’s country. Liberals, who endorse the moral foundation of intentional care, are more persuaded by messages about climate change framed as a matter of care for innocent creatures (138). Moral reframing has also been found effective for minimizing morally based opposition to vaccines and stem cell technology (138). Similarly, for recipients who think about public health in more (vs. less) moral terms, messages that use moral arguments such as engaging in physical distancing during the COVID-19 pandemic to benefit others (vs. oneself) are more persuasive (139).
To increase acceptance of scientific evidence among those who have strong moral intuitions about naturalness/purity, science communicators can specifically reframe scientific innovations as confluent with nature. For example, increasing the perceived naturalness of geoengineering has been found to increase people’s acceptance of it as a strategy to combat climate change (140). Overall, these findings suggest that science communicators can create multiple moral frames when communicating their scientific information to distinct audiences (e.g., liberals vs. conservatives, religious vs. nonreligious) who are likely to have different moral intuitions or views.
Targeting Basis 4: Matching the Delivery of the Scientific Message with the Recipient’s Epistemic Style.
People tend to reject scientific information when it is delivered in ways that mismatch their epistemic styles. This basic principle has theoretically straightforward implications for what counteractive strategies to use: Identify the recipient’s style, and match it. To implement a matching strategy, regional demographic data (e.g., on political leanings) can aid in developing psychographically targeted communications at the aggregate level. Given the vast amounts of fine-grained, person-specific data that various technology companies collect on people’s online activity (if they have not opted out), targeting may even be done at the individual level, which has been found effective for changing behavior (141). Consumer researchers have long been segmenting and targeting consumers based on rich psychographic and behavioral data. Other public interest groups could adopt similar strategies and use the logic of targeted advertising to more precisely position their scientific communications with different audiences in mind. The essence of this strategy is to craft different messages or different delivery approaches for different audiences. For recipients who think abstractly (vs. concretely), scientific messages delivered in an abstract (vs. concrete) manner increase their acceptance of the scientific information as true (142). For recipients who are promotion focused (vs. prevention focused), messages about health behavior framed as approaching gains (vs. avoiding losses) are better accepted (76), and so forth, as explained earlier.
Concluding Remarks
By offering an inclusive framework of key principles underlying antiscience attitudes, we aim to advance theory and research on several fronts: Our framework highlights basic principles applicable to antiscience phenomena across multiple domains of science. It predicts situational and personal variables (e.g., moralization, attitude strength, and need for closure) that amplify people’s likelihood and intensity of being antiscience. It unpacks why politics is such a potent force with multiple aspects of influence on antiscience attitudes. And it suggests a range of counteractive strategies that target each of the four bases. Beyond explaining, predicting, and addressing antiscience views, our framework raises unresolved questions for future research (SI Appendix).
With the prevalence of antiscience attitudes, scientists and science communicators face strong headwinds in gaining and sustaining public trust and in conveying scientific information in ways that will be accepted and integrated into public understanding. It is a multifaceted problem that ranges from erosions in the credibility of scientists to conflicts with the identities, beliefs, attitudes, values, morals, and epistemic styles of different portions of the population, exacerbated by the toxic ecosystem of the politics of our time. Scientific information can be difficult to swallow, and many individuals would sooner reject the evidence than accept information that suggests they might have been wrong. This inclination is wholly understandable, and scientists should be poised to empathize. After all, we are in the business of being proven wrong, but that must not stop us from helping people get things right.
Supplementary Material
Acknowledgments
We thank Rebecca Walker Reczek, Laura Wallace, Tim Broom, Javier Granados Samoyoa, the Attitudes and Persuasion Lab, and the Mind and Body Lab for feedback.
Footnotes
The authors declare no competing interest.
This article is a PNAS Direct Submission.
This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.2120755119/-/DCSupplemental.
Data Availability
There are no data underlying this work.
References
- 1.Thangaraj J. W. V., et al. , Predominance of delta variant among the COVID-19 vaccinated and unvaccinated individuals, India, May 2021. J. Infect. 84, 94–118 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.World Health Organization, Climate change and health. (Fact Sheet, World Health Organization, 2021). https://www.who.int/news-room/fact-sheets/detail/climate-change-and-health/. Accessed 6 July 2022. [Google Scholar]
- 3.Tyson A., Funk C., Kennedy B., Johnson C., Majority in U.S. says public health benefits of COVID-19 restrictions worth the costs, even as large shares also see downsides. Pew Research Center, (2021). https://www.pewresearch.org/science/2021/09/15/majority-in-u-s-says-publich-health-benefits-of-covid-19-restrictions-worth-the-costs-even-as-large-shares-also-see-downsides/. Accessed 30 March 2022. [Google Scholar]
- 4.Kennedy B., U.S. concern about climate change is rising, but mainly among Democrats. Pew Research Center, (2020). https://www.pewresearch.org/fact-tank/2020/04/16/u-s-concern-about-climate-change-is-rising-but-mainly-among-democrats/. Accessed 28 February 2021. [Google Scholar]
- 5.Rutjens B. T., et al. , Science skepticism across 24 countries. Soc. Psychol. Personal. Sci. 13, 102–117 (2022). [Google Scholar]
- 6.Hornsey M. J., Why facts are not enough: Understanding and managing the motivated rejection of science. Curr. Dir. Psychol. Sci. 29, 583–591 (2020). [Google Scholar]
- 7.Rutjens B. T., Heine S. J., Sutton R. M., van Harreveld F., “Attitudes towards science” in Advances in Experimental Social Psychology, Olson J. M., Ed. (Academic, 2018), pp. 125–165. [Google Scholar]
- 8.Hornsey M. J., Fielding K. S., Attitude roots and Jiu Jitsu persuasion: Understanding and overcoming the motivated rejection of science. Am. Psychol. 72, 459–473 (2017). [DOI] [PubMed] [Google Scholar]
- 9.McGuire W. J., “The nature of attitudes and attitude change” in The Handbook of Social Psychology, Lindzey G., Aronson E., Eds. (Addison-Wesley, ed. 2, 1969), pp. 136–314. [Google Scholar]
- 10.Funk C., Hefferon M., Kennedy B., Johnson C., Trust and mistrust in Americans’ views of scientific experts. Pew Research Center, (2019). https://pewresearch.org/science/2019/08/02/trust-and-mistrust-in-americans-views-of-scientific-experts/. Accessed 17 March 2022. [Google Scholar]
- 11.Wallace L. E., Wegener D. T., Petty R. E., When sources honestly provide their biased opinion: Bias as a distinct source perception with independent effects on credibility and persuasion. Pers. Soc. Psychol. Bull. 46, 439–453 (2020). [DOI] [PubMed] [Google Scholar]
- 12.Diekman A. B., Clark E. K., Johnston A. M., Brown E. R., Steinberg M., Malleability in communal goals and beliefs influences attraction to stem careers: Evidence for a goal congruity perspective. J. Pers. Soc. Psychol. 101, 902–918 (2011). [DOI] [PubMed] [Google Scholar]
- 13.Errington T. M., et al. , Investigating the replicability of preclinical cancer biology. eLife 10, e71601 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Nosek B. A., et al. , Replicability, robustness, and reproducibility in psychological science. Annu. Rev. Psychol. 73, 719–748 (2022). [DOI] [PubMed] [Google Scholar]
- 15.Yong E., Psychology’s replication crisis is running out of excuses. The Atlantic, 19 November 2018. https://www.theatlantic.com/science/archive/2018/11/psychologys-replication-crisis-real/576223/. Accessed 5 April 2022. [Google Scholar]
- 16.Heid M., Opinion | Why experts can’t seem to agree on boosters. N. Y. Times, 13 April (2022). https://www.nytimes.com/2022/04/13/opinion/covid-booster-shot.html. Accessed 20 April 2022. [Google Scholar]
- 17.Flemming D., Feinkohl I., Cress U., Kimmerle J., Individual uncertainty and the uncertainty of science: The impact of perceived conflict and general self-efficacy on the perception of tentativeness and credibility of scientific information. Front. Psychol. 6, 1859 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Kennedy J., Populist politics and vaccine hesitancy in Western Europe: An analysis of national-level data. Eur. J. Public Health 29, 512–516 (2019). [DOI] [PubMed] [Google Scholar]
- 19.Lee C., Whetten K., Omer S., Pan W., Salmon D., Hurdles to herd immunity: Distrust of government and vaccine refusal in the US, 2002-2003. Vaccine 34, 3972–3978 (2016). [DOI] [PubMed] [Google Scholar]
- 20.Pechar E., Bernauer T., Mayer F., Beyond political ideology: The impact of attitudes towards government and corporations on trust in science. Sci. Commun. 40, 291–313 (2018). [Google Scholar]
- 21.Fiske S. T., Dupree C., Gaining trust as well as respect in communicating to motivated audiences about science topics. Proc. Natl. Acad. Sci. U.S.A. 111, 13593–13597 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Barnes M. E., Truong J. M., Grunspan D. Z., Brownell S. E., Are scientists biased against Christians? Exploring real and perceived bias against Christians in academic biology. PLoS One 15, e0226826 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Duarte J. L., et al. , Political diversity will improve social psychological science. Behav. Brain Sci. 38, e130 (2015). [DOI] [PubMed] [Google Scholar]
- 24.Simpson A., Rios K., Is science for atheists? Perceived threat to religious cultural authority explains U.S. Christians’ distrust in secularized science. Public Underst. Sci. 28, 740–758 (2019). [DOI] [PubMed] [Google Scholar]
- 25.Hilton S., Petticrew M., Hunt K., Parents’ champions vs. vested interests: Who do parents believe about MMR? A qualitative study. BMC Public Health 7, 42 (2007). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Funke D., Fact-check: Does Anthony Fauci have millions invested in a coronavirus vaccine? Austin American-Statesman, (2020). https://www.statesman.com/story/news/politics/elections/2020/04/16/fact-check-does-anthony-fauci-have-millions-invested-in-coronavirus-vaccine/984125007/. Accessed 28 February 2021.
- 27.Handley I. M., Brown E. R., Moss-Racusin C. A., Smith J. L., Quality of evidence revealing subtle gender biases in science is in the eye of the beholder. Proc. Natl. Acad. Sci. U.S.A. 112, 13201–13206 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Hart W., Richardson K., Tortoriello G. K., Earl A., ‘You Are What You Read:’ Is selective exposure a way people tell us who they are? Br. J. Psychol. 111, 417–442 (2020). [DOI] [PubMed] [Google Scholar]
- 29.Hogg M. A., Williams K. D., From I to we: Social identity and the collective self. Group Dyn. Theory Res. Pract. 4, 81–97 (2000). [Google Scholar]
- 30.Knobloch-Westerwick S., Hastall M. R., Please your self: Social identity effects on selective exposure to news about in- and out-groups. J. Commun. 60, 515–535 (2010). [Google Scholar]
- 31.Washington H. A., Medical Apartheid: The Dark History of Medical Experimentation on Black Americans from Colonial Times to the Present (Doubleday, 2006). [Google Scholar]
- 32.Warren R. C., Forrow L., Hodge D. A. Sr., Truog R. D., Trustworthiness before trust—Covid-19 vaccine trials and the black community. N. Engl. J. Med. 383, e121 (2020). [DOI] [PubMed] [Google Scholar]
- 33.Nauroth P., Gollwitzer M., Bender J., Rothmund T., Gamers against science: The case of the violent video games debate. Eur. J. Soc. Psychol. 44, 104–116 (2014). [Google Scholar]
- 34.Kahan D. M., Jenkins‐Smith H., Braman D., Cultural cognition of scientific consensus. J. Risk Res. 14, 147–174 (2011). [Google Scholar]
- 35.Kahan D. M., Braman D., Slovic P., Gastil J., Cohen G., Cultural cognition of the risks and benefits of nanotechnology. Nat. Nanotechnol. 4, 87–90 (2009). [DOI] [PubMed] [Google Scholar]
- 36.Kahan D. M., Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN [Preprint] (2017). 10.2139/ssrn.2973067. Accessed 17 March 2022. [DOI]
- 37.Kahan D. M., Braman D., Gastil J., Slovic P., Mertz C. K., Culture and identity-protective cognition: Explaining the white-male effect in risk perception. J. Empir. Leg. Stud. 4, 465–505 (2007). [Google Scholar]
- 38.Fasce A., Adrián-Ventura J., Lewandowsky S., van der Linden S., Science through a tribal lens: A group-based account of polarization over scientific facts. Group Process. Intergroup Relat., 10.1177/13684302211050323 (2021). [DOI] [Google Scholar]
- 39.Rutjens B. T., van der Linden S., van der Lee R., Zarzeczna N., A group processes approach to antiscience beliefs and endorsement of “alternative facts.” Group Process. Intergroup Relat. 24, 513–517 (2021). [Google Scholar]
- 40.Offit P. A., Moser C. A., The problem with Dr Bob’s alternative vaccine schedule. Pediatrics 123, e164–e169 (2009). [DOI] [PubMed] [Google Scholar]
- 41.McGregor I., Haji R., Kang S.-J., Can ingroup affirmation relieve outgroup derogation? J. Exp. Soc. Psychol. 44, 1395–1401 (2008). [Google Scholar]
- 42.Bliuc A.-M., et al. , Public division about climate change rooted in conflicting socio-political identities. Nat. Clim. Chang. 5, 226–229 (2015). [Google Scholar]
- 43.Branscombe N. R., Wann D. L., Collective self-esteem consequences of outgroup derogation when a valued social identity is on trial. Eur. J. Soc. Psychol. 24, 641–657 (1994). [Google Scholar]
- 44.Hornsey M. J., Imani A., Criticizing groups from the inside and the outside: An identity perspective on the intergroup sensitivity effect. Pers. Soc. Psychol. Bull. 30, 365–383 (2004). [DOI] [PubMed] [Google Scholar]
- 45.Nogrady B., ‘I hope you die’: How the COVID pandemic unleashed attacks on scientists. Nature 598, 250–253 (2021). [DOI] [PubMed] [Google Scholar]
- 46.Douglas K. M., COVID-19 conspiracy theories. Group Process. Intergroup Relat. 24, 270–275 (2021). [Google Scholar]
- 47.Auwaerter P. G., et al. , Antiscience and ethical concerns associated with advocacy of Lyme disease. Lancet Infect. Dis. 11, 713–719 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Motta M., Callaghan T., Sylvester S., Lunz-Trujillo K., Identifying the prevalence, correlates, and policy consequences of anti-vaccine social identity. Polit. Groups Identities, 10.1080/21565503.2021.1932528 (2021). [DOI] [Google Scholar]
- 49.Sturgis P., Allum N., Science in society: Re-evaluating the deficit model of public attitudes. Public Underst. Sci. 13, 55–74 (2004). [Google Scholar]
- 50.van der Linden S., The Gateway Belief Model (GBM): A review and research agenda for communicating the scientific consensus on climate change. Curr. Opin. Psychol. 42, 7–12 (2021). [DOI] [PubMed] [Google Scholar]
- 51.Festinger L., A Theory of Cognitive Dissonance (Stanford University Press, 1957). [Google Scholar]
- 52.Cooper J., Cognitive Dissonance: Fifty Years of a Classic Theory (SAGE, 2007). [Google Scholar]
- 53.Hannam J., God’s Philosophers: How the Medieval World Laid the Foundations of Modern Science (Icon, 2009). [Google Scholar]
- 54.Lee T., The global rise of “fake news” and the threat to democratic elections in the USA. Public Adm. Policy 22, 15–24 (2019). [Google Scholar]
- 55.Vosoughi S., Roy D., Aral S., The spread of true and false news online. Science 359, 1146–1151 (2018). [DOI] [PubMed] [Google Scholar]
- 56.Pennycook G., Rand D. G., The psychology of fake news. Trends Cogn. Sci. 25, 388–402 (2021). [DOI] [PubMed] [Google Scholar]
- 57.Greifeneder R., Jaffé M. E., Newman E. J., Schwarz N., The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (Routledge, 2021). [Google Scholar]
- 58.Scheufele D. A., Krause N. M., Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 7662–7669 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Ecker U. K., Lewandowsky S., Fenton O., Martin K., Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Mem. Cognit. 42, 292–304 (2014). [DOI] [PubMed] [Google Scholar]
- 60.Susmann M. W., Wegener D. T., The role of discomfort in the continued influence effect of misinformation. Mem. Cognit. 50, 435–448 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Thagard P. R., “Why astrology is a pseudoscience” in PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, D. Hull, M. Forbes, and R. M. Burian, Eds. (Philosophy of Science Association, 1978), vol. 1978, 223–234. [Google Scholar]
- 62.Schwartz S. H., “Universals in the content and structure of values: Theoretical advances and empirical tests in 20 countries” in Advances in Experimental Social Psychology, Zanna M. P., Ed. (Academic, 1992), pp. 1–65. [Google Scholar]
- 63.Skitka L. J., Hanson B. E., Morgan G. S., Wisneski D. C., The psychology of moral conviction. Annu. Rev. Psychol. 72, 347–366 (2021). [DOI] [PubMed] [Google Scholar]
- 64.Dibonaventura M. d., Chapman G. B., Do decision biases predict bad decisions? Omission bias, naturalness bias, and influenza vaccination. Med. Decis. Making 28, 532–539 (2008). [DOI] [PubMed] [Google Scholar]
- 65.Scott S. E., Inbar Y., Rozin P., Evidence for absolute moral opposition to genetically modified food in the United States. Perspect. Psychol. Sci. 11, 315–324 (2016). [DOI] [PubMed] [Google Scholar]
- 66.Waytz A., Young L., Aversion to playing God and moral condemnation of technology and science. Philos. Trans. R. Soc. Lond. B Biol. Sci. 374, 20180041 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Petty R. E., Krosnick J. A., Eds., Attitude Strength: Antecedents and Consequences (Lawrence Erlbaum Associates, 1995). [Google Scholar]
- 68.Luttrell A., Sawicki V., Attitude strength: Distinguishing predictors versus defining features. Soc. Personal. Psychol. Compass 14, e12555 (2020). [Google Scholar]
- 69.Luttrell A., Petty R. E., Briñol P., Wagner B. C., Making it moral: Merely labeling an attitude as moral increases its strength. J. Exp. Soc. Psychol. 65, 82–93 (2016). [Google Scholar]
- 70.Lewandowsky S., Ecker U. K. H., Seifert C. M., Schwarz N., Cook J., Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012). [DOI] [PubMed] [Google Scholar]
- 71.Teeny J. D., Siev J. J., Briñol P., Petty R. E., A review and conceptual framework for understanding personalized matching effects in persuasion. J. Consum. Psychol. 31, 382–414 (2021). [Google Scholar]
- 72.Reczek R. W., Trudel R., White K., Focusing on the forest or the trees: How abstract versus concrete construal level predicts responses to eco-friendly products. J. Environ. Psychol. 57, 87–98 (2018). [Google Scholar]
- 73.Trope Y., Liberman N., Construal-level theory of psychological distance. Psychol. Rev. 117, 440–463 (2010). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Goldsmith K., Newman G. E., Dhar R., Mental representation changes the evaluation of green product benefits. Nat. Clim. Chang. 6, 847–850 (2016). [Google Scholar]
- 75.Cesario J., Grant H., Higgins E. T., Regulatory fit and persuasion: Transfer from “feeling right.” J. Pers. Soc. Psychol. 86, 388–404 (2004). [DOI] [PubMed] [Google Scholar]
- 76.Bertolotti M., Catellani P., Effects of message framing in policy communication on climate change: Framing in communication on climate change. Eur. J. Soc. Psychol. 44, 474–486 (2014). [Google Scholar]
- 77.Ludolph R., Schulz P. J., Does regulatory fit lead to more effective health communication? A systematic review. Soc. Sci. Med. 128, 142–150 (2015). [DOI] [PubMed] [Google Scholar]
- 78.Webster D. M., Kruglanski A. W., Individual differences in need for cognitive closure. J. Pers. Soc. Psychol. 67, 1049–1062 (1994). [DOI] [PubMed] [Google Scholar]
- 79.Nan X., Daily K., Biased assimilation and need for closure: Examining the effects of mixed blogs on vaccine-related beliefs. J. Health Commun. 20, 462–471 (2015). [DOI] [PubMed] [Google Scholar]
- 80.Cacioppo J. T., Petty R. E., Morris K. J., Effects of need for cognition on message evaluation, recall, and persuasion. J. Pers. Soc. Psychol. 45, 805–818 (1983). [Google Scholar]
- 81.Petty R. E., Cacioppo J. T., “The elaboration likelihood model of persuasion” in Advances in Experimental Social Psychology, Zanna M. P., Ed. (Academic, 1996), pp. 123–205. [Google Scholar]
- 82.Bhattacherjee A., Sanford C., Influence processes for information technology acceptance: An elaboration likelihood model. Manage. Inf. Syst. Q. 30, 805–825 (2006). [Google Scholar]
- 83.Winter S., Krämer N. C., Selecting science information in Web 2.0: How source cues, message sidedness, and need for cognition influence users’ exposure to blog posts. J. Comput. Mediat. Commun. 18, 80–96 (2012). [Google Scholar]
- 84.Kudrna J., Shore M., Wassenberg D., Considering the role of “need for cognition” in students’ acceptance of climate change & evolution. Am. Biol. Teach. 77, 250–257 (2015). [Google Scholar]
- 85.Drummond C., Fischhoff B., Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proc. Natl. Acad. Sci. U.S.A. 114, 9587–9592 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Lewandowsky S., Woike J. K., Oberauer K., Genesis or evolution of gender differences? Worldview-based dilemmas in the processing of scientific information. J. Cogn. 3, 9 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Kirby D. B., The impact of abstinence and comprehensive sex and STD/HIV education programs on adolescent sexual behavior. Sex. Res. Soc. Policy 5, 18–27 (2008). [Google Scholar]
- 88.Hamilton L. C., Hartter J., Lemcke-Stampone M., Moore D. W., Safford T. G., Tracking public beliefs about anthropogenic climate change. PLoS One 10, e0138208 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Baker S., Axios-Ipsos poll: More Americans want the vaccine. Axios, 12 January (2021). https://www.axios.com/2021/01/12/axios-ipsos-coronavirus-index-americans-want-vaccine. Accessed 28 February 2021. [Google Scholar]
- 90.Lewandowsky S., Oberauer K., Motivated rejection of science. Curr. Dir. Psychol. Sci. 25, 217–222 (2016). [Google Scholar]
- 91.Landreville K. D., Niles C., “And that’s a fact!”: The roles of political ideology, PSRs, and perceived source credibility in estimating factual content in partisan news. J. Broadcast. Electron. Media 63, 177–194 (2019). [Google Scholar]
- 92.McCright A. M., Dentzman K., Charters M., Dietz T., The influence of political ideology on trust in science. Environ. Res. Lett. 8, 044029 (2013). [Google Scholar]
- 93.Nisbet E. C., Cooper K. E., Garrett R. K., The partisan brain: How dissonant science messages lead conservatives and liberals to (dis)trust science. Ann. Am. Acad. Pol. Soc. Sci. 658, 36–66 (2015). [Google Scholar]
- 94.Stroud N. J., Polarization and partisan selective exposure. J. Commun. 60, 556–576 (2010). [Google Scholar]
- 95.Traberg C. S., van der Linden S., Birds of a feather are persuaded together: Perceived source credibility mediates the effect of political bias on misinformation susceptibility. Pers. Individ. Dif. 185, 111269 (2022). [Google Scholar]
- 96.Fielding K. S., Hornsey M. J., Thai H. A., Toh L. L., Using ingroup messengers and ingroup values to promote climate change policy. Clim. Change 158, 181–199 (2020). [Google Scholar]
- 97.Bakshy E., Messing S., Adamic L. A., Political science. Exposure to ideologically diverse news and opinion on Facebook. Science 348, 1130–1132 (2015). [DOI] [PubMed] [Google Scholar]
- 98.West E. A., Iyengar S., Partisanship as a social identity: Implications for polarization. Polit. Behav. 44, 807–838 (2020). [Google Scholar]
- 99.Fielding K. S., Hornsey M. J., A social identity analysis of climate change and environmental attitudes and behaviors: Insights and opportunities. Front. Psychol. 7, 121 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Hoffarth M. R., Hodson G., Green on the outside, red on the inside: Perceived environmentalist threat as a factor explaining political polarization of climate change. J. Environ. Psychol. 45, 40–49 (2016). [Google Scholar]
- 101.Finkel E. J., et al. , Political sectarianism in America. Science 370, 533–536 (2020). [DOI] [PubMed] [Google Scholar]
- 102.Del Vicario M., et al. , The spreading of misinformation online. Proc. Natl. Acad. Sci. U.S.A. 113, 554–559 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Nelson T. E., Garst J., Values-based political messages and persuasion: Relationships among speaker, recipient, and evoked values. Polit. Psychol. 26, 489–516 (2005). [Google Scholar]
- 104.Washburn A. N., Skitka L. J., Science denial across the political divide: Liberals and conservatives are similarly motivated to deny attitude-inconsistent science. Soc. Psychol. Personal. Sci. 9, 972–980 (2018). [Google Scholar]
- 105.Toner K., Leary M. R., Asher M. W., Jongman-Sereno K. P., Feeling superior is a bipartisan issue: Extremity (not direction) of political views predicts perceived belief superiority. Psychol. Sci. 24, 2454–2462 (2013). [DOI] [PubMed] [Google Scholar]
- 106.Janoff-Bulman R., To provide or protect: Motivational bases of political liberalism and conservatism. Psychol. Inq. 20, 120–128 (2009). [Google Scholar]
- 107.Chirumbolo A., The relationship between need for cognitive closure and political orientation: The mediating role of authoritarianism. Pers. Individ. Dif. 32, 603–610 (2002). [Google Scholar]
- 108.Douglas K. M., Sutton R. M., Cichocka A., The psychology of conspiracy theories. Curr. Dir. Psychol. Sci. 26, 538–542 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Fabrigar L. R., Wegener D. T., Petty R. E., A validity-based framework for understanding replication in psychology. Pers. Soc. Psychol. Rev. 24, 316–344 (2020). [DOI] [PubMed] [Google Scholar]
- 110.Benson-Greenwald T. M., Trujillo A., White A. D., Diekman A. B., Science for others or the self? Presumed motives for science shape public trust in science. Pers. Soc. Psychol. Bull., 10.1177/01461672211064456 (2021). [DOI] [PubMed] [Google Scholar]
- 111.Blue C., Precision is the enemy of public understanding. APS Obs. 34, 73 (2021). [Google Scholar]
- 112.Kuehne L. M., Olden J. D., Lay summaries needed to enhance science communication. Proc. Natl. Acad. Sci. U.S.A. 112, 3585–3586 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Wallace L. E., Wegener D. T., Petty R. E., Influences of source bias that differ from source untrustworthiness: When flip-flopping is more and less surprising. J. Pers. Soc. Psychol. 118, 603–616 (2020). [DOI] [PubMed] [Google Scholar]
- 114.Hussein M. A., Tormala Z. L., Undermining your case to enhance your impact: A framework for understanding the effects of acts of receptiveness in persuasion. Pers. Soc. Psychol. Rev. 25, 229–250 (2021). [DOI] [PubMed] [Google Scholar]
- 115.Xu M., Petty R. E., Two-sided messages promote openness for morally based attitudes. Pers. Soc. Psychol. Bull., 10.177/0146167220988371 (2021). [DOI] [PubMed] [Google Scholar]
- 116.Van Bavel J. J., Packer D. J., The Power of Us: Harnessing Our Shared Identities to Improve Performance, Increase Cooperation, and Promote Social Harmony (Little, Brown Spark, ed. 1, 2021). [Google Scholar]
- 117.Gaertner S. L., Dovidio J. F., Anastasio P. A., Bachman B. A., Rust M. C., The common ingroup identity model: Recategorization and the reduction of intergroup bias. Eur. Rev. Soc. Psychol. 4, 1–26 (1993). [Google Scholar]
- 118.Schultz T., Fielding K., The common in-group identity model enhances communication about recycled water. J. Environ. Psychol. 40, 296–305 (2014). [Google Scholar]
- 119.Corbie-Smith G., Thomas S. B., Williams M. V., Moody-Ayers S., Attitudes and beliefs of African Americans toward participation in medical research. J. Gen. Intern. Med. 14, 537–546 (1999). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Portacolone E., et al. , Earning the trust of African American communities to increase representation in dementia research. Ethn. Dis. 30, 719–734 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 121.Wilkinson A., Parker M., Martineau F., Leach M., Engaging ‘communities’: Anthropological insights from the West African Ebola epidemic. Philos. Trans. R. Soc. Lond. B Biol. Sci. 372, 20160305 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Claw K. G., et al. ; Summer internship for INdigenous peoples in Genomics (SING) Consortium, A framework for enhancing ethical genomic research with Indigenous communities. Nat. Commun. 9, 2957 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 123.Sidik S. M., Weaving Indigenous knowledge into the scientific method. Nature 601, 285–287 (2022). [DOI] [PubMed] [Google Scholar]
- 124.Wade L., To overcome decades of mistrust, a workshop aims to train Indigenous researchers to be their own genome experts. Science, (2018). 10.1126/science.aav5286. [DOI]
- 125.Lazer D. M. J., et al. , The science of fake news. Science 359, 1094–1096 (2018). [DOI] [PubMed] [Google Scholar]
- 126.O’Brien T. C., Palmer R., Albarracin D., Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation. J. Exp. Soc. Psychol. 96, 104184 (2021). [Google Scholar]
- 127.McGuire W. J., Papageorgis D., The relative efficacy of various types of prior belief-defense in producing immunity against persuasion. J. Abnorm. Soc. Psychol. 62, 327–337 (1961). [DOI] [PubMed] [Google Scholar]
- 128.Vivion M., et al. , Prebunking messaging to inoculate against COVID-19 vaccine misinformation: An effective strategy for public health. J. Commun. Healthc., 10.1080/17538068.2022.2044606 (2022). [DOI] [Google Scholar]
- 129.Bolsen T., Druckman J. N., Cook F. L., Citizens’, scientists’, and policy advisors’ beliefs about global warming. Ann. Am. Acad. Pol. Soc. Sci. 658, 271–295 (2015). [Google Scholar]
- 130.Tappin B. M., Pennycook G., Rand D. G., Rethinking the link between cognitive sophistication and politically motivated reasoning. J. Exp. Psychol. Gen. 150, 1095–1114 (2021). [DOI] [PubMed] [Google Scholar]
- 131.Miller N., Campbell D. T., Recency and primacy in persuasion as a function of the timing of speeches and measurements. J. Abnorm. Psychol. 59, 1–9 (1959). [DOI] [PubMed] [Google Scholar]
- 132.Petty R. E., Schumann D. W., Richman S. A., Strathman A. J., Positive mood and persuasion: Different roles for affect under high- and low-elaboration conditions. J. Pers. Soc. Psychol. 64, 5–20 (1993). [Google Scholar]
- 133.Nerlich B., Koteyko N., Brown B., Theory and language of climate change communication. Wiley Interdiscip. Rev. Clim. Change 1, 97–110 (2010). [Google Scholar]
- 134.Angst C. M., Agarwal R., Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. Manage. Inf. Syst. Q. 33, 339–370 (2009). [Google Scholar]
- 135.Steele C. M., Liu T. J., Dissonance processes as self-affirmation. J. Pers. Soc. Psychol. 45, 5–19 (1983). [Google Scholar]
- 136.Epton T., Harris P. R., Self-affirmation promotes health behavior change. Health Psychol. 27, 746–752 (2008). [DOI] [PubMed] [Google Scholar]
- 137.Sparks P., Jessop D. C., Chapman J., Holmes K., Pro-environmental actions, climate change, and defensiveness: Do self-affirmations make a difference to people’s motives and beliefs about making a difference? Br. J. Soc. Psychol. 49, 553–568 (2010). [DOI] [PubMed] [Google Scholar]
- 138.Feinberg M., Willer R., Moral reframing: A technique for effective and persuasive communication across political divides. Soc. Personal. Psychol. Compass 13, e12501 (2019). [Google Scholar]
- 139.Luttrell A., Petty R. E., Evaluations of self-focused versus other-focused arguments for social distancing: An extension of moral matching effects. Soc. Psychol. Personal. Sci. 12, 946–954 (2021). [Google Scholar]
- 140.Corner A., Pidgeon N., Like artificial trees? The effect of framing by natural analogy on public perceptions of geoengineering. Clim. Change 130, 425–438 (2015). [Google Scholar]
- 141.Summers C. A., Smith R. W., Reczek R. W., An audience of one: Behaviorally targeted ads as implied social labels. J. Consum. Res. 43, 156–178 (2016). [Google Scholar]
- 142.Hansen J., Wänke M., Truth from language and truth from fit: The impact of linguistic concreteness and level of construal on subjective truth. Pers. Soc. Psychol. Bull. 36, 1576–1588 (2010). [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
There are no data underlying this work.