Skip to main content
UKPMC Funders Author Manuscripts logoLink to UKPMC Funders Author Manuscripts
. Author manuscript; available in PMC: 2022 Nov 5.
Published in final edited form as: Ann Am Acad Pol Soc Sci. 2022 May 5;700(1):26–40. doi: 10.1177/00027162221084663

When Science Becomes Embroiled in Conflict: Recognizing the Public’s Need for Debate while Combating Conspiracies and Misinformation

Stephan Lewandowsky 1,2,, Konstantinos Armaos 3, Hendrik Bruns 4, Philipp Schmid 5, Dawn Liu Holford 6,1, Ulrike Hahn 7, Ahmed Al-Rawi 8, Sunita Sah 9, John Cook 10
PMCID: PMC7613792  EMSID: EMS156146  PMID: 36338265

Abstract

Most democracies seek input from scientists to inform policies. This can put scientists in a position of intense scrutiny. Here we focus on situations in which scientific evidence conflicts with people’s worldviews, preferences, or vested interests. These conflicts frequently play out through systematic dissemination of disinformation or the spreading of conspiracy theories, which may undermine the public’s trust in the work of scientists, muddy the waters of what constitutes truth, and may prevent policy from being informed by the best available evidence. However, there are also instances in which public opposition arises from legitimate value judgments and lived experiences. In this article, we analyze the differences between politically-motivated science denial on the one hand, and justifiable public opposition on the other. We conclude with a set of recommendations on tackling misinformation and understanding the public’s lived experiences to preserve legitimate democratic debate of policy.

Keywords: COVID-19, misinformation, conspiracy theories, climate change, science denial, public health, scientific evidence, vaccine hesitancy

When Science Becomes Embroiled in Conflict: Recognizing the Public’s Need for Debate while Combating Conspiracies and Misinformation

When scientists discover a planet in our Milky Way that is made of diamonds (Bailes et al. 2011), public fascination and admiration are virtually assured. Who would not revel in the idea that we might spot a particularly bright sparkle in the night sky? However, when scientists discover that the burning of fossil fuels causes climate change, or that a lethal airborne virus can only be controlled through mask wearing and social distancing, then the public and political response can be anything but favorable, with scientists being verbally assaulted or their reputations being impugned (Lewandowsky et al. 2016; Mann, 2012).

It is nearly impossible for scientists to escape those politically-motivated conflicts. Although Daniel Kahneman has recommended that scientists should scrupulously avoid the political, and that if science involves a matter “that anybody in Congress is going to be offended by, then it’s political” (cited in Basken 2016), adherence to this recommendation would render entire scientific fields, such as evolutionary biology and climate science, off limits. Moreover, even if scientists abstain from providing policy advice, they can become targets of conspiracy theorists who frequently “blame the messenger” for inconvenient information, as has been apparent during the COVID-19 pandemic. If political conflict cannot be avoided, scientists must learn to manage such conflicts, and the public must learn to understand that such conflicts can be inevitable. Fortunately, both surveys (Pew Research Center 2009) and experimental studies (Kotcher et al. 2017) have shown that scientists can, in some circumstances, advocate policies without necessarily losing credibility or the public’s trust.

In this article, we explore the common attributes of political conflicts in which scientific findings take center stage, using the COVID-19 pandemic as a case study, but also drawing on knowledge of long-standing conflicts surrounding climate change and vaccinations. A core attribute that is common to all those conflicts is the abundance of disinformation, mainly crafted by politically-motivated actors, that distorts public perception of the scientific evidence. Another core attribute of such conflicts in democratic societies is the public’s legitimate need to be involved in the surrounding policy debates and for dissenting voices to be heard. The fundamental question to be resolved, therefore, is how to differentiate between legitimate democratic critique of scientifically informed policies on the one hand, and motivated science denial on the other.

The COVID-19 “infodemic”

The COVID-19 pandemic that upended the world in early 2020 also triggered an “infodemic” (Zarocostas 2020); that is, an overabundance of low-quality information including misinformation (i.e., information that turns out to be false), disinformation (i.e., false information that is intentionally spread to mislead people), and conspiracy theories (Enders et al. 2020; Roozenbeek et al. 2020). This infodemic, while predominantly spreading online, has had adverse real-world consequences—including for innocent bystanders. For example, in the United Kingdom, the baseless claim that 5G broadband installations were responsible for the virus-borne disease led to vandalism against numerous telecommunications installations (Jolley & Paterson 2020). Exposure to misinformation has also been shown to reduce people’s intention to get vaccinated (Loomba et al. 2021). The involvement of politicians and politically motivated actors in the dissemination of disinformation cannot be overlooked. For example, in a text analysis of 38 million online documents, Evanega et al. (2020) identified then-President Trump as a major vector of misinformation on COVID-19. Trump’s dissemination of misinformation has been linked to reduced compliance with pandemic control measures, which eventually translated into higher COVID-19 infection and fatality growth rates in U.S. counties that predominantly voted for Trump in 2016 than those that voted for Clinton (Gollwitzer et al. 2020). Similarly, far-right parties in Europe vocally opposed public-health measures such as “lockdowns” and mandatory mask wearing when in opposition (Wondreys & Mudde 2020), based on “anti-elite” arguments buttressed by misleading statistics. There is a growing body of evidence that vaccine hesitancy is mainly associated with the political right rather than left (for a summary, see Lewandowsky & Oberauer 2021), although political extremism may be another factor that transcends the conventional left-right dimension. For example, there are anecdotal reports from the UK that far-right conspiracy theories relating to COVID-19 also find traction among the radical left (Monbiot 2021). Similarly, in France opposition to COVID-19 vaccines has been found to be greatest among people who are aligned with extreme parties on both right and left (J. K. Ward et al. 2020). Support for the generality of this pattern is provided by pre-pandemic research which suggests that extreme ideology, irrespective of left or right orientation, is a predictor of conspiratorial thinking (van Prooijen, Krouwel, & Pollet 2015).

Turning to the role of institutional actors, there is evidence that COVID-19 misinformation has been amplified by right-leaning media. Those media outlets have given prominence to misinformation and conspiracy theories from the early stages of the COVID-19 pandemic (Cinelli et al. 2020; Motta, Stecula, & Farhart 2020). Two recent instrumental-variable analyses have focused on the effects of Fox News (Simonov et al. 2020), and in particular the channel’s show by Sean Hannity, who until recently consistently downplayed the risks from the pandemic et al. 2020). The studies showed that a 10% increase in viewing Fox News was found to reduce the propensity to stay at home by 1.3%, with obvious downstream consequences for public health (cf. Hegland et al. in this volume; Simonov et al. 2020). Furthermore, a one standard deviation increase in the relative viewership of the downplaying show (Sean Hannity) relative to another Fox show that did not downplay the pandemic (Tucker Carlson) was associated with a temporary increase of cases and deaths by roughly a third (Bursztyn et al, 2020).

COVID-19 related disinformation is also spread by other highly organized actors. For example, a recent analysis has suggested that the “anti-vax” online industry accumulates annual revenues of $35 million, and that their audience of 62 million followers may be worth upward of a billion dollars a year for the big social media platforms (Center for Countering Digital Hate, 2021). Some COVID-19 disinformers also have organizational links to similar political operatives who deny the existence or the human causes of climate change. To illustrate, one such connection involves the American Institute for Economic Research (AIER), a libertarian free-market think-tank that has a history of bogus argumentation about climate change (e.g., by denying the scientific consensus), and which has recently engaged in similarly misleading argumentation about COVID-19 (B. Ward, 2020). A central component of the AIER’s anti-scientific activities relating to COVID-19 is their sponsorship of the “Great Barrington Declaration”, whose signatories (many of whom have no relevant scientific credentials) advocate a “herd immunity” strategy by letting the pandemic spread through the population—by avoiding lockdowns—while (ostensibly) seeking to protect those who are most vulnerable. This position has been vociferously opposed by the majority of experts (McKee & Stuckler, 2020) and has been labeled “simply unethical” by the World Health Organization.1

Attempts to undermine a scientific consensus through dubious declarations or petitions is a long-standing strategy used by vested interests, and was pioneered by the tobacco industry in the 1950s (Cook, Lewandowsky, & Ecker, 2017) before being adopted by the fossil fuel industry (Cook et al. 2018) and now by COVID-19 disinformers. Such attempts to undermine a scientific consensus have been found to be the most persuasive disinformation strategy in a comparison of 6 different climate-denial messages (van der Linden et al. 2017). This is perhaps unsurprising because any appearance of scientific dissent invites the public to exploit this “false balance” to choose a more congenial (but scientifically unsupported) position (Koehler, 2016).

COVID-19 and democracy

COVID-19 has led governments to enforce policies that impaired economy activity and that limited individual liberties to an extent that is unprecedented in western democracies. Although some of these harsh measures were likely necessary—and demonstrably successful (Flaxman et al. 2020; Haug et al. 2020)—concerns that these policies were used by some governments to trigger a slow “authoritarianization” and that they are harbingers of an “authoritarian pandemic” (Thomson & Ip, 2020) should not be dismissed. Any infringement of civil liberties must be thoroughly examined before it can be justified as an unfortunate exception in the interest of public health. Scrutiny of restrictive measures is particularly difficult because, while democratic norms and practices are well- established during normal times, very little guidance and few conceptualizations exist of what democratic standards are acceptable during times of crisis. Codifications, such as the U.S. Constitution, can only provide broad guardrails but cannot substitute for conventions, practices, and norms of conduct. The recent refusal by Donald Trump to concede that he lost the election of 2020, a clear departure from democratic norms, starkly highlighted the importance of convention and practice in a functioning democracy (Cuéllar, 2018).

A recent empirical analysis that related pandemic severity (measured in terms of deaths from COVID-19) to infringement of democratic rights across 144 countries found no association between the two variables, which argues against a simplistic “lives-vs.-democracy” trade-off (Edgell et al. 2021). Social restrictions have also negatively impacted mental health at scale (Every-Palmer et al., 2020; Serafini et al., 2020) and have disproportionately impacted women, single parents, young people, minority groups, refugees and migrants, and poor people who cannot affordto buy basic personal protective equipment (PPE) (Greenaway et al. 2020; van Barneveld et al. 2020).

Frustration with, and opposition to, social restrictions are therefore potentially legitimate grievances that deserve to be heard in democratic public discourse. Pandemics deprive people of their feelings of control and security, factors that are known to enhance the attractiveness of conspiracy theories (Lewandowsky & Cook 2020). Some people may therefore be driven towards conspiratorial rhetoric out of psychological or rhetorical needs rather than out of an intrinsic disposition (Lewandowsky 2021). Although the epistemic status of argumentation is independent of the proponent’s circumstances, those circumstances or grievances may be relevant to determining the appropriate response. The need to recognize and empathize with these grievances is amplified by the fact that the pandemic has had the most severe impact on low-wage and low-skill employees. These employees were hit in multiple ways, from wage insecurity for hourly workers to dense living conditions and the inability to escape crowded and unsafe workplaces (Kramer & Kramer 2020).

But where do we draw the boundary between politically-motivated disinformation on the one hand, and legitimate expressions of grievances or criticism of government policy on the other? And assuming that we can identify that boundary, what are the appropriate responses by scientists and communicators?

Denial versus legitimate critique

In the remainder of this article, we present a sketch of this boundary viewed through two different lenses.

Scientific argumentation versus denial

The first lens relies on the notion of science denial, which arises when people reject well-established scientific propositions that are no longer debated by the relevant scientific community. The term “science denial” is well defined and established in the literature (Diethelm & McKee 2009; Hansson 2017; Lewandowsky et al. 2015; McKee & Diethelm 2010), and is frequently used in connection with the link

between AIDS and HIV, climate change, evolution, and other clearly established scientific facts. In all those cases, absent new evidence, dissent from the scientifically accepted position cannot be supported by legitimate evidence and theorizing, but must necessarily—i.e., in virtually all instances—involve misleading or flawed argumentation. There simply is no legitimate and coherent scientific position on climate change that does not invoke carbon emissions as a causal variable (Benestad et al., 2016). There is no legitimate scientific position that attributes AIDS to a cause other than the HIV virus (Kalichman, 2015). The definition of denial, and the misleading techniques it employs for its (usually political) ends, transcend domains. They are summarized by the acronym “FLICC” (Cook, 2020; Diethelm & McKee, 2009; Schmid & Betsch, 2019):

  • Fake experts: Using doubtful/questionable/discredited/fake experts. Fake experts were first used by the tobacco industry in the 20th century by presenting people in a lab coat who assured the public that smoking does not entail any harm (Cook et al., 2017; Oreskes & Conway, 2010).

  • Logical fallacies: patterns of reasoning that are invalid due to their logical structure. Logical fallacies, ranging from “straw man” arguments that misrepresent opponents’ positions to false dichotomies that demand one chooses between one of two options when more options may be available or both choices might be viable (on fallacies more generally, see Hahn, 2020).

  • Impossible expectations: The act of demanding undeniable proof beyond what is scientifically feasible. Demanding “proof” of the existence of global warming. Science, of course, rests on evidence, not proof, and no scientific result could ever meet absolute standards of proof.

  • Cherry-picking: Regarding and disregarding pieces of evidence such as to advance one’s point. Cherry-picking of outlying “convenient” observations (Hansson, 2017; Lewandowsky, Ballard, Oberauer, & Benestad, 2016) while ignoring an abundance of evidence to the contrary.

  • Conspiracy theories: Explaining evidence by means of an evil conspirator, while consecutively expanding the theory to defend against challenging evidence. Conspiracy theories are an almost inevitable component of denial, which serve to explain away overwhelming scientific evidence. One example is the accusation that the pharmaceutical industry is trying to kill people through vaccinations, or that leftwing politicians pursue the “Great Reset” agenda to change Western societies (Schmid & Betsch 2019).

Claims that rely on one or more of these techniques contribute little to a debate and should not play a direct role in determining policy. The rules of scientific evidence formation and argumentation are inescapable and cannot be discarded or side-stepped for political expediency. A cherry-picked argument against climate change or COVID-19 vaccinations is unscientific and should not prevent climate mitigation or a vaccine rollout. (The converse, however, does not follow: argumentation that survives the FLICC criteria is not guaranteed to lead to accurate conclusions. For example, logically correct arguments can lead us astray when a premise is incorrect.)

At first glance, this insistence on quality of argumentation may seemingly curtail the public’s involvement in any scientifically-informed debate. After all, members of the public are often non-experts on topics and issues whose outcomes affect their lives. Consequently, people form attitudes on these topics mainly by relying on narratives, stories, feelings, or pictures—rather than data and scientific analysis. People can therefore easily fall prey to the misleading techniques outlined above. The public at large is also typically unskilled in argumentation and the evaluation and weighting of scientific evidence, which creates a further apparent barrier to involvement. We suggest that these barriers are not insurmountable.

First, scientific issues can be legitimately communicated by stories or pictures (Lewandowsky & Whitmarsh 2018). For example, whereas a picture of snowfall in New York presents no legitimate evidence against global warming (because a cherry-picked normal weather event falls within the envelope of events expected with climate change), a photo of retreating glaciers is a legitimate tool to communicate climate change (because glaciers integrate snowfall over centuries and hence their retreat captures climate change rather than weather). Insistence on quality of argumentation thus does not prevent the use of engaging and accessible communication.

Second, the insistence on sound argumentation does not prevent people’s lived experience from being relevant to debate. On the contrary, as we show next, people’s lived and reported experiences and their value judgments are crucial to developing scientifically-informed public policy. People’s lived experiences, and the narratives emerging from them, provide us with a form of data that can constrain the design of policies.

Different lived experiences

Beliefs are context dependent. Denying facts that are commonly accepted by the scientific community therefore does not necessarily reflect “irrationality” or bad faith. The rationality of a belief is established relative to an evidence base, so actors may rationally disagree in situations where they have different evidential histories. For example, if people differ in how much trust they put on various information sources (e.g., scientists vs. their neighbor on social media), polarization may ensue even on the basis of identical information that is processed completely rationally (Cook & Lewandowsky 2016; Druckman & McGrath 2019; Jern, Chang, & Kemp 2014).

Ethnic minorities, for example, have historically been discriminated against in the healthcare system. Western countries, especially those with colonial histories, have also damaged people’s trust in medical treatments through their previous mistreatment of indigenous populations (Lowes & Montero 2018) and misuse of vaccination centers, for example, by the CIA in its hunt for Osama bin Laden (Reardon 2011). It is unsurprising that people would question scientific evidence communicated by the same institutions that caused them harm or deceived them in the past (Jamison, Quinn, & Freimuth 2019).

Opposing vaccinations because they have previously been misused by the authorities does not make the argument scientifically sustainable. However, appreciation of why evidence is mistrusted in these communities is essential to interpret beliefs and to design culturally sensitive interventions. Locally designed, multicomponent interventions that are sensitive to lived experiences have been shown to increase vaccine uptake considerably (Crocker-Buque, Edelstein, & Mounier-Jack 2017)—reflecting the need to regain trust rather than dismiss beliefs based on lived experiences as simple denialism.

Similarly, denying the severity or even the existence of COVID-19, or denying the effectiveness of social distancing, may represent an adaptive strategy irrespective of the poor quality of argumentation (though it is less clear that denial along with other forms of “not wanting to know” could be construed as “rational”; see Krueger et al. 2020). For example, denial can enhance people’s self-efficacy by maintaining optimism about the current situation (Bénabou & Tirole 2016). Avoiding or selectively relying on information about the pandemic can be a particularly effective strategy to maintain such self-efficacy (Golman, Hagmann, & Loewenstein 2017; Hertwig & Engel 2016). In addition, research into the psychology of conspiracy beliefs has identified beneficial consequences of such beliefs for the individual, such as creating a sense of belonging through group identification. However, these potentially beneficial effects should not blind us to the fact that conspiracy beliefs are ultimately based in distorted reasoning and that their positive results for an individual can lead to gravely adverse social outcomes, including racism (Golec de Zavala & Cichocka 2012; van Prooijen 2018). Such consequences have been observed with respect to the COVID-19 pandemic as well (Coates 2020; Pei & Mehta 2020).

Denial can also be a protective mechanism to manage fear (Schimmenti, Billieux, & Starcevic 2020). Someone who lives alone, with a pre-existing health condition and a limited budget, does not have many options. She would either have to isolate herself completely to avoid contracting COVID-19 or neglect the danger and go out to serve her everyday needs, in which case denial may be necessary to manage fear. Similar considerations apply to the millions of people who are forced to go to work every day in crowded buses or trains and operate in workplaces with little provision for their safety and health. Combating denial or non-compliance with social distancing measures under those circumstances requires supportive policies rather than persuasive or coercive measures. It is only through supportive policies that life conditions that force people into a denialist or conspiracist mindset will lose their power.

In summary, flawed argumentation does not become scientifically more valid because of a person’s cultural background or lived experience. However, understanding a lived experience can serve at least two purposes. First, it can provide pointers as to why a particular person engages in (or falls for) FLICC-based flawed argumentation. Second, even if flawed, it can reveal shortcomings in the scientific process or evidence base. For example, given that non-Hispanic whites of European ancestry comprise more than 90% of participants in clinical trials, compared to their actual share in the population of 61% et al. 2021; Mak et al. 2007), much medical knowledge may not apply to the full diversity of the population. For example, side effects of 5-Fluorouracil, a common cancer drug, occur at higher rates in under-represented populations. Because the clinical trials involved predominantly white participants, these side effects in racial/ethnic minority groups were initially overlooked (Yates et al. 2020). Arguments based on a person’s identification as a cultural or ethnic minority may therefore not reflect “irrationality” or bad faith. On the contrary, analysis of those arguments can provide valuable pointers to underlying issues—such as lacking representation in medical research—that can be addressed by suitable policies or remedial research.

Recommendations

When science has an impact on policy and on people’s daily lives, two fundamental rights of the public collide: the right to be heard, and the right not to be misled. We propose that this tension can be resolved, and legitimate democratic debate be facilitated, in at least two ways.

First, misleading and inappropriate argumentation must be identified (e.g., Cook, Ellerton, & Kinkead 2018; Lewandowsky, Cook, & Lloyd 2016; Lewandowsky, Lloyd, & Brophy 2018). Rapidly evolving crises can overwhelm the scientific process, which then cannot provide firm answers at the speed at which they are demanded by public and policy makers. However, lack of scientific knowledge or scientific uncertainty does not legitimate misleading and inappropriate argumentation. Cherry picking, for example, remains inappropriate irrespective of the strength of available scientific evidence. Once identified, those misleading arguments can be used to “inoculate” the public against them. The theory of inoculation posits that people can be protected against misleading information when they are (a) warned that they may be misled, and (b) are exposed to a preemptive rebuttal of the misleading argumentation (Lewandowsky & van der Linden 2021). Inoculation has been shown to be effective in many situations, including COVID-19 misinformation (Basol et al. 2021) and vaccine-related conspiracy theories (Jolley & Douglas 2017). The two main inoculation approaches are (a) fact-based, where misinformation is demonstrated to be false through factual explanations, and (b) logic-based, which involves identifying misleading techniques rather than content. The logic-based approach can convey immunity across topics (Cook et al. 2017) and is applicable without requiring detailed knowledge of the entire inventory of misleading talking points.

Second, the functional role of inappropriate argumentation must be interrogated. Do people believe and voice those arguments to express a relevant aspect of their circumstances? If people voice conspiratorial rhetoric, do they express a deep-seated belief or does the rhetoric serve other functions (Lewandowsky 2021), such as a perceived loss of control? What policies might mitigate those circumstances, so that people no longer have to rely on counter-productive arguments? Policy taking into account the reasons underlying misleading arguments can be more effective than those agnostic about these reasons. For example, in the context of overcoming vaccine hesitancy, approaches such as “motivational interviewing”, which are based on listening and empathy and understanding of circumstances rather than “winning” an argument, have been shown to be particularly successful (Gagneur 2020).

It is only when misleading arguments can be identified and rejected, or can be interrogated for their underlying causes, that the holy grail of “deliberative decision making that is inclusive, transparent and accountable” (Norheim et al. 2021, p. 10) can be achieved.

Acknowledgments

SL received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 964728 (JITSUVAX) during completion of this paper. SL was also supported by funding from the Humboldt Foundation in Germany through a research award, and by an ERC Advanced Grant (PRODEMINFO).

DLH was supported by funding from the UK Economic and Social Research Council (grant reference ES/V011901/1) during completion of this paper.

Footnotes

References

  1. Bailes Matthew, Bates Stuart D, Varun Bhalerao, Bhat NDRamesh, Burgay Marta, Sarah Burke-Spolaor, Nichi D’Amico, Simon Johnston, Keith Michael J, Michael Kramer, Kulkarni Shrinivas R, et al. “Transformation of a Star into a Planet in a Millisecond Pulsar Binary”. Science. 2011;333(6050):1717–20. doi: 10.1126/SCIENCE.1208890. [DOI] [PubMed] [Google Scholar]
  2. Basken Paul. On Climate Change, Are University Researchers Making a Difference. Chronicle of Higher Education. 2016 Available from www.chronicle.com. [Google Scholar]
  3. Basol Melisa, Roozenbeek Jon, Berriche Manon, Uenal Fatih, McClanahan William P, van der Linden Sander. Towards Psychological Herd Immunity: Cross-Cultural Evidence for Two Prebunking Interventions against COVID-19 Misinformation. Big Data Society. 2021;8(1) doi: 10.1177/20539517211013868. [DOI] [Google Scholar]
  4. Bénabou Roland, Tirole Jean. Mindful Economics: The Production, Consumption, and Value of Beliefs. Journal of Economic Perspectives. 2016;30(3):141–64. doi: 10.1257/JEP.30.3.141. [DOI] [Google Scholar]
  5. Benestad Rasmus E, Nuccitelli Dana, Lewandowsky Stephan, Hayhoe Katharine, Hygen Hans Olav, van Dorland Rob, Cook John. Learning from Mistakes in Climate Research. Theoretical and Applied Climatology. 2015;126(3):699–703. doi: 10.1007/S00704-015-1597-5. [DOI] [Google Scholar]
  6. Bruns Axel, Harrington Stephen, Hurcombe Edward. ‘Corona? 5G? Or Both?’: The Dynamics of COVID-19/5G Conspiracy Theories on Facebook. Media International Australia. 2020;177 doi: 10.1177/1329878X20946113. [DOI] [Google Scholar]
  7. Bursztyn Leonardo, Rao Aakaash, Roth Christopher P, Yanagizawa-Drott David H, Roth Christopher, Yanagizawa-Drott David, Alesina Alberto, Candelaria Luis, Cantoni Davide, Caprettini Bruno, Dix Rebekah, et al. Misinformation During a Pandemic National Bureau of Economic Research Working Paper Series. 2020 doi: 10.3386/W27417. [DOI]
  8. Center for Countering Digital Hate. Pandemic Profiteers: The business of anti-vaxx. 2021. Retrieved from https://www.counterhate.com/pandemicprofiteers.
  9. Cinelli Matteo, Quattrociocchi Walter, Galeazzi Alessandro, Valensise Carlo Michele, Brugnoli Emanuele, Schmidt Ana Lucia, Zola Paola, Zollo Fabiana, Scala Antonio. The COVID-19 Social Media Infodemic. Scientific Reports. 2020;10(1):1–10. doi: 10.1038/s41598-020-73510-5. 2020 10:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Coates Melanie. Covid-19 and the Rise of Racism. BMJ. 2020;369 doi: 10.1136/BMJ.M1384. [DOI] [PubMed] [Google Scholar]
  11. Cook John. In: Research handbook on communicating climate change. Holmes D, Richardson LM, editors. Edward Elgar Publishing; 2020. Deconstructing climate science denial. [DOI] [Google Scholar]
  12. Cook John, Ellerton Peter, Kinkead David. Deconstructing Climate Misinformation to Identify Reasoning Errors. Environmental Research Letters. 2018;13(2):024018. doi: 10.1088/1748-9326/AAA49F. [DOI] [Google Scholar]
  13. Cook John, Lewandowsky Stephan. Rational Irrationality: Modeling Climate Change Belief Polarization Using Bayesian Networks. Topics in Cognitive Science. 2016;8(1):160–79. doi: 10.1111/TOPS.12186. [DOI] [PubMed] [Google Scholar]
  14. Cook John, Lewandowsky Stephan, Ullrich K, Ecker H. Neutralizing Misinformation through Inoculation: Exposing Misleading Argumentation Techniques Reduces Their Influence. PLOS ONE. 2017;12(5) doi: 10.1371/journal.pone.0175799. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Cook John, van der Linden Sander, Maibach Edward, Lewandowsky Stephan. The Consensus Handbook Why the Scientific Consensus on Climate Change Is Important. Available at www.climatechangecommunication.org/all/consensus-handbook/ [DOI]
  16. Crocker-Buque Tim, Edelstein Michael, Mounier-Jack Sandra. Interventions to Reduce Inequalities in Vaccine Uptake in Children and Adolescents Aged. Journal of Epidemiology and Community Health. 2017;71(1):87–97. doi: 10.1136/JECH-2016-207572. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Cuéllar Mariano-Florentino. From Doctrine to Safeguards in American Constitutional Democracy UCLA Law Review. 65:1398. [Google Scholar]
  18. Diethelm Pascal, McKee Martin. Denialism: What Is It and How Should Scientists Respond. European Journal of Public Health. 2009;19(1):2–4. doi: 10.1093/EURPUB/CKN139. [DOI] [PubMed] [Google Scholar]
  19. Druckman James N, McGrath Mary C. The Evidence for Motivated Reasoning in Climate Change Preference Formation. Nature Climate Change. 2019;29(2):111–19. doi: 10.1038/s41558-018-0360-1. 2019 9. [DOI] [Google Scholar]
  20. Edgell Amanda B, Lachapelle Jean, Lührmann Anna, Maerz Seraphine F. Pandemic Backsliding: Violations of Democratic Standards during Covid-19. Social Science Medicine. 2021;285(1982) doi: 10.1016/J.SOCSCIMED.2021.114244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Enders Adam M, Uscinski Joseph E, Klofstad Casey, Stoler Justin. The Different Forms of COVID-19 Misinformation and Their Consequences. Harvard Kennedy School Misinformation Review. 2020;1(8):1. doi: 10.37016/MR-2020-48. [DOI] [Google Scholar]
  22. Evanega Sarah. Coronavirus misinformation: Quantifying sources and themes in the COVID-19’infodemic’. Retrieved from https://allianceforscience.cornell.edu/wp-content/uploads/2020/10/Evanega-et-al-Coronavirus-misinformation-submitteD_07_23_20-1.pdf.
  23. Every-Palmer Susanna, Jenkins Matthew, Gendall Philip, Hoek Janet, Beaglehole Ben, Bell Caroline, Williman Jonathan, Rapsey Charlene, Stanley James. Psychological Distress, Anxiety, Family Violence, Suicidality, and Wellbeing in New Zealand during the COVID-19 Lockdown: A Cross-Sectional Study. PLOS ONE. 2020;15(11) doi: 10.1371/JOURNAL.PONE.0241658. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Flaxman Seth, Mishra Swapnil, Scott James, Ferguson Neil, Gandy Axel, Bhatt Samir. Reply to: The Effect of Interventions on COVID-19. Nature. 2020;588(7839):E29–32. doi: 10.1038/s41586-020-3026-x. 2020 588:7839. [DOI] [PubMed] [Google Scholar]
  25. Gagneur Arnaud. Motivational Interviewing: A Powerful Tool to Address Vaccine Hesitancy. Canada Communicable Disease Report = Releve Des Maladies Transmissibles Au Canada. 2020;46(4):93–97. doi: 10.14745/CCDR.V46I04A06. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Golec de Zavala Agnieszka, Cichocka Aleksandra. Collective Narcissism and AntiSemitism in Poland. Group Processes and Intergroup Relations. 2011;15(2):213–29. doi: 10.1177/1368430211420891. [DOI] [Google Scholar]
  27. Gollwitzer Anton, Martel Cameron, Brady William J, Philip Pärnamets, Freedman Isaac G, Knowles Eric D, van Bavel Jay J. Partisan Differences in Physical Distancing Are Linked to Health Outcomes during the COVID-19 Pandemic. Nature Human Behaviour. 2020;4(11):1186–97. doi: 10.1038/s41562-020-00977-7. 2020 4:11. [DOI] [PubMed] [Google Scholar]
  28. Golman Russell, Hagmann David, Loewenstein George. Information Avoidance. Journal of Economic Literature. 2017;55(1):96–135. doi: 10.1257/JEL.20151245. [DOI] [Google Scholar]
  29. Greenaway Christina, Hargreaves Sally, Barkati Sapha, Coyle Christina M, Gobbi Federico, Veizis Apostolos, Douglas Paul. COVID-19: Exposing and Addressing Health Disparities among Ethnic Minorities and Migrants. Journal of Travel Medicine. 2020;27(7):1–3. doi: 10.1093/JTM/TAAA113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Hahn Ulrike. Argument Quality in Real World Argumentation. Trends in Cognitive Sciences. 2020;24(5):363–74. doi: 10.1016/J.TICS.2020.01.004. [DOI] [PubMed] [Google Scholar]
  31. Hansson Sven Ove. Science Denial as a Form of Pseudoscience. Studies in History and Philosophy of Science Part A. 2017;63:39–47. doi: 10.1016/J.SHPSA.2017.05.002. [DOI] [PubMed] [Google Scholar]
  32. Haug Nils, Greyhofer Lukas, Londei Alessandro, Dervic Elma, Amelie Desvars-Larrive, Loreto Vittorio, Pinior Beate, Thurner Stefan, Klimek Peter. Ranking the effectiveness of worldwide COVID-19 government interventions. Nature Human Behaviour. 2020;4:1303–1312. doi: 10.1038/s41562-020-01009-0. [DOI] [PubMed] [Google Scholar]
  33. Hertwig Ralph, Engel Christoph. Homo Ignorans: Deliberately Choosing Not to Know. Perspectives on Psychological Science. 2016;11(3):359–72. doi: 10.1177/1745691616635594. [DOI] [PubMed] [Google Scholar]
  34. Jamison Amelia M, Quinn Sandra Crouse, Freimuth Vicki S. ‘You Don’t Trust a Government Vaccine’: Narratives of Institutional Trust and Influenza Vaccination among African American and White Adults. Social Science Medicine. 2019;221:87–94. doi: 10.1016/J.SOCSCIMED.2018.12.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Jern Alan, Chang Kai Min K, Kemp Charles. Belief Polarization Is Not Always Irrational. Psychological Review. 2014;121(2):206–24. doi: 10.1037/A0035941. [DOI] [PubMed] [Google Scholar]
  36. Jolley Daniel, Douglas Karen M. Prevention Is Better than Cure: Addressing Anti Vaccine Conspiracy Theories. Journal of Applied Social Psychology. 2017;47(8):459–69. doi: 10.1111/JASP.12453. [DOI] [Google Scholar]
  37. Jolley Daniel, Paterson Jenny L. Pylons Ablaze: Examining the Role of 5G COVID-19 Conspiracy Beliefs and Support for Violence. British Journal of Social Psychology. 2020;59(3):628–40. doi: 10.1111/BJSO.12394. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Kalichman Seth C. Commentary on “Questioning the HIV-AIDS hypothesis: 30 years of dissent”. Frontiers in Public Health. 2015;3 doi: 10.3389/fpubh.2015.00030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Koehler Derek J. Can Journalistic ‘False Balance’ Distort Public Perception of Consensus in Expert Opinion. Journal of Experimental Psychology Applied. 2016;22(1):24–38. doi: 10.1037/XAP0000073. [DOI] [PubMed] [Google Scholar]
  40. Kotcher John E, Myers Teresa A, Vraga Emily K, Stenhouse Neil, Maibach Edward W. Does Engagement in Advocacy Hurt the Credibility of Scientists? Results from a Randomized National Survey Experiment. Environmental Communication. 2017;11(3):415–29. doi: 10.1080/17524032.2016.1275736. [DOI] [Google Scholar]
  41. Kramer Amit, Kramer Karen Z. The Potential Impact of the Covid-19 Pandemic on Occupational Status, Work from Home, and Occupational Mobility. Journal of Vocational Behavior. 2020;119:103442. doi: 10.1016/J.JVB.2020.103442. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Krueger Joachim I, Hahn Ulrike, Ellerbrock Dagmar, Gächter Simon, Hertwig Ralph, Kornhauser Lewis A, Leuker Christina, Szech Nora, Waldman Michael R. In: Deliberate Ignorance. Hertwig Ralph, Engel Christoph., editors. MIT Press; Cambridge, MA: 2021. Normative Implications of Deliberate Ignorance; pp. 241–72. [DOI] [Google Scholar]
  43. Lewandowsky Stephan. Conspiracist Cognition: Chaos, Convenience, and Cause for Concern. 2021;25(1):12–35. doi: 10.1080/14797585.2021.1886423. [DOI] [Google Scholar]
  44. Lewandowsky Stephan, Ballard Timothy, Oberauer Klaus, Benestad Rasmus. A Blind Expert Test of Contrarian Claims about Climate Data. Global Environmental Change. 2016;39:91–97. doi: 10.1016/J.GLOENVCHA.2016.04.013. [DOI] [Google Scholar]
  45. Lewandowsky Stephan, Cook John. The Conspiracy Theory Handbook | Center For Climate Change Communication. Retrieved at www.climatechangecommunication.org/conspiracy-theory-handbook.
  46. Lewandowsky Stephan, Cook John, Lloyd Elisabeth. The ‘Alice in WonderlanD’ Mechanics of the Rejection of (Climate) Science: Simulating Coherence by Conspiracism. Synthese. 2018;195(1):175–96. doi: 10.1007/S11229-016-1198-6/TABLES/2. [DOI] [Google Scholar]
  47. Lewandowsky Stephan, Cook John, Oberauer Klaus, Brophy Scott, Lloyd Elisabeth A, Marriott Michael. Recurrent Fury: Conspiratorial Discourse in the Blogosphere Triggered by Research on the Role of Conspiracist Ideation in Climate Denial. Journal of Social and Political Psychology. 2015;3(1):142–78. doi: 10.5964/JSPP.V3I1.443. [DOI] [Google Scholar]
  48. Lewandowsky Stephan, van der Linden Sander. Countering Misinformation and Fake News Through Inoculation and Prebunking. European Review of Social Psychology. 2021;32(2):348–84. doi: 10.1080/10463283.2021.1876983. 3. [DOI] [Google Scholar]
  49. Lewandowsky Stephan, Lloyd Elizabeth A, Brophy Scott. When THUNCing Trumps Thinking: What Distant Alternative Worlds Can Tell Us About the Real World. Argumenta. 3:217–231. doi: 10.23811/52.arg2017.lew.llo.bro. [DOI] [Google Scholar]
  50. Lewandowsky Stephan, Mann Michael E, Brown Nicholas JL, Friedman Harris. Science and the Public: Debate, Denial, and Skepticism. Journal of Social and Political Psychology. 2016;4(2):537–53. doi: 10.5964/JSPP.V4I2.604. [DOI] [Google Scholar]
  51. Lewandowsky Stephan, Oberauer Klaus. Worldview-Motivated Rejection of Science and the Norms of Science. Cognition. 2021;215:104820. doi: 10.1016/J.COGNITION.2021.104820. [DOI] [PubMed] [Google Scholar]
  52. Lewandowsky Stephan, Whitmarsh Lorraine. Climate Communication for Biologists: When a Picture Can Tell a Thousand Words. PLOS Biology. 2018;16(10):e2006004. doi: 10.1371/JOURNAL.PBIO.2006004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Loomba Sahil, de Figueiredo Alexandre, Piatek Simon J, de Graaf Kristen, Larson Heidi J. Measuring the Impact of COVID-19 Vaccine Misinformation on Vaccination Intent in the UK and USA. Nature Human Behaviour. 2021;5(3):337–48. doi: 10.1038/s41562-021-01056-1. 2021 5:3. [DOI] [PubMed] [Google Scholar]
  54. Lowes Sara, Montero Eduardo. The Legacy of Colonial Medicine in Central Africa. American Economic Review. 2021;111(4):1284–1314. [Google Scholar]
  55. Ma Manuel A, Gutiérrez Dora E, Frausto Joanna M, Al-Delaimy Wael K. Minority Representation in Clinical Trials in the United States: Trends Over the Past 25 Years. Mayo Clinic Proceedings. 2021;96(1):264–66. doi: 10.1016/J.MAYOCP.2020.10.027. [DOI] [PubMed] [Google Scholar]
  56. Mak Winnie WS, Law Rita W, Alvidrez Jennifer, Pérez-Stable Eliseo J. Gender and Ethnic Diversity in NIMH-Funded Clinical Trials: Review of a Decade of Published Research. Administration and Policy in Mental Health. 2007;34(6):497–503. doi: 10.1007/S10488-007-0133-Z. [DOI] [PubMed] [Google Scholar]
  57. Mann Michael E. Dispatches from the Front Lines. Columbia University Press; New York: 2013. The Hockey Stick and the Climate Wars? [Google Scholar]
  58. McKee Martin, Diethelm Pascal. How the Growth of Denialism Undermines Public Health. BMJ. 2010;341(7786) doi: 10.1136/BMJ.C6950. [DOI] [PubMed] [Google Scholar]
  59. McKee Martin, Stuckler David. Scientific Divisions on Covid-19: Not What They Might Seem. BMJ. 2020;371 doi: 10.1136/BMJ.M4024. [DOI] [PubMed] [Google Scholar]
  60. Monbiot George. It’s Shocking to See so Many Leftwingers Lured to the Far Right by Conspiracy Theories. The Guardian. 2021 Available from www.theguardian.co.uk. [Google Scholar]
  61. Motta Matt, Stecula Dominik, Farhart Christina. How Right-Leaning Media Coverage of COVID-19 Facilitated the Spread of Misinformation in the Early Stages of the Pandemic in the U.S. Canadian Journal of Political Science/Revue Canadienne de Science Politique. 2020;53(2):335–42. doi: 10.1017/S0008423920000396. [DOI] [Google Scholar]
  62. Norheim Ole F, Abi-Rached Joelle M, Bright Liam Kofi, Bærøe Kristine, Ferraz Octávio LM, Gloppen Siri, Voorhoeve Alex. Difficult Trade-Offs in Response to COVID-19: The Case for Open and Inclusive Decision Making. Nature Medicine. 2020;27(1):10–13. doi: 10.1038/s41591-020-01204-6. 2020 27:1. [DOI] [PubMed] [Google Scholar]
  63. Oreskes Naomi. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Climate Change. Bloomsbury Publishing; London, UK: 2010. [Google Scholar]
  64. Pei Xin, Mehta Deval. #Coronavirus or #Chinesevirus?!: Understanding the Negative Sentiment Reflected in Tweets with Racist Hashtags across the Development of COVID-19. 2020. Retrieved from https://arxiv.org/abs/2005.08224.
  65. Pew Research Center. Public Praises Science; Scientists Fault Public, Media. 2009. Available from https://www.pewresearch.org/politics/2009/07/09/public-praises-science-scientists-fault-public-media/
  66. Reardon Sara. Decrying CIA Vaccination Sham, Health Workers Brace for Backlash. Science. 2011;333(6041):395. doi: 10.1126/SCIENCE.333.6041.395. [DOI] [PubMed] [Google Scholar]
  67. Roozenbeek Jon, Schneider Claudia R, Dryhurst Sarah, Kerr John, Freeman Alexandra LJ, Recchia Gabriel, van der Bles Anne Marthe, van der Linden Sander. Susceptibility to Misinformation about COVID-19 around the World. Royal Society Open Science. 2020;7(10) doi: 10.1098/RSOS.201199. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Schimmenti Adriano, Billieux Joёl, Starcevic Vladan. The Four Horsemen of Fear: An Integrated Model of Understanding Fear Experiences during the COVID-19 Pandemic. Clinical Neuropsychiatry. 2020;17(2):41–45. doi: 10.36131/CN20200202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Schmid Philipp, Betsch Cornelia. Effective Strategies for Rebutting Science Denialism in Public Discussions. Nature Human Behaviour. 2019;3(9):931–39. doi: 10.1038/s41562-019-06324. 2019 3:9. [DOI] [PubMed] [Google Scholar]
  70. Serafini G, Parmigiani B, Amerio A, Aguglia A, Sher L, Amore M. The Psychological Impact of COVID-19 on the Mental Health in the General Population. QJM : Monthly Journal of the Association of Physicians. 2020;113(8):229–35. doi: 10.1093/QJMED/HCAA201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Simonov Andrey, Sacher Szymon K, Dubé Jean-Pierre H, Biswas Shirsho. The Persuasive Effect of Fox News: Non-Compliance with Social Distancing During the Covid-19 Pandemic. National Bureau of Economic Research; 2020. [DOI] [Google Scholar]
  72. Thomson Stephen, Ip Eric C. COVID-19 Emergency Measures and the Impending Authoritarian Pandemic. Journal of Law and the Biosciences. 2020;7(1):1–33. doi: 10.1093/JLB/LSAA064. [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. van Barneveld Kristin, Quinlan Michael, Kriesler Peter, Junor Anne, Baum Fran, Chowdhury Anis, Junankar PN, Clibborn Stephen, Flanagan Frances, Wright Chris F, Friel Sharon, et al. The COVID-19 Pandemic: Lessons on Building More Equal and Sustainable Societies. The Economic and Labour Relations Review. 2020;31:133–157. doi: 10.1177/1035304620927107. [DOI] [Google Scholar]
  74. van der Linden Sander, Leiserowitz Anthony, Rosenthal Seth, Maibach Edward. Inoculating the Public against Misinformation about Climate Change. Global Challenges. 2017;1(2):1600008. doi: 10.1002/GCH2.201600008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. van Prooijen Jan-Willem. The Psychology of Conspiracy Theories. Routledge; London, UK: 2018. [Google Scholar]
  76. van Prooijen Jan-Willem, André PMKrouwel, Pollet Thomas v. Political Extremism Predicts Belief in Conspiracy Theories. Social Psychological and Personality Science. 2015;6(5):570–78. doi: 10.1177/1948550614567356. [DOI] [Google Scholar]
  77. Ward Bob. Organisers of Anti-Lockdown Declaration Have Track Record of Promoting Denial of Health and Environmental Risks. Grantham Research Institute on Climate Change and the Environment; 2020. Available from www.lse.ac.uk/granthaminstitute/ [Google Scholar]
  78. Ward Jeremy K, Alleaume Caroline, Peretti-Watel Patrick, Seror Valérie, Cortaredona Sébastien, Launay Odile, Raude Jocelyn, Verger Pierre, Beck François, Legleye Stéphane, L’Haridon Olivier, et al. The French Public’s Attitudes to a Future COVID-19 Vaccine: The Politicization of a Public Health Issue. Social Science Medicine. 2020;265(1982) doi: 10.1016/J.SOCSCIMED.2020.113414. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Wondreys Jakub, Mudde Cas. Victims of the Pandemic? European Far-Right Parties and COVID-19. Nationalities Papers. 2020:1–18. doi: 10.1017/NPS.2020.93. [DOI] [Google Scholar]
  80. Yates Isabelle, Byrne Jennifer, Donahue Susan, McCarty Linda, Mathews Allison. Representation in Clinical Trials: A Review on Reaching Underrepresented Populations in Research - ACRP. Clinical Researcher. 2020;34(7) Available at https://acrpnet.org/ [Google Scholar]
  81. Zarocostas John. How to Fight an Infodemic. The Lancet. 2020;395(10225):676. doi: 10.1016/S0140-6736(20)30461-X. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES