Abstract
Science denial has a long history of causing harm in contemporary society when ignored. Recent discussions of science denial suggest that correcting people’s false beliefs rarely has an impact on eliminating the adherence to false beliefs and assumptions, which is called the backfire effect. This paper brings the backfire effect within the context of science denial to the attention of science education researchers and practitioners and discusses the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. The use of epistemic understanding of knowledge production in science with a focus on avoiding the backfire effect may increase the potential for science education research to produce fruitful strategies which advance students’ attitudes toward science and deepen students’ understanding of how science works through divergent perspectives. There are some areas that need to be focused on and investigated for their potential to combat science denial and the backfire effect while foregrounding the role(s) epistemic understanding of knowledge production for science instruction. These areas include expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different frameworks, teaching about the power and limitations of science, and bringing different and similar ways science is done to students’ attention.
Introduction
There has been increased attention paid to science denial in both educational and social context (Hansson 2017b; Liu 2012; Rosenau 2012). Science denial is defined as “the systematic rejection of empirical evidence to avoid [personally and subjectively] undesirable facts or conclusions” (Liu 2012, p. 129). Some typical examples of science denial are denial of climate change, relativity theory, evolution, the origin of life, AIDS, vaccination, and tobacco disease. Science denial is a social phenomenon, and it is one form of pseudo-science (Bardon 2020). Another form is called pseudo-theory promotion. While science denial is coloured by a growing antipathy towards particular scientific theories and the refusal of some parts of science (e.g., climate change denial, evolution denial, continental drift denial, the origin of life, or relativity theory denial), pseudo-theory promotion is based on the attempts to construct personal theories or claims (e.g., transcendental meditation, astrology, herbal medicine, or iridology) (Hansson 2017b). Hansson (2017b, pp. 43–44) outlined ten sociological characteristics shared by science denialists and pseudo-theory promoters as listed in Table 1.
Table 1.
Characteristics | |
---|---|
Considering the target theory as a threat (e.g., evolution theory is considered a threat to traditional religion) | Primarily prominent in science denial |
Finding the target theory complex and difficult to understand (pedagogical difficulty in understanding of evidence built on interdisciplinary data—e.g., climate science) | |
Engaging in personal attacks on legitimate scientists (e.g., the anti-relativists of the 1920s and 1930s who prevented Einstein from visiting Germany) | |
Lacking competence in conducting scientific research or teaching science (among the opponents of climate science and evolution theory, the participation of competent scientists has been small) | Prominent in both science denial and pseudo-theory promotion |
Failing to publish in peer-reviewed scientific journals | |
Blaming conspiracy theories for failing to publish in scientific journals and gain a recognition (e.g., seeing relativity theory as part of a larger Jewish conspiracy and believing that the prestigious physics journals are under Jewish control) | |
Targeting the public (denialists tend to disseminate their views through outlets intended for the public) | |
Giving a false impression of having support in the scientific community (denialists create institutes, conferences, and journals to impress the public such as The Academy of Nations and The Creation Research Society) | |
Having a denialist literature dominated by males (women are less likely to take part in the activities of evolution and climate change denial) | More prominent in science denial, less prominent in pseudo-theory promotion |
Strong political connections (e.g., Nazi newspapers attack against the relativity theory, evolution denial dominated by a Christian right-wing, and climate change denial dominated by a more business-oriented right-wing politics) |
Science denial is slightly different than pseudo-theory promotion (Hansson 2017b). The most important difference between science denial and pseudo-theory promotion is that while the fabrication of false controversies is a standard practice in science denial, most cases of pseudo-theory promotion do not engage in producing fake controversies (Hansson 2017a). In contrast, pseudo-theory promotion tends to avoid controversies with science and describes its claims as compatible with and conformable to science (Hansson 2017a, b). In this paper, distinguishing and comparing science denial and pseudo-theory promotion is key for two main reasons. First, this paper focuses only on science denial due to the ongoing discussions around bringing science denial to classrooms (e.g., Boyle 2017) and the massive spread and acceptance of conspiracy theories about scientific phenomena (e.g., climate change, the origin of life, COVID-19) in both the public and schools. Second, the discussion in this paper takes the characteristics of science denial into account to determine some areas for both educators and researchers to focus on as to how to respond to science denial in educational settings.
The purpose of this paper is to bring the backfire effect within the context of science denial to the attention of science education researchers and practitioners and discuss the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. I wish to take the reader beyond what I present and discuss here and to detect some areas open for further exploration rather than providing a road map or a list of tips and strategies to combat science denial and the backfire effect.
Correcting Misbeliefs?
Many people resist evaluating and accepting reliable scientific evidence. One of the reasons for denying scientific evidence is that scientific ideas may threaten people’s beliefs, ideologies, and background assumptions which are often wrong and misleading. For instance, “what predicts the denial of human-made climate change is not scientific illiteracy but political ideology” (Pinker 2018, p. 357). Adherence to personal beliefs and background assumptions, what Sandoval (2005) called personal epistemology, interferes with the acceptance of scientific facts and conclusions (Sinatra et al. 2014). One may ask the question of whether we can change or correct people’s false beliefs. In general, people are supposed to adjust their assumptions when they evaluate scientific evidence that challenges their beliefs. So, is this always the case? The answer is no. In their review of the literature on correcting misinformation, Lewandowsky et al. (2012) showed that correcting people’s false beliefs rarely has an impact on eliminating the adherence to false beliefs and assumptions. They also argued that even though people understand the retraction, correcting false beliefs is still ineffective (Lewandowsky et al. 2012).
One of the reasons why people fail to refute personal beliefs and assumptions is explained by the backfire effect (Ecker et al. 2017; Swire et al. 2017). The backfire effect is a cognitive bias that causes people’s background assumptions to get stronger when they encounter contradictory evidence (Nyhan and Reifler 2010, 2015). In other words, the backfire effect means that showing people scientific claims and evidence which prove that they are wrong is often ineffective because it causes them to support their original assumptions more strongly than they previously did (Nyhan and Reifler 2010; Trevors et al. 2016). It is an important phenomenon because it derails critical thinking skills. The backfire effect is the very heart of how people negotiate between scientific ideas and their background assumptions (Sinatra et al. 2014).
In 2010, Nyhan and Reifler designed a study to test the backfire effect. The researchers created an article that included a very common misconception about certain issues in politics. Participants were first asked to read a fake article and then another article that corrected the fake article. Participants with a certain ideological belief strongly disagreed with the correct article while they articulated stronger beliefs about the fake article. In that study, corrections failed to reduce misconceptions among the targeted ideological group. The same researchers designed the same experiment about other controversial topics such as tax cuts and stem cell research. They concluded that corrections that contradicted participants’ beliefs caused background assumptions to get stronger (Nyhan and Reifler 2010).
The same researchers also conducted a study that examined people’s beliefs about vaccination against the flu. They showed that when people who believe that vaccine is unsafe are provided with correct information challenging their beliefs, misconceptions about vaccination among the group increased (Nyhan and Reifler 2015). Another study examined parents’ intent to vaccinate their children (Nyhan et al. 2014). The researchers found that corrective information (pro-vaccination messages) decreased intent to vaccinate among parents who had the most negative attitudes toward vaccines. Nyhan et al. (2014) concluded that “respondents brought to mind other concerns about vaccines to defend their anti-vaccine attitudes, a response that is broadly consistent with the literature on motivated reasoning about politics and vaccines” (p. 840).
Supporting the findings of Nyhan and Reifler (2010, 2015) and Nyhan and colleagues (2014), other researchers have concluded that even though people understand the rationale for retraction, corrections are still ineffective (Lewandowsky et al. 2012). Correcting widespread misinformation has little effect on the ways people act and think (Sides and Citrin 2007), and the arguments that reinforce people’s background beliefs are favoured while the ones that contradict their views are disparaged (Taber and Lodge 2006). Additionally, a review of research by Tippett (2010) on refutation texts in science education showed that reading a refutation text that explicitly challenges and refutes students’ naïve conceptions seemed to be useful for improving students’ conceptual understanding but the review also pointed out that a refutation text alone is not enough to change or improve students’ misconceptions (Tippett 2010).
On the other hand, some researchers (e.g., Crozier and Strange 2019; Haglin 2017; Wood and Porter 2017) have argued that the backfire effect is not as strong as had been claimed in the literature (e.g., Lewandowsky et al. 2012; Nyhan and Reifler 2015). Crozier and Strange (2019) found no evidence for a backfire effect in their study in which they evaluated the effects of corrections on reliance on misinformation. They found that corrections can decrease individuals’ reliance on misinformation (Crozier and Strange 2019). The researchers also argued that the format of corrections (the frequency of exposures to the corrections, the activation of the misinformation and its correction simultaneously, etc.) has a key role in its effectiveness (Crozier and Strange 2019). Replicating the Nyhan and Reifler (2015) corrective information experiment with a different population, Haglin (2017) also found no support for a backfire effect from corrections of misinformation and highlighted the importance of investigating the specific conditions and individuals affected when a suspected backfire effect occurs. According to the literature discussed, we still need more evidence to figure out whether corrections are a successful strategy for combatting misinformation or misbeliefs. It is important to make it clear that whether the backfire effect exists or not is not the focus of this paper. With the actual purpose of this piece in mind, I now turn to different forms of the backfire effect.
The Backfire Effect and Reasoning:
Two forms of the backfire effect cause the denial of scientific knowledge: the familiarity backfire effect (Swire et al. 2017) and the overkill backfire effect (Ecker et al. 2019). The familiarity backfire effect occurs when people remember misinformation rather than its inaccuracy as a result of getting exposed to misinformation frequently (Swire et al. 2017). This effect can influence the way people respond to pseudo-scientific arguments (Hansson 2017b). The overkill backfire effect occurs when people reject multiple complex scientific explanations for certain phenomena that are difficult to understand and process (Ecker et al. 2019). This shows that people tend to engage in simple and easy explanations. When people are presented with a complicated scientific explanation, the overkill backfire effect may cause them to reject that explanation and to stick to their simple misconceptions (Chater 1999; Lombrozo 2007).
The backfire effect explains why people confirm their own biases even though they have heard about scientific facts and observed scientific phenomena and why they reject scientific information and create counterarguments against empirical evidence. Additionally, the backfire effect can help us understand and explain why the way science is traditionally taught is not successful at eliminating science denial. In a traditional classroom setting, students who deny scientific facts and conclusions are usually provided with complex explanations that aim to convince students and correct their false beliefs and assumptions. Science instruction should encourage students, citizens of the future, to differentiate selective use of evidence, what Hansson (2017b) called “cherry-picking” or what Sinatra et al. (2014) called “motivated reasoning”, from accuracy-oriented scientific reasoning. It does not mean that there is no motivated reasoning in science. For instance, Mizrahi (2015) discussed some examples of confirmation bias from the history of science. Rather, it means that science instruction should emphasize the differences between deliberate thoughts and intuitive thoughts as students learn about methods of reasoning (Short et al. 2019).
The understanding of scientific reasoning is one of the three dimensions of scientific literacy (Fasce and Picó 2019). The understanding of scientific reasoning means a public understanding of the way(s) scientific knowledge is developed in terms of sociological, philosophical, and historical aspects of science (Fasce and Picó 2019). Students should understand scientific reasoning and separate scientific reasoning from motivated reasoning. Scientific reasoning has a logical nature based on some principles. There are some ways to decide how much confidence we should place in scientific explanations: deduction, induction, and abduction (inferences to the best) (Okasha 2002). These three forms of logical inference are important for understanding how we, human beings, think and how we make meaning out of the world around us. While reasoning, we look at the premises and draw conclusions based on the premises through deduction, induction, and abduction.
The first form of logical inference is deductive reasoning. With deduction, our conclusions must be true as long as the premises are true (Okasha 2002). Deductive inferences move from the general to the specific (Jaipal 2009). An example of deductive reasoning, or inference, in Okasha (2002, p. 18) is the following:
All Frenchmen like red wine.
Pierre is a Frenchman.
Therefore, Pierre likes red wine.
If the premises are true in the first two statements, then the conclusion must be true. The most important feature of deductive inferences is that their premises are general and their conclusions are more specific.
The second form of inference is inductive reasoning. In induction, the premises do not entail the conclusion (Okasha 2002). Here is an example of inductive reasoning from Okasha (2002, p. 19):
The first five eggs in the box were rotten.
All the eggs have the same best-before date stamped on them.
Therefore, the sixth egg will be rotten too.
It is possible that even if the premises of this inference are true, the conclusion can be false. The reason is that we move from specific observations about objects or events we have examined (i.e., the first five eggs) to generalizations about objects or events that we have not examined (i.e., the rest of the eggs in the box).
With deduction, we can be certain if we begin with true premises, we will come to a true conclusion. With induction, we cannot be so confident because inductive inferences can possibly take us from true premises to a false conclusion (Okasha 2002). Even though inductive reasoning is weaker than deductive reasoning, much scientific research and reasoning in everyday life is carried out inductively. Consider the following examples in Okasha (2002). An example of inductive reasoning in everyday life is as follows.
… when you turn on your computer on the morning, you are confident it will not explode in your face. Why? Because you turn on your computer every morning, and it has never exploded in your face up to now. The premises of this inference do not entail the conclusion. (Okasha 2002, p. 20)
So how do scientists use inductive reasoning? Consider this example.
… geneticists tell us that Down’s syndrome (DS) sufferers have an additional chromosome. How do they know this? The answer, of course, is that they examined a large number of DS sufferers and found that each had an additional chromosome. They then reasoned inductively to the conclusion that all DS sufferers, including ones they had not examined, have an additional chromosome. (Okasha 2002, pp. 20–22)
Some philosophers such as David Hume and Karl Popper denied the existence and importance of inductive reasoning in science by arguing that inductive inferences are not justifiable because we cannot make sure that phenomena that we have not experienced will resemble those that we have experienced in the past (Okasha 2002). However, we know that inductive reasoning is a perfectly sensible way of forming beliefs about the world around us by making our inferences quite probable.
The third form of logical inference is called abduction (inference to the best explanation). Abductive inference makes a similar jump to the logic of the inductive syllogism but the abductive inference is fallible. Consider the following example that Okasha (2002, p. 29) offers:
The cheese in the larder has disappeared, apart from a few crumbs.
Scratching noises were heard coming from the larder last night.
Therefore, the cheese was eaten by a mouse.
In this case, the premises do not entail the conclusion. However, with the available data, the inference is reasonable. If we obtain more data, we can make the reasoning stronger. Scientists (doctors and detectives as well) use abduction—drawing a conclusion that best explains a state of events from a set of possible scenarios, rather than solely based on evidence provided in the premises. Within this context, scientists’ theories provide strong evidence for their claims. In addition to inferences, many scientific laws and theories are expressed in terms of probability (probabilistic reasoning) such as Mendelian genetics arguing that there is a 50% chance that any gene in your mother (and father) will be in you. “Probability provides a continuous scale from poor theories with low probability to good theories with high probability” (Lakatos 1998, p. 22). The importance of probabilistic reasoning in understanding and accepting polarizing scientific ideas (e.g., evolution) is also highlighted in the literature (e.g., Fiedler et al. 2019; Lenormand et al. 2009).
Learning about the three forms of logical inferences discussed above is important to distinguish between motivated reasoning and scientific reasoning and to address science denial. As Hand et al. (1999) suggested, logical reasoning is important because “science distinguishes itself from other ways of knowing and from other bodies of knowledge through the use of empirical standards, logical arguments, and scepticism to generate the best temporal explanations possible about the natural world” (p. 1023). The way we make inferences through deduction, induction, and abduction shows that even though scientific knowledge is temporary and uncertain, it is highly probable and it is subject to change as we collect more evidence (Hand et al., 1999; Okasha 2002). In contrast, motivated reasoning relies on selectively interpreting evidence and leads to preferred inferences.
Making logical inferences while evaluating claims and evidence is one of the critical thinking abilities (Paul 1995). As one might infer from the nature of science literature, students have limited ability to evaluate scientific claims and evidence. One reason is that science instruction in K-12 does not facilitate engaging in aspects of scientific inquiry and practices about evaluating the strengths and limitations of the evidence and developing scientific arguments (Banilower 2019). Banilower (2019) provides an interesting finding from the study as follows:
Fewer than a quarter of secondary science classes have students, at least once a week, pose questions about scientific arguments, evaluate the credibility of scientific information, identify strengths and limitations of a scientific model, evaluate the strengths and weaknesses of competing scientific explanations, determine what details about an investigation might persuade a targeted audience about a scientific claim, or construct a persuasive case. (Banilower 2019, p. 204)
The absence of logical inferences may add strength to the backfire effect by leading to the retrieval of thoughts that support one’s background beliefs and assumptions. It means that “when we think we are reasoning, we may instead be rationalizing” (Mooney 2011, para. 11). Rationalization involves deciding what evidence to accept based on the preferred conclusion—motivated reasoning (Bardon 2020). In contrast, scientific reasoning requires using critical thinking skills to determine which explanation(s) represents the best answer to our question based on evidence (Lawson 1999).
As discussed earlier, when we encourage students to engage in evaluating evidence that has the potential to threaten their background assumptions and beliefs, science denial might become more entrenched. One reason is that people tend to look for evidence which confirms their beliefs and background assumptions (Druckman and McGrath 2019). Referring to this point, one may ask whether we should avoid discussing scientific evidence that may conflict with students’ worldviews while teaching controversial topics in science in order to not enable science denial. How can science educators address science denial in the classroom? How can science educators make scientific claims and evidence sticky so that students remember what they read or observe and try to evaluate their background assumptions? The answers to these questions are complicated. Regarding these questions, the following paragraphs discuss the intersections between the ways science should be taught and the suggestions for addressing science denial and the backfire effect.
Science Denial, the Backfire Effect and Science Teaching
It seems that pedagogical suggestions for avoiding the backfire effect and dealing with science denial are inconclusive and contradictory. Regarding the fact that there is a strong relationship between background assumptions and science denial or acceptance (Mazur 2004), Nyhan and Reifler (2010) and Cook and Lewandowsky (2011) suggested that when educators present counter-evidence, they should acknowledge students’ background assumptions (e.g., political ideologies, religious beliefs). On the other hand, there are some suggestions on how to discuss controversial issues by avoiding considering students’ background assumptions. Consider the following excerpt showing how we should be careful while teaching about climate change:
… in a polarized political landscape, talking about politicians and the decisions they make is counterproductive. Students may put their guard up, thinking that I’m partisan, and tune me out when I’m lecturing about other things, such as climate modeling. So, I made a conscious decision to change my approach to teaching the subject. As part of my modified strategy, I joined a local bipartisan group that aims to bring people together by emphasizing the potential consequences, rather than causes, of climate change. (Kannan 2019, p. 1042)
This example suggests that leaving politics out of the classroom while discussing polarizing issues in science is considered as an important attempt to prevent science denial and to avoid threatening students’ worldviews. So, should we acknowledge students’ background assumptions? It is not clear how educators should go about reconciling the advice in their classroom.
Another example of contradictory advice to educators can be seen in Cook and Lewandowsky (2011). The authors suggested that if teachers aim to debunk misbeliefs about scientific phenomena, they should begin by emphasizing the scientific facts, not the misbeliefs. The goal should be to increase students’ familiarity with scientific facts (Cook and Lewandowsky 2011). Even though this bit of advice seems to work for specifically combating the familiarity backfire effect discussed earlier, it still invites the backfire effect, in general, described by Nyhan and Reifler (2010, 2015) and Nyhan and colleagues (2014).
Moreover, when we compare what the literature on how to teach science and what to teach about science says with the suggested ways of avoiding the backfire effect and science denial, we see conflicting ideas on these issues. Duschl and Osborne (2002), for instance, argued that science instruction should focus on “how we know what we know and why we believe the beliefs of science to be superior or more fruitful than competing viewpoints” (Duschl and Osborne 2002, p. 43). Even though this statement refers to the importance of the epistemic aspect of understanding scientific practices, it seems to neglect what might happen when students are provided with the idea that the scientific way of knowing is superior to other ways of knowing, and triggering a possible backfire effect.
Emphasizing the role(s) of an epistemic understanding of knowledge production in science might be a fruitful way to avoid the backfire effect while learning and teaching polarizing scientific issues. Using Duschl (2008)’s framing of epistemic and conceptual aspects of science learning, I define the epistemic understanding of knowledge production in science as the consideration of multiple perspectives and contexts (social, cultural, historical, linguistic, etc.) while evaluating or challenging evidence and claims. The integration of the epistemic understanding of how to develop and evaluate scientific knowledge into scientific practices is one of the more important goals for science learning defined by Duschl (2008). This goal can be accomplished by facilitating a dialogical discourse through which learners have a chance to evaluate claims and evidence to make inferences about the natural world (Duschl 2020). Even though the literature on the importance of the epistemic understanding in science classrooms is well-established, its potential role in preventing or fostering science denial and the backfire effect is not adequately discussed in the field of science education. There are some areas that need to be focused on and investigated for their potential to combat science denial and the backfire effect while foregrounding the role(s) of the epistemic understanding of knowledge production for science instruction. These areas include expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different frameworks, teaching about the power and limitations of science, and bringing different and similar ways science is done to students’ attention.
First, educators can encourage expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time. Expanding ways of knowing involves acknowledging knowledge that is value-based and cultural not only empirical. The scientific way of knowing produces knowledge (I will call this type of knowledge scientific knowledge) through specific practices (observation, experimentation, logical inference, etc.). Scientific knowledge tries to explain the natural world by focusing on individual parts. On the other hand, traditional knowledge, indigenous knowledge, or local knowledge (I use these terms interchangeably here) refers to other ways of knowing embedded in the cultural traditions, beliefs, and attitudes of specific communities. The production of this type of knowledge also includes observations, predictions, and problem-solving (Snively and Corsiglia 2001). However, the way of producing traditional knowledge is not always systematic. Additionally, the traditional ways of knowing try to understand the natural world more holistically by observing the interactions between all of the parts of a phenomenon. Consider this example. Cobern and Loving (2001) shared the following conversation between a researcher working at a scientific station on a South Pacific Island and an indigenous islander:
The islander commented that Westerners only think they know why the ocean rises and falls on a regular basis. They think it has to do with the moon. They are wrong. The ocean rises and falls as the great sea turtles leave and return to their homes in the sand. The ocean falls as the water rushes into the empty nest. The ocean rises as the water is forced out by the returning turtles. (Cobern and Loving 2001, p. 51)
As another example of other ways of knowing, Foucault (1970) mentioned a Chinese encyclopaedia in which animals are divided into groups: “(a) belonging to the Emperor, (b) embalmed, (c) tame, (d) sucking pigs, (e) sirens, (f) fabulous, (g) stray dogs, (h) included in the present classification, (i) frenzied, (j) innumerable, (k) drawn with a very fine camelhair brush, (l) etcetera, (m) having just broken the water pitcher, and (n) that from a long way off look like flies” (p. 16). For another example, an indigenous group, called Tao (or Yami) people, living on Orchid Island (Lanyu) located near South-East Taiwan, has a different taxonomy where fish are grouped into two main classes: edible and inedible fish (Wang 2012). The inedible fish are like fish without scales such as eels. The edible fish are further divided into different groups: old people fish (only to be consumed by elders), men fish (prohibited to women), and women fish (for all to consume). This kind of classification is based on the different purposes fish are used for in the community. The indigenous classification method is motivated by the protection of natural diversity and ecosystem while scientific classification aims to inform the user as to what the relatives of the taxon are hypothesized to be (M.-Y. Lin, personal communication, September 14, 2020). For instance, the reason Tao people do not eat eels (and classify it as inedible fish) is that the eels dredge the headwater of the taro fields and hunt pests (Wang 2012). These three examples of other ways of knowing show that knowledge is produced within specific contexts, with specific purposes, and with specific methods.
The literature in the sciences and science education has emphasized and valued expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing without focusing on science denial and the backfire effect. As an example of acknowledging other ways of knowing, Behrens (1989) examined the correspondence between Shipibo, an indigenous group in the Peruvian Amazon, soil categories, and Western pedology (a branch of soil science) to understand soil-plant associations and agricultural productivity. There are also many studies about how educators can acknowledge different ways of knowing in their science teaching practices (see Barba 1995; Loving 1991; Ogawa 1995). Ogawa (1995), for instance, argued that bringing a multiscience perspective in science classrooms helps students understand more than one view simultaneously and discuss how and why some natural phenomena can be interpreted similarly or differently in different contexts. For another example, Loving (1991) proposed a model called the Scientific Theory Profile to help science teachers develop an understanding of the nature of science and evaluate scientific explanations and theories within cultural contexts. Even though these studies provide insights into what expanding ways of knowing might look like in practice and how it might be useful to facilitate the epistemic understanding of knowledge production in science, they do not discuss the potential of fostering science denial and the backfire effect instead of avoiding it.
The proponents of diverse perspectives in explaining natural phenomena argue that scientific way of knowing and other ways of knowing should be viewed as co-existing or parallel (e.g., Cobern and Loving 2001; Snively and Corsiglia 2001) rather than competing viewpoints. This is true. One reason is that different ways of knowing might be useful in different social or cultural contexts and lead to different consequences and decision-making processes (Feinstein and Waddington 2020). It is also important to note that these different ways of knowing are not equal. It means that knowledge-building encompasses multiple ways of origins, practices, logical conclusions, rationales, and methods. Here, the intent of this paper is not to discuss whether or not other ways of knowing are classified as scientific knowledge or science. The answers to this question in the science education literature are not in agreement with one another (for detailed discussions see Cobern and Loving 2001; Snively and Corsiglia 2001; Southerland 2000; Stanley and Brickhouse 1994).
Potential Impact on Students’ Learning
What we educators can do by expanding ways of knowing is to consider the epistemological pluralism and the ability to wisely differentiate scientific knowledge from other ways of knowing in light of logical inferences, use of evidence, systematic observation, etc. (Cobern and Loving 2001). By doing so, educators provide a way of distinguishing reliable knowledge claims from unreliable ones (Laudan 1996). Different ways of knowing can contribute to our explanations about the world (Snively and Corsiglia 2001) and work in consort because different ways of knowing may be important in different situations. Expanding ways of knowing provides students with a chance to see how the practice of science may utilize the insights of another domain of knowledge (Cobern and Loving 2001). Science instruction should “value knowledge on its many forms and from its many sources” (Cobern and Loving 2001, p. 63) so that students feel free to bring different perspectives and ways of knowing to their classroom and discuss them.
Second, students should be able to compare claims and arguments that derive from different frameworks or domains of knowledge. To do so, it is important to know how to engage in scientific practices such as making inferences, generating and evaluating explanations, and making observations. Teaching students about “methods for posing questions about science, scientific models for serious thinking about science, understandings about aspects of scientific inquiry, and a sceptical orientation regarding ways that science is characterized in curriculum materials and instruction” might be a good way to guide them to develop and evaluate arguments and counterarguments (Kelly 2014, p. 1368).
Constructing a counterargument that successfully weakens the force of others’ arguments is a challenging task for students (Kuhn 2010). In her study, Kuhn highlighted two important implications for learning and teaching about scientific argumentation: (a) students should be encouraged to develop alternative arguments based on evidence against the opponents’ argument rather than critiquing the opponents’ arguments and threatening their beliefs and assumptions. (b) There are two main ways of making use of evidence in argumentation: the support strategy—using the evidence to support one’s claim, and the challenge strategy—using the evidence to challenge the other’s claim. Educators tend to avoid using the term argument in the classroom because of fear that argument may be associated with negative concepts and senses in students’ minds. However, developing arguments and counterarguments are key components of critical thinking and it creates an opportunity for students to make use of their skills of analysis, synthesis, and evaluation (Osborne and Patterson 2011). An example that fits this argument would be the curriculum introduced in 2016 in Finland that requires students to think critically, interpret, and evaluate all the information they encounter across all subjects. Henley (2020) reports on how the national curriculum aims to accomplish this goal in Finland as follow:
In maths lessons, … pupils learn how easy it is to lie with statistics. In art, they see how an image’s meaning can be manipulated. In history, they analyse notable propaganda campaigns, while Finnish language teachers work with them on the many ways in which words can be used to confuse, mislead, and deceive. (Henley 2020, para. 4)
This is one way of providing students with the necessary skills and methods to evaluate claims and evidence without leading to any conflicts and threats. As reported by Henley from his personal communication with Mikko Salo, a member of the European Union’s independent high-level expert group on fake news, “It’s about trying to vaccinate against problems, rather than telling people what’s right and wrong. That can easily lead to polarisation” (Henley 2020, para. 23).
Third, students should learn about both the power and limitations of science to engage with the epistemic aspect of knowledge production in science. Even though the programme of study for 14–16-year-old students in England contains an acknowledgement that students are taught about the “power and limitations” of science (Department of Education 2014, p. 5), it is argued in the literature that school science does not explicitly and efficiently teach that argumentation is associated with uncertainty—being unsure and lacking knowledge or evidence (Chen et al. 2019). Researchers showed that an individual’s political attitudes, beliefs, and worldviews are strongly related to the level of tolerance of uncertainty (Jost et al. 2003; Pennycook et al. 2012). For instance, conservatives are less likely to tolerate uncertainty (Deppe et al. 2015). (A caveat should be noted: Denial is not a problem for only conservatives. Kahan et al. (2011) have found that liberals are less likely to accept a hypothetical expert consensus on nuclear waste disposal and handgun regulations). Uncertainty is one of the factors that trigger science denial that educators encounter while teaching and learning about hot button issues. Chen et al. (2019) proposed a way of productively managing uncertainty in the classroom: raising uncertainty—expressing confusion and seeing other ideas to problematize a phenomenon, maintaining uncertainty—facilitating a discussion by which students can deepen their scientific reasoning with evidence, and reducing uncertainty—synthesizing alternative ideas, looking for inconsistencies among them, and connecting them to each other. This way helps teachers facilitate students’ epistemic understanding of knowledge production to manage uncertainty and prevents students from constructing motivated reasoning.
Lastly, science educators can bring different and similar ways science is done to their students’ attention to emphasize epistemic understanding. For instance, historical (e.g., palaeontology, historical geology, archaeology) and experimental sciences (e.g., physics, chemistry, astronomy) use distinct ways of producing scientific knowledge and reasoning. Historical sciences focus on explaining observable phenomena in terms of unobservable causes by using retrodiction, abduction, reasoning from analogy, and multiple working hypotheses (Gray 2014). In contrast, experimental sciences engage in making predictions and testing these predictions in controlled laboratory settings by focusing on hypotheses, experiments, controls, and variables. In addition to the differences between historical and experimental sciences, it is also important to highlight that even though historical science hypotheses and methods are usually associated with fields such as palaeontology and archaeology, we can see historical hypotheses and methods in geology, planetary science, astronomy, and astrophysics—such as continental drift, the meteorite impact extinction of the dinosaurs, and the big bang origin of the universe hypotheses (Cleland 2001). The epistemological and methodological differences and similarities between historical and experimental sciences are important since background assumptions and beliefs about historical science claims can have important consequences (e.g., creationist critiques of evolution) (Gray 2014). Just because historical sciences cannot replicate unobservable causes in laboratory settings, it is not true to assume that the way historical scientists do science is inferior to the way experimental sciences produce knowledge and make inferences (Cleland 2001), and that historical sciences are more subject to denial.
For another example of different ways of doing science, scientists working on the same problem and with the same data can arrive at different conclusions. In a recent study (Silberzahn et al. 2018), 29 research teams (a total of 61 researchers) from 13 countries with a variety of research backgrounds including Psychology, Statistics, Research Methods, Economics, Sociology, Linguistics, and Management were provided with the same set of data and asked to answer the same question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Twenty of the teams found a statistically significant relationship between a player’s skin color and the likelihood of receiving a red card. Nine teams found no significant relationship at all. The researchers came to different conclusions because they used different statistical models and took different variables from the data set into account. It is clear that their analyses led to somewhat subjective decisions about the best statistical model to use and which variables should be included in the analyses. Silberzahn et al. (2018) concluded that “many subjective decisions are part of the research process and can affect the outcomes” (p. 354). As an important consequence, this variability in analytic approaches and conclusions is likely to affect decision-making processes. With this illustrative example in mind, it is important for teachers to consider different analytical tools and methodologies used in science and how these differences lead to diverse viewpoints while they engage students with using and interpreting scientific evidence and making inferences in classrooms.
These four areas discussed above are promising and are open to further investigations to evaluate their potential to combat science denial and the backfire effect while facilitating the epistemic understanding of how we know and what know about the natural world around us. The reason these areas are important to focus on is that they can address the sociological characteristics of science denial(ists), such as considering scientific theories as threats, finding scientific ideas difficult to understand, and disseminating false beliefs, assumptions, and ideologies in the public (see Table 1), and provide some insights into how to deal with science denial and the backfire effect. For instance, expanding ways of knowing can take the familiarity backfire effect into account while providing students with diverse perspectives on the same phenomenon. Encountering different ways of knowing, students can have a chance to access to and discuss a vast array of ideas instead of getting exposed to the same (mis)beliefs frequently. Moreover, if students would like to challenge some ideas, they need to learn how to develop counterarguments based on evidence rather than solely targeting other ideas just because these ideas contradict with their background assumptions. Additionally, teaching students about how knowledge is produced (different ways of logical reasoning, different methodologies, etc.) before teaching them scientific ideas themselves may prevent the overkill backfire effect. To do so, educators can explain why there are multiple explanations on the same phenomenon and why the ways science is done seem to be complicated processes that may lead to uncertainty or inconclusive evidence. The most important point of zooming in on these four areas can potentially provide learners, scientifically literate citizens, with opportunities to reflect on their background assumptions, beliefs, ideologies, and cultural resources while negotiating and distinguishing between different ways of knowing and evaluating the credibility of claims and evidence.
Conclusions and Discussion
With a focus on science denial, this paper brings the backfire effect to the attention of science educators and science education researchers and discusses the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. In order to investigate the potential role(s) of epistemic understanding of knowledge production in confronting the denial of scientific ideas and mitigating the influence of the backfire effect, the current paper suggest taking a close look at expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different domains of knowledge, recognizing the power and limitations of science, and learning about different ways science is done.
Given these four areas to seek effective ways of dealing with science denial in science classrooms, it may seem that the suggested areas for further explorations are based on the nature of science rather than the specific ways of combating the backfire effect. There are two main reasons for that. First, the literature on debunking misinformation and avoiding the backfire effect has offered contradictory advice (e.g., emphasizing scientific facts not (mis)beliefs vs. acknowledging students’ beliefs). This literature also falls short in providing educators with practical ways of implementing these strategies. For example, how can educators acknowledge students’ beliefs and values while presenting a counterargument or scientific fact? How can educators balance a discussion of different ways of knowing without opening the door to science denial? What forms of knowing or knowledge production should be admitted to science classrooms? Should educators care about the correctness of different ways of knowing at all? Or should they focus on how different ways of knowing are useful in different contexts?
Second, even though cognition-oriented research findings in the field of science education (e.g., conceptual change pedagogies such as cognitive conflict pedagogies) have provided insights on the processes of how students reconstruct their knowledge and understanding (Chinn and Malhotra 2002; diSessa 1993; Vosniadou 2002), we still do not know what steps students follow to achieve a meaningful conflict while they reconstruct their prior knowledge, beliefs, and values (Limón 2001). As an example, despite the fact that cognitive conflict—confronting learners with contradictory information—has a long history as a suggested strategy for supporting learning and teaching in science education, it has had less success in classroom implementations than expected and has led to conflicting results as well (e.g., Limón and Carretero 1997). One reason is that many educators do not know how to facilitate a meaningful cognitive conflict in classrooms (Limón 2001). Several models and theories on conceptual change focus only on the cognitive processes of individuals and underestimate the importance of epistemological beliefs, values, attitudes, and reasoning strategies (Limón 2001). Moreover, it seems that these models and theories neglect the consequences of inducing conflict by providing anomalous and contradictory information, situations which ignite the backfire effect. The given perspectives from these two areas, the literature on debunking misinformation and how students reconstruct their knowledge through a meaningful conflict, might be complementary but neither is sufficient alone to provide fruitful strategies to avoid the backfire effect and science denial and promote meaningful conflict while learning and teaching about controversial issues in science.
With regard to the potentially fruitful areas discussed earlier, the epistemic understanding of knowledge production in science is not a panacea, or a one-size-fits-all solution. However, the epistemic understanding of knowledge production in science seems to be relevant to lead students to consider different perspectives and sources of knowledge and knowing on polarizing scientific issues rather than dismissing ideas that contradict their knowledge, beliefs, and values. Limitations exist in terms of the role of researchers and educators in addressing science denial and the backfire effect while facilitating epistemic understanding of knowledge production. There are some important questions that we need to ask and to seek answers for. Do educators consider the importance of presenting relevant information to explain scientific phenomena in classrooms? Teachers, for instance, who heavily depend on textbooks to teach science might encounter issues related to the epistemic aspect of knowledge production in science. As Kuhn (1970) pointed out, textbooks are “persuasive” (p. 1) and what is described as science in the textbooks does not fit the way science is done. One may also ask whether we teach students about both the scientific knowledge and the way knowledge is produced. Teaching scientific knowledge before explaining how it is produced can be exemplified by a cart before the horse approach. There is a need, then, for educators and researchers to be conscious of the backfire effect and the nature of scientific knowledge and formulate a comprehensive approach to science denial. Moreover, educators and researchers should pay attention to students’ background assumptions according to their specific contexts. It means that the strategies in dealing with students’ assumptions and beliefs about electrons should be different than their beliefs about hot button issues such as vaccination and global warming (Hodgin and Kahne 2018). It is important to consider different pedagogical approaches based on whether students’ misbeliefs are caused by the absence of knowledge, pseudo-theory promotion, or antipathy towards scientific facts. Regarding the challenges of post-truth and science denial, it would be wise to develop well-focused and empirically grounded strategies to combat with different types of unwarranted beliefs to produce satisfactory instructional outcomes (Fasce and Picó 2019).
Only a handful of studies in political science have analysed the effects of attempts to correct misbeliefs and background assumptions, leading to contradictory research findings. The studies also lack evidence on effective strategies for pedagogical implementations. Little is known about how science educators and researchers approach the backfire effect with polarising issues and science denial within the field of science education. Use of epistemic understanding of knowledge production in science with a focus on avoiding the backfire effect may increase the potential for science education research to produce fruitful strategies and democratic environments which promote divergent perspectives to deepen students’ understanding of how science works. There is a need for science education research to consider the consequences of the backfire effect and develop a program of research or supplemental curriculum to help students use critical and reflective thinking skills within a multidisciplinary context (e.g., natural sciences, political sciences, media and communication studies).
Declarations
Conflict of Interest
The author declares no conflict of interest.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Banilower, E. R. (2019). Understanding the big picture for science teacher education: The 2018 NSSME+. Journal of Science Teacher Education, 30(3), 201–208
- Barba, R. H. (1995). Science in the multicultural classroom: A guide to teaching and learning. Needham Heights, MA: Allyn and Bacon
- Bardon, A. (2020). The truth about denial: Bias and self-deception in science, politics, and religion. New York, NY: Oxford University Press
- Behrens, C. A. (1989). The scientific basis for Shipibo soil classification and land use: Changes in soil-plant associations with cash cropping. American Anthropologist, 91, 83–100
- Boyle R. States are trying to bring science denial to the classroom. 2017. [Google Scholar]
- Chater, N. (1999). The search for simplicity: A fundamental cognitive principle? The Quarterly Journal of Experimental Psychology Section A, 52(2), 273–302
- Chen YC, Benus MJ, Hernandez J. Managing uncertainty in scientific argumentation. Science Education. 2019;103:1235–1276. doi: 10.1002/sce.21527. [DOI] [Google Scholar]
- Chinn CA, Malhotra BA. Children’s responses to anomalous scientific data: how is conceptual change impeded? Journal of Educational Psychology. 2002;94(2):327–343. doi: 10.1037/0022-0663.94.2.327. [DOI] [Google Scholar]
- Cleland CE. Historical science, experimental science, and the scientific method. Geology. 2001;29(11):987–990. doi: 10.1130/0091-7613(2001)029<0987:HSESAT>2.0.CO;2. [DOI] [Google Scholar]
- Cobern WW, Loving CC. Defining “science” in a multicultural world: implications for science education. Science Education. 2001;85(1):50–67. doi: 10.1002/1098-237X(200101)85:1<50::AID-SCE5>3.0.CO;2-G. [DOI] [Google Scholar]
- Cook J, Lewandowsky S. The debunking handbook. St. Lucia, Australia: University of Queensland; 2011. [Google Scholar]
- Crozier WE, Strange D. Correcting the misinformation effect. Applied Cognitive Psychology. 2019;33(4):585–595. [Google Scholar]
- Department of Education. (2014). Science programmes of study: key stage 4. National Curriculum in England. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/381380/Science_KS4_PoS_7_November_2014.pdf
- Deppe, K. D., Gonzalez, F. J., Neiman, J. L., Jacobs, C., Pahlke, J., Smith, K. B., & Hibbing, J. R. (2015). Reflective liberals and intuitive conservatives: a look at the cognitive reflection test and ideology. Judgment & Decision Making, 10(4), 314–331
- diSessa AA. Toward an epistemology of physics. Cognition and Instruction. 1993;10(2-3):105–225. doi: 10.1080/07370008.1985.9649008. [DOI] [Google Scholar]
- Druckman JN, McGrath MC. The evidence for motivated reasoning in climate change preference formation. Nature Climate Change. 2019;9(2):111–119. doi: 10.1038/s41558-018-0360-1. [DOI] [Google Scholar]
- Duschl R. Science education in three-part harmony: Balancing conceptual, epistemic, and social learning goals. Review of Research in Education. 2008;32:268–291. doi: 10.3102/0091732X07309371. [DOI] [Google Scholar]
- Duschl, R. A. (2020). Practical reasoning and decision making in science: Struggles for truth. Educational Psychologist, 55(3), 187–192
- Duschl RA, Osborne J. Supporting and promoting argumentation discourse in science education. Studies in Science Education. 2002;38(1):39–72. doi: 10.1080/03057260208560187. [DOI] [Google Scholar]
- Ecker UK, Hogan JL, Lewandowsky S. Reminders and repetition of misinformation: helping or hindering its retraction? Journal of Applied Research in Memory and Cognition. 2017;6(2):185–192. doi: 10.1037/h0101809. [DOI] [Google Scholar]
- Ecker, U. K., Lewandowsky, S., Jayawardana, K., & Mladenovic, A. (2019). Refutations of equivocal claims: No evidence for an ironic effect of counterargument number. Journal of Applied Research in Memory and Cognition, 8(1), 98–107
- Fasce A, Picó A. Science as a vaccine. Science & Education. 2019;28(1-2):109–125. doi: 10.1007/s11191-018-00022-0. [DOI] [Google Scholar]
- Feinstein NW, Waddington DI. Individual truth judgments or purposeful, collective sensemaking? Rethinking science education’s response to the post-truth era. Educational Psychologist. 2020;55(3):155–166. doi: 10.1080/00461520.2020.1780130. [DOI] [Google Scholar]
- Fiedler D, Sbeglia GC, Nehm RH, Harms U. How strongly does statistical reasoning influence knowledge and acceptance of evolution? Journal of Research in Science Teaching. 2019;56(9):1183–1206. doi: 10.1002/tea.21547. [DOI] [Google Scholar]
- Foucault, M. (1970). The order of things: An archaeology of the human sciences. (A. M. Sheridan Smith, Trans.). New York, NY: Vintage Books
- Gray RON. The distinction between experimental and historical sciences as a framework for improving classroom inquiry. Science Education. 2014;98(2):327–341. doi: 10.1002/sce.21098. [DOI] [Google Scholar]
- Hand, B., Lawrence, C., & Yore, L. D. (1999). A writing in science framework designed to enhance science literacy. International Journal of Science Education, 21(10), 1021–1035
- Haglin K. The limitations of the backfire effect. Research & Politics. 2017;4(3):1–5. doi: 10.1177/2053168017716547. [DOI] [Google Scholar]
- Hansson, S. O. (2017a). Science and pseudo-science. In E. N. Zalta (Ed.). The Stanford encyclopedia of philosophy (Summer 2017 ed.). Retrieved from https://plato.stanford.edu/entries/pseudo-science/#ScD
- Hansson SO. Science denial as a form of pseudoscience. Studies in History and Philosophy of Science. 2017;63:39–47. doi: 10.1016/j.shpsa.2017.05.002. [DOI] [PubMed] [Google Scholar]
- Henley J. How Finland starts its fight against fake news in primary schools. 2020. [Google Scholar]
- Hodgin, E., & Kahne, J. (2018). Misinformation in the information age: what teachers can do to support students. Social Education, 82(4), 208–212
- Jaipal, K. (2009). Meaning making through multiple modalities in a biology classroom: A multimodal semiotics discourse analysis. Science Education, 94(1), 48–72
- Jost JT, Glaser J, Kruglanski AW, Sulloway FJ. Political conservatism as motivated social cognition. Psychological Bulletin. 2003;129(3):339–375. doi: 10.1037/0033-2909.129.3.339. [DOI] [PubMed] [Google Scholar]
- Kahan DM, Jenkins-Smith H, Braman D. Cultural cognition of scientific consensus. Journal of Risk Research. 2011;14(2):147–174. doi: 10.1080/13669877.2010.511246. [DOI] [Google Scholar]
- Kannan R. Sidestepping politics to teach climate. Science. 2019;366(6468):1042. doi: 10.1126/science.366.6468.1042. [DOI] [PubMed] [Google Scholar]
- Kelly, G. J. (2014). Inquiry teaching and learning: Philosophical considerations. International handbook of research in history, philosophy and science teaching (pp. 1363–1380). The Netherlands: Springer
- Kuhn TS. The structure of scientific revolutions. Chicago: University of Chicago Press; 1970. [Google Scholar]
- Kuhn D. Teaching and learning science as argument. Science Education. 2010;94(5):810–824. doi: 10.1002/sce.20395. [DOI] [Google Scholar]
- Lakatos I. Science and pseudoscience. In: Curd M, Cover JA, editors. Philosophy of science: The central issues. New York, NY: W. W. Norton & Company; 1998. pp. 20–26. [Google Scholar]
- Laudan L. Beyond positivism and relativism. Boulder, CO: Westview Press; 1996. [Google Scholar]
- Lawson AE. A scientific approach to teaching about evolution and special creation. American Biology Teacher. 1999;61(4):266–274. doi: 10.2307/4450669. [DOI] [Google Scholar]
- Lenormand T, Roze D, Rousset F. Stochasticity in evolution. Trends in Ecology & Evolution. 2009;24:157–165. doi: 10.1016/j.tree.2008.09.014. [DOI] [PubMed] [Google Scholar]
- Lewandowsky S, Ecker UKH, Seifert C, Schwarz N, Cook J. Misinformation and its correction: continued influence and successful debiasing. Psychological Science in the Public Interest. 2012;13:106–131. doi: 10.1177/1529100612451018. [DOI] [PubMed] [Google Scholar]
- Limón, M. (2001). On the cognitive conflict as an instructional strategy for conceptual change: A critical appraisal. Learning and Instruction, 11(4), 357–380
- Limón, M., & Carretero, M. (1997). Conceptual change and anomalous data: A case study in the domain of natural sciences. European Journal of Psychology of Education, 12(2), 213–230
- Liu DWC. Science denial and the science classroom. CBE-Life Sciences Education. 2012;11:129–134. doi: 10.1187/cbe.12-03-0029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lombrozo T. Simplicity and probability in causal explanation. Cognitive Psychology. 2007;55(3):232–257. doi: 10.1016/j.cogpsych.2006.09.006. [DOI] [PubMed] [Google Scholar]
- Loving, C. C. (1991). The scientific theory profile: A philosophy of science models for science teachers. Journal of Research in Science Teaching, 28, 823–838.
- Mazur A. Believers and disbelievers in evolution. Politics and the Life Sciences. 2004;23(2):55–61. doi: 10.2990/1471-5457(2004)23[55:BADIE]2.0.CO;2. [DOI] [PubMed] [Google Scholar]
- Mizrahi, M. (2015). Historical inductions: New cherries, same old cherry-picking. International Studies in the Philosophy of Science, 29(2), 129–148.
- Mooney, C. (2011). The science of why we don’t believe science. Mother Jones.https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/
- Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330
- Nyhan B, Reifler J. Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine. 2015;33(3):459–464. doi: 10.1016/j.vaccine.2014.11.017. [DOI] [PubMed] [Google Scholar]
- Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133(4), 835–842 [DOI] [PubMed]
- Ogawa M. Science education in a multiscience perspective. Science Education. 1995;79:583–593. doi: 10.1002/sce.3730790507. [DOI] [Google Scholar]
- Okasha, S. (2002). Philosophy of science: A very short introduction. New York, NY: Oxford University Press
- Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95(4), 627–638
- Paul, R. W. (1995). Critical thinking: How to prepare students for a rapidly changing world. Santa Rosa, CA: Foundation for Critical Thinking
- Pennycook G, Cheyne JA, Seli P, Koehler DJ, Fugelsang JA. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012;123(3):335–346. doi: 10.1016/j.cognition.2012.03.003. [DOI] [PubMed] [Google Scholar]
- Pinker, S. (2018). Enlightenment now: The case for reason, science, humanism, and progress. Viking
- Rosenau, J. (2012). Science denial: A guide for scientists. Trends in Microbiology, 20(12), 567–569 [DOI] [PubMed]
- Sandoval WA. Understanding students’ practical epistemologies and their influence on learning through inquiry. Science Education. 2005;89(4):634–656. doi: 10.1002/sce.20065. [DOI] [Google Scholar]
- Short SD, Lastrapes KA, Natale NE, McBrady EE. Rational engagement buffers the effect of conservatism on one’s reported relevance of the theory of evolution. Journal of Research in Science Teaching. 2019;56:1384–1405. doi: 10.1002/tea.21559. [DOI] [Google Scholar]
- Sides J, Citrin J. Paper presented at the 2007 annual meeting of the Midwest Political Science Association, Chicago, IL. 2007. How large the huddled masses? The causes and consequences of public misperceptions about immigrant populations. [Google Scholar]
- Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E. C., et al. (2018). Many analysts, one dataset: Making transparent how variations in analytical choices affect results. Advances in Methods and Practices in Psychological Science, 1(3), 337–356
- Sinatra, G. M., Kienhues, D., & Hofer, B. K. (2014). Addressing challenges to public understanding of science: Epistemic cognition, motivated reasoning, and conceptual change. Educational Psychologist, 49(2), 123–138
- Snively, G., & Corsiglia, J. (2001). Discovering indigenous science: Implications for science education. Science Education, 85(1), 6–34
- Southerland SA. Epistemic universalism and the shortcomings of curricular multicultural science education. Science & Education. 2000;9(3):289–307. doi: 10.1023/A:1008676109903. [DOI] [Google Scholar]
- Stanley WB, Brickhouse NW. Multiculturalism, universalism, and science education. Science Education. 1994;78(4):387–398. doi: 10.1002/sce.3730780405. [DOI] [Google Scholar]
- Swire B, Ecker UK, Lewandowsky S. The role of familiarity in correcting inaccurate information. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2017;43:1948–1961. doi: 10.1037/xlm0000422. [DOI] [PubMed] [Google Scholar]
- Taber CS, Lodge M. Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science. 2006;50:755–769. doi: 10.1111/j.1540-5907.2006.00214.x. [DOI] [Google Scholar]
- Tippett, C. D. (2010). Refutation text in science education: A review of two decades of research. International Journal of Science and Mathematics Education, 8, 951–970
- Trevors, G. J., Muis, K. R., Pekrun, R., Sinatra, G. M., & Winne, P. H. (2016). Identity and epistemic emotions during knowledge revision: A potential account for the backfire effect. Discourse Processes, 53(5-6), 339–370
- Vosniadou S. On the nature of naïve physics. In: Limón M, Mason L, editors. Reconsidering the processes of conceptual change. Dordrecht: Kluwer Academic Publishers; 2002. pp. 61–76. [Google Scholar]
- Wang KC. Animals in Tao’s eco-cultural meanings (蘭嶼動物生態文化) Taiwan: National Chiao Tung University Press; 2012. [Google Scholar]
- Wood, T., & Porter, E. (2017). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior: Forthcoming. 10.2139/ssrn.2819073.