Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
. 2021 Oct 5;12:739070. doi: 10.3389/fpsyg.2021.739070

Counterintuitive Pseudoscience Propagates by Exploiting the Mind’s Communication Evaluation Mechanisms

Spencer Mermelstein 1,*, Tamsin C German 1
PMCID: PMC8523830  PMID: 34675845

Abstract

Epidemiological models of culture posit that the prevalence of a belief depends in part on the fit between that belief and intuitions generated by the mind’s reliably developing architecture. Application of such models to pseudoscience suggests that one route via which these beliefs gain widespread appeal stems from their compatibility with these intuitions. For example, anti-vaccination beliefs are readily adopted because they cohere with intuitions about the threat of contagion. However, other varieties of popular pseudoscience such as astrology and parapsychology contain content that violates intuitions held about objects and people. Here, we propose a pathway by which “counterintuitive pseudoscience” may spread and receive endorsement. Drawing on recent empirical evidence, we suggest that counterintuitive pseudoscience triggers the mind’s communication evaluation mechanisms. These mechanisms are hypothesized to quarantine epistemically-suspect information including counterintuitive pseudoscientific concepts. As a consequence, these beliefs may not immediately update conflicting intuitions and may be largely restricted from influencing behavior. Nonetheless, counterintuitive pseudoscientific concepts, when in combination with intuitively appealing content, may differentially draw attention and memory. People may also be motivated to seek further information about these concepts, including by asking others, in an attempt to reconcile them with prior beliefs. This in turn promotes the re-transmission of these ideas. We discuss how, during this information-search, support for counterintuitive pseudoscience may come from deference to apparently authoritative sources, reasoned arguments, and the functional outcomes of these beliefs. Ultimately, these factors promote the cultural success of counterintuitive pseudoscience but explicit endorsement of these concepts may not entail tacit commitment.

Keywords: epistemic vigilance, pseudoscience, counterintuitive concepts, memory, social transmission, astrology, parapsychology, epidemiology of representations

Introduction

Pseudoscience—claims that take on the guise of scientific knowledge but lack evidentiary support or theoretical plausibility—is pervasive. At least 40% of Americans, for example, believe in extra-sensory perception and 25% believe that the position of the stars affects life on Earth (Moore, 2005). Pseudoscience can be harmful. The proliferation of anti-vaccination sentiments undermines public health campaigns (Larson et al., 2011) and misinformation about global climate change reduces support for mitigation efforts (van der Linden et al., 2017). Understanding the psychological appeal and social transmission of pseudoscience is therefore critical for informing attempts to reduce the impact and spread of these beliefs.

Blancke and De Smedt (2013), Boudry et al. (2015), and Blancke et al. (2017), Blancke et al. (2019) recently advanced a model accounting for the ubiquity of pseudoscience. Drawing on Sperber’s (1994,1996) epidemiological theory of cultural representations, these authors have suggested that many forms of pseudoscience are widespread because they cohere with intuitive ways of thinking. For example, those opposed to vaccination often point to pseudoscientific claims that vaccines might cause autism spectrum disorders or other harm (Poland and Spier, 2010). Miton and Mercier (2015) suggest that vaccines, as they entail injecting (inert) pathogens into the body, tap into disgust intuitions that evolved to protect against exposure to contaminants. Vaccines may then be intuitively viewed as a source of contagion, making anti-vaccination claims centered on harm inherently believable, appealing, and transmissible from mind to mind. Other pseudoscientific beliefs may gain traction by exploiting a variety of cognitive predispositions: Creationism/Intelligent Design is grounded in intuitive teleological reasoning (Kelemen, 2004; Blancke et al., 2017); anti-GMO attitudes are based in essentialist intuitions (Blancke et al., 2015); flat earth beliefs are rooted in naive mental models of a geocentric solar system (Vosniadou, 1994).

Along with pseudoscientific beliefs that might exploit a fit with intuitions, however, are a range of such beliefs that manage to spread despite content that is decidedly counterintuitive. Specifically, these “counterintuitive pseudoscientific” beliefs violate evolved and reliably developing core knowledge intuitions. Documented as early as infancy (Spelke and Kinzler, 2007), core knowledge intuitions structure our basic expectations of physical objects and their mechanics (e.g., Spelke, 1990) and of intentional agents and their mental states (Baillargeon et al., 2016), among other ontological domains. Thus, counterintuitive concepts are not merely unusual but rather are defined by their incompatibility with the foundational distinctions the mind makes in parsing the world. People may nonetheless acquire counterintuitive concepts; indeed, they are widespread throughout religious, scientific, and pseudoscientific belief systems (Boyer, 2001; Baumard and Boyer, 2013; Shtulman, 2017).

Astrology is one example of counterintuitive pseudoscience. Cultures as diverse as the Babylonians, Han Dynasty China, and the Maya each developed sophisticated belief systems and mathematics to divine the purported influence of the planets and stars on people’s personalities and events on Earth (Boxer, 2020). Moreover, astrology remains widespread today despite its contemporary status as a pseudoscience. This is true despite the fact that a central tenet of astrology, that celestial objects can have an influence on people or events on Earth, violates core “folk physics” intuitions that objects cannot act on each other at a distance (Leslie and Keeble, 1987; Spelke, 1990).

Parapsychology, or psi, is a second example of counterintuitive pseudoscience. The belief that psychics, mediums, and clairvoyants have a preternatural ability to read minds, manipulate or view distant objects, or tell the future has ancient roots in cultures around the world (Singh, 2018) and has been the subject of research for over 150 years despite its fundamental disconnect from the sciences (Reber and Alcock, 2020). Again, this is despite the fact that these beliefs violate core “folk psychological” intuitions that a person’s beliefs are constrained by their perceptual capacities: that people are ignorant of events they haven’t seen or heard (Onishi and Baillargeon, 2005; Baillargeon et al., 2016).

The ubiquity of pseudoscience that contains such drastically counterintuitive elements is potentially surprising from a cultural epidemiology perspective. One reason follows from the suggestion that the prevalence of a belief in a population may depend in part on its fit with intuitive ways of thinking (Sperber, 1994, 1996). On this account, information that is consistent with intuitions is generally more likely to persist across repeated retellings and become more widespread than counterintuitive information (Kalish et al., 2007; Griffiths et al., 2008; Morin, 2013; Miton et al., 2015).

A second potential obstacle to the spread of counterintuitive content stems from the suggestion that the mind contains a host of mechanisms designed to evaluate and filter communicated information (Sperber et al., 2010; see also Mercier, 2017). One function of these “epistemic vigilance” mechanisms is to assess the plausibility of a message by checking its consistency with prior beliefs. The rudiments of these consistency-checking mechanisms have been documented as early as infancy (Koenig and Echols, 2003), and by age 4, children have been found to reject the claims of others that conflict with their firsthand experiences (Clément et al., 2004) or background knowledge about objects and animals (Lane and Harris, 2015). Counterintuitive information, then, appears to be at a social transmission and believability disadvantage relative to information consistent with cognitive predispositions (Mercier et al., 2019). What then accounts for the cultural success of pseudosciences like astrology and parapsychology?

In the current article, we propose a pathway by which counterintuitive pseudoscience may spread and receive broad endorsement. First we suggest that these beliefs engage the mind’s communication evaluation mechanisms, which largely restrict their influence on behavior. Nonetheless, counterintuitive pseudoscience, as it cannot be fully reconciled with past beliefs, recruits our attention and memory, and triggers a search for more information that may result in the preferential re-transmission of these ideas. During information-search, endorsement of counterintuitive pseudoscience may be bolstered by support from apparently authoritative sources, reasoned arguments, or the functional outcomes of holding such beliefs. Counterintuitive pseudoscience thus achieves cultural prominence by exploiting the mind’s communication evaluation mechanisms but explicit belief in such content may not entail tacit commitment.

The Representational Format of Counterintuitive Pseudoscience

While communication that is consistent with prior beliefs may be readily accepted, counterintuitive pseudoscience is a class of content that should be flagged by epistemic vigilance mechanisms as requiring further monitoring. By hypothesis, inconsistencies between counterintuitive content and pre-existing beliefs trigger epistemic vigilance mechanisms to quarantine that content from those beliefs via a “meta-representational” formatting (Sperber, 1997, 2000; see also Mercier, 2017).

A metarepresentation is a mental data structure that links a proposition to a set of tags that limit the scope of applicability of the information (Cosmides and Tooby, 2000). These tags may take the form of a link to a particular source (Mermelstein et al., 2020), a propositional attitude like certainty or doubt (Leslie, 1987), or a supporting argument (Mercier and Sperber, 2011). For example, the proposition “the stars influence events on earth” may be embedded in the metarepresentation “my friends believe that [the stars influence events on earth].” Encapsulated within contextualizing tags, counterintuitive concepts are prevented from spontaneously updating or interacting with existing beliefs or influencing behavior. Nonetheless, one may still come to explicitly profess belief in counterintuitive concepts, deliberately derive inferences from them, and articulate them to others—but only upon reflection as they cannot be reconciled with conflicting core intuitions (Sperber, 1997).

Epistemic vigilance mechanisms tend to quarantine, rather than outright reject, counterintuitive pseudoscientific beliefs like astrology and parapsychology for two reasons. First, such messages may often be communicated by friends, family, or other influential people. Epistemic vigilance mechanisms are therefore likely to retain these messages (albeit as metarepresentations), given underlying trust in these sources (Sperber, 1997; Sperber et al., 2010; Harris et al., 2018) and social learning biases that motivate people to adopt the beliefs of the successful or prestigious (Henrich and Gil-White, 2001). Relatedly, should a particular counterintuitive concept be widespread in a community, people might at least outwardly endorse such beliefs given that the social cost of rejecting a belief held by their peers may be greater than epistemic costs of harboring them (Hong and Henrich, 2021). Second, epistemic vigilance mechanisms might retain these concepts to aid in the further evaluation of their source and content over time (Mermelstein et al., 2020). Should we later come across information that supports or challenges a given claim, we can then update our judgment of the veracity of the message and the trustworthiness and/or competence of its speaker. Until corroborating evidence is found, we would expect counterintuitive pseudoscientific concepts to remain quarantined as reflectively-held metarepresentations, with consequences for their stability and capacity to influence behavior.

As reflectively-held beliefs, the counterintuitive concepts found in some varieties of pseudoscience may be variable in their specific content (Baumard and Boyer, 2013). Whereas intuition-consistent pseudoscience might coalesce around a small set of cognitively appealing claims (e.g., “vaccine ingredients cause harm”), counterintuitive beliefs such as “psychics know the future” may be subject to differing and possibly idiosyncratic interpretations. Compatible with this suggestion, proponents of psi have put forward a wide range of different accounts for the underlying mechanisms through which these abilities work (Reber and Alcock, 2020). Some accounts, for instance, reference paranormal forces (e.g., a connection to a spirit world), while others may (erroneously) implicate scientific explanations (e.g., quantum mechanics). Without grounding in intuition, the exact content of counterintuitive pseudoscience may be ad hoc; moreover, these beliefs may be inconsistent or contradictory even within the same mind, as has been documented among adherents of conspiracy theories (Wood et al., 2012) and religious beliefs (Slone, 2007).

Another proposed signature of reflectively-held beliefs is that they may coexist alongside the intuitions with which they conflict rather than update or replace them (Sperber, 1997). Indeed, representational co-existence has been documented for counterintuitive concepts found in science (Kelemen and Rosset, 2009; Shtulman and Valcarcel, 2012; Shtulman and Harrington, 2016) and religion (Barrett and Keil, 1996; Barrett, 1998; Barlev et al., 2017, 2018, 2019). Research on the God concept, for example, finds that religious believers accurately describe God’s counterintuitive properties (e.g., omnipresence, omniscience) when explicitly asked, but nonetheless reason as though God possessed human-like psychology and physicality when indexed by implicit measures (Barrett and Keil, 1996; Barrett, 1998). Co-existence also raises the possibility of interference between mutually incompatible beliefs. Barlev et al. (2017, 2018, 2019) asked religious believers to evaluate a series of statements that were consistent or inconsistent in truth-value between intuitions about persons and later-acquired counterintuitive beliefs about God. Participants were slower and less accurate at evaluating inconsistent vs. consistent statements, suggesting that intuitions not only co-exist alongside incompatible beliefs, but also conflict with them. The ongoing tension between core intuitions and counterintuitive concepts suggests that these beliefs, including those found in pseudoscience, may not regularly inform behavior.

An implication of this idea is that counterintuitive pseudoscientific concepts might only be deployed in narrow contexts, giving rise to discrepancies between stated beliefs and everyday behavior (Sperber, 1985; Barrett, 1999; Slone, 2007). While one might state their belief that a psychic can tell the future or even follow their horoscope’s recommendations when making decisions, they might do so only upon reflection or when prompted. Commitment to counterintuitive pseudoscientific beliefs might generally be at a reflective and not an intuitive level. Indeed, such beliefs may be largely decoupled from behavior as a function of epistemic vigilance mechanisms. A typical believer in psi, for instance, would likely make quite different decisions in their life should they implicitly believe that someone could be watching them at any time; the position of the stars and planets may not be one’s initial explanation for another’s behavior but a post hoc rationalization. In contrast, intuition-consistent pseudoscience may have a more direct influence on behavior. Unencumbered by a metarepresentational formatting, anti-vaccination beliefs, for instance, might fluidly translate to vaccine refusal (Miton and Mercier, 2015).

Memory for Counterintuitive Pseudoscience

The memorability of a concept is one predictor of its cultural success: memorable content, all things equal, is more likely to be reproducible and retain fidelity across retellings. Past research suggests that a subset of counterintuitive pseudoscientific beliefs may be mnemonically optimal. Boyer (1994, 2001, 2003) has argued that concepts which are largely consistent with the expectations afforded to ontological categories such as “person” or “object” but for a minimal set of violations of those expectations are particularly attention-grabbing, memorable, and inferentially rich. The concept of a ghost fits this “minimally counterintuitive” template: despite being deceased and capable of passing through solid objects, ghosts are otherwise conceptualized as persons with beliefs and desires. Such striking violations of expectations draw attention as they cannot be fully incorporated into existing beliefs, yet we may still easily imagine and make inferences about ghosts using our knowledge about people. Together, these features make for a differentially memorable combination compared to fully ordinary concepts. Counterintuitive concepts with many violations of expectation (e.g., “a ghost that knows nothing and could never interact with the world”), however, lose their memorability advantage as they cease to hook into existing knowledge and fail to yield many meaningful inferences.

Boyer’s (2001) account has received empirical support from laboratory experiments with adults from across cultures (e.g., Boyer and Ramble, 2001; Nyhof and Barrett, 2001) and with children (Banerjee et al., 2013). Participants in these studies were asked to recall or retell narratives to others, with results demonstrating a memory advantage for minimally counterintuitive (e.g., “a chair that can float in midair”) compared to ordinary (e.g., “a table that can hold a lot of weight”) or very counterintuitive concepts (e.g., “a rock that could give birth to a singing teapot”). The memory advantage for minimally counterintuitive concepts has also been found to extend to the contextual details associated with them, such as their speaker (Mermelstein et al., 2020). Furthermore, analyses of cultural materials such as folktales from around the world reveal that narratives containing minimally counterintuitive concepts tend to be more common than other concept types (Norenzayan et al., 2006; Burdett et al., 2009). Thus, the mind’s attention and memory mechanisms constrain the range of counterintuitive concepts that are likely to be remembered and suitable for cultural success.

We find it likely that popular counterintuitive pseudoscientific beliefs are composed of intuitive content alongside compelling, but limited violations of expectation. The wide range of psi abilities, for example, seem to be relatively narrow modifications of the capacities typically assumed of persons: supernatural mind-reading may be an overextension of everyday mentalizing, telekinesis an overextension of the expectation that mental states can have effects on the world by directing behavior. Psychics and the like, however, are otherwise conceptualized as ordinary people. The famous psychic Uri Geller could ostensibly bend spoons with his mind, but he nonetheless possessed a physical body that needed to eat, sleep, and breathe. Astrological belief systems may similarly package together counterintuitive and intuitive elements. While the claimed linkage between people and the position of the stars may violate intuitions of cause and effect, astrology does seem to feed off the human tendencies to perceive patterns in noise (Whitson and Galinsky, 2008), intuit purpose behind complex natural phenomenon (Kelemen et al., 2012), and stereotype others (Lu et al., 2020).

Future empirical work may investigate whether counterintuitive pseudoscientific content strikes a mnemonic optimum for cultural transmission. Laboratory studies employing serial re-transmission methods could demonstrate that minimally counterintuitive pseudoscientific concepts tend to survive repeated retellings compared to other content. Analyses of cultural materials could map out the degree to which astrologers or psychics draw upon counterintuitive vs. intuitively-appealing content in making their claims.

Social Re-Transmission of Counterintuitive Pseudoscience

The prevalence of a belief in a population, however, depends not only on its memorability but also on individuals being willing to re-transmit it to others. Recent research suggests that people may share counterintuitive concepts with others in an attempt to gather more information about them. Indeed, as early as infancy, violations of expectation have been shown to trigger not only surprise but also information-seeking behavior: Stahl and Feigenson (2015) found that 11-month-old infants who saw an object involved in an counterintuitive event (e.g., a toy appeared to float in midair) preferentially explored that object and manipulated it in an attempt to learn more about its unusual properties in comparison to an ordinary object (e.g., one that fell when unsupported).

The early developing tendency to seek new information in response to a violation of core knowledge may extend across the lifespan, such that one may be motivated to learn more about counterintuitive concepts, including those found in pseudoscience, in an attempt to reconcile them with prior beliefs. One mode of information-search is to ask others for their opinion, thereby re-transmitting the concept. Compatible with this account, Mermelstein et al. (2019) found that novel counterintuitive statements (e.g., “a cactus that liked to sing”) were judged by adults to be less believable than ordinary statements (e.g., “a cat that liked to play with toys”), but also as more interesting, more desirable to learn about, and more likely to be passed along to others, and these variables were all strongly correlated. Thus, as with other epistemically suspect information (e.g., “fake news,” see Pennycook and Rand, 2021), one’s (lack of) belief in counterintuitive content seems to be orthogonal to a willingness to share it with others. People may repeat counterintuitive pseudoscience to others, regardless of their commitment to these beliefs, to scope out what others think about them.1

Nonetheless, Mercier et al. (2018) have put forward a complementary account suggesting that people may choose to re-transmit pseudoscientific beliefs so as to appear competent to others. Participants in this study rated a series of pseudoscientific (e.g., “people can learn information, like new languages, while asleep”) and factual (e.g., “handwriting doesn’t reveal personality traits”) statements on their believability, one’s willingness to re-transmit them, and on how knowledgeable someone who said that statement would seem. A key analysis found that the extent to which a participant believed that holding a given claim (pseudoscientific or factual) made them appear knowledgeable was an important predictor of their willingness to re-transmit it.

Mercier et al. (2018), however, did not differentiate intuition-consistent from counterintuitive pseudoscience. It may be the case that the motive for and method of re-transmission differs depending on the consistency of a claim with core intuitions. Thus, one may be willing to share, and desire to be associated with, intuition-consistent pseudoscience given that others might find that information intuitively compelling (Altay et al., 2020a). On the other hand, when re-sharing counterintuitive pseudoscientific beliefs one might tend to attribute them to a source other than the self while gauging others’ reactions to that content (Altay et al., 2020b). For example, disclaimers such as “I read somewhere that.” or “many other people have said.” allow one to discuss counterintuitive ideas with others without asserting ownership of them—all while promoting the circulation of counterintuitive pseudoscience from mind to mind.

Discussion

In this article we have distinguished intuition-consistent from counterintuitive pseudoscience, described how the mind’s communication evaluation mechanisms might shape the representational characteristics of these counterintuitive concepts, and suggested how such beliefs may become memorable and socially transmissible. We speculate that, as people attempt to reconcile counterintuitive content with their prior beliefs, they may come across different lines of support that lead them to (at least explicitly) accept and endorse counterintuitive pseudoscience.

Mercier and Sperber (2011) have identified two ways by which communication that violates prior beliefs may overcome epistemic vigilance. First, one may suspend their disbelief in such information should they find its source(s) sufficiently trustworthy or reliable. Indeed, scientists have often come to counterintuitive conclusions (e.g., the sun is at the center of the solar system) and laypersons typically trust such claims based on the past reliability and esteem of science in general (Shtulman, 2013). Blancke et al. (2019) note that pseudoscience may become believable as it adopts the appearance of science and consequently its privileged epistemic status. Thus, when researchers publish apparent evidence of psi in peer reviewed journals, the public may be inclined to believe these claims have a degree of credibility given the source.

Second, acceptance of counterintuitive pseudoscience may come about by encountering supporting argumentation or reasons that justify holding these beliefs (Mercier and Sperber, 2011). Effective arguments in support of counterintuitive pseudoscience might emphasize links between that content and an audience’s cognitive predispositions or prior beliefs (Blancke et al., 2019), including existing commitments to other supernatural or paranormal beliefs (Lindeman and Aarnio, 2007). An astrologer’s predictions about the future might seem sensible in reference to intuitive pattern-seeking and teleological reasoning tendencies, a psychic might appeal to the widespread and cherished belief in spirits that survive the death of the body in explaining how they communicate with the deceased.

On that note, some strands of counterintuitive pseudoscience may be especially appealing, despite their inconsistency with core intuitions, as they function to alleviate stress or anxiety by providing a compensatory sense of control. Past research finds that many belief systems, from religious beliefs (Inzlicht and Tullett, 2010) and superstitious or magical thinking (Keinan, 2002) to belief in the efficacy of ritual behavior (Lang et al., 2015), may serve as a buffer against stressful or unpredictable circumstances by offering explanations and actions to take to reduce uncertainty or regain a sense of control (Kay et al., 2009). Interestingly, among pseudosciences, astrology and parapsychology have elements that might serve as anxiolytics. Indeed, experimental work has shown that participants induced to feel that outcomes were out of their control increasingly endorsed the existence of precognition (Greenaway et al., 2013) and followed a psychic’s recommendations (Case et al., 2004). Thus, certain counterintuitive pseudoscientific concepts may be particularly likely to gain acceptance, not because their content is intrinsically believable, but because of their functional role in reducing stress or anxiety.

Conclusion

In conclusion, we have argued that counterintuitive pseudoscience has features that exploit the mind’s communication evaluation mechanisms to become attention-grabbing, memorable, and likely to be passed on to others. People may even come to explicitly endorse these beliefs through deference to an apparently authoritative source or from a reasoned argument. In this way, counterintuitive pseudoscience achieves cultural prominence. Nonetheless, we hypothesize that these beliefs are held reflectively as they cannot be reconciled with core intuitions (Sperber, 1997). As with other counterintuitive concepts in science (Shtulman and Valcarcel, 2012; Shtulman and Harrington, 2016) and religion (Barlev et al., 2017, 2018, 2019), such pseudoscientific beliefs may coexist alongside incompatible prior beliefs and may be to some extent suspended from guiding behavior. A stated belief in these concepts thus does not necessitate an implicit commitment to them in all contexts. Pseudoscience is ubiquitous but it is not unitary. Recognizing that these beliefs may propagate through different means may be key to undermining their spread and impact.

Data Availability Statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author/s.

Author Contributions

SM and TCG conceptualized the manuscript. SM wrote the first draft, to which TCG provided critical edits. Both authors contributed to its final version and approved its submission.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1

Interestingly, the philosopher David Hume (1748/2000) suggested that one may repeat a counterintuitive claim (e.g., of a miracle) that they do not necessarily believe in for the purpose of eliciting “surprise and wonder” in others as to gain their attention and respect. This account is compatible with that of the current paper: a motivation for re-transmitting counterintuitive claims may be to provoke others’ reactions to that content. Doing so may reveal whether such claims tend to be endorsed by peers. Moreover, should the counterintuitive claim be favorably received by others (perhaps as it signals a shared group membership), one may be encouraged to continually re-share it to earn their esteem.

Funding

This work was supported by the UCSB Open Access Publishing Fund.

References

  1. Altay S., Claidière N., Mercier H. (2020a). It happened to a friend of a friend: inaccurate source reporting in rumour diffusion. Evol. Hum. Sci. 2 1–19. 10.1017/ehs.2020.53 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Altay S., Majima Y., Mercier H. (2020b). It’s my idea! reputation management and idea appropriation. Evol. Hum. Behav. 41 235–243. 10.1016/j.evolhumbehav.2020.03.004 [DOI] [Google Scholar]
  3. Baillargeon R., Scott R. M., Bian L. (2016). Psychological reasoning in infancy. Annu. Rev. Psychol. 67 159–186. [DOI] [PubMed] [Google Scholar]
  4. Banerjee K., Haque O. S., Spelke E. S. (2013). Melting lizards and crying mailboxes: children’s preferential recall of minimally counterintuitive concepts. Cogn. Sci. 37 1251–1289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Barlev M., Mermelstein S., Cohen A. S., German T. C. (2019). The embodied god: core intuitions about person physicality coexist and interfere with acquired christian beliefs about god, the holy spirit, and jesus. Cogn. Sci. 43:e12784. 10.1111/cogs.12784 [DOI] [PubMed] [Google Scholar]
  6. Barlev M., Mermelstein S., German T. C. (2017). Core intuitions about persons coexist and interfere with acquired Christian beliefs about God. Cogn. Sci. 41 425–454. 10.1111/cogs.12435 [DOI] [PubMed] [Google Scholar]
  7. Barlev M., Mermelstein S., German T. C. (2018). Representational co-existence in the God concept: core knowledge intuitions of God as a person are not revised by Christian theology despite lifelong experience. Psychonomic Bull. Rev. 25 2330–2338. 10.3758/s13423-017-1421-6 [DOI] [PubMed] [Google Scholar]
  8. Barrett J. L. (1998). Cognitive constraints on Hindu concepts of the divine. J. Sci. Study Religion 37 608–619. 10.2307/1388144 [DOI] [Google Scholar]
  9. Barrett J. L. (1999). Theological correctness: cognitive constraint and the study of religion. Method Theory Study Religion 11 325–339. 10.1163/157006899X00078 [DOI] [Google Scholar]
  10. Barrett J. L., Keil F. C. (1996). Conceptualizing a nonnatural entity: anthropomorphism in God concepts. Cogn. Psychol. 31 219–247. 10.1006/cogp.1996.0017 [DOI] [PubMed] [Google Scholar]
  11. Baumard N., Boyer P. (2013). Religious beliefs as reflective elaborations on intuitions: a modified dual-process model. Curr. Direct. Psychol. Sci. 22 295–300. 10.1177/0963721413478610 [DOI] [Google Scholar]
  12. Blancke S., Boudry M., Braeckman J. (2019). Reasonable irrationality: the role of reasons in the diffusion of pseudoscience. J. Cogn. Culture 19 432–449. 10.1163/15685373-12340068 [DOI] [Google Scholar]
  13. Blancke S., Boudry M., Pigliucci M. (2017). Why do irrational beliefs mimic science? Cultural Evol. Pseudosci. Theoria 83 78–97. 10.1111/theo.12109 [DOI] [Google Scholar]
  14. Blancke S., De Smedt J. (2013). “Evolved to be irrational? Evolutionary and cognitive foundations of pseudosciences,” in The Philosophy of Pseudoscience, eds Pigliucci M., Boudry M. (Chicago, ILL: The University of Chicago Press; ). [Google Scholar]
  15. Blancke S., Van Breusegem F., De Jaeger G., Braeckman J., Van Montagu M. (2015). Fatal attraction: the intuitive appeal of GMO opposition. Trends Plant Sci. 20 414–418. 10.1016/j.tplants.2015.03.011 [DOI] [PubMed] [Google Scholar]
  16. Boudry M., Blancke S., Pigliucci M. (2015). What makes weird beliefs thrive? the epidemiology of pseudoscience. Philos. Psychol. 28 1177–1198. 10.1080/09515089.2014.971946 [DOI] [Google Scholar]
  17. Boxer A. (2020). A Scheme of Heaven: The History of Astrology and the Search for our Destiny in Data. New York, NY: W. W. Norton & Company. [Google Scholar]
  18. Boyer P. (1994). The Naturalness of Religious Ideas: A Cognitive Theory of Religion. Berkeley, CA: University of California Press. [Google Scholar]
  19. Boyer P. (2001). Religion Explained: The Evolutionary Origins of Religious Thought. New York, NY: Basic Books. [Google Scholar]
  20. Boyer P. (2003). Religious thought and behaviour as by-products of brain function. Trends Cogn. Sci. 7 119–124. 10.1016/S1364-6613(03)00031-7 [DOI] [PubMed] [Google Scholar]
  21. Boyer P., Ramble C. (2001). Cognitive templates for religious concepts: cross-cultural evidence for recall of counter-intuitive representations. Cogn. Sci. 25 535–564. [Google Scholar]
  22. Burdett E. R., Porter T., Barrett J. (2009). Counterintuitiveness in folktales: finding the cognitive optimum. J. Cogn. Culture 9 271–287. 10.1163/156770909X12489459066345 [DOI] [Google Scholar]
  23. Case T. I., Fitness J., Cairns D. R., Stevenson R. J. (2004). Coping with uncertainty: superstitious strategies and secondary control. J. Appl. Soc. Psychol. 34 848–871. 10.1111/j.1559-1816.2004.tb02574.x [DOI] [Google Scholar]
  24. Clément F., Koenig M., Harris P. (2004). The ontogenesis of trust. Mind Lang. 19 360–379. 10.1111/j.0268-1064.2004.00263.x [DOI] [Google Scholar]
  25. Cosmides L., Tooby J. (2000). “Consider the source: the evolution of adaptations for decoupling and metarepresentation,” in Metarepresentations: A Multidisciplinary Perspective, ed. Sperber D. (New York, NY: Oxford University Press; ), 53–115. [Google Scholar]
  26. Greenaway K. H., Louis W. R., Hornsey M. J. (2013). Loss of control increases belief in precognition and belief in precognition increases control. PLoS One 8:e71327. 10.1371/journal.pone.0071327 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Griffiths T. L., Kalish M. L., Lewandowsky S. (2008). Theoretical and empirical evidence for the impact of inductive biases on cultural evolution. Philos. Trans. R. Soc. London Series B Biol. Sci. 363 3503–3514. 10.1098/rstb.2008.0146 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Harris P. L., Koenig M. A., Corriveau K. H., Jaswal V. K. (2018). Cognitive foundations of learning from testimony. Ann. Rev. Psychol. 69, 251–273. 10.1146/annurev-psych-122216-011710 [DOI] [PubMed] [Google Scholar]
  29. Henrich J., Gil-White F. J. (2001). The evolution of prestige: freely conferred deference as a mechanism for enhancing the benefits of cultural transmission. Evol. Hum. Behav. 22 165–196. 10.1016/S1090-5138(00)00071-4 [DOI] [PubMed] [Google Scholar]
  30. Hong Z., Henrich J. (2021). The cultural evolution of epistemic practices. Hum. Nat. 10.1007/s12110-021-09408-6 Online ahead of print. [DOI] [PubMed] [Google Scholar]
  31. Hume D. (1748/2000). An Enquiry Concerning Human Understanding: A Critical Edition. New York, NY: Oxford University Press. [Google Scholar]
  32. Inzlicht M., Tullett A. M. (2010). Reflecting on God: religious primes can reduce neurophysiological response to errors. Psychol. Sci. 21 1184–1190. 10.1177/0956797610375451 [DOI] [PubMed] [Google Scholar]
  33. Kalish M. L., Griffiths T. L., Lewandowsky S. (2007). Iterated learning: intergenerational knowledge transmission reveals inductive biases. Psychonomic Bull. Rev. 14 288–294. [DOI] [PubMed] [Google Scholar]
  34. Kay A. C., Whitson J. A., Gaucher D., Galinsky A. D. (2009). Compensatory control: achieving order through the mind, our institutions, and the heavens. Curr. Direct. Psychol. Sci. 18 264–268. 10.1111/j.1467-8721.2009.01649.x [DOI] [Google Scholar]
  35. Keinan G. (2002). The effects of stress and desire for control on superstitious behavior. Personal. Soc. Psychol. Bull. 28 102–108. 10.1177/0146167202281009 [DOI] [Google Scholar]
  36. Kelemen D. (2004). Are children “intuitive theists”? reasoning about purpose and design in nature. Psychol. Sci. 15 295–301. 10.1111/j.0956-7976.2004.00672.x [DOI] [PubMed] [Google Scholar]
  37. Kelemen D., Rosset E. (2009). The human function compunction: teleological explanation in adults. Cognition 111 138–143. 10.1016/j.cognition.2009.01.001 [DOI] [PubMed] [Google Scholar]
  38. Kelemen D., Rottman J., Seston R. (2012). Professional physical scientists display tenacious teleological tendencies: purpose-based reasoning as a cognitive default. J. Exp. Psychol. General 142 1074–1083. 10.1037/a0030399 [DOI] [PubMed] [Google Scholar]
  39. Koenig M. A., Echols C. H. (2003). Infants’ understanding of false labeling events: the referential roles of words and the speakers who use them. Cognition 87 179–208. 10.1016/S0010-0277(03)00002-7 [DOI] [PubMed] [Google Scholar]
  40. Lane J. D., Harris P. L. (2015). The roles of intuition and informants’ expertise in children’s epistemic trust. Child Dev. 86 919–926. 10.1111/cdev.12324 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Lang M., Krátký J., Shaver J. H., Jerotijević D., Xygalatas D. (2015). Effects of anxiety on spontaneous ritualized behavior. Curr. Biol. 25 1892–1897. 10.1016/j.cub.2015.05.049 [DOI] [PubMed] [Google Scholar]
  42. Larson H. J., Cooper L. Z., Eskola J., Katz S. L., Ratzan S. (2011). Addressing the vaccine confidence gap. Lancet 378 526–535. [DOI] [PubMed] [Google Scholar]
  43. Leslie A. M. (1987). Pretense and representation: the origins of “theory of mind.”. Psychol. Rev. 94 412–426. [Google Scholar]
  44. Leslie A. M., Keeble S. (1987). Do six-month-old infants perceive causality? Cognition 25 265–288. 10.1016/S0010-0277(87)80006-9 [DOI] [PubMed] [Google Scholar]
  45. Lindeman M., Aarnio K. (2007). Superstitious, magical, and paranormal beliefs: an integrative model. J. Res. Personal. 41 731–744. 10.1016/j.jrp.2006.06.009 [DOI] [Google Scholar]
  46. Lu J. G., Liu X. L., Liao H., Wang L. (2020). Disentangling stereotypes from social reality: astrological stereotypes and discrimination in China. J. Pers. Soc. Psychol. 119 1359–1379. 10.1037/pspi0000237 [DOI] [PubMed] [Google Scholar]
  47. Mercier H. (2017). How gullible are we? a review of the evidence from psychology and social science. Rev. General Psychol. 21 103–122. [Google Scholar]
  48. Mercier H., Majima Y., Claidière N., Léone J. (2019). Obstacles to the spread of unintuitive beliefs. Evol. Hum. Sci. 1:E10. 10.1017/ehs.2019.10 [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Mercier H., Majima Y., Miton H. (2018). Willingness to transmit and the spread of pseudoscientific beliefs. Appl. Cogn. Psychol. 32 499–505. 10.1002/acp.3413 [DOI] [Google Scholar]
  50. Mercier H., Sperber D. (2011). Why do humans reason? arguments for an argumentative theory. Behav. Brain Sci. 34 57–74; discussion 74–111. 10.1017/s0140525x10000968 [DOI] [PubMed] [Google Scholar]
  51. Mermelstein S., Barlev M., German T. C. (2020). She told me about a singing cactus: counterintuitive concepts are more accurately attributed to their speakers than ordinary concepts. J. Exp. Psychol. General 150 972–982. 10.1037/xge0000987 [DOI] [PubMed] [Google Scholar]
  52. Mermelstein S., Barlev M., German T. C. (2019). “Tell me more about that melting lizard! counterintuitive concepts trigger information search,” in Paper Presented at the Evolutionary Psychology Pre-Conference at the Annual Convention of the Society for Personality and Social Psychology, (Portland, OR: ). [Google Scholar]
  53. Miton H., Claidière N., Mercier H. (2015). Universal cognitive mechanisms explain the cultural success of bloodletting. Evol. Hum. Behav. 36 303–312. [Google Scholar]
  54. Miton H., Mercier H. (2015). Cognitive obstacles to pro-vaccination beliefs. Trends Cogn. Sci. 19 633–636. 10.1016/j.tics.2015.08.007 [DOI] [PubMed] [Google Scholar]
  55. Moore D. W. (2005). Three in Four Americans Believe in Paranormal. Washington, DC: Gallup. [Google Scholar]
  56. Morin O. (2013). How portraits turned their eyes upon us: visual preferences and demographic change in cultural evolution. Evol. Hum. Behav. 34 222–229. [Google Scholar]
  57. Norenzayan A., Atran S., Faulkner J., Schaller M. (2006). Memory and mystery: the cultural selection of minimally counterintuitive narratives. Cogn. Sci. 30 531–553. 10.1207/s15516709cog0000_68 [DOI] [PubMed] [Google Scholar]
  58. Nyhof M., Barrett J. (2001). Spreading non-natural concepts: the role of intuitive conceptual structures in memory and transmission of cultural materials. J. Cogn. Culture 1 69–100. 10.1163/156853701300063589 [DOI] [Google Scholar]
  59. Onishi K. H., Baillargeon R. (2005). Do 15-month-old infants understand false beliefs? Science 308 255–258. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Pennycook G., Rand D. G. (2021). The psychology of fake news. Trends Cogn. Sci. 25 388–402. 10.1016/j.tics.2021.02.007 [DOI] [PubMed] [Google Scholar]
  61. Poland G. A., Spier R. (2010). Fear, misinformation, and innumerates: how the Wakefield paper, the press, and advocacy groups damaged the public health. Vaccine 28 2361–2362. 10.1016/j.vaccine.2010.02.052 [DOI] [PubMed] [Google Scholar]
  62. Reber A. S., Alcock J. E. (2020). Searching for the impossible: parapsychology’s elusive quest. Am. Psychol. 75:391. 10.1037/amp0000486 [DOI] [PubMed] [Google Scholar]
  63. Shtulman A. (2013). Epistemic similarities between students’ scientific and supernatural beliefs. J. Educ. Psychol. 105 199–212. 10.1037/a0030282 [DOI] [Google Scholar]
  64. Shtulman A. (2017). Scienceblind: Why Our Intuitive Theories About the World are so Often Wrong. New York, NY: Basic Books. [DOI] [PubMed] [Google Scholar]
  65. Shtulman A., Harrington K. (2016). Tensions between science and intuition across the lifespan. Top. Cogn. Sci. 8 118–137. 10.1111/tops.12174 [DOI] [PubMed] [Google Scholar]
  66. Shtulman A., Valcarcel J. (2012). Scientific knowledge suppresses but does not supplant earlier intuitions. Cognition 124 209–215. [DOI] [PubMed] [Google Scholar]
  67. Singh M. (2018). The cultural evolution of shamanism. Behav. Brain Sci. 41:e66. 10.1017/S0140525X17001893 [DOI] [PubMed] [Google Scholar]
  68. Slone J. (2007). Theological Incorrectness: Why Religious People Believe What They Shouldn’t. New York, NY: Oxford University Press. [Google Scholar]
  69. Spelke E. S. (1990). Principles of object perception. Cogn. Sci. 14 29–56. [Google Scholar]
  70. Spelke E. S., Kinzler K. D. (2007). Core knowledge. Dev. Sci. 10 89–96. [DOI] [PubMed] [Google Scholar]
  71. Sperber D. (1985). On Anthropological Knowledge: Three Essays. New York, NY: Cambridge University Press. [Google Scholar]
  72. Sperber D. (1994). “The modularity of thought and the epidemiology of representations,” in Mapping the Mind: Domain Specificity in Cognition and Culture, eds Hirschfeld L. A., Gelman S. A. (Cambridge: Cambridge University Press; ), 39–67. [Google Scholar]
  73. Sperber D. (1996). Explaining Culture: A Naturalistic Approach. Oxford: Blackwell. [Google Scholar]
  74. Sperber D. (1997). Intuitive and reflective beliefs. Mind Lang. 12 67–83. 10.1111/j.1468-0017.1997.tb00062.x [DOI] [Google Scholar]
  75. Sperber D. (2000). Metarepresentations: A Multidisciplinary Perspective. New York, NY: Oxford University Press. [Google Scholar]
  76. Sperber D., Clément F., Heintz C., Mascaro O., Mercier H., Origgi G., et al. (2010). Epistemic vigilance. Mind Lang. 25 359–393. [Google Scholar]
  77. Stahl A. E., Feigenson L. (2015). Observing the unexpected enhances infants’ learning and exploration. Science 348 91–94. 10.1126/science.aaa3799 [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. van der Linden S., Leiserowitz A., Rosenthal S., Maibach E. (2017). Inoculating the public against misinformation about climate change. Global Challenges 1:1600008. 10.1002/gch2.201600008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Vosniadou S. (1994). Capturing and modeling the process of conceptual change. Learn. Instruct. 4 45–69. 10.1016/0959-4752(94)90018-3 [DOI] [Google Scholar]
  80. Whitson J. A., Galinsky A. D. (2008). Lacking control increases illusory pattern perception. Science 322 115–117. 10.1126/science.1159845 [DOI] [PubMed] [Google Scholar]
  81. Wood M. J., Douglas K. M., Sutton R. M. (2012). Dead and alive: beliefs in contradictory conspiracy theories. Soc. Psychol. Personal. Sci. 3 767–773. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author/s.


Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES