Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2020 Apr 16;23(3):505–518. doi: 10.1007/s11019-020-09951-6

Medical conspiracy theories: cognitive science and implications for ethics

Gabriel Andrade 1,
PMCID: PMC7161434  PMID: 32301040

Abstract

Although recent trends in politics and media make it appear that conspiracy theories are on the rise, in fact they have always been present, probably because they are sustained by natural dispositions of the human brain. This is also the case with medical conspiracy theories. This article reviews some of the most notorious health-related conspiracy theories. It then approaches the reasons why people believe these theories, using concepts from cognitive science. On the basis of that knowledge, the article makes normative proposals for public health officials and health workers as a whole, to deal with conspiracy theories, in order to preserve some of the fundamental principles of medical ethics.

Keywords: Medical conspiracy theories, Cognitive science, Ethics, Human brain, Public policy

Introduction

Conspiracy theories are narratives about events or situations, that allege there are secret plans to carry out sinister deeds. Although this kind of narratives have existed for a long time, scientists are only beginning to understand why people come to believe them in the first place. The medical world is not spared of these dynamics. Conspiracy theories can do a lot of harm, and that is why there is an urgent need to study them. By better coming to understand how they arise and spread, we can begin to propose concrete measures in order to better educate the public, and prevent them from being captivated by these narratives.

In recent times, celebrities in Western media have manifested interest in conspiracy theories. It is not entirely clear whether celebrities’ beliefs in conspiracy theories are genuine, or simply publicity stunts. When it comes to medical conspiracy theories, some celebrities do seem to honestly believe them. Jenny MacCarthy (Gottlieb 2016), Jim Carrey (Bearman 2010), Robert De Niro (Sharfstein 2017) and Bill Maher (Parker-Pope 2009) have been very vocal in their opposition to vaccines, and their alleged links to autism.

Although celebrity culture has been nourishing conspiracy theories for some decades, it appears that more recent political events have increased the popularity of conspiracy theories in the Western world. As early as 1964 Richard Hofstadter (2012) studied the so-called “paranoid style” in American politics. But, many observers converge on the idea that Donald Trump’s political ascendency has marked a new era of conspiratorial thinking in the United States and other countries under its sphere of influence (Hellinger 2019). Before being a politician, Trump was a celebrity; and thus, both roles have managed to influence his followers into making conspiracy theories more popular. He is on record for giving credibility to numerous conspiracy theories: global warming is a hoax, Barack Obama was not born in the United States, Rafael Cruz participated in J.F. Kennedy’s assassination, Bill Clinton ordered the assassination of Vince Foster, Antonin Scalia was murdered, vaccines cause autism.

But, even if celebrity influence is more of a modern phenomenon, it is nevertheless true that conspiracy theories have been present throughout history. In fact, as documented by Uscinski and Parent (2014), conspiracy themes have been persistent in American opinion for more than a century. But, even going further back, Joseph Roisman (2006) has documented how the rhetoric of conspiracy was already prominent in Ancient Greece, and Suetonius’ telling of the lives and times of Rome’s first twelve Caesars are also filled with all sorts of conspiracy theories and rumors. Anthropologists have documented conspiracy theories in peoples as diverse as the Yanomami (Chagnon 1983) and the Azande (Evans-Pritchard 1963). The fact that even hunter gatherers (Von Rueden and van Vugt 2015) have conspiracy theories, seems to indicate that this is indeed a universal phenomenon (West and Sanders 2003).

Therefore, even though particular social contexts may magnify the prevalence of conspiracy theories, it is well established that conspiracy theories have deep psychological bases that are present in all human beings. In this article, I shall rely on principles of cognitive science to attempt to understand why people believe conspiracy theories. This is an important endeavor in relationship to medical ethics, because some of the most prominent conspiracy theories pertain to medicine. As documented by Oliver and Woods, the percentage of Americans accepting medical conspiracy theories is alarmingly high; for example, only 44% disagree that doctors want to vaccinate children, even though they know vaccines are harmful; 37% agree that the FDA refuses to release the cure of cancer; only 46% disagree that fluoridation is a secret plot to poison people (Oliver and Woods 2014). This has important implications, as exposure to medical conspiracy theories influence health behaviors (Jolley and Douglas 2014).

Consequently, medical ethicists and public health officials must find a way to overturn medical conspiracy theories. In order to do this, we must come to an understanding of why people believe these theories in the first place, as this will allow us to take proper steps to design public policies so that acceptance of conspiracy theories remains limited, the public is better informed, and thus they make their decisions on the basis of informed consent, hence satisfying the principle of autonomy in medical ethics, as well as other ethical principles specific to public health such as public engagement and communication (Heldman et al. 2013).

Therefore, the aim of this article is to review the existing literature regarding the psychological and sociological reasons why people believe in conspiracy theories, on the basis of the findings of cognitive science. By “belief”, we shall understand the affirmation that something is true, regardless of its rationale; the concept may also relate to personal attitudes related to particular claims about the world. With that information, another aim of this article is to introduce some ethical implications, and provide an exploratory framework for the ethical design of public policies, in order to attempt to eradicate some of the most prominent conspiracy theories in healthcare. This will be a particularly innovative approach, since there is much theoretical material on the workings of conspiracy theories, but very little on how these theoretical approaches help us understand conspiracy theories specific to the medical realm.

Medical conspiracy theories: a brief review

Health-related conspiracy theories are not necessarily a new phenomenon. And, to understand how they have come to be, we first need to come to terms with a working definition. We can provisionally define them as attempts to explain particular events or situations, as the result of the actions of a small, powerful group, with perverse intentions. That does not imply that conspiracy theories are necessarily false, because, in some cases, some small evil groups have indeed conspired to bring about unfortunate situations. But, for the most part, conspiracy theories rely on sloppy thinking, and they present scenarios that are not accurate. In the medical world, this has been a constant.

Edward Jenner’s discovery of the vaccine against smallpox is a major milestone in the history of medicine, but this event marked the beginning of a new wave of conspiracy mongering (Dube et al. 2015). Public opinion did not properly understand how vaccines work, and soon enough, there were rumors that taking vaccines would make people grow horns (it must remembered that the vaccine originated with cows), and vaccines would actually kill people.

Ever since, conspiracy theories regarding vaccines have remained popular in public opinion. In the 1980s, Dr. John Wilson made a great fuss about DPT vaccine allegedly causing convulsions and cerebral damage (Dyer 1987). In 1998, Andrew Wakefield published an article claiming that the MMR vaccines is linked to autism. Although this paper was thoroughly refuted, it was retracted from the journal where it was published, and most of Wakefield’s co-authors have also retracted from their views, it ultimately unleashed a new wave of a moral panic against vaccination (Goldacre 2008). In recent years there have been occasional outbreaks of measles in affluent areas, and the main factor for this seems to be that parents choose not to vaccinate children, out of fear that they may turn out autistic. In this conspiracy theory, pharmaceutical companies know that vaccines are not safe, but they are still making big profits on them, so they deliberately keep this information hidden.

Conspiracy theories about vaccines have become even more popular in non-Western countries. For example, Pakistan is one of the sole three remaining countries where polio has not been eradicated. In the early 1990s, the annual incidence of polio in each of those countries was about 20,000 cases per year in Pakistan. This was due to a failure in vaccination campaigns. In Pakistan, there is a persistent conspiracy theory that the polio vaccine is a ploy designed the by the CIA to make Muslim men sterile (Andrade and Hussain 2018).

Narratives about the origins of viruses are very popular in conspiracy theories. AIDS has been particularly interpreted as an invention by the US government to reduce black populations. Consequently, this theory is notoriously believed by African American men, and as a result, they tend to use preservatives less frequently (Bogart and Bird 2003). In fact, in this population, the belief that birth control methods are a plan for genocide, is also prevalent, thus further reducing the use of preservatives for safer sex practices (Thorburn and Bogart 2005).

Apart from the narrative about the origins of AIDS, there is also the conspiracy theory, according to which, HIV does not cause AIDS, and retroviral medication is the actually culprit for most causes of death in AIDS patients. This conspiracy theory is particularly popular in sub-Saharan Africa, with various prominent politicians giving it credit and promoting it (Fourie and Meyer 2010). This form of AIDS denialism has caused considerable damage in Africa, and it is of urgent epidemiological concern.

Another virus that frequently draws the attention of conspiracy theorists is Ebola. Conspiracy theorist Leonard Horowitz has been very active in promoting the idea that Ebola has been manufactured by the US government, and as a result, some of his followers have recommended not vaccinating children against any disease whatsoever (Knight 2013). SARS and COVID-19 have also been discussed in conspiracy theory circles, either as a biological weapon against the Chinese, or as an invention of the Chinese government.

The trope that big pharmaceutical companies have the cure for cancer or other deadly diseases, yet do not release it (either to make profits or simply as population control), is also persistent in conspiracy theories. Likewise, some alternative therapies for cancer have been proposed, and despite their lack of evidence in their support, many conspiracy theorists claim that they are effective, but the scientific establishment conspires against it. This has been especially the case with Laetrile, a synthetic form of amygdalin that has been defended as a cure for cancer by many conspiracy theorists, advocating it as replacement for more effective treatments (Ernst 2019).

Issues of substance abuse have also been the subject of various conspiracy theories. It is frequently alleged that marijuana is a safe drug, and was only outlawed under pressure from the paper industry, as the hemp plant was a competitor. By contrast, conspiracy theorists typically accept that cocaine is a dangerous drug, but many believe that the crack cocaine epidemic across the United States in the 1980s, was actually due to a US government plan to specifically target African Americans and keep them addicted, while at the same time profiting from the illegal trade to finance paramilitary groups in Nicaragua (Webb 2019).

Cognitive science of medical conspiracy theories

It should be noted that, up to date, there is no single explanation for conspiracy theories. There are multiple correlations and explanations of particular aspects of conspiracy beliefs, but not necessarily a coherent whole that theoretically encapsulates all conspiracy theories. In this section, I shall approach some important findings of cognitive science pertaining to conspiracy theories, but it is important to keep in mind that this does not necessarily constitute a unified theoretical approach, because it is still a developing field.

Nevertheless, there is some unifying threat in the approaches that will be addressed. That threat is an explanatory framework as to why people believe in conspiracy theories, on a neuroscientific, psychological, and sociological level. Even though these approaches may come from different theoretical perspectives, they complement each other, to the extent that they establish correlations amongst variables, and offer some measure of predictive factors regarding the proclivity to believe conspiracy theories. Inasmuch as the disposition to believe conspiracy theories has some firm biological grounding, I shall approach the way evolutionary theory accounts for specific psychological mechanisms that in human evolution primed us to believe things like conspiracy theories. But, given that conspiracy theories are further developed on account of environmental factors, I shall also examine studies that relate to environmental variables (both psychological and sociological) that facilitate the rise of conspiracy theories.

Conspiracy theories spread very easily. Although the technological advance of social media plays a significant role in their dissemination, it is still true that, even in a preindustrial world without media technologies, conspiracy theories were easily widespread by word of mouth. Conspiracy theories rely on rumor, and cognitive science has produced significant research documenting how gossiping is hardwired in human brains (Rosnow 1991).

When scientists say that something is “hardwired” into the brain, they mean that particular beliefs or behaviors are constant in the human species, because they arise from predetermined arrangements of physical connections between nerve cells (Ottersen and Helm 2002). Now, it is important to keep in mind that rumoring by itself, is not the same as a conspiracy theory. Rumoring is a daily affair in human behavior. Humans are hardwired to rumor, but not necessarily to form conspiracy theories. For conspiracy theories to arise, there probably need to be additional psychological and sociological environmental conditions that facilitate their development. But, these developments are ultimately built on the biological basis of the brain, that prime humans to activities such as gossiping.

Gossiping was an important adaptation in human evolution (McAndrew and Milenkovic 2002). Robin Dunbar defends the view that the main factor in the origin of language is gossip itself (Dunbar 1996). In fact, it is estimated that 80% of conversations are about other human beings. Hominids likely were required to form bands as a way to ensure survival, and their greatest threats came, not only from predators, but also from other bands. Thus, in order to ensure group alliance as a defense against others, constant gossip served as a way to cement bonds, and exclude individuals perceived as dangerous.

It is thus expected that, when considering the causes of particular health problems, human beings will always have the inclination to talk about other human beings in relation to these problems. Consequently, the conversation will turn more interesting if the culprits of diseases are not just microorganisms, cancer cells or unhealthy foods, but rather, other human beings. And, since these are initially rumors about other people, they will ultimately spread rather quickly.

The rise of electronic technology, and most especially, the internet, is also a considerable factor in more recent conspiracy theories (Clarke 2007). In natural conditions, rumor is prevalent, but still limited because communication relies on proximity. However, with the rise of internet, the effects of rumor have even been more potentiated, because now conspiracy theorists are able to connect via forums with people in more distant locations, thus reinforcing their worldviews (Wood 2013).

Although the discipline of memetics has come under sustained criticisms, research into how ideas spread more easily has made important advances (Blackmore 1999). One particularly useful tool in the study of how ideas stick comes from the cognitive science of religion: minimally counterintuitive effects. It has been established that belief in conspiracy theories is more associated with intuitive rather than analytic thinking (Swami et al. 2014). But, conspiracy theories are more popular if they retain an element of minimal counterintuitiveness. As Boyer explains this notion, concepts that violate a few ontological expectations of a category, are more memorable than intuitive and maximally counterintuitive concepts (Boyer 1994). Religious concepts such as fairies, demigods, healing powers, miracles, and so on, are more easily remembered because they step out of the ordinary, and defy the way things happen conventionally. In the same manner, conspiracy theories stick relatively easily, because they involve concepts that are not so common: there is no expectation that everyday, evil scientists in labs manufacture deadly viruses, or that dentists advance water fluoridation in order to make people more stupid. This appeal to counterintuitive concepts ultimately leads conspiracy theorists to frequently make contradictory claims. For example, research shows that many people who believe Princess Diana staged her death, also believe she was killed by the royal family (Wood et al. 2011). Likewise, medical conspiracy theories frequently claim that pharmaceuticals profit from making people buy ineffective cures for cancer (and thus letting people die), yet at the same time keep them alive so that they continue to be clients.

Yet, concepts that are too strange (maximally counterintuitive) do not stick either. That is how Slone (2004) explains why people frequently defend “theologically incorrect” views that, although not approved by official doctrinal teachings of religions, make more sense on an intuitive level. In one particularly useful study, Norenzayan et al. (2006) document how minimally counterintuitive narratives are more easily remembered by subjects.

This also applies to conspiracy theories. Medical conspiracy theories are most likely false, but they are not outrageously bizarre. On the surface, they do have some level of plausibility, especially taking into account that some medical conspiracy theories have turned out to be true. African Americans in greater proportion falsely believe that AIDS was designed by the US government to reduce their population, but it is not false that the US Public Health Service did engage in human experimentation with African American males in the infamous Tuskegee Study of Untreated Syphilis of the 1930–1970s. It is false that vaccines cause autism, but it is true that vaccinations in St Louis caused the death of 13 children in 1901. It is likely false that the US government conspired to get African Americans addicted to crack cocaine, but it is true that the CIA carried out experiments with LSD to test mind control in the MK-Ultra project. And, the list of real medical conspiracy theories does not end there. Moreno (2013) provides an extensive list of secret State experiments in humans throughout history, and there have been plenty of documented cases of unethical human experimentation and reckless medical procedures (McNeill 1993). These serve as foundations for conspiracy theorists to elaborate on the basis of factual information that ultimately makes their claims more intuitive.

Mark Fenster even believes that conspiracy theories may serve an ethical purpose, as in democratic societies where public opinion is a force to be reckoned with, they hold in check potential conspirators (Fenster 1999). However, the evidence more strongly suggests that conspiracy theories have numerous detrimental effects, both social and psychological. On a social level, conspiracy theories are empirically associated with populism (Silva et al. 2017), political extremism (Van Prooijen et al. 2015), and radicalization of fringe groups (Bartlett and Miller 2010). On a psychological level, research shows that belief in conspiracy theories is associated with paranoia (Darwin et al 2011), schizotypy, narcissism (Cichocka et al. 2016a, b) and insecure attachment (Green and Douglas 2018a, b). In fact, ever since Hofstadter’s seminal The Paranoid Style in American Politics, the assumption has been that conspiracy theorists suffer from a form of psychopathology associated with paranoia.

It is true that some influential conspiracy theorists have been markedly paranoid. For example, Nesta Weber notoriously never attended her door without carrying a gun. However, the consensus is now that most conspiracy theorists are not pathological, precisely because their beliefs ultimately rely on cognitive tendencies that are neurologically hardwired and probably have deep evolutionary origins. Paranoia works on a personal level (the individual feels personally attacked), whereas conspiracy theories are about threat perception as a group (Van Prooijen and Van Lange 2014). And even if conspiracy theorists do feel paranoid on a personal level, this is not necessarily pathological, as paranoid traits exist on a continuum in the general population (Bebbington et al. 2013).

Conspiracy theories are not so much explained by paranoia, but rather, by natural inclinations towards agency detection. Steve Guthrie’s (1995) cognitive science of religion is relevant in this regard: according to his theory, religious beliefs come mostly as a result of the human brain’s tendency to attribute agency and detect patterns, usually in the form of anthropomorphism. The same principle applies to conspiracy theories. It has been empirically established that the tendency to detect agency in inanimate stimuli predicts belief in conspiracy theories (Imhoff and Bruder 2014). Evolutionarily, agency detection was an important advantage, as error management theory would predict, under the principle of “better safe than sorry” (Haselton 2000). As Gray and Wegner (2010) explain this principle, under the threat of predators, “the high cost of failing to detect agents and the low cost of wrongly detecting them… [suggests] that people possess a Hyperactive Agent Detection Device, a cognitive module that readily ascribes events in the environment to the behavior of agents”. In medical conspiracy theories, unfortunate things (such as, say, the outbreak of some virus) cannot just happen without a purpose. Some agent must be behind it. And thus, instead of accepting that AIDS spread because of contact with chimpanzees in Africa, conspiracy theorists are better satisfied with attributing agency to the whole phenomenon, preferring to believe that some cabal actually designed the deadly virus.

Heider and Simmel’s (1944) famous experiment of purposeless movements of shapes demonstrated that most subjects tend to attribute intentions and agency to those shapes. Developmental psychologists have long asserted that teleological thinking is deeply enshrined in preschool children (Kelemen 1999), and understanding randomness requires more mature cognitive functions that not all human beings develop to the same extent. It has been documented that believers of conspiracy theories are even more likely to detect nonexistent patterns in random data (Van Prooijen et al. 2018). In fact, many conspiracy theorists acknowledge that their work is mostly about “connecting the dots”, as in David Icke’s Dot Connector video series. Indeed, conspiracy theories frequently fall under the category of “monological belief systems” (Hagen 2018), i.e., a set of interconnected ideas that are mutually reinforcing. Thus, medical conspiracy theories are frequently not just about health issues. They are enshrined in a grander scheme of things, and usually involve the typical suspects: Masons, Illuminati, etc., as well as greater conspiratorial themes, such as the New World Order and population control. This has been very typical of Nancy Turner Banks (2010), a prominent writer about medical conspiracy theories, who frequently brings Jews into her explanations. Conspiracy theorists try to make sense of the world by providing an overly simplistic explanation of phenomena. This is usually done by focusing exclusively on one single idea, and explaining everything else on the basis of that idea.

The monological aspect of conspiracy theories has two important implications. First, inasmuch as everything is connected in a grand conspiracy, the single best predictor of belief in one conspiracy theory is belief in a different conspiracy theory (Goertzel 1994). And second, inasmuch as conspiracy theories reinforce each other, they ultimately become incorrigible: evidence against them is interpreted as evidence of a conspiratorial effort to try to suppress them (Grimmes 2016), thus confirming the original conspiracy theory. This ultimately becomes a form of cognitive dissonance. As documented by Festinger (1957) in a famous study, whenever individuals strongly adhere to beliefs that turn out not to be true, this causes discomfort. But, only rarely, will individuals acknowledge they are in error. More frequently, individuals will accommodate to that discomfort by adjusting the original belief, so that the evidence against it can now be reinterpreted as confirming the original belief.

It is also true that, inasmuch as conspiracies are enshrined in a grand scheme of things, there is also a tendency to explain big events with big causes. This is the so-called proportionality bias. Conspiracy theorists cannot accept that something as big and deadly as the AIDS epidemic came out of something so trivial as casual contacts between chimpanzees and humans. In their mind, such a big phenomenon needs to have bigger causes, such as evil scientists designing HIV to wipe out specific populations. Rob Brotherton (2013) neatly observes that there are countless conspiracy theories about JFK’s assassination, but very few about Ronald Reagan’s assassination attempt. The difference between both cases reflect this proportionality bias: inasmuch as in the first case the president was assassinated, big explanations are sought; in the second case, the president survived, so conspiracy theories about that event were soon left in oblivion.

This is also the case with explanations for medical phenomena. In one particular study, Ebel-Lam et al. (2010) found that when subjects read about a disease outbreak that does not lead to deaths, they are less likely to believe that the outbreak was intended; by contrast, when another group of subjects read a story in which the outbreak does result in deaths, they are more likely to attribute it to a conspiracy.

Piaget and Inhelder (2008) documented how children in preoperational stages rarely believe that accidents just happen. This suggests that, intuitively, we are intention seekers. In fact, Evelyn Rosset’s (2008) empirical studies demonstrate that, subjects pressed with time, are more likely to explain things with greater intentionality bias, inasmuch as in shortness of time, intuition overtakes analytical thinking. Likewise, for conspiracy theorists, there are no accidents. In their mindset, bad things always come from bad agents. Many medical procedures may have minor unfortunate side effects, but conspiracy theorists have trouble understanding that these side effects are not necessarily intended.

Therefore, conspiracy theorists have more difficulties in accepting coincidences, and may struggle with the idea that events that superficially appear connected, in fact are not. For example, most symptoms of autism are first observed when the child turns three years old. This is slightly after children usually receive the MMR vaccine. Consequentially, by “connecting the dots”, conspiracy theorists fall pray to the post hoc ergo propter hoc fallacy (“after this, therefore because of this”), and erroneously come to believe that, simply because the MMR vaccine antecedes the first symptoms of autism, the former causes the latter.

Humans are natural intention seekers, but this tendency is especially enhanced under conditions of anxiety. In one particular study, subjects under anxiety were more prone to perceive patterns in random sequence of dots (Brotherton 2013, p. 7). As Malinowski (1992) theorized, magical thinking becomes especially preponderant in the face of uncertainty. Superstitious behavior that also “connects the dots” by establishing causal relationships amongst unrelated phenomena, becomes more prominent in times of difficulties. Thus, it is expected that conspiracy theories abound more in times of crisis, and in marginalized populations that face greater challenges.

This has been empirically confirmed. Conspiracy theories appear more frequently in the contexts of fires, flood, epidemics and wars (McCauley and Jacques 1979). Feelings of powerlessness also predict conspiracy beliefs (Abalakina-Paap et al. 1999). People who make a connection between vaccines and autism are frequently parents of autistic children themselves. There is no known cause or cure for autism, so in those cases, feelings of powerlessness are considerable, and this feeds more into the theory that there is a conspiracy at play. By contrast, diabetes has well-established causes, and it also has better prospects of treatment; consequentially, little conspiracy mongering surrounds this disease.

Likewise, empirical studies assert that conspiracy beliefs are high particularly among members of stigmatized minority groups (Davis et al. 2018). White Christian Americans are not likely to argue that the US government is out to make them sterile, presumably because they are not stigmatized; this sort of claim is more likely made by African Americans or Muslims, who feel the heat of discrimination more closely.

Anxiety-provoking situations elicit more easily so-called “illusions of control”, and that partly explains how magical thinking arises in difficult situations, as a way to attempt to control the world. In this regard, conspiracy theories also operate similarly to religions, as explanations for incomprehensible phenomena. It is relatively hard to understand how fluoridate helps prevents cavities; in the face of this anxiety evoked by the lack of knowledge, an easier explanation is simply to say that fluoridation is actually an evil Communist plot to destroy America. Conspiracy theories therefore provide an “illusion of explanatory depth” (Rozenblit and Keil 2002), and the best way for conspiracy theorists to assure themselves that they are on the right explanatory track is by constantly engaging in confirmation bias (Klayman 1987).

Additionally, Dan Sperber argues that even when religious believers (or conspiracy theorists) are aware that their theories do not explain sufficiently well the phenomena they address, they are guided by “meta-representations”: they delegate to experts filling in the details (Sperber 2000), and go on continuing to hold their beliefs.

Evolutionarily, anxiety was an important adaptation, and consequently, it is no surprise that human beings are hardwired for constant anxious feelings and behaviors. The sympathetic nervous system activates the fight-or-flight reaction, and this was surely an adaptive mechanism in the face of predators and other threats. Neuberg et al. theorize that human brains are equipped with “threat management systems” that, very much as error management theory would predict, condition humans to constantly focus and react on things that may pose dangers (Neuberg et al. 2010). Unsurprisingly, we react quicker to snakes than to flowers, as has also been empirically documented (Ohman et al. 2001). This particular mind module has facilitated avoidance of diseases; we recognize danger in germs (only intuitively, of course, as a formal theory of microorganisms only came to be in the 19th Century), so consequently we avoid excrement, even in cases when we know it is just fudge, as has been empirically tested in studies (Rozin and Fallon, 1987). However, this threat management system frequently backfires by interpreting as dangerous situations that, in fact, are not. That is how we come to believe that fluoridation, retrovirals, vaccines, etc., are dangerous.

Perhaps even more so than predators and germs, other human beings also represented significant dangers in human evolution. Human beings naturally form coalitions against other human beings, and tribal feelings easily arise (McDonald et al. 2012). Similar patterns have been observed in chimpanzees (Wrangham 1999). Thus, the capacity to detect alliances and figure out how outsiders get together against our own inner group, was a very important adaptation. As Tobby and Cosmides (2015) posit it, human brains are equipped with an “alliance detection” system. Consequently, conspiracy theories put this system in play: in their mental patterns, they bring together unrelated people, and conclude that they are forming an alliance behind closed doors, planning to harm a particular collective. For the most part, physicians are unrelated to politicians, but given that physicians are frequently perceived as a different group (and indeed, they are, given that they are professionally organized as such) in its own right, conspiracy theorists align them with the rest of outsiders, and imagine that they form coalitions to plot against patients.

Alliance detection systems enhance “us-against-them” mentality (Cikara et al. 2011). This is of course very typical in nationalism, and unsurprisingly, it has been empirically established that conspiracy theories are related to “collective narcissism” (Cichoka et al. 2016a, b). In European and Middle Eastern history, Jews have frequently been suspected of being disloyal to the countries in which they live (“rootless cosmopolitans”), and that is presumably one additional factor why they are frequently included in medical conspiracy theories. Scapegoating also plays a significant role in conspiracy mongering (Girard 1986). Inner divisions and difficulties can be channeled towards an outsider, who takes the blame for the community’s problems. African leaders have failed to control the AIDS epidemic, but in order to divert blame and cement group unity, they opt to engage in conspiracy mongering by attributing the origin of the epidemic to outside conspirators, whoever they may be.

Ethics and implications for policy

Although some philosophers have attempted an ethical defense of conspiracy theories (Dentith 2014) (mostly on the basis that it keeps a healthy democratic check on powerful elites, and some conspiracy theories have turned out to be true), it is safe to argue that conspiracy theories do more harm than good. As previously mentioned, conspiracy theories have deleterious social and psychological effects, and especially in the medical realm, they lead to poor health behaviors. So, it can be assumed that there is an ethical duty for physicians and public health officials to attempt to mitigate medical conspiracy theories. But, how? The answer is not so clear, although the preceding information and arguments may provide some guide.

First, it is important to acknowledge that conspiracy theories are not necessarily pathological, and that they rely on evolved mental mechanisms that are hardwired in human brains. Consequently, public health officials can never hope to entirely eradicate medical conspiracy theories, and when they encounter them, they must patiently attempt to refute them, but never disrespecting those who defend them, because alas, conspiratorial thinking is quite natural.

As argued above, given their adherence to monological belief systems, conspiracy theories are frequently incorrigible, and attempts at refutation with convincing evidence, would presumably be interpreted as confirmation of the original conspiracy theory. This is known as the “backfire effect” (Nyhan and Reifler 2010). For example, one particular study found that showing vaccine skeptics a story about a baby who is hospitalized because of measles, nearly doubled the portion of skeptics’ who thought it very likely vaccines had serious side effects (Nyhan et al. 2014).

It would then appear that greater levels of education are useless in countering conspiracy theories. On one level, this appears to be true. Bogart and Thorburn document that higher levels of education do not necessarily prevent against acceptance of conspiracy theories. In fact, especially in a medical context, greater education may increase adherence to conspiracy theories, because individuals can reaffirm their suspicion by learning about real conspiracy theories, as it appears to be the case with African Americans who learn about the Tuskegee syphilis experiment (Nelson et al. 2010).

However, as a whole, education does predict decreased belief in conspiracy theories, and this has been empirically examined with larger sets of data (Van Prooijen 2016). Recall that conspiracy theories rely more on intuitive (and also minimally counterintuitive) approaches. So, as thinking becomes more analytical and less intuitive, conspiracy theories make less sense. In fact, more powerful than the “backfire effect” is the “elusive backfire effect”, i.e., people do abandon conspiracy thinking once they encounter their inconsistencies and lack of evidence (Wood and Porter 2019). This has been especially true in health-related contexts. Health information campaigns do turn out to be successful, and they are effective in correcting the distortions of conspiracy theories (Bode and Vraga 2018).

So, one important implication of this analysis is that health literacy, critical thinking, and general education as a whole, can reduce belief in conspiracy theories. Public health officials need to keep this mind when designing public policy, and physicians need to be prepared to act as educators as complement of their clinical role.

Cognitive science has established some concrete parameters as to how to make communicative campaigns more effective, especially if they pertain to medical conspiracy theories. One important feature of this approach is the emphasis on rhetorical tools that rely less on the emotional centers of the brain. Scare tactics have long been discouraged in public health campaigns, although occasionally, they have been tried, with mixed results. For example, a 1997 campaign in Australia used massive images for scaring purposes in anti-smoking campaigns, with seemingly positive results (Hill et al. 1998). But, more extensive research has proven otherwise. Backer et al. (1992) have done extensive studies showing that techniques, such as showing the effects of tobacco on dentition, are generally not effective.

Cognitive science informs that information processed in the amygdala (such as in the fear response), is received differently, with no due rational consideration (Dolan and Vuilleumier 2003). Consequently, when the dangers of, say, not vaccinating children, are presented with stark images of children suffering measles, subjects typically fail to process the message that the public health campaign may be trying to convey. Paradoxically, subjects may continue to engage in the behavior that public health officials aspire to eradicate.

Given the “elusive backfire effect”, it is more useful for policy makers to design campaigns that engage the rational aspect of information processing in people, when attempting to address medical conspiracy theories. The excessive use of catastrophic scenarios (say, a measles epidemic as a result of not vaccinating children) may hasten individuals to develop anxious attachments. Studies in magnetic resonance imaging suggest that, individuals with higher levels of anxious attachment, increase significantly activity of the amygdala (Riem et al. 2012). In one important study, results came out showing that anxious attachment predicts the general tendency to believe conspiracy theories (Green and Douglas 2018a). Therefore, the use of disturbing material to address conspiracy theories (even in the attempt to refute them), may further contribute to people accepting such theories.

One relevant contribution of cognitive science to the design of public health campaigns, is the development of frame, appeal type, and outcome extremity, in its relationship to the way public information is processed by the brain. In the case of information campaigns addressing medical conspiracy theories, these three elements must be considered, so as to get a clearer picture of what is to be achieved, and to what effect. Various studies have shown that messages that loss-framed messages with more extreme outcomes, have more probability of being remembered (Leshner and Cheng 2009); this implies that information campaigns addressing medical conspiracy theories must include more extreme outcomes. In concrete terms, this implies that if a campaign is to address, say, a conspiracy theory regarding HIV denialism, the message should sufficiently emphasize the details that theory does not sufficiently explain well.

In their educational efforts, public health officials also need to clarify things and make themselves understood. Recall that conspiracy theories frequently fill explanatory gaps, and they serve as heuristics to reduce anxiety in the face of the unknown. In this regard, educational campaigns addressing medical conspiracy theories must make sure to include the rationale for addressing the conspiracy theories in the first place, as well as the use of communication-persuasion matrixes (McGuire 1984). In particular, the use of visual aids has proven to be crucial in health campaigns, as confirmed in various studies (Garcia-Retamero and Cokley 2013), and they should prove especially apt in refuting conspiracy theories. Visual aids rely on intuition (Weitlaner et al. 2013), and recall that conspiracy theories also arise out of intuitive thinking. So the message that runs counter to conspiracy theories must be presented in a similarly intuitive manner, or else, it will not be able to compete in grabbing the attention from the public.

In order to ensure that the public is getting the right message, public health officials must consider lobbying for more advertising campaigns in media. Some people might fear that talking about a conspiracy theory, might raise the issue amongst people who never thought about it in the first place. But, as the principles of cognitive science discussed above suggest, if the theory is properly addressed with sufficient persuasive power, bringing up the topic may even put people on guard so as to be better cognitively prepared when they encounter conspiracy theories for the first time.

Furthermore, apart from advertising campaigns, mandatory screenings of short films whenever citizens have to comply with State requirements (school registration, acquisition of drivers’ license), can also prove effective in the wider awareness of the need to disavow medical conspiracy theories. This approach has proven to work in vaccination campaigns, as well as signing up for organ donation (Evers et al. 1988).

Another important aspect of any health literacy campaign in addressing conspiracy theories is a reliance on a more thorough understanding of what people believe, and the reasons they offer for doing so. The use of focus groups is very important in this regard. For example, prior to targeting African Americans in a public health campaign explaining why it is important for them to seek preventive medical care, it is important to form focus groups so as to hear from them, what they know and think about the Tuskegee Syphilis experiments. In fact, research of this kind has been done with focus groups (Freimuth et al. 2001), and it has been found that, although subjects are aware of the incident, they do not understand the full details. Hearing from subjects themselves, situates public health officials in a better position to address the particular concerns that members from disadvantaged communities may have, and specifically target aspects that may lend themselves to misinterpretation, and consequently, distortion in conspiracy theories.

Cognitive science informs that focus groups are particularly important, given the powerful effect of information transmission amongst communities (Acocella 2012). Recall that conspiracy theories are related to gossiping, for the same evolutionary reasons. Therefore, the reliance on group dynamics facilitates the inhibitions of opinions regarding conspiracy theories (Kitzinger 1995), and researchers can therefore get a better grasp of what ideas are more likely to be spread. On the basis of this information, public health officials can target particular ideas in their health literacy campaigns, placing educational efforts on those aspects that most frequently arise in focus groups discussions.

Likewise, conspiracy theories are typically defended by the dispossessed and those individuals who feel powerless. It has long been established that big social and economic inequalities leads to suspiciousness and collective paranoia (Swami and Coles 2010). Groups that find themselves in the lower end of the socio-economic scale begin to wonder how they got there in the first place, and they inevitably conclude that they have been cheated in a conspiracy.

One particularly influential study is informative in this regard. Foster (1974) studied how rural communities in Mexico become resented whenever some acquires a greater share of land. In these communities’ worldview, land is a “limited good”, and therefore, whoever increases their share, must have done so on the basis of some conspiracy. Eventually, the more prosperous landowners are accused of using witchcraft. This case clearly expresses how dispossession and powerlessness may lead to conspiracy mongering.

Given that inequality and powerlessness is a significant cause of conspiracy mongering, policy designers must address this problem. One particularly effective approach is wealth redistribution through universal service policies (Mueller 1999). Political attempts to increase universal accessibility to health care may in turn further feed conspiracy theories. For example, the Affordable Care Act (Obamacare) in the United States played into the hands of conspiracy theorists who were already suspicious of Obama’s background and intentions (Quadagno 2014). In fact, it has been empirically shown that in the American public there are significant misperceptions of Obamacare (Pasek et al. 2015), which makes it again necessary to address these misconceptions before they turn into conspiracy theories.

Increased access to health care does predict a lower adscription to conspiracy theories, and for that reason, any attempt to eradicate conspiratorial thinking regarding medical issues must rely on an attempt to make universal healthcare more expansive. Expanding a safety net is an ambitious goal, and may be more of a political talking point, than an actual concrete proposal by public health officials. But, one important aspect is the communication to the public that, ultimately, public health is in the interest of the common good. Recall that, as individuals preserve a sense of community, they rely more on communal links, and therefore, become less suspicious of each other. If proper political steps are taken, so that citizens strengthen links to each other by universally receiving health care, the levels of paranoia that typically arouse conspiracy theories, may be significantly reduced.

The building of a sustainable safety net is also of great importance in this regard. The presence of a safety net would prevent the dispossessed from engaging in conspiracy mongering, because even if they come to feel that they do not have a greater say in the dictum of society, they at least preserve the satisfaction of being secure in case of extreme hardship.

Nevertheless, as Calomiris (1999) advocates it, this safety net must be incentive compatible, so that it remains sustainable. One important feature of this safety net, which specifically pertains to medical conspiracy theories, is universal health care. Most industrialized countries have robust systems of universal healthcare, but the United States is lacking in it (Lasser et al. 2006). Not coincidentally, it has been empirically established that the perception that Big Pharma is just a business whose sole motivation is profit, actually induces conspiracy mongering (Blaskiewicz 2013). A system of universal healthcare would decrease that perception, and in turn, would reduce the proliferation of medical conspiracy theories.

John Rawls’ arguments in favor of a welfare state, on the basis of a “veil of ignorance”, are very relevant here (Korobkin 1998). If individuals design a society in which they envision themselves to be in the lowest position, they may be more apt at understanding what the society as a whole needs in order to keep its citizens healthy. Cognitive science has provided a thorough understanding of how imagination is crucial for forming moral opinions (Johnson 1994). One particularly important recommendation in this regard is to appeal, not necessarily to excessive emotions or scare tactics, but at least to plausible imagined scenarios in awareness campaigns, so that people may come to understand why particular policies, such as more expansive healthcare, are needed.

The internet has a big role to play in public health campaigns targeting medical conspiracy theories. Officials have realized that, of all social media, Twitter in particular plays a huge role in the shaping of opinions and transmission of information, related to health issues (Denecke et al. 2013). Twitter has the particular advantage of conveying messages in a limited amount of words. From a cognitive science perspective, this proves very useful, because studies do show that shorter messages can have more powerful effects in brain processing (Saharia 2015), especially if they pertain to emotional issues, as the case of medical conspiracy theories tends to be. Consequently, one important ethical implication from the cognitive science of medical conspiracy theories, is that, inasmuch as these theories rely on simplistic bites, one efficient way of combating misinformation might be by relying on similarly short information that debunks the false narratives circulating. For this endeavor, Twitter is ideal. One study has shown that misinformation and conspiracy mongering regarding the Zika virus effectively countered by Twitter (Wood 2018).

Given the power of Twitter, public health officials must also encourage physicians, nurses, and other healthcare workers, to embrace Twitter more proficiently, so as to push back whenever medical conspiracy theories arise. Hospitals may very well organize training sessions in which healthcare workers are taught to synthetize relevant information in the short space provided by Twitter. So far, it is unclear what the extent of Twitter and other social media is amongst doctors (for medical purposes) (Hawn 2009), but as a bulwark against medical conspiracy, its use should be more expansive amongst health professionals.

Yet, Twitter, social media, and the internet as a whole, also plays a big role in the spread of medical conspiracy theories. In that sense, one important aspect of public health campaigns to address medical conspiracies, is the regulation of the internet. It is important to remember that, apart from the natural disposition towards gossiping, internet has potentiated the effects of rumor. This facilitates the spread of lies, but as Lidsky (2008) explains, the deliberate spread of false information cannot be protected as free speech. Although the internet has been an immensely valuable resource, communication ethicists now understand that more regulation is needed (Weiser 2009), and this is an important aspect in addressing conspiracy mongering, particularly in the healthcare sector. Experts still debate whether internet has made conspiracy theories more prevalent; for now, there is no definite consensus, and it may still be too early to tell (Wood 2013). Yet, the internet is here to stay, and given that reality, public health officials must give more consideration to lobbying action to lawmakers and politicians, in order to call for a greater control of the information that is divulged in the cyberspace.

Furthermore, patient empowerment is also a useful resource in the eradication of conspiracy theories, for the reasons already discussed. Medical ethics in the past did not place much emphasis on the principle of autonomy, and paternalism was the rule. Things have changed over the last few decades, but physicians need to further ensure that patients retain the power of decision through informed consent. On a concrete level, this implies that public health officials emphasize to public health workers about the utmost importance of not imposing decisions on patients, and about the need to convey all the relevant information to them. Yet, we should keep in mind that in the current discussion, ethical imperatives can go beyond informed consent, given the different forms of public engagement. For example, O’Neill (2003) argues that “since the point of consent procedures is to limit deception and coercion, they should be designed to give patients and others control over the amount of information they receive and opportunity to rescind consent already given.”

In this manner, patients will feel that they do have the power to decide over their own bodies, and thus, will not easily come to believe the conspiracy theories that are more common amongst persons who do not have the privilege to decide on their own.

Physicians and public health officials also need to take a more activist political role. This may seem counterintuitive, since doctors who get involved in engagement in public discourse may easily be perceived to be in alliance with politicians, thus giving rise to all sorts of conspiracy theories. But in fact, by more actively participating in engagement in public discourse, physicians and public health officials can take steps to ensure that marginalized populations receive proper healthcare and become better integrated to society. By doing this, powerlessness can again be reduced, and thus one factor fueling medical conspiracy theories can be mitigated.

One concrete way of expanding the political participation of physicians is by encouraging the formation of guilds and local chapters of medical associations, in which health workers may gather to discuss, not just technical issues, but also how health relates to society. Hospitals need also to encourage community life amongst its staff (sports tournaments, cultural events), so that staff can form a greater sense of commitment to social issues (and thus alienation is prevente), and consequently come up with more effective ways of approaching policymakers as to what may be the most effective way of empowering dispossessed communities in the access to healthcare. Furthermore, hospitals can arrange for weekly seminar series open to the wider public, in which particular social and political problems related to conspiracy theories claims are discussed (e.g., the price of medications, government healthcare plans, race representation in particular diseases), and seize the opportunity to hear attendees that may potentially be sympathetic to medical conspiracy theories, and engage them in dialogue.

Finally, recall also that conspiracy theories feed into the evolved “us-versus-them” mentality, along with scapegoating. If doctors are perceived as outsiders, then they are more likely to be the object of conspiracy speculations. Health workers need to ensure that they find common links with their patients. This implies respect (although not necessarily agreement) with patients’ local cultures and even ways of understanding disease and medicine (Flores 2000).

Some empirical data suggests that when patients and doctors have different ethnicities, compliance rates are lower (McQuaid and Landier 2018), although this is not an insurmountable obstacle. Patients may not fully trust doctors of different ethnicities, and that may contribute to conspiracy theories about their procedures. One potential way of dealing with this problem is by ensuring the medical profession is represented by all ethnicities, through a program of affirmative action (Magnus and Mick 2000). Once again, public health officials can take lobbying action, so as to motivate lawmakers to take decisive steps in that direction.

However, affirmative action in medicine can also become very divisive (Sowell 2005), thus deepening the “us-versus-them” mentality that sustains conspiracy theories in the first place. Explicit racial and ethnic preferences can contribute to stereotypes in healthcare services, and ultimately, these stereotypes nourish conspiracy mongering. One possible countermeasure is the developing of strategies for the advancement of cosmopolitanism and supraethnic, supraracial and supranational identities, in order to bridge groups that have suspicion for each other. On a concrete level, this can be done by medical associations endorsing civic messages that call for the unity of a country through public messages, broadcast on TV, radio, and other media.

To achieve this purpose, health workers need to find a balance between engaging with local cultures so as not to appear as outsiders, but not become too parochial, so as to encourage the cosmopolitanism that prevents against conspiracy mongering. Concretely, this balance can be struck by including more cultural diversity and sensitivity training in hospitals and medical schools, as part of professional development plans.

Conclusion

Recent developments in both the United States and Europe have given occasion to the rise of post-truth politics; i.e., massive misinformation for pure electoral gain. This in turn has given rise to a flourishing of conspiracy theories, that feeds a paranoid style, not only in political activities, but in society as a whole.

Despite the fact that suspicions regarding medical procedures have always existed, this sudden rise of conspiracy mongering has also had important implications pertaining to medical information. In the past, some unethical medical procedures have been done, and on the basis of this, new conspiracy theories have arisen.

Although it offers no coherent, unified view to explain why people believe in conspiracy theories, the emerging field of cognitive science has offered some guidance in the attempt to understand how these ideas are transmitted, and why they stick. Pattern recognition, powerlessness, and anxiety-induced illusions of control, are some of the most important mechanisms underlying the prevalence of conspiracy theories.

This information can better sustain some of the policies that can be designed in order to counter the spread of medical conspiracy theories. Concrete measures such as avoidance of scaring tactics, improved communication skills, increase of Twitter use amongst doctors, use of focus groups, greater respect for patients’ autonomy, lobbying for Affirmative Action, and cultural and diversity training, could theoretically be useful means of pushing back against the prevalence of medical conspiracy theories. All these measures ultimately have a connection with the understanding that cognitive science offers of conspiracy theories in general. Unfortunately, given the current political climate of Europe and the United States, medical conspiracy theories are likely to either stay, or morph into new ones. Precisely for that reason, a deeper understanding of why people believe them is necessary (and for this, cognitive science offers a relevant approach), and further consideration about effective policies to counter them, is also needed.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Abalakina-Paap M, Stephan W, Craig T, Gregory WL. Beliefs in conspiracies. Political Psychology. 1999;20:637–647. [Google Scholar]
  2. Acocella Ivana. The focus groups in social research: advantages and disadvantages. Quality & Quantity. 2012;46(4):1125–1136. [Google Scholar]
  3. Andrade Gabriel, Hussain Azhar. Polio in Pakistan: Political, Sociological, and Epidemiological Factors. Cureus. 2018;10(10):e3502. doi: 10.7759/cureus.3502. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Backer Thomas E, Rogers Everett, Sopory Pradeep. Designing health communication campaigns: What works? Newbury Park, CA: Sage; 1992. [Google Scholar]
  5. Bandura Albert. Social Learning Theory. New York: Prentice Hall; 1977. [Google Scholar]
  6. Banks Nancy. AIDS, Opium, Diamonds, and Empire: The Deadly Virus of International Greed. New York: I Universe; 2010. [Google Scholar]
  7. Bartlett J, Miller C. The power of unreason: Conspiracy theories, extremism and counter-terrorism. London, UK: Demos; 2010. [Google Scholar]
  8. Bearman P. Just-so stories: Vaccines, autism, and the single-bullet disorder. Social Psychology Quarterly. 2010;73(2):112–115. doi: 10.1177/0190272510371672. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Bebbington P, McBride O, Steel C, Kuipers E, Radovanic M, Brugha T, Jenkins R, Meltzer H, Freeman D. The structure of paranoia in the general population. The British Journal of Psychiatry. 2013;202:419–427. doi: 10.1192/bjp.bp.112.119032. [DOI] [PubMed] [Google Scholar]
  10. Blackmore Susan. The Meme Machine. Oxford: Oxford University Press; 1999. [Google Scholar]
  11. Blaskiewicz R. The Big Pharma conspiracy theory. Medical Writing. 2013;22(4):259–261. [Google Scholar]
  12. Bode L, Vraga E. See Something, Say Something: Correction of Global Health Misinformation on Social Media. Health Communication. 2018;33(9):1131–1140. doi: 10.1080/10410236.2017.1331312. [DOI] [PubMed] [Google Scholar]
  13. Bogart LM, Bird ST. Exploring the relationship of conspiracy beliefs about HIV/AIDS to sexual behaviors and attitudes among African-American adults. Journal of the National Medical Association. 2003;95(11):1057. [PMC free article] [PubMed] [Google Scholar]
  14. Boyer Pascal. The Naturalness of Religious Ideas. Los Angeles: University of California Press; 1994. [Google Scholar]
  15. Brotherton Rob. Suspicious Minds: Why We Believe Conspiracy Theories. New York: Bloomsbury; 2013. [Google Scholar]
  16. Calomiris CW. Building an incentive-compatible safety net. Journal of Banking & Finance. 1999;23(10):1499–1519. [Google Scholar]
  17. Carstairs C, Elder R. Expertise, health, and popular opinion: Debating water fluoridation, 1945–80. Canadian Historical Review. 2008;89(3):345–371. [Google Scholar]
  18. Chagnon Napoleon. Yanomamo: The Fierce People. New York: Holt, Rinehart and Winston; 1983. [Google Scholar]
  19. Cichocka A, Marchlewska M, Golec de Zavala A. Doe self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and the endorsement of conspiracy theories. Social Psychological and Personality Science. 2016;7:157–166. [Google Scholar]
  20. Cichocka A, Marchlewska M, Golec de Zavala A, Olechowski M. “They will not control us”: In-group positivity and belief in intergroup conspiracies. British Journal of Psychology. 2016;107:556–576. doi: 10.1111/bjop.12158. [DOI] [PubMed] [Google Scholar]
  21. Cikara Mina, Botvinick Matthew, Fiske Susan. Us versus Them: Social Identity Shapes Neural Responses to Intergroup Competition and Harm. Psychological Science. 2011;22:3. doi: 10.1177/0956797610397667. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Clarke S. Conspiracy theories and the Internet: Controlled demolition and arrested development. Episteme. 2007;4(2):167–180. [Google Scholar]
  23. Darwin H, Neave N, Holmes J. Belief in conspiracy theories: The role of paranormal belief, paranoid ideation and schizotypy. Personality and Individual Differences. 2011;50:1289–1293. [Google Scholar]
  24. Davis J, Wetherell G, Henry PJ. Social devaluation of African Americans and race-related conspiracy theories. European Journal of Social Psychology. 2018 doi: 10.1002/ejsp.2531. [DOI] [Google Scholar]
  25. Denecke K, Krieck M, Otrusina L, Smrz P, Dolog P, Nejdl W, Velasco E. How to exploit twitter for public health monitoring? Methods of information in medicine. 2013;52(04):326–339. doi: 10.3414/ME12-02-0010. [DOI] [PubMed] [Google Scholar]
  26. Dentith Matthew. The Philosophy of Conspiracy Theories. New York: Palgrave; 2014. [Google Scholar]
  27. Dolan Raymond J, Vuilleumier Patrick. Amygdala automaticity in emotional processing. Annals of the New York Academy of Sciences. 2003;985(1):348–355. doi: 10.1111/j.1749-6632.2003.tb07093.x. [DOI] [PubMed] [Google Scholar]
  28. Dube E, Vivion M, MacDonald NE. Vaccine hesitancy, vaccine refusal and the anti-vaccine movement: Influence, impact and implications. Expert Review of Vaccines. 2015;14(1):99–117. doi: 10.1586/14760584.2015.964212. [DOI] [PubMed] [Google Scholar]
  29. Dunbar Robin. Grooming, Gossip and the Evolution of Language. Salem: Harvard University Press; 1996. [Google Scholar]
  30. Dyer Clare. Whooping cough vaccine on trial again. Medicolegal. 1987;295:1053–1054. doi: 10.1136/bmj.295.6605.1053. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Ebel-Lam AP, Fabrigar LR, MacDonald TK, Jones S. Balancing causes and consequences: The magnitude-matching principle in explanations for complex social events. Basic & Applied Social Psychology. 2010;32:348–359. [Google Scholar]
  32. Ernst Edzard. Alternative Medicine. A Critical Assessment of 150 Modalities. New York: Springer; 2019. [Google Scholar]
  33. Evans-Pritchard EE. Witchcraft, Oracles and Magic Among the Azande. London: Clarendon Press; 1963. [Google Scholar]
  34. Evers S, Farewell VT, Halloran PF. Public awareness of organ donation. CMAJ Canadian Medical Association Journal. 1988;138(3):237. [PMC free article] [PubMed] [Google Scholar]
  35. Fenster Mark. Conspiracy Theories: Secrecy and Power in American Culture. Minneapolis: University of Minnesota Press; 1999. [Google Scholar]
  36. Festinger Leon. A Theory of Cognitive Dissonance. Stanford: Stanford University Press; 1957. [Google Scholar]
  37. Flores Glenn. Culture and the patient-physician relationship: Achieving cultural competency in health care. The Journal of Pediatrics. 2000;136:1. doi: 10.1016/s0022-3476(00)90043-x. [DOI] [PubMed] [Google Scholar]
  38. Foster GM. Limited good or limited goods: Observations on Acheson. American Anthropologist. 1974;76(1):53–57. [Google Scholar]
  39. Fourie Pieter, Meyer Melissa. The Politics of AIDS Denialism. New York: Routledge; 2010. [Google Scholar]
  40. Freimuth Vicki S, et al. African Americans’ views on research and the Tuskegee Syphilis Study. Social Science & Medicine. 2001;52(5):797–808. doi: 10.1016/s0277-9536(00)00178-7. [DOI] [PubMed] [Google Scholar]
  41. Garcia-Retamero R, Cokely ET. Communicating health risks with visual aids. Current Directions in Psychological Science. 2013;22(5):392–399. [Google Scholar]
  42. Girard Rene. The Scapegoat. Baltimore: John Hopkins University Press; 1986. [Google Scholar]
  43. Goertzel T. Belief in conspiracy theories. Political Psychology. 1994;15:733–744. [Google Scholar]
  44. Goldacre Ben. Bad Science. London: Harper Collins; 2008. [Google Scholar]
  45. Gottlieb SD. Vaccine resistances reconsidered: Vaccine skeptics and the Jenny McCarthy effect. Biosocieties. 2016;11(2):152–174. [Google Scholar]
  46. Gray K, Wegner D. Blaming God for our pain: human suffering and the divine mind. Personality and Social Psychology Review. 2010;14(1):7–16. doi: 10.1177/1088868309350299. [DOI] [PubMed] [Google Scholar]
  47. Green R, Douglas KM. Anxious attachment and belief in conspiracy theories. Personality and Individual Differences. 2018;125:30–37. [Google Scholar]
  48. Grimmes David. On the Viability of Conspiratorial Beliefs. PLoS ONE. 2016 doi: 10.1371/journal.pone.0147905. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Guthrie Steven. Faces in the Clouds: A New Theory of Religion. Oxford: Oxford University Press; 1995. [Google Scholar]
  50. Hagen Kurtis. Conspiracy theorists and monological belief systems. Argumenta. 2018;3:2. [Google Scholar]
  51. Hagen Kurtis. Conspiracy theories and the paranoid style: do conspiracy theories posit implausibly vast and evil conspiracies? Social Epistemology. 2018;32(1):24–24. [Google Scholar]
  52. Haselton Martie. Error management theory: A new perspective on biases in cross-sex mind reading. Journal of Personality and Social Psychology. 2000;78(1):81–91. doi: 10.1037//0022-3514.78.1.81. [DOI] [PubMed] [Google Scholar]
  53. Hawn C. Take two aspirin and tweet me in the morning: How Twitter, Facebook, and other social media are reshaping health care. Health affairs. 2009;28(2):361–368. doi: 10.1377/hlthaff.28.2.361. [DOI] [PubMed] [Google Scholar]
  54. Heider F, Simmel M. An experimental study of apparent behavior. American Journal of Psychology. 1944;57:243–259. [Google Scholar]
  55. Heldman AB, Schindelar J, Weaver JB. Social media engagement and public health communication: Implications for public health organizations being truly “social”. Public Health Reviews. 2013;35(1):13. [Google Scholar]
  56. Hellinger Daniel. Conspiracy and Conspiracy Theories in the Ae of Trump. New York: Palgrave; 2019. [Google Scholar]
  57. Hill David, Chapman Simon, Donovan Robert. The return of scare tactics. Tobacco Control. 1998;7(1):5–8. doi: 10.1136/tc.7.1.5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Hoffman Steven, Mansoor Yasmeen, Natt Navneet, Sritharan Lathika, Belluz Julia, Caulfield Timothy, Freedhoff Yoni, Lavis John, Sharma Arya. Celebrities’ impact on health-related knowledge, attitudes, behaviors, and status outcomes: Protocol for a systematic review, meta-analysis, and meta-regression analysis. Systematic Reviews. 2017;6:13. doi: 10.1186/s13643-016-0395-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Hofstadter R. The Paranoid Style in American Politics. New York: Vintage; 2012. [Google Scholar]
  60. Icke David. The David Icke Guide to the Global Conspiracy (and How to End It) London: David Icke Books; 2007. [Google Scholar]
  61. Imhoff R, Bruder M. Speaking (un-)truth to power: Conspiracy mentality as a generalized political attitude. European Journal of Personality. 2014;28:25–43. [Google Scholar]
  62. Johnson M. Moral Imagination: Implications of Cognitive Science for Ethics. Chicago: University of Chicago Press; 1994. [Google Scholar]
  63. Jolley D, Douglas K. The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS ONE. 2014;9:e89177. doi: 10.1371/journal.pone.0089177. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Kelemen Deborah. The scope of teleological thinking in schoolchildren. Cognition. 1999;70:241–272. doi: 10.1016/s0010-0277(99)00010-4. [DOI] [PubMed] [Google Scholar]
  65. Kitzinger Jenny. Qualitative research: Introducing focus groups. BMJ. 1995;311(7000):299–302. doi: 10.1136/bmj.311.7000.299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Klayman J, Ha Y-W. Confirmation, disconfirmation and information in hypothesis testing. Psychological Review. 1987;94:211–228. [Google Scholar]
  67. Knight Peter. Conspiracy Culture: From the Kennedy Assassination to the X-Files. New York: Routledge; 2013. [Google Scholar]
  68. Korobkin R. Determining health care rights from behind a veil of ignorance. U. Ill. L. Rev. 1998;1998(3):801–836. doi: 10.2139/ssrn.85189. [DOI] [PubMed] [Google Scholar]
  69. Lasser KE, Himmelstein DU, Woolhandler S. Access to care, health status, and health disparities in the United States and Canada: Results of a cross-national population-based survey. American Journal of Public Health. 2006;96(7):1300–1307. doi: 10.2105/AJPH.2004.059402. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Leshner G, Cheng IH. The effects of frame, appeal, and outcome extremity of antismoking messages on cognitive processing. Health Communication. 2009;24(3):219–227. doi: 10.1080/10410230902804117. [DOI] [PubMed] [Google Scholar]
  71. Lidsky LB. Where's the Harm: Free Speech and the Regulation of Lies. Wash. & Lee L. Rev. 2008;65:1091. [Google Scholar]
  72. Magnus S, Mick S. Medical schools, affirmative action, and the neglected role of social class. American Journal of Public Health. 2000;90(8):1197–1201. doi: 10.2105/ajph.90.8.1197. [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Malinowski Bronislaw. Magic, Science and Religion. New York: Waveland; 1992. [Google Scholar]
  74. McAndrew FT, Milenkovic MA. Of tabloids and family secrets: The evolutionary psychology of gossip 1. Journal of Applied Social Psychology. 2002;32(5):1064–1082. [Google Scholar]
  75. McCauley C, Jacques S. The popularity of conspiracy theories of presidential assassination: A Bayesian analysis. Journal of Personality and Social Psychology. 1979;37:637–644. [Google Scholar]
  76. McDonald Melissa, Navarrete Carlos, Van Vugt Mark. Evolution and the psychology of intergroup conflict: The male warrior hypothesis. Philosophical Transactions of the Royal Society B. 2012;367(1589):670–679. doi: 10.1098/rstb.2011.0301. [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. McGuire W. Public communication as a strategy for inducing health-promoting behaviorial change. Preventive Medicine. 1984;14:3. doi: 10.1016/0091-7435(84)90086-0. [DOI] [PubMed] [Google Scholar]
  78. McNeill Paul. The Ethics and Politics of Human Experimentation. Cambridge: University of Cambridge Press; 1993. [Google Scholar]
  79. McQuaid Elizabet, Landier Wendy. Cultural Issues in Medication Adherence: Disparities and Directions. Journal of General Internal Medicine. 2018;33(2):200–206. doi: 10.1007/s11606-017-4199-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Moreno JD. Undue Risk: Secret State Experiments on Humans. New York: Routledge; 2013. [Google Scholar]
  81. Mueller M. Universal service policies as wealth redistribution. Government Information Quarterly. 1999;16(4):353–358. [Google Scholar]
  82. Nelson Jessica C, Adams Glenn, Branscombe Nyla R, Schmitt Michael. The role of historical knowledge in perception of race-based conspiracies. Race and Social Problems. 2010;2(2):69–80. [Google Scholar]
  83. Neuberg Steven, Kenrick Douglas, Schaller Mark. Human Threat Management Systems: Self-Protection and Disease Avoidance. Neuroscience Biobehavioral Review. 2010;35(4):1042–1051. doi: 10.1016/j.neubiorev.2010.08.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Norenzayan Ara, Attran Scott, Faulkner Jason, Schaller Mark. Memory and Mystery: The Cultural Selection of Minimally Counterintuitive Narratives. Cognitive Science. 2006;30:531–553. doi: 10.1207/s15516709cog0000_68. [DOI] [PubMed] [Google Scholar]
  85. Nyhan Brendan, Reifler Jason. When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior. 2010;32:2. [Google Scholar]
  86. Nyhan Brendan, et al. Effective messages in vaccine promotion: A randomized trial. Pediatrics. 2014;133(4):e835–e842. doi: 10.1542/peds.2013-2365. [DOI] [PubMed] [Google Scholar]
  87. Öhman A, Flykt A, Esteves F. Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology. 2001;130:466–478. doi: 10.1037//0096-3445.130.3.466. [DOI] [PubMed] [Google Scholar]
  88. Oliver Eric, Woods Thomas. Medical Conspiracies and Health Behaviors in the United States. Jama Internal Medicine. 2014;174(5):817–818. doi: 10.1001/jamainternmed.2014.190. [DOI] [PubMed] [Google Scholar]
  89. O'Neill O. Some limits of informed consent. Journal of Medical Ethics. 2003;29(1):4–7. doi: 10.1136/jme.29.1.4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  90. Ottersen OP, Helm PJ. How hardwired is the brain? Nature. 2002;420(6917):751–752. doi: 10.1038/420751a. [DOI] [PubMed] [Google Scholar]
  91. Parker-Pope, T. 2009. Bill Maher vs. the Flu Vaccine. Well Blog, New York Times.
  92. Pasek J, Sood G, Krosnick JA. Misinformed about the affordable care act? Leveraging certainty to assess the prevalence of misperceptions. Journal of Communication. 2015;65(4):660–673. [Google Scholar]
  93. Piaget Jean, Inhelder Barbara. The Psychology of the Child. New York: Basic; 2008. [Google Scholar]
  94. Quadagno J. Right-wing conspiracy? Socialist plot? The origins of the patient protection and affordable care act. Journal of Health Politics, Policy and Law. 2014;39(1):35–56. doi: 10.1215/03616878-2395172. [DOI] [PubMed] [Google Scholar]
  95. Riem Madelon ME, et al. Attachment in the brain: Adult attachment representations predict amygdala and behavioral responses to infant crying. Attachment & Human Development. 2012;14(6):533–551. doi: 10.1080/14616734.2012.727252. [DOI] [PubMed] [Google Scholar]
  96. Roisman Joseph. The Rhetoric of Conspiracy in Ancient Athens. Los Angeles: University of California Press; 2006. [Google Scholar]
  97. Rosnow RL. Inside rumor: A personal journey. American Psychologist. 1991;46:484–496. [Google Scholar]
  98. Rosset Evelyn. It's no accident: Our bias for intentional explanations. Cognition. 2008;108(3):771–780. doi: 10.1016/j.cognition.2008.07.001. [DOI] [PubMed] [Google Scholar]
  99. Rozenblit Leonid, Keil Frank. The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science. 2002;26(5):521–562. doi: 10.1207/s15516709cog2605_1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  100. Rozin Paul, Fallon April. A perspective on disgust. Psychological Review. 1987;9(1):23–41. [PubMed] [Google Scholar]
  101. Saharia, N. (2015). Detecting emotion from short messages on Nepal earthquake. In 2015 International Conference on Speech Technology and Human-Computer Dialogue (SpeD) (pp. 1–5). IEEE.
  102. Sharfstein JM. Vaccines and the trump administration. JAMA. 2017;317(13):1305–1306. doi: 10.1001/jama.2017.2311. [DOI] [PubMed] [Google Scholar]
  103. Silva BC, Vegetti F, Littvay L. The elite is up to something: Exploring the relationship between populism and belief in conspiracy theories. Swiss Political Science Review. 2017;23:423–443. [Google Scholar]
  104. Slone Jason. Theological Incorrectness: Why Religious People Believe What They Shouldn't. Oxford: Oxford University Press; 2004. [Google Scholar]
  105. Sowell Thomas. Affirmative Action Around the World. New Haven: Yale University Press; 2005. [Google Scholar]
  106. Sperber Dan. Introduction. In: Sperber Dan., editor. Metarepresentations: A Multidisciplinary Perspective. Oxford: Oxford University Press; 2000. [Google Scholar]
  107. Swami V, Voracek M, Stieger S, Tran US, Furnham A. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014;133:572–585. doi: 10.1016/j.cognition.2014.08.006. [DOI] [PubMed] [Google Scholar]
  108. Swami V, Coles R. The truth is out there: Belief in conspiracy theories. The Psychologist. 2010;23(7):560–563. [Google Scholar]
  109. Thorburn S, Bogart L. Conspiracy beliefs about birth control: Barriers to pregnancy prevention among African Americans of reproductive age. Health, Education & Behavior. 2005;32(4):474–487. doi: 10.1177/1090198105276220. [DOI] [PubMed] [Google Scholar]
  110. Tooby J, Cosmides L. Conceptual foundations of evolutionary psychology. In: Buss D, editor. Handbook of Evolutionary Psychology. London: Wiley; 2015. [Google Scholar]
  111. Uscinski Joseph, Parent Joseph. American Conspiracy Theories. Oxford: Oxford University Press; 2014. [Google Scholar]
  112. Van Prooijen J-W, Van Lange PAM. The social dimension of belief in conspiracy theories. In: van Prooijen J-W, van Lange PAM, editors. Power, politics, and paranoia: Why people are suspicious of their leaders. Cambridge: Cambridge University Press; 2014. [Google Scholar]
  113. Van Prooijen J-W, Douglas K, De Inocencio C. Connecting the dots: Illusory pattern perception predicts beliefs in conspiracies and the supernatural. European Journal of Social Psychology. 2018;48:320–335. doi: 10.1002/ejsp.2331. [DOI] [PMC free article] [PubMed] [Google Scholar]
  114. Van Prooijen J-W, Krouwel APM, Pollet T. Political extremism predicts belief in conspiracy theories. Social Psychological and Personality Science. 2015;6:570–578. doi: 10.1177/1948550614567356. [DOI] [Google Scholar]
  115. Van Prooijen J. Why Education Predicts Decreased Belief in Conspiracy Theories. Applied Cognitive Psychology. 2016 doi: 10.1002/acp.3301. [DOI] [PMC free article] [PubMed] [Google Scholar]
  116. Von Rueden C, van Vugt M. Leadership in small-scale societies: Some implications for theory, research, and practice. The Leadership Quarterly. 2015;26:978–990. [Google Scholar]
  117. Webb Gary. Dark Alliance: The CIA, the Contras, and the Cocaine Explosion. New York: Seven Stories Press; 2019. [Google Scholar]
  118. Weiser PJ. The future of Internet regulation. UC Davis L. Rev. 2009;43:529. [Google Scholar]
  119. Weitlaner, D., Guettinger, A., & Kohlbacher, M. (2013). Intuitive comprehensibility of process models. In International Conference on Subject-Oriented Business Process Management (pp. 52–71). Berlin, Heidelberg: Springer.
  120. West HG, Sanders T. Transparency and conspiracy: Ethnographies of suspicion in the New World Order. Durham, NC: Duke University Press; 2003. [Google Scholar]
  121. Wood M. Has the internet been good for conspiracy theorising. PsyPAG Quarterly. 2013;88:31–34. [Google Scholar]
  122. Wood MJ. Propagating and debunking conspiracy theories on Twitter during the 2015–2016 Zika virus outbreak. Cyberpsychology, Behavior, and Social Networking. 2018;21(8):485–490. doi: 10.1089/cyber.2017.0669. [DOI] [PMC free article] [PubMed] [Google Scholar]
  123. Wood Michael, Douglas Karen, Sutton Robbie. Dead and Alive: Contradictory Conspiracy Theories. Social Psychology and Personality Science. 2011;00:1–7. [Google Scholar]
  124. Wood Thomas, Porter Ethan. The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence. Political Behavior. 2019;41:1. [Google Scholar]
  125. Wrangham RW. Evolution of coalitionary killing. Yearbook of Physical Anthropology. 1999;42:1–30. doi: 10.1002/(sici)1096-8644(1999)110:29+<1::aid-ajpa2>3.3.co;2-5. [DOI] [PubMed] [Google Scholar]

Articles from Medicine, Health Care, and Philosophy are provided here courtesy of Nature Publishing Group

RESOURCES