Abstract
Public health is usually enacted through public policies, necessitating that the public engage in debates that, ideally, are grounded in solid scientific findings.
Mistrust in science, however, has compromised the possibility of deriving sound policy from such debates, partially owing to justified concerns regarding undue interference and even outright manipulation by commercial interests. This situation has generated problematic impasses, one of which is the emergence of an anti-vaccination movement that is already affecting public health, with a resurgence in the United States of preventable diseases thought to have been eradicated.
Drawing on British sociologist Harry Collins’ work on expertise, we propose a theoretical framework in which the paralyzing, undue public distrust of science can be analyzed and, it is hoped, overcome.
“As the nation’s leading public health organization, APHA strengthens the impact of public health professionals and provides a science-based voice in policy debates too often driven by emotion, ideology or financial interests. APHA is at the forefront of efforts to advance prevention, reduce health disparities and promote wellness.”
—American Public Health Association1
“Science, if it can deliver truth, cannot deliver it at the speed of politics.”
—Collins and Evans2(p1)
By definition, public health interventions are enacted on human collectives, and as pointed out by Geoffrey Rose (among many others) decades ago, preventive strategies are most effective in implementing such interventions.3 This situation can—and often does—produce ethical conundrums wherein individual and collective rights clash.4 The key to democratic public policy is the availability of scientific evidence, its effective communication, and the competence of those involved in its formulation to evaluate presented evidence, as made clear by the epigraph of this article, extracted from the website of the American Public Health Association (APHA) in the section in which the association’s mission is defined. Forces that operate against the necessary rational debate are pointed out as well.
This passage, however, places a great deal of trust, arguably too much, in science itself and in the credibility and strength of science-based arguments in public debates. It is not difficult to find examples of scientific blunders that have compromised public health, for instance the approval of the commercialization of certain medicines that proved to pose an unjustifiable health risk. This was the case of thalidomide in the past, which turned out to be associated with birth defects, and the more recent case of rofecoxib, which was found to be associated with cardiovascular diseases, some lethal. It is also not difficult to find situations when science itself was distorted, often as a result of commercial interests, with negative implications for public health.5 And, by the same token, it is not difficult to find examples of public mistrust in science, even if such mistrust is totally unjustified.6,7
At the same time, contemporary life in most of the world depends on complex technologies that are scarcely, if at all, understood even in superficial terms by most people. These technologies, labeled by British sociologist Antony Giddens as “expert systems,” are nevertheless trusted, in Giddens’ view, through a kind of pragmatic faith.8(p27)
How the general population comes to trust such technologies is partially a matter of their effectiveness, but it also involves the way in which the public discourse about them is constructed. Considering the diversity of scientific and technological domains that are involved in understanding how any given expert system functions, this means that public opinion at some point has to rely on the word of experts. This is particularly true for most public health issues. A majority of health risks, for instance, are not self-evident, and identifying these risks requires a somewhat sophisticated epidemiological machinery that is fully understood by relatively few, even among health professionals. This means that the issue of what to trust usually becomes an issue of whom to trust.
This state of affairs brings to the spotlight the role of the expert and how problematic this role can be in a democratic society.9 Harry Collins, a sociologist of science who has consistently studied expertise, points to 2 opposite risks in how society at large interacts with experts: at one extreme, there is what he calls technological fascism, a technocratic view that grants to experts a monopoly of opinion on their specific subjects, effectively excluding proper political negotiations from decisions; at the other extreme (and equally unwanted) is technological populism, which effectively denies any role scientific experts might have in a public policy debate.2
Still, according to Collins, a general mistrust of science and scientists has produced a paralyzing form of skepticism that empowers scientific populism.10,11 This mistrust has had disastrous results, as exemplified by the misguided influence of anti-vaccine activism that led to a resurgence in the United States and Europe of infectious diseases that had practically been eradicated (discussed in detail subsequently).10 Belief in bizarre conspiracy theories is not uncommon in the United States5 and elsewhere, constituting yet another obstacle to overcome in communicating scientific findings about health issues.
These themes come together in the issue of the potential association between childhood vaccinations and autism. Objections to vaccinations have a long history based on skepticism of the underlying science. Questions have been raised about efficacy (whether vaccines protect against disease) and safety (whether they can do harm to patients).12 Parents have objected to mandatory vaccination requirements for school entry, considering them unwarranted government intrusions on personal freedom.13 Much of this disinformation is spread through the Internet,14 virally spreading to an exponential audience and seriously undermining public confidence in vaccines in general.
The most recent impetus for anti-vaccine positions is concern that the number of vaccinations in the current mandatory childhood immunization schedule has a harmful effect on children and that the measles–mumps–rubella (MMR) vaccine is a cause of autism. The key publication fueling these concerns is the report by Wakefield et al. published in the British journal The Lancet. The authors reviewed medical records supplemented by parent interviews for 12 patients diagnosed with autism. Eight of the 12 participating parents associated the onset of their child’s autism with the child having received the MMR vaccine.15
There were, however, fatal methodological flaws in Wakefield and colleagues’ article. The sample size of 12 did not have sufficient power for global generalization to all children. Autism is a spectrum disorder with a wide range of symptoms and severity. Also, the small sample was not representative of the population of children with autism. The investigators did not have prospective developmental data, so age of identification of symptoms was probably used to represent age of onset. Although severe symptoms of autism may be manifested in infancy,16 they are more typically first noted at the age of 18 to 24 months,17 roughly coincident with the recommended age for MMR immunization. This temporal coincidence may have been mistaken for a causal connection.
The Wakefield et al. article was criticized by, among others, investigative reporter Brian Deer, who found that some of the parents were parties in a lawsuit against a vaccine manufacturer and that Wakefield had a financial connection to that lawsuit. It was also found that Wakefield had engaged in several ethical violations, such as submitting his participants to tests without proper consent. The possibility of conflict of interest was raised.18,19 The journal editors retracted the article, concluding that it should not have been published. As a consequence of the multiple ethical problems, the medical authorities in the United Kingdom opened an investigation that, in the end, resulted in Wakefield losing his license to practice medicine there.
It has been hypothesized that thimerosal, the mercury-based preservative added to increase vaccine shelf life, is the mechanism by which vaccines would be associated with autism. This hypothesis has been tested and rejected in multiple studies, including one in Canada that compared the incidence of autism before and after removal of thimerosal from vaccines in a population-based sample of 27 749 participants.20 The Institute of Medicine, among others, has published literature reviews with the same conclusion, that there is no evidence to support the vaccine–autism hypothesis.21,22
The anti-vaccine movement continued unabated, having gained official credibility in 2000 when Republican congressman Dan Burton of Texas held hearings in the House of Representatives based on his conclusion that the MMR vaccine causes autism (Burton’s grandson had recently been diagnosed with autism).13 More than a decade later Jenny McCarthy, the leading celebrity voice for the notion that vaccines cause autism, wrote a Huffington Post column titled “In the Vaccine–Autism Debate, What Can Parents Believe?”23 McCarthy correctly pointed out that, according to Wakefield et al., their article had not proved an association between autism and vaccines, and further investigation was needed. She went on, however, to note that eight of the 12 parents associated the onset of behavioral symptoms of autism with the MMR vaccine and added that the vaccine had caused autism in her son as well. The implicit conclusion is that parents can believe other parents, scientific evidence to the contrary. The public health impact of the anti-vaccine point of view is decreasing rates of children who are up to date for immunizations and outbreaks of vaccine-preventable diseases such as measles.24,25
The anti-vaccine position is one of certainty, in stark contrast to the way in which scientific evidence is often presented. The committee responsible for the Institute of Medicine literature reviews concluded that “the body of epidemiological evidence favors rejection of a causal relationship between thimerosal-containing vaccines and autism” but recommended “surveillance and epidemiological research, clinical studies, and communication related to these vaccine safety concerns.”21(p151) This position can be understood as avoidance of certainty: although there is no current evidence to support the hypothesis, further study is needed. The anti-vaccine position became a political movement, fueled in equal parts by discredited hypotheses and paranoid ideology, to an extent that it has created “true believers” impervious to revising their ideas.7,26,27
Returning to Collins’ paradigm, in the case of a cause of autism—a diagnosis that can be devastating for parents and is often delivered with certainty but no corresponding authority about etiology, effective treatment, or prognosis—scientific populism provided certainty in the absence of scientific fascism. Anti-vaccination campaigners not only lack the necessary expertise to evaluate the information that is presented to them, but they also lack the meta-expertise needed to adequately acknowledge the expertise of others (or lack thereof), a point made by Collins himself.11
The inability of anti-vaccine enthusiasts to correctly gauge their own skills can also be seen as a demonstration of the Dunning–Kruger effect, the tendency of unskilled people to overestimate their own competence in a given area.28,29 When we look at other public health achievements, such as the continuing reductions in population-wide tobacco smoking in the United States, messages about the association of tobacco with diseases such as cancer are clear and unambiguous, despite continued efforts of the tobacco industry to sow doubts about such associations.30 The association between soda consumption and obesity and diabetes is less clear, and efforts to regulate soda consumption very publicly failed in New York City in 2013.
To overcome situations such as those just described, it is necessary to consider an apparent conundrum: if experts cannot be blindly trusted, on one hand, but public opinion can be tremendously misguided, on the other hand, how do we have a proper debate on scientific issues of importance to public policy? Once again, Collins’ ideas provide an interesting approach. He has described a number of different ways in which a person can be considered an expert, among which we focus on 2: contributory expertise and interactional expertise.2,11 The first is solely the province of the expert, the kind of expertise necessary to perform an activity with competence; the second refers to people who have mastered the language of the experts and, although lacking practical competence, are capable to engage in meaningful conversation with them. Collins has pointed out that, to become even an interactional expert, one must devote time to and be practically involved with the problem at hand. Merely reading the scientific literature (and, even less so, collecting information on the Internet), which would in any case require previous competence to sort out the wheat from the chaff, is not enough.11
Thus, one key component in solving the expertise dilemma is having enough interactional experts in the community of interest (or the general public) who can conduct sensible debates with experts and help shape public policies that adequately consider the necessary scientific knowledge without being dictated by their spokespersons (i.e., scientists).11 As Turner has pointed out, there is no inherent contradiction between expertise and democracy as long as a critical (rather than cynical) stance is sustained; in his words, “to grant a role to expert knowledge does not require us to accept the immaculate conception of expertise,”9(p146) an idea that converges once again with Collins’ approach. If such an ideal seems unattainable, it should be pointed out that this is precisely the situation in the HIV/AIDS arena, wherein knowledgeable activists became capable interactional experts and thus have been a fundamental part of the response to the epidemic, helping shape sensible and effective policies in many countries.11,31
If, however, the solution is apparently so simple, why is the current state of affairs so distant from this model, exceptions notwithstanding? First, as noted, unfortunately there are good, rational reasons to mistrust health information presented to the public. Conspiracy theories are not necessary to realize that there are documented instances in which medical knowledge was deliberately manipulated for commercial gains,32 and medical journals are not exempt from responsibility in those cases.33
Second, the media in general, for reasons that are not amenable to discussion in this space, have neglected their role in reporting facts, reducing everything to a “he said/she said” model in which two countering visions must always be presented as having equal footing, even when one side clearly has the entire scientific community behind it and the other is driven by misinformation, deliberate or not.34 With uncomfortable frequency, public relations strategies are mobilized to push deliberately distorted information to the media, which will pass it on, uncriticized, to the general public.5,35
Finally, we academics in public health must accept part of the blame. For a number of reasons, chief among them the “publish or perish” mentality, we have concentrated our efforts in communicating with each other, delegating the role of reaching out to the general public to other actors such as the media (whose shortcomings were just discussed), although we certainly need help from the communication experts as well. The “brave new world” of the Internet, in particular, calls for the voice of public health experts to be heard, lest we allow misinformed activists or commercial interests to dictate the debate, to the loss of the public’s health.
Acknowledgments
Kenneth Camargo Jr holds research grants from Conselho Nacional de Desenvolvimento Científico e Tecnológico, Fundação Carlos Chagas de Amparo à Pesquisa, and Universidade do Estado do Rio de Janeiro.
References
- 1.American Public Health Association. Our mission. Available at: http://www.apha.org/about-apha/our-vision/our-mission. Accessed October 12, 2014. [DOI] [PubMed]
- 2.Collins H, Evans R. Rethinking Expertise. Chicago, IL: University of Chicago Press; 2008. [Google Scholar]
- 3.Rose G. The Strategy of Preventive Medicine. Oxford, England: Oxford University Press; 1992. [Google Scholar]
- 4.Nuffield Council on Bioethics. Public Health: Ethical Issues. London, England: Cambridge Publishers Ltd; 2007. [Google Scholar]
- 5.McGarity TO, Wagner WE. Bending Science: How Special Interests Corrupt Public Health Research. Cambridge, MA: Harvard University Press; 2008. [Google Scholar]
- 6.Oliver JE, Wood T. Medical conspiracy theories and health behaviors in the United States. JAMA Intern Med. 2014;174(5):817–818. doi: 10.1001/jamainternmed.2014.190. [DOI] [PubMed] [Google Scholar]
- 7.Goertzel T. Conspiracy theories in science. EMBO Rep. 2010;11(7):493–499. doi: 10.1038/embor.2010.84. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Giddens A. The Consequences of Modernity. Stanford, CA: Stanford University Press; 1990. [Google Scholar]
- 9.Turner S. What is the problem with experts? Soc Stud Sci. 2001;31(1):123–149. [Google Scholar]
- 10.Collins H. We cannot live by scepticism alone. Nature. 2009;458(7234):30. doi: 10.1038/458030a. [DOI] [PubMed] [Google Scholar]
- 11.Collins H. Are We All Scientific Experts Now? Cambridge, England: Polity Press; 2014. [Google Scholar]
- 12.Poland GA, Jacobson RM. The age-old struggle against the antivaccinationists. N Engl J Med. 2011;364(2):97–99. doi: 10.1056/NEJMp1010594. [DOI] [PubMed] [Google Scholar]
- 13.Hodge JG, Jr, Gostin LO. School vaccination requirements: historical, social, and legal perspectives. Available at: http://www.publichealthlaw.net/Research/PDF/vaccine.pdf. Accessed October 12, 2014. [PubMed]
- 14.Zimmerman RK, Wolfe RM, Fox DE et al. Vaccine criticism on the World Wide Web. J Med Internet Res. 2005;7(2):e17. doi: 10.2196/jmir.7.2.e17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Wakefield AJ, Murch SH, Anthony A et al. Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. Lancet. 1998;351(9103):637–641. doi: 10.1016/s0140-6736(97)11096-0. [retracted in: Lancet. 2010;375(9713):445] [DOI] [PubMed] [Google Scholar]
- 16.Samango-Sprouse CA, Stapleton EJ, Aliabadi F et al. Identification of infants at risk for autism spectrum disorder and developmental language delay prior to 12 months. Autism. 2014 doi: 10.1177/1362361314521329. Epub ahead of print. [DOI] [PubMed] [Google Scholar]
- 17.American Academy of Pediatrics. Council on Children with Disabilities. Identifying infants and young children with developmental disorders in the medical home: an algorithm for developmental surveillance and screening. Pediatrics. 2006;118(1):405–420. doi: 10.1542/peds.2006-1231. [DOI] [PubMed] [Google Scholar]
- 18.Deer B. Revealed: MMR research scandal. Available at: http://briandeer.com/mmr/lancet-deer-1.htm. Accessed October 12, 2014.
- 19.Andrew Jeremy Wakefield: Determination of Serious Professional Misconduct (SMP) and Sanction. London, England: General Medical Council; 2010. [Google Scholar]
- 20.Fombonne E, Zakarian R, Bennett A, Meng L, McLean-Heywood D. Pervasive developmental disorders in Montreal, Quebec, Canada: prevalence and links with immunizations. Pediatrics. 2006;118(1):e139–e150. doi: 10.1542/peds.2005-2993. [DOI] [PubMed] [Google Scholar]
- 21.Institute of Medicine, Immunization Safety Review Committee. Immunization Safety Review: Vaccines and Autism. Washington, DC: National Academies Press; 2004. [PubMed] [Google Scholar]
- 22.Nelson KB, Bauman ML. Thimerosal and autism? Pediatrics. 2003;111(3):674–679. doi: 10.1542/peds.111.3.674. [DOI] [PubMed] [Google Scholar]
- 23.McCarthy J. In the vaccine-autism debate, what can parents believe? Available at: http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html. Accessed October 12, 2014.
- 24.Omer SB, Richards JL, Ward M, Bednarczyk RA. Vaccination policies and rates of exemption from immunization, 2005–2011. N Engl J Med. 2012;367(12):1170–1171. doi: 10.1056/NEJMc1209037. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Centers for Disease Control and Prevention. Measles cases and outbreaks. Available at: http://www.cdc.gov/measles/cases-outbreaks.html. Accessed October 12, 2014.
- 26.Nyhan B, Reifler J, Richey S, Freed GL. Effective messages in vaccine promotion: a randomized trial. Pediatrics. 2014;133(4):e835–e842. doi: 10.1542/peds.2013-2365. [DOI] [PubMed] [Google Scholar]
- 27.Kata A. A postmodern Pandora’s box: anti-vaccination misinformation on the Internet. Vaccine. 2010;28(7):1709–1716. doi: 10.1016/j.vaccine.2009.12.022. [DOI] [PubMed] [Google Scholar]
- 28.Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77(6):1121–1134. doi: 10.1037//0022-3514.77.6.1121. [DOI] [PubMed] [Google Scholar]
- 29.Ehrlinger J, Johnson K, Banner M, Dunning D, Kruger J. Why the unskilled are unaware: further explorations of (absent) self-insight among the incompetent. Organ Behav Hum Decis Process. 2008;105(1):98–121. doi: 10.1016/j.obhdp.2007.05.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Camargo KR., Jr How to identify science being bent: the tobacco industry’s fight to deny second-hand smoking health hazards as an example. Soc Sci Med. 2012;75(7):1230–1235. doi: 10.1016/j.socscimed.2012.03.057. [DOI] [PubMed] [Google Scholar]
- 31.Collins H, Pinch T. Dr. Golem: How to Think About Medicine. Chicago, IL: University of Chicago Press; 2008. [Google Scholar]
- 32.Goldacre B. Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. Toronto, Ontario: Random House LLC; 2013. [Google Scholar]
- 33.Smith R. The trouble with medical journals. J R Soc Med. 2006;99(3):115–119. doi: 10.1258/jrsm.99.3.115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Rampton S, Stauber J. Trust Us, We’re Experts: How Industry Manipulates Science and Gambles With Your Future. New York, NY: Penguin Books; 2002. [Google Scholar]
- 35.Ewen S. PR!: A Social History of Spin. New York, NY: Basic Books; 2008. [Google Scholar]