Abstract
Many scientists operate under a mental model that I label the “supply side model of science.” It assumes that the job of scientists is to supply information that governments and citizens can use to make good decisions, and that governments and citizens will use that information once they have it in hand. Therefore, scientists need only do their job—which is to supply accurate, high quality, well vetted information—and all will be well. Events of the past few decades have challenged this model severely. Across the globe, governments and citizens have rejected established scientific findings on climate change, on evolutionary biology, on the safety and efficacy of vaccines, and other issues. Typically, this rejection is ‘implicatory rejection.’ That is to say, people reject or deny science not because the science is weak, unsettled or too uncertain to inform decision-making, but because they and don’t like the actual or perceived implications of that science. In some cases, for example evolutionary biology, the perceived implications are erroneous; in these cases, scientists can help to clear up misunderstandings by engaging seriously (and not dismissively) with people’s concerns. In other cases, for example climate change, the perceived implications may be partly true. In these cases, scientists may help by suggesting ways in which the negative implications might be mitigated or redressed. Often, this will require collaborating with other experts, such as experts in communication, religion, or public health. But whatever the details of the particular case, our overall situation suggests that it does not suffice for scientists simply to supply factual information, and leave it at that. Scientists need as well to engage actively with the recipients of that information.
Keywords: Science, Society, Disinformation
Introduction
Many scientists operate under a mental model that I label the “supply side model of science.” It is a model of the relationship between the scientific community and the publics that we serve. I believe this model has played a significant role in contributing to our present difficulties.
The “supply side model of science” has two parts. The first part is the premise that the job of scientists is to supply information that governments and citizens can use to make good decisions. The second part is that governments and citizens will, in fact, use that information, once they have it. Scientists role is to supply scientific information, which policy makers, business leaders, and others will accept and use. Therefore, scientists—whether working as individuals or as part of a scientific community—need only do their job—which is to supply accurate, high quality information—and all will be well.
Whether or not that model worked well in the past is debatable, but it is clear that it is not working now. Over the past four decades, there have been many areas where weighty public policy decisions—from climate change to the covid-19 pandemic response—require scientific understanding to protect people from harms. Yet, these scientific understandings have often been poorly reflected in public policy and public behaviour and sometimes disregarded entirely. There are many domains where scientists have supplied abundant, well-supported scientific conclusions that have been resisted or rejected by the people we have expected to accept it. This tells us that it is not enough for the scientific community merely to supply scientific information. If we want society to use that information effectively, there is additional work that needs to be done.
Three areas that have been very well studied are Evolutionary Theory, Climate Change, and Vaccines. My arguments draw on the findings of this work. Most of these studies focus on the United States, but it is likely that the principles will apply elsewhere, even if the particulars vary with social, cultural, and educational context.
Faulty assumptions in the supply side model
We all know that if the assumptions built into a model are wrong, then the model may not accurately predict what will actually happen in the world. I think that's a good part of what has happened to us in science in the last 30 years or so: we have been working with a mental model of the scientific project has embedded assumptions that were often not true, or at least not entirely true.
The supply side model assumes at least three things that may not be true.
First, it assumes that people understand what we are saying. Second, it assumes that people want to hear what we are saying. Third, it assumes that we're working on a level playing field. Often, at least one of these assumptions is not true and sometimes none is true.
The assumption that people understand what we are saying
Let’s look first at the first assumption: that people understand what we are saying. Science is complex, often involves mathematics, and nearly always includes specialized terminology. If we are speaking with anyone other than expert colleagues in our own field, or a closely aligned one, we can be fairly confidence that our audience will have little or no prior knowledge of the science, will not possess the skills to understand the relevant mathematics, and will not be familiar with our terminology.
The obvious solution to this challenge is better, clearer, more capacious communication. We have to do more to educate people. We have to accept that our audiences are starting with far less knowledge and information than we would ideally like. (Most Americans stopped taking science classes after high school chemistry and mathematics after high school trigonometry. This is equivalent to the traditional O-levels in the United Kingdom, or where students in most countries are at 16 years of age. If we consider what American adults actually remember of their scientific education, one study suggests that the answer is not much past primary school math and science.
Here I think the scientific community has made huge strides in the last 30 years. In the past, scientists often bemoaned the sorry state of American scientific and mathematical education, while doing little to remedy it. I think the situation is better now. But education is a long term project. In the short term, we have to adjust how we communicate, to translate our information into terms that our audiences can understand, using words with which they are familiar. This is a matter not so much of knowledge as of communication.
Personally, I’ve witnessed enormous changes in how we communicate, and how seriously we take communication, particularly in the last decade. In this regard, I think that we've done fairly well. Recent U.S. National Academy initiatives in this domain are extremely welcome and they seem to be very well thought through. Other scientific societies and organization are also paying more attention, and putting more resources, into scientific communication.
But, as welcome as these initiatives are, they often fall short, because they still assume too much background on the part of our audiences. If we wish to communicate broadly, we have to use language, terminology metaphors that make sense to a person who has not learned science past high school chemistry—and may not even remember that. Most of us do not do this. Most of us speak as if, at minimum, we are speaking with people who have attended college, and taken at least some college-level science courses. If we are university professors, we typically speak as if we are talking to our own students, rather than to groups of people with very little science education or knowledge.
When I try explain this to scientific colleagues, many of them get upset. They feel as if I am asking them to “dumb it down.” Some will reply that they don't know how to talk about science to people whose education stopped at high school chemistry. If this is the case, then we need to learn how to talk to these people—which (in America, and likely elsewhere) is most people. I like to say: we’re not dumbing it down, we are cleaning it up.
A related problem involves accepting the fact that communication is as much about style as it is about substance. Anyone who's worked in Hollywood or advertising or marketing will tell you this. But for scientists, this is hard to swallow.
As scientists, we have been trained, we have been educated, and we work our hearts out to learn substance, and we want to convey that substance to others. Most of us feel that style is a distraction; sometimes we are even suspicious of colleagues who seem too stylish, whether it be in their clothing, their manner of speech, the quality of their slides. We suspect that style is covering for a lack of substance.
The reality of communication is that substance by itself is not enough. How we appear—how we come across—are extremely important to our audiences. Social science research confirms that people judge messages by their messenger. If we don’t seem likeable, many people simply won’t listen, no matter how important our message or how much data we have to support it.
Social scientific research
Social scientific research also tells us that people are more likely to be willing to listen and accept a message from a person to whom they feel they can “relate.” This means someone they already know, or trust because of their relationship to them, such as a doctor, nurse, or religious leader, or who they feel as if they know, such as a celebrity. Social scientists have labelled this the “trusted messenger effect.”
Relation is an intensely personal sensibility, but as scientists we have been talk to eschew the personal. We have been taught that it is the essence of science to be impersonal—that the laws of science don’t care who you are or where you come from. Science is universal, not individual. The great sociologist of science, Robert Merton, identified universalism as one of the four key norms of science. (The other three were communalism, disinterestedness, and organized skepticism.) This puts the norms of science into direct conflict with the demands of effective non-scientific communication.
When it comes to communication, wholly impersonal approaches are rarely effective. Telling our personal stories, revealing something of ourselves, helps in communication, but we have trouble accepting this, because it stands in contradistinction to our internalized norms, which tell us that good scientists are dispassionate and disinterested, and that good science is universal, not personal.
The assumption that people want to hear what we are saying
The second assumption is that people want to hear what we're saying. Here we move from difficult to worse—because very often this is simply not true. Often, we are carrying news that many people receive as bad news.
The reality that many people do not want to hear what we have to say is even more difficult for us to accept than the need for appropriate and personal communication, because it is tied up with politics, with ideology, with economics, and with religion, all of which are delicate and difficult topics, especially for scientists. We don’t want science to be ideological or political, and we especially don’t want it to be politicized.
Many of us went into science because we didn’t like politics, or viewed it as messy and hard to understand, or were just not interested in it. Moreover, if we are natural (as opposed to social) scientists, we aren’t trained to think through political questions or even to analyse political information.
Politics is also risky territory, because if we do engage with it, we can be accused of politicizing the science, even if what we’re doing is a response to other people who have politicized it, even if we are trying to depoliticize a question by more clearly explaining the scientific evidence.
Religion is an even more difficult domain for scientists, because many of us have been trained (wrongly, in my view) to see religion as oppositional (as opposed to complementary) to science.
However, like it or not, scientific work often does have political, philosophical, or economic implications—or at least perceived implications—and those implications have large consequences for how people receive our work. Religion is a bit more complicated. Following Steven Jay Gould, Kenneth Miller, and many others, I view science and religion as complementary—what Gould called “non-competing magisteria.” Personally, I do not believe that science has theological implications. I do not believe, for example, that evolutionary theory necessarily leads us to disbelieve the existence of G-d. But many religious people think that it does—that is to say, they think that evolutionary theory asserts the non-existence of G-d—and some scientists think this as well. This can lead religious believers to view science as threatening. And once they view one area of science (such as evolutionary theory) as threatening their world view, this can spill over into a generalized view of science as threatening. This is what I refer to as “implicatory denial” (or implicatory rejection). It is when people deny science because they dislike or fear its (real or perceived) implications.
Implicatory denial underlies a good deal of science rejection. Many creationists, for example, reject evolutionary theory because they think it asserts the non-existence of G-d, or that because selection operates on random mutations, life itself is therefore random and meaningless, or even that it reduces humans to a “mistake.” Many climate change deniers, likewise, reject the reality of anthropogenic climate change because they view it as challenging free market capitalism. Much of my work has explored the ideologically motivated resistance to accepting the realities of anthropogenic climate change tied to the politics of “free market” economics, what George Soros has labelled “Market fundamentalism.”
Ideological resistance is further reinforced by the tendency of most people to fear and resist change, even change that would in the long run benefit them.
Implicatory denial is a huge challenge for scientists. Politics, religion, and ideology are far afield from the training and interest of most natural scientists. But, like it or not, as scientists we need to understand and address these domains because they influence how people view our findings. They can play a controlling role in whether people are open to scientific evidence, or resist and reject it.
The assumption that we are operating on a level playing field
The third assumption is that we are operating on a level playing field. By this I mean that the social, cultural, and economic resources available to support the understanding and uptake of scientific findings are at least roughly comparable to the resources that have gone into resisting or blocking those fundings. Of the three assumptions considered here, this is probably the most problematic, because it is the one that is furthest from the facts. In her paper, Dr. McNutt usefully distinguishes between misinformation and disinformation. Misinformation is a mistake. People make honest mistakes all the time, and honest mistakes can be corrected by well-crafted messages delivered in a warm and friendly way by trusted messengers. Most people are willing to correct their mistakes, if they are pointed out in a non-threatening way.
Disinformation is different. Disinformation is deliberate. It is designed and intended to mislead, confuse, and steer people in a direction other than the one they might otherwise follow.
The most well-documented example Brandt (2009) and Proctor (2012), from modern times involves the tobacco industry, which spent decades attempting to challenge, refute, and undermine public trust in the scientific evidence that their product was deadly. There is a similar story to be told about deliberate disinformation aimed at undermining climate science Oreskes and Conway (2020), vaccine science Mnookin (2012), Offit (2015), and a number of other areas of science. What is crucial about these campaigns is that they sought not only to generate doubt about the specific sciences involved (epidemiology, climate modelling) but also about the integrity of science as an enterprise and the trustworthiness of scientists.
As a community, scientists face both misinformation and disinformation. But the latter is the harder challenge, because it’s much more difficult to counter. In climate change, we have faced more than three decades of documented deliberate disinformation, and it has been extremely well funded. In fact, we have some evidence that climate disinformation has been more well-funded them the science it has worked to undermine.
Consider a brief example. About 10 or 12 years ago, I was part of a panel discussion at the American Geophysical Union with American oceanographer Jane Lubchenco. Jane gave an inspiring “feel-good” talk about a wonderful program with which she was involved called the Aldo Leopold program. It was specifically designed to address my assumption number one: that people understand what we are saying. Its goal was to teach scientists how to be better communicators to the general public. It's a wonderful program; it has done a lot of good.
After Jane finished, it was my turn to present. I got up and said: “That was the feel good story. Now I'm going to give you the feel bad story. It begins by asking: What is the budget for Leopold program?” Jane told me the number. I went on to discuss a disinformation program funded by the tobacco industry, whose budget was 10 times the budget that Jane had. And it was just one program, funded by one particular company, in a large landscape of tobacco industry disinformation.
How can scientists address these challenges?
I have outlined some difficult problems. But we are smart people, and as scientists we have been trained to embrace difficult challenges.
One course of action is obvious: we continue to educate. Education is absolutely essential. My arguments about these other components are by no means meant to imply that education is not essential. It is. I do not agree with people who say “facts don’t matter.” They do. But facts alone are not enough.
We need to be mindful of who are audiences are and tailor our messages appropriately. Some scientists are uncomfortable with this recommendation, because “tailoring” a message can feel false, as if we are telling the truth but not the whole truth. That is not how I look at it. Rather, I look at it this way: we would not present the same material to advanced graduate students as to first year university students. That does not offend us. We understand that education involves acknowledging the background of our students and starting at an appropriate level. We need to apply that same flexibility when speaking to public audiences who may have little scientific background.
We also need to acknowledge that there often are political and ideological implications to our work. But this does not mean we are helpless. Often people’s political concerns can be addressed. In the case of climate change, for example, if we're speaking with audiences who have concerns about the economic implications, we can discuss the ways in which climate change can be addressed through market mechanisms, or the economic opportunities that arise in a renewable energy economy. Right now, in the United States, green energy is one of the fastest growing sector of the American economy. Yet, very few Americans know this.
Sometimes the implications are more perceived than real; this is particularly true in the area of evolutionary biology Miller (2007; 2009). In these cases, we can help our audiences to understand that this, by acknowledging and addressing their concerns in a non-condescending manner. As I’ve already noted, social survey data and public opinion polls suggest that, among people who reject evolutionary theory, many do so because they think that evolutionary theory implies the non-existence of God. Or they think that because evolutionary theory holds that life comes about through random processes that therefore life itself is random and has no meaning. Neither of these conclusions is correct. Evolutionary theory does not tell us whether or not there is a God and the meaning of life is a philosophical, theological and philosophical question that cannot be answered by science. If we acknowledge this, we can in some cases allay people’s concerns. Social science research that shows that if we talk to our students about this, and particularly if we assign readings by scientists who are themselves people of faith, we can counter that wrong perceived implication, and reduce students’ resistance to learning evolutionary theory.
Countering deliberate disinformation is the hardest challenge, but there are some solutions. One thing we know can help is to call out deliberate disinformation for what it is. John Cook and colleagues have shown the power of intellectual “inoculation.” Often this involves what researchers have labelled a “two-sided message” that raises conventional objections to the position you would like people to accept (e.g. anthropogenic climate change is real, vaccines are safe and effective) and then refutes those objections. Research Compton et al. (2021), Cook et al. (2017; 2019), Cook (2019) also shows that when you expose people to disinformation and explain that it is disinformation, this can help to insulate them against its effects. Such approaches can help to protect against disinformation that was not even part of the “treatment,” suggesting that once people become aware of disinformation as a general phenomenon, they may be more resistant to it.
All of this points to the need for natural scientists to take more seriously the insights and findings of social science, as well as to be supportive of robust research budgets for the social sciences.
It’s not the role of natural science to fix our broken political systems and cultural difficulties, but it may be (at least in part) the role of social sciences. Social scientists can help to answer many of our questions about how to communicate effectively, how to build bonds of trust, and why people draw erroneous implications from true theories. Social scientists can help us learn how to communicate risk, how to build trust, and, especially to reject faulty mental models and understand and explain how both science and society actually operate. I often hear scientific colleagues make claims about how science operates that are not in fact true. We are actually conveying misinformation about our own work! That needs to be fixed, too.
Conclusion
Both communications experts and common sense tell us that we should not leave our audiences depressed. Recently, I saw a call for proposals from the U.S. National Science Foundation inviting research working to incorporate human behaviour into epidemiological models. This is an important step in the right direction: integrating the social and natural sciences to address a crucial public health issues. We have all seen clearly during the last two years that if we don’t consider human behaviour in public health, we will not get the outcome we want. We can have the best, safest, most effective vaccines, and they will do no good if people refuse to get them. And that conclusion can be generalized: we can have the best science in the world, but it will do us little good if people ignore or even actively reject it.
References
- Brandt A. The cigarette century: the rise, fall, and deadly persistence of the product that defined modern America. New York: Basic Books; 2009. [Google Scholar]
- Compton J, van der Linden S, Cook J, Basol M. Inoculation theory in the post-truth era: Extant Findings and new frontiers for contested science, misinformation, and conspiracy theories. Social and Personality Pyschology Compass. 2021 doi: 10.1111/spc3.12602. [DOI] [Google Scholar]
- Cook J, Lewandowsky S, Ecker UKH. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE. 2017 doi: 10.1371/journal.pone.0175799. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook, J., Supran, G., Lewandowsky, S., Oreskes, N., Maibach, E.: How Americans Were Misled About Climate Change.http://www.climatechangecommunication.org/all/america-misled/ (2019)
- Cook, J.: Understanding and countering Misinformation about climate change, pp. 281–306 in I.E. Chiluwa and S.A. Samoilenko, handbook of research on deception, fake news, and misinformation online, Information Science References/GO Global 10.4018/978-1-5225-8535-0.ch016 (2019)
- Miller KR. Finding Darwin’s God: a scientist’s search for common ground between God and evolution. New York: Harper Perennial; 2007. [Google Scholar]
- Miller KR. Only a theory: evolution and the battle for America’s Soul. New York: Penguin Books; 2009. [Google Scholar]
- Mnookin S. The panic virus: the true story behind the vaccine-autism controversy. New York: Simon and Schuster; 2012. [Google Scholar]
- Offit PA. Deadly choices: how the anti-vaccine movements threatens us all. New York: Basic Bookks; 2015. [Google Scholar]
- Oreskes, N., Conway, E.M.: Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. 2nd edition, with a new introduction by Al Gore and a new postscript by the authors. Bloomsbury Press, New York (2020)
- Proctor RN. Golden holocaust: origins of the cigarette catastrophe and the case for abolition. Los Angeles: The University of California Press; 2012. [Google Scholar]