Abstract
Public trust is being lamented as the central victim of our new, digital information environment, a notion that is depicted in labeling our society as “posttruth” or “posttrust.” Within this article, we aim to call this deficit view of public trust into question and kindle a more positive outlook in future research. For this, we utilize the Social Amplification of Risk Framework to discuss trust as an inherent aspect of social interactions and to question the frameworks’ normative approach to public trust and risk perception. Utilizing a literature review of prior studies that investigated trust within the structure of SARF and a case study on the impacts of Fukushima on public trust in nuclear energy, we would like to argue that the current normative “trust deficit model” should be overcome and future risk research should increasingly focus on the opportunities of the digital informational environment for risk communication.
Keywords: Risk communication, risk perception, social amplification of risk framework, trust
1. SOCIAL AMPLIFICATION OF RISK FRAMEWORK—A FRAMEWORK FOR TRUST RESEARCH?
Researchers who study the phenomena of public risk and benefit perceptions and technology acceptance aim at providing insights to other sciences, regulation, and the industry. Thus, the following complaints of exasperated stakeholders likely sound familiar:
Nuclear advocates asking, “In light of its potential to mitigate the impact of energy generation on climate change, why do people demand the withdrawal from nuclear energy?”
Toxicologists lamenting the public's inability to grasp that “it is the dose that makes the poison” and their opposition to consumer products containing trace chemicals or to synthetic pesticides used in agriculture.
Public health regulators that do not understand why people are skeptical about wearing a facemask to protect themselves from an infection with Covid‐19.
A prerequisite to address these grievances of industry representatives, natural or technical scientists, and regulators is to dismantle their notion that the public's risk perception stems from a lack of knowledge about the likelihood and severity of risks in direct comparison to the risk judgments of experts (Hansen, Holm, Frewer, Robinson, & Sandøe, 2003; Irwin & Wynne, 1996). While it is not disputed that domain‐specific knowledge is related to risk perception (see Siegrist & Árvai, 2020 for an in‐depth discussion), the normative conceptualization of lay‐people's “ignorance” as the sole cause of their “flawed” risk judgments must be questioned. Risk research, management, and communication has turned away from the dated conceptualization of lay‐people's risk perception as a pure deficit in knowledge (suitably termed “knowledge deficit model” cf. Hansen et al., 2003; Slovic, 1987). Morgan, Fischhoff, Bostrom, and Atman (2002) provided a methodological framework to systematically study lay‐people's mental models of a particular risk, which allows the incorporation of both domain‐specific knowledge and other factors impacting judgment and decision making (e.g., affect, attitudes). This leads to a better understanding of “the things that people need to know but do not already” about a specific risk (Morgan et al., 2002, p. 19). Once it is established that various factors, above and beyond knowledge, impact people's risk perception and decision making regarding a particular risk inevitably leads to another important question for stakeholders in risk regulation, management, and communication: “Do people trust us to communicate honestly, to be capable of managing the risk and to act in their best interest?”
For this reason, trust is a frequently examined phenomenon in risk research (Earle, 2010b; Eiser, Miles, & Frewer, 2002; Siegrist, 2019). Efforts were made to characterize different trust and trust‐related concepts, such as confidence, general trust, and interpersonal or social trust (Earle, 2010a; Siegrist, 2010; Siegrist, Earle, & Gutscher, 2003) and to identify factors that influence people's trust in stakeholders, such as value similarity or the perception of competence (Earle & Cvetkovich, 1997; Siegrist, Cvetkovich, & Roth, 2000). Theoretical models aim at explaining trust and its relationship with other factors, such as risk and benefit perception and acceptance (e.g., Earle, Siegrist, & Gutscher, 2007). For a more comprehensive overview over the current literature on trust and related concepts, we would like to point to a recent literature overview by Siegrist (2019), as well as older literature reviews (Chryssochoidis, Strada, & Krystallis, 2009; Earle, 2010b).
Within this article, we apply the Social Amplification of Risk Framework (SARF) (Kasperson et al., 1988) to highlight where trust plays an important role in reaction to a risk or risk event. SARF attempts to incorporate all factors relevant in public reactions to a particular risk or risk event. Due to its use of metaphors, numerous interdependent factors, and dynamic nature, it is not a model that can be tested empirically. However, it offers an intuitive structure to ponder about how a risk or risk event leads to a particular impact, such as the public's risk perception or implications for regulation. For individual risks, it highlights the factors that should be taken into account in research and practice. SARF is suitable for the purpose of this article for two reasons. The SARF's core focus on social interactions between individuals, communicators, and organizations is useful for a discussion about trust as an inherent aspect of all social interactions (Kasperson et al., 1988; Pidgeon, Kasperson, & Slovic, 2002). On a more critical note, the potentially problematic notion of risk amplification and attenuation illustrates issues in risk research that we would like to address in this article: (1) its normative implication that someone knows what the right level of risk perception is for a given issue and (2) that this someone is the right person to place public trust in (Rayner, 1988; Rip, 1988).
SARF's strength lies in its application to a specific risk or risk event. Depending on the risk, components of SARF might be highly important, while other components are not or to a lesser degree. Moreover, the metaphor of ripples, underlying SARF, might not fit all risks in the same way. Focusing on the relevant components might contribute to a better understanding of societal responses to risks or risk events. Thus, we first present a case study on nuclear energy generation in Europe, which illustrates how trust might be impacted by and might impact risk regulation, management, and communication (Section 2). Second, we review prior literature on the role of the trust in information sources, regulators, and risk management applying SARF as the structural backbone of this literature review (Section 3). Several articles, focusing on a variety of risks, have explicitly investigated trust within SARF focusing on two main points where trust plays a role in public reactions to various risks or risk events: the trust in information sources (Section 3.1) and the trust in the regulation and management of a risk (Section 3.2). Last, we propose the “trust deficit model” and hypothesize that defining public trust as a finite resource might negatively impact the way risks or risk events are regulated, managed, or communicated. We further discuss connecting points for future research into people's trust in risk regulators, managers, and communicators (Section 4).
2. CASE STUDY: FUKUSHIMA AND THE TRUST IN NUCLEAR ENERGY GENERATION IN EUROPE
On March 11th 2011, a large earthquake and ensuing Tsunami off the coast of Japan set off an event sequence that ultimately led to the meltdown of three reactors at the Fukushima Daiichi power plant (American Nuclear Society, 2012). We would like to discuss the role of trust after a risk event within a case study of the impact of Fukushima on the trust in nuclear energy generation in Europe. Within, we will focus specifically on the way, we feel a “trust deficit model” might play a role. Within this case study, we will exclusively focus on trust, while not claiming comprehensiveness in covering all influential factors that impact the acceptance of nuclear energy in Europe.
2.1. Background Information on the Disaster and its Potential Impact on Energy Policy in Europe
The most drastic impact of the disaster outside Japan could be observed in Germany, where the nuclear phase‐out was expedited and by the end of 2022 the last German nuclear reactor is scheduled to shut down. Furthermore, in the wake of this decision in Germany, the nuclear power plants in Switzerland, France, and Belgium were substantially reduced and it was decided not to replace old nuclear power plants (Kormann, 2019; NZZ, 2011). In line with the high public interest in climate change and its mitigation, the rushed withdrawal from nuclear phase‐out in some European countries has been discussed critically as irrational and based on public pressure instead of scientific evidence and policy recommendations (e.g., Carpenter, 2020; Goldstein, Qvist, & Pinker, 2019; Kormann, 2019). As part of this discussion, the increasing power consumption, the CO2 emissions that could be prevented by reducing Germany's reliance on coal instead of nuclear power, and the public's reluctance to tolerate renewable energy sources in their backyard are debated (e.g., Kharecha & Sato, 2019; Rehner & McCauley, 2016; Weber, Jenal, Rossmeier, & Kuhne, 2017). A study by Rudolf, Seidl, Moser, Krutli, and Stauffacher (2014) concluded that the transition from nuclear to renewable energy might be challenging, as the preferences for other energy sources (e.g., gas, photovoltaics, wind power, hydropower) did not increase as a result of the disaster. This policy change has also made it impossible to benefit from future technological developments in nuclear energy that might overcome some of the challenges of current nuclear power (Gen IV nuclear energy systems, for example, recycling of nuclear waste for energy generation) (Johnson, 2018).
From previous research, it is known that nuclear energy was connected to negative affect and dread from the public even prior to the accident (Fischhoff, Slovic, Lichtenstein, Read, & Combs, 1978; Siegrist & Visschers, 2013; Slovic, 1993; Visschers & Siegrist, 2013). Nuclear power combines aspects that spark higher risk perception in people, such as the high availability of dread‐inducing pictures of the contamination of the natural environment or that nuclear accidents have the potential to affect a large number of people (Slovic, 1993; Visschers & Siegrist, 2013). European countries, such as Germany, have a long history of political opposition and public protests against nuclear energy (Renn & Marshall, 2016). Nevertheless, in the years before the disaster, the word “nuclear renaissance” was used to describe people's reluctant acceptance of nuclear power as a means to generate energy (Bolsen & Cook, 2008; Teräväinen, Lehtonen, & Martiskainen, 2011). In Britain, concerns about climate change and energy security were related to support for nuclear power, but only if acceptance was framed as “reluctant” (Corner et al., 2011; Pidgeon, Lorenzoni, & Poortinga, 2008). There exists a large knowledge base on general and site‐specific acceptance of nuclear energy for the United States (Greenberg, 2009a, 2009b, 2009c; Greenberg & Truelove, 2011).
Thus, fears over potentially devastating accidents were and are always present but might have been partly overshadowed by the perception of the significant benefits of nuclear energy for sustainable energy generation. The Fukushima disaster increased the risk aversion toward nuclear energy and triggered changes regarding political preferences in Germany and other European countries (e.g., Goebel, Krekel, Tiefenbach, & Ziebarth, 2015; Renn & Marshall, 2016). Even more importantly, the perception of the benefits did not outweigh the perceived risks of nuclear power anymore, as other energy generating methods were presented as a solution and “easy way out.” The media coverage as information source and as mirror of the public discourse about the Fukushima disaster might have contributed to this, as will be discussed subsequently.
2.2. Risk Perception, the Media Coverage, Public and Political Discourses
Park, Wang, and Pinto (2016) uncovered discrepancies between the volume and content of the journalistic coverage of the disaster in German and U.S. news media. U.S. newspapers focused on the earthquake and tsunami and framed Fukushima as an isolated natural disaster, whereas German newspapers discussed broader issues, including energy alternatives and policy implications (Park et al., 2016). Similarly, a study investigating the news coverage in four European countries found an emphasis on the implications for domestic nuclear plants in Germany and Switzerland, while French and British media focused on the tsunami as a natural hazard (Kepplinger & Lemke, 2016). Similar findings were made by other authors that also stressed the impact of different narratives on risk amplification and on long‐term changes to the sociopolitical environment (Arlt & Wolling, 2016; Hermwille, 2016). While it not possible to unravel the directionality, as journalistic coverage originates from the national sociocultural backdrop (Amend & Secko, 2012; Hodgetts, Chamberlain, Scammell, Karapu, & Waimarie Nikora, 2008), the studies presented above suggest that the different narratives impacted public's responses in different countries to some degree.
Of course, nobody can prevent natural hazards, such as earthquakes or tsunamis, due to the underlying randomness of such “perfect storms” (Park et al., 2016; Paté‐Cornell, 2012). The public discourse of Fukushima as a natural disaster should, therefore, not have direct implications for people's trust in the operators and regulators of nuclear energy stations. However, the fact that the nuclear power station was sited in a location with an increased seismic risk placed the focus on the human involvement, instead of natural hazards. Furthermore, focusing the public dialogue on the epistemic uncertainty of “black swans,” and on the risks and likelihoods of accidents contributed to a stronger focus on the risks of nuclear energy (Arlt & Wolling, 2016; Park et al., 2016; Paté‐Cornell, 2012). The fact that there was a nuclear accident in a highly developed country as Japan, might have reinforced the perception that similar accidents were likely in Germany (i.e., increasing the risk perception of nuclear energy, reducing the trust in regulators and providers). The ethics committee that was hastily assembled after Fukushima recommended the phase‐out of nuclear energy, while simultaneously promoting renewable energy sources (Ethik‐Kommission Sichere Energieversorgung [Ethics Committee on Secure Energy Supply], 2011; Renn & Marshall, 2016). This also led to the statement by German Chancellor Angela Merkel, previously a strong advocate of nuclear power, that she is not willing to accept the residual risk of nuclear energy, despite it being small and despite the benefits for carbon‐free energy generation (Spiegel International, 2011). What led to this change of heart in Germany's political landscape?
The discussion of the risks of nuclear energy might have further challenged people's trust in nuclear energy operators and their ability to minimize risk (Goebel et al., 2015; Visschers & Siegrist, 2013). It is plausible that this decline in trust originates from the lower perception of the competence to manage nuclear power stations, but also from the perception of diverging values (Cvetkovich, Siegrist, Murray, & Tragesser, 2002; Greenberg, 2014; Visschers & Siegrist, 2013). Consequently, the shift in support for nuclear energy from Germany's politicians stemmed partly from a fear of a similar loss of trust in them, particularly right before the general elections (Spiegel International, 2011). This fear led to an adjustment of the politician's expressed values and attitudes toward nuclear energy. The interaction between risk and benefit perception and its implications for the acceptance of a risk is a well‐studied phenomenon in risk research (Finucane, Alhakami, Slovic, & Johnson, 2000; Siegrist et al., 2000). People are unwilling to accept technologies that offer too little individual and societal benefit and are perceived to be too high in risk in comparison. Thus, such a statement from an opinion leader in a traditionally trustworthy position, in combination with the promise of more safety, contributed to the amplification of risk of nuclear energy.
2.3. The Role of Trust and the Belief in Beneficial Alternatives
A study on the acceptance of nuclear power before and after a nuclear disaster stressed the important role of the perception of benefits (Visschers & Siegrist, 2013; Visschers, Keller, & Siegrist, 2011). Thus, more importantly, the acknowledgment of the risks of nuclear power were accompanied by the promise that an ”Energiewende,” namely the fast replacement of nuclear energy with other energy sources, would be easily feasible, clean, and not expensive (Ethik‐Kommission Sichere Energieversorgung [Ethics Committee on Secure Energy Supply], 2011; Renn & Marshall, 2016). Thus, promises were made that sustainable and cheap alternatives exist (i.e., reducing the benefit perception of nuclear energy), and that lower energy consumption was easily achieved (i.e., increasing confidence that an immediate phase‐out was easily achieved). This dramatically decreased the perceived benefits of nuclear power, which was more strongly linked to decreases in the acceptance of nuclear power than the increases in risk perception (Siegrist & Visschers, 2013; Visschers & Siegrist, 2013). To put it simply: People did not see the point in keeping a risky energy source (i.e., nuclear energy), if other less risky energy sources were available.
The public's trust in the regulators and operators of nuclear energy might have been eroded by the perceived inevitability of such an accident, despite their best efforts to manage those risks. Simultaneously, other involved stakeholders (i.e., politicians, regulators, journalists) might have adjusted their communication patterns and decisions out of fear of losing the public's trust too. Thus, the case of the Fukushima disaster suggests that the mere perception of a potential loss in trust might lead to important implications for decision making and communication regarding a particular technology. It serves to illustrate the need to focus on the causes and mechanisms of shifts in trust in risk research, while moving away from the normative deficit conceptualization of public trust.
3. TRUST AND THE SOCIAL AMPLIFICATION OF RISK FRAMEWORK
Subsequently, we would like to broaden the perspective to other risks and risk events and review the available literature on the role that trust plays within SARF. At its core, SARF describes how society and societal systems process information in reaction to a risk or risk event and how the interactions between institutions shapes behavior and the risk (Kasperson, 2014; Kasperson et al., 1988). Trust plays a key role in this reaction, something that has previously been labeled as “trust heuristic.” This trust heuristic supports judgment and decision making in uncertain environments without demanding too many resources from the decisionmaker (Siegrist, 2019; Terpstra, 2011). It allows to judge information sources, as well as trust in regulators and risk management.
3.1. Trust in the Information Source
The trust that we have in a piece of information and its messenger influences, whether we pay attention to it, how believable the information is and what conclusions we draw from it (Frewer, Howard, Hedderley, & Shepherd, 1996; Frewer, Scholderer, & Bredahl, 2003; Siegrist et al., 2000). For example, if there exist contradictory information on ways that people can protect themselves from a foodborne outbreak (e.g., avoid cucumbers vs. avoid sprouts), consumers tend to utilize their level of trust in the messengers to decide what to do (e.g., advice of the more trusted source will be focused on, remembered and followed; De Vocht, Cauberghe, Sas, & Uyttendaele, 2013; Jungermann, Pfister, & Fischer, 1996). This link between people's trust in an information source and their attitudes and decision making can be found in most areas of risk research. Mase, Cho, and Prokopy (2015) questioned agricultural advisors on the topic of climate change within SARF. Advisors with higher trust in scientific information sources about climate change were more likely to believe in climate change than advisors with lower trust. Advisors with high levels of trust had a positive attitude toward agricultural adaption. People's knowledge, attitudes, and values in turn influences which information sources about climate change that they trust and thus rely on (Malka, Krosnick, & Langer, 2009). Petts and Niemeyer (2004) investigated parents’ beliefs about the immunization of their children and found that the social networks among the parents (on‐ and offline) strongly shape and reinforce parental beliefs about risks. Increases in risk perception of combined immunization of children happens mostly after media communication about unsubstantiated risks (e.g., immunization causes autism) and the perception that “even experts do not know” (Petts & Niemeyer, 2004). Lin and Bautista (2016) uncovered that while the trust in traditional media (e.g., television, newspapers) did not influence people's affect regarding haze pollution, the trust in new media did (e.g., Twitter, Facebook).
Three important properties of today's information environment can be drawn from the latter study (Lin & Bautista, 2016): First, there exists a shift in the type of information transfer (i.e., from mass media to social media) and importance of particular social stations (i.e., increasing importance of online social group vs. traditional news media), and second, social media and the generated proximity has made it easier for people to judge certain determinants of trust in information sources (e.g., shared values and intentions, familiarity). This explains in parts, why in the case of haze pollution in Singapore, social media was more influential than traditional media: People exhibited lower levels of trust in the traditional media and higher levels of trust in social media. During the 2013 Southeast Asian Haze crisis, a lot of emotional information was shared on social media (e.g., by sharing pictures of the crisis or by sharing humorous content), which might have implications for the trust people place in these messengers (Lin & Bautista, 2016).
Science communication research suggests that the emphasis in news articles is frequently not on the main aspect of an issue but is colored by the journalists’ values and perceptions of the target audience's expectations (Amend & Secko, 2012; Hodgetts et al., 2008). For example, depending on a journalist's personal views of biotechnology and a news outlet's target group, a news article about a new gene‐edited potato might either focus on the scientific uncertainty regarding potential risks (risk focus) or on the fact that the potato is not affected by blight (benefit focus). The media depends largely on people's trust in them and thus, need to deliver information that corresponds to the viewers’ attitudes and values. Social media platforms, such as Facebook or Twitter, have made it easier for social networks to share and discuss information with trusted groups and peers. Research shows that we are more likely to trust the information that is shared and liked by friends and family, compared to information by organizations or impersonal experts (Comrie, Burns, Coulson, Quigley, & Quigley, 2019; Lin & Bautista, 2016; Petts & Niemeyer, 2004). These effects can be explained by the fact that trust in an information source is based on the perception of similar values, accountability, competency, and transparency (Frewer et al., 1996; Frewer et al., 2003; Siegrist et al., 2000). This does not automatically imply that trust is lost in more traditional information sources and social stations (e.g., newspapers, government, science). Rather, other, less regulated and more bidirectional information sources and social stations have appeared that people place their trust in too (e.g., social media, blogs). Also, information transfers have become more efficient and effective in shaping risk perceptions due to the rise of social media.
However, the perception of similar values does not necessarily imply that the information sources that we trust are the ones that provide us with factual information that leads to informed decision making in the best of our interest. The fast spread of interactive information across social media is an important aspect of today's information environment. Social media platforms are largely unregulated. Thus, anyone—with or without the relevant qualifications and transparent interests—can spread information (for example about immunization; Brown et al., 2010). Conventionally, the trust heuristic is a useful strategy to make decisions in uncertain environments without requiring excessive resources (e.g., time, attention). However, it can also be faulty and lead to suboptimal decision making for an individual or society when people trust an information sources with vested interests or without the necessary competences.
3.2. Trust in Regulators and Risk Management
A risk event has the potential to substantially reduce people's trust in an institution or regulator. A distinct, visible risk event (e.g., disaster, misinformation, nontransparency in information provision) might reduce people's trust temporarily or in the longer term. Even communicating about a nonacute risk or providing information about scientific uncertainty can negatively impact trust in those responsible. For instance, hearing about the risks of nanotechnology lowered people's trust in industry leaders, which in turn also was related to lower acceptance and benefit perceptions (Cobb, 2005; Retzbach & Maier, 2015). Conversely, a longitudinal study by Frewer, Miles, and Marsh (2002) did not find that trust in regulators was affected by the media's reporting of the risks of genetically modified foods and that risk amplification occurred independently of trust. These conflicting findings seem puzzling in light of the previously established importance of trust, but might also arise from methodological issues (e.g., people wish to respond in a consistent manner, unclear directionality of mechanism in cross‐sectional data) or more importantly, from the fact that trust is not relevant for all risks in every situation (Eiser et al., 2002; Siegrist, 2019).
First, different concepts of trust (e.g., confidence, general trust, social trust, interpersonal trust) impact risk perception. Thus, a prerequisite is to determine, whether trust is relevant for the risk at all or if other concepts might be more applicable. In situations, where people believe to have sufficient knowledge to judge the information for themselves or it is unclear who is responsible for a particular hazard, trust plays only a minor role for their risk perception (Siegrist & Cvetkovich, 2000). For example, in toxicological risk assessments, consumers do not have a clear picture of the involved stakeholders and thus, other factors than trust are more relevant (Bearth, Saleh, & Siegrist, 2019; Saleh, Bearth, & Siegrist, 2019). In such cases, the validity of measuring people's social or interpersonal trust is low, and it might be more interesting to measure general trust and confidence (Siegrist, Gutscher, & Earle, 2005). A recent study on the Covid‐19 pandemic showed that general trust and social trust had opposing effects on people's risk perceptions. High general trust was associated with lower risk perception, while high social trust was associated with a higher risk perception of an infection with the SARS‐CoV‐2 virus (Siegrist, Luchsinger, & Bearth, 2021).
Second, stakeholders could simultaneously function as risk generators, regulators, and communicators, which implicates different levels of trust (Frewer, 2003). People might be willing to trust in the ability of employees of a pharmaceutical company to develop effective and safe medication, yet they might be less willing to accept them as regulators or communicators. Conversely, people might reject a journalist's ability to determine the health risk of a particular chemical substance but would still trust in the information provided in a news article (Bearth, Kwon, & Siegrist, 2020). This challenges the common operationalization of trust as an individually stable construct, such as the trust people have in regulatory offices to ensure food safety (Breakwell, 2007; Mase et al., 2015).
Third, depending on the risk and the type of trust under investigation, paradoxical results were uncovered. Thus, the risk paradox is another aspect that should be mentioned in the context of trust, particularly regarding the behavior that stakeholders exhibit in reaction to a risk or risk event. A systematic literature review exhibited that higher trust in public authorities coincided with a reduced willingness to take precautionary actions regarding natural hazards (e.g., floods, earthquakes; Wachinger, Renn, Begg, & Kuhlicke, 2013). The confidence in the protective abilities of, but not a lack of risk awareness, reduced the perceived need to take action. Similarly, another literature review recommended differentiating between people's trust and confidence in acute risk events, such as an outbreak (Siegrist & Zingg, 2014). Before or during an outbreak, such as the current Covid‐19 pandemic beginning in Wuhan (2019‐nCoV), the responsible authorities, such as national and local governments or the World Health Organization (WHO), face a challenging dilemma linked to trust and confidence. Inaction, a lack of transparent communication, or delayed action might result in high death rates and reduce confidence in the health system, as well as trust in the stakeholders (Hsu, Chen, Wei, Yang, & Chen, 2017; Siegrist & Zingg, 2014). Alerting the public to the outbreak with urgency and enforcing strict precautionary actions (e.g., hygienic measures, quarantine of patients, vaccinations, travel bans) is an important tool in stopping the outbreak (Hsu et al., 2017). The trust that the potentially affected people have in the stakeholder might influence, whether the recommended actions, such as frequently washing and disinfecting hands, are taken (Prati, Pietrantoni, & Zani, 2011; Siegrist & Zingg, 2014). However, if outbreaks are successful and infection numbers decrease, people's trust in the stakeholders might be reduced based on the false notion that the stakeholders overreacted. In accordance with the risk paradox, this might also increase people's confidence regarding future waves and outbreaks and reduce their willingness to take precautionary actions (Prati et al., 2011; Rossmann, Meyer, & Schulz, 2018).
4. MOVING BEYOND THE “TRUST DEFICIT MODEL”
“The confidence in governmental agencies is declining” and “people do not trust scientists anymore” are commonly heard complaints in the risk management and communication community (Kasperson, 2014; Pidgeon et al., 2002). Frequently, these complaints are accompanied by the fear of a “dread spiral” that ultimately leads to more and more public distrust (Pidgeon et al., 2002). While the complaints ring true, the claim of a continuously decreasing global and general trust in science and official organizations lacks empirical evidence. For example, a Eurobarometer report from 2017 showed that in a majority of countries, trust in the national government increased or remained stable (European Commission, 2017). Similarly, a recent report (US National Science Board, 2018) showed stability in the public's confidence in the scientific community and only minor decreases in public trust in the media. Another large‐scale longitudinal study from Germany did also not confirm an erosion of the public's trust in the media (Jackob et al., 2019). A study on trust and distrust in America by the Pew Research Center (Rainie, Keeter, & Page, 2019) did find a continuous decline in trust in political institutions, but not in other groups (e.g., scientists, business leaders). Furthermore, the decline in trust was not as strong as people's expectations of this decline in trust. In other words, many researchers may view the decline in public trust as much more dramatic than it actually is, if trust is seen as a limited resource (Rainie et al., 2019). Van de Walle, Van Roosbroek, and Bouckaert (2008) reviewed international survey data on trust in governments and came to the conclusion that trust fluctuates, while a steady decline is not supported by the data. More importantly, they argue that there exists insufficient data in many countries to support claims of a continuous decline in trust (Van de Walle et al., 2008).
People are concerned about a decline in trust, because they implicitly assume that this may result in an amplification of some risks or an attenuation of other risks (Kasperson, Golding, & Tuler, 1992). The fear that permeates here is that as a consequence, society may not focus on the hazards they should be most concerned about. SARF is based on the normative precondition that risks deemed by experts as less urgent are amplified and risks deemed by experts as urgent are attenuated. While intuitive in its description to the public responses to risks, the application of SARF obscures two relevant questions (Busby & Duckett, 2012; Busby & Onggo, 2013; Rayner, 1988; Rip, 1988). First, is it possible to define an objective cut‐off point for the appropriate public reaction to a particular risk? Second, is it possible to say, which sources of information, channels, and stations are the most trustworthy in terms of competence and shared values? Although a loss of public trust is a valid fear due to its importance for public responses to emerging risks and risk events and in light of global challenges, such as climate change or pandemics, the situation is a bit more complicated than that.
For emerging risks and risk events, the public risk perceptions frequently lag behind experts’ risk judgments. The public has to rely on affect and their trust in the involved social stations (affect and trust heuristic) to judge the relevance of a particular risk for themselves. In a recent perspective, “understanding the role of trust on risk perception and behavior” was appointed one of the 10 most important accomplishments of risk research (Greenberg et al., 2020). This theoretical accomplishment conceptualizes trust as an initial response that “lingers even in the face of deliberative assessment” (Greenberg et al., 2020, p. 2114). However, the goal of risk communication or more generally, science communication, is to provide the public with the tools to make more informed and deliberative decisions. However, focusing on the expected loss of trust and thus, adjusting communication accordingly might do more harm than good, as it puts the communicator in a defensive stance. Complaints about the vanishing public trust and fears that this will result in risk amplification and attenuation of the “wrong” hazards could be described as a “trust deficit model.” Within this conceptualization of public trust, the risk researcher or manager normatively defines, who is the “right” information source that should be trusted, while another information source is defined as the “wrong” information source that needs to be met with distrust. Along with other scholars (Hansen et al., 2003; Rayner, 2004), we would like to make the case that it is detrimental to inform risk management and communication with a deficit model in mind. Rather, the new information environment due to digitalization should be moved into the focus of research as another emerging risk with the following issues that should be studied.
First, there have been shifts in the feedback and iteration between information sources and stations along the SARF, as these processes become more bidirectional due to social media. People participate actively and with increased reach in the public discourses about risk and their communications. This offers an interesting “playground” for risk researchers to investigate people's reactions to risk and risk events (by for example examining tweets or other social media posts). However, a lively debate summarized under the rule of thumb “90‐9‐1” (or the 1% rule) about how many people indeed participate actively in digital debates over specific issues (Hargittai & Walejko, 2008; Horowitz, 2006; van Mierlo, 2014). According to this rule of thumb, solely 1% of people create content, 9% edit and modify it, while a staggering 90% only passively watch the digital debate. Thus, a conclusion that is potentially relevant for risk research is that a small number of people are “quite loud,” while the majority quietly observes these debates. Focusing on these loud 10% (potentially with extreme views) might fuel the “trust deficit model” and distort our view of what the majority of the public thinks about risks and risk events.
Second, people's communication on social media might be associated with higher levels of trust than the communication efforts of official sources, because of the increased visibility of the determinants of trust (e.g., shared values, familiarity). This informational environment supports information sources without the necessary competences and vested interests. This is an important challenge because it is not trivial to determine which sources do indeed have vested interests or distribute false information, due to the inherent uncertainty of risk and inter‐ and intrapersonal variance in the acceptability of risk. Research needs to investigate further, how current issues, such as echo chambers, false or misleading information about risk can be tackled. Already, first studies are being published that look at strategies to counteract so‐called “fake news” and its spread in social media. For example, providing people with guidelines to uncover fake news (simple questions, such as “Do I recognize the news organization that posted the news story) reduced their trust in fake news and also lowered the likelihood of them sharing it (Lutzke, Drummond, Slovic, & Árvai, 2019). This stresses the importance of previously voiced ideas that skepticism, instead of simply trust in the “right” information source, might be an important ability to promote in the public (Pidgeon, Poortinga, & Walls, 2007). This is a step away from the normative “trust deficit model” and would allow the public to make their own decisions about which source is trustworthy and which one is not.
Third, the interviewed health professionals in a study by Comrie et al. (2019) stressed the central role of quickly available and trustworthy information during a risk event (e.g., disease outbreak). According to the health professionals, organizations that do not communicate via social media are perceived as “out of touch” and they saw a need for official organizations to establish a presence on social media (Comrie et al., 2019). A number of conclusions were made in the literature (Siegrist & Zingg, 2014), what attributes communicators should have to increase their trustworthiness (e.g., diversity of expertise, modeling behavior). A prerequisite to shape communicators’ roles is to determine, who the involved stakeholders are and to differentiate between their roles as receivers and transmitters within the flow of information (Mase et al., 2015). Thus, the relationships and the trust between stakeholders should be seen as a dynamic system, where a stakeholder trusts other stakeholders and at the same time might be trusted or distrusted by other stakeholders. Trust in governance and trust in information sources are two different types of trust, which should be reflected in the operationalization of trust and ideally both types should be measured. The role of transparency on trust should be investigated further. There is a consensus among risk researchers that a lack of transparency in risk management and communication leads to a loss of trust (Pidgeon et al., 2002; Siegrist & Zingg, 2014). A form of transparency in risk management is to openly communicate scientific uncertainties about likelihoods and outcomes (European Food Safety Authority et al., 2019). However, there exist a scarcity of evidence that this form of transparency actually increases trust in experts. Thus, it would be desirable to gather more systematic evidence in the effects of uncertainty communication on people's trust and the amplification of risk.
Last but definitely not least, we believe that there exists a need to rethink terminology such as “posttrust society” (Bouder, 2015; Löfstedt, 2005). This terminology contributes to the conceptualization of trust in a deficit light, while there is hardly any evidence to back a global and general loss of public trust in science and regulation (European Commission, 2017; US National Science Board, 2018). Moreover, there is emerging evidence that the current Covid‐19 pandemic increased the public's trust in science, particularly based on the fast development of vaccines (Funk, Tyson, Kennedy, & Johnson, 2020; Science Barometer, 2020; Science Barometer Switzerland, 2020). The new information environment due to digitalization is undoubtfully a challenge for risk communication. However, rather than focusing on the negative impacts of social media (loss of trust, trust in fake news, echo chambers, etc.), the focus of research should lie on its potential to sustain and foster trust, as an inherent mechanism of any social interaction and an important guideline for people's judgment and decision making (Siegrist, 2019).
ACKNOWLEDGMENTS
We would like to express our gratitude to Roger Kasperson for initiating and inviting us to write a paper for this Special Issue. We are thankful for his invaluable input to the first version of this manuscript. Also, we would like to thank the reviewers for their time and helpful feedback, which improved our article in various ways.
Open Access Funding provided by Eidgenossische Technische Hochschule Zurich.
[Correction added on 8th April, 2022, after first online publication: CSAL funding statement has been added.]
REFERENCES
- Amend, E. , & Secko, D. M. (2012). In the face of critique: A metasynthesis of the experiences of journalists covering health and science. Science Communication, 34(2), 241–282. 10.1177/1075547011409952 [DOI] [Google Scholar]
- American Nuclear Society . (2012). Fukushima Daiichi: ANS committee report. Retrieved from http://fukushima.ans.org/report/Fukushima_report.pdf
- Arlt, D. , & Wolling, J. (2016). Fukushima effects in Germany? Changes in media coverage and public opinion on nuclear power. Public Understanding of Science, 25(7), 842–857. 10.1177/0963662515589276 [DOI] [PubMed] [Google Scholar]
- Bearth, A. , Kwon, S. , & Siegrist, M. (2020). Chemophobia and knowledge of toxicological principles in South‐Korea: Perceptions of trace chemicals in consumer products. Journal of Toxicology and Environmental Health, Part A, 89, 104144. 10.1080/15287394.2020.1851834 [DOI] [PubMed] [Google Scholar]
- Bearth, A. , Saleh, R. , & Siegrist, M. (2019). Lay‐people's knowledge about toxicology and its principles in eight European countries. Food and Chemical Toxicology, 131, 110560. 10.1016/j.fct.2019.06.007 [DOI] [PubMed] [Google Scholar]
- Bolsen, T. , & Cook, F. L. (2008). Public opinion on energy policy: 1974–2006. Public Opinion Quarterly, 72, 364–388. [Google Scholar]
- Bouder, F. (2015). Risk Communication of Vaccines: Challenges in the Post‐Trust Environment. Current Drug Safety, 10(1), 9–15.Retrieved from https://www.ingentaconnect.com/content/ben/cds/2015/00000010/00000001/art00005 [DOI] [PubMed] [Google Scholar]
- Breakwell, G. M. (2007). The psychology of risk. Cambridge, UK: Cambridge University Press. [Google Scholar]
- Brown, K. F. , Kroll, J. S. , Hudson, M. J. , Ramsay, M. , Green, J. , Long, S. J. , … Sevdalis, N. (2010). Factors underlying parental decisions about combination childhood vaccinations including MMR: A systematic review. Vaccine, 28(26), 4235–4248. 10.1016/j.vaccine.2010.04.052 [DOI] [PubMed] [Google Scholar]
- Busby, J. S. , & Duckett, D. (2012). Social risk amplification as an attribution: The case of zoonotic disease outbreaks. Journal of Risk Research, 15(9), 1049–1074. 10.1080/13669877.2012.670130 [DOI] [Google Scholar]
- Busby, J. S. , & Onggo, S. (2013). Managing the social amplification of risk: A simulation of interacting actors. Journal of the Operational Research Society, 64(5), 638–653. 10.1057/jors.2012.80 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Carpenter, S. (2020). January 11. As the costs of Germany's nuclear phase out mount, little appetite for a rethink. Forbes. Retrieved from https://www.forbes.com/sites/scottcarpenter/2020/01/11/costs-of-germanys-nuclear-phase-out-are-substantial-new-paper-finds-but-there-is-little-appetite-for-a-rethink/#5ab7be213ef7
- Chryssochoidis, G. , Strada, A. , & Krystallis, A. (2009). Public trust in institutions and information sources regarding risk management and communication: Towards integrating extant knowledge. Journal of Risk Research, 12(2), 137–185. 10.1080/13669870802637000 [DOI] [Google Scholar]
- Cobb, M. D. (2005). Framing effects on public opinion about Nanotechnology. Science Communication, 27(2), 221–239. 10.1177/1075547005281473 [DOI] [Google Scholar]
- Comrie, E. L. , Burns, C. , Coulson, A. B. , Quigley, J. , & Quigley, K. F. (2019). Rationalising the use of Twitter by official organisations during risk events: Operationalising the social amplification of risk framework through causal loop diagrams. European Journal of Operational Research, 272(2), 792–801. 10.1016/j.ejor.2018.07.034 [DOI] [Google Scholar]
- Corner, A. , Venables, D. , Spence, A. , Poortinga, W. , Demski, C. , & Pidgeon, N. (2011). Nuclear power, climate change and energy security: Exploring British public attitudes. Energy Policy, 39(9), 4823–4833. 10.1016/j.enpol.2011.06.037 [DOI] [Google Scholar]
- Cvetkovich, G. , Siegrist, M. , Murray, R. , & Tragesser, S. (2002). New information and social trust: Asymmetry and perseverance of attributions about hazard managers. Risk Analysis, 22(2), 359–367. 10.1111/0272-4332.00030 [DOI] [PubMed] [Google Scholar]
- De Vocht, M. , Cauberghe, V. , Sas, B. , & Uyttendaele, M. (2013). Analyzing consumers' reactions to news coverage of the 2011 escherichia coli O104:H4 outbreak, using the extended parallel processing model. Journal of Food Protection, 76(3), 473–481. 10.4315/0362-028X.JFP-12-339 [DOI] [PubMed] [Google Scholar]
- Earle, T. C. (2010a). Distinguishing trust from confidence: manageable difficulties, worth the effort. Risk Analysis, 30(7), 1025–1027. 10.1111/j.1539-6924.2010.01456.x [DOI] [PubMed] [Google Scholar]
- Earle, T. C. (2010b). Trust in risk management: A model‐based review of empirical research. Risk Analysis, 30(4), 541–574. 10.1111/j.1539-6924.2010.01398.x [DOI] [PubMed] [Google Scholar]
- Earle, T. C. , & Cvetkovich, G. (1997). Culture, cosmopolitanism, and risk management. Risk Analysis, 17(1), 55–65. 10.1111/j.1539-6924.1997.tb00843.x [DOI] [Google Scholar]
- Earle, T. C. , Siegrist, M. , & Gutscher, H. (2007). Trust, risk perception and the TCC model of cooperation. In Gutscher H. & Earle T. C. (Eds.), Trust in cooperative risk management: Uncertainty and scepticism in the public mind. (1–50). London, UK: Earthscan. [Google Scholar]
- Eiser, J. R. , Miles, S. , & Frewer, L. J. (2002). Trust, perceived risk, and attitudes toward food technologies. Journal of Applied Social Psychology, 32(11), 2423–2433. 10.1111/j.1559-1816.2002.tb01871.x [DOI] [Google Scholar]
- Ethik‐Kommission Sichere Energieversorgung [Ethics Committee on Secure Energy Supply] . (2011). Deutschlands Energiewende ‐ Ein Gemeinschaftswerk für die Zukunft [Germany's energy transition ‐ A joint effort for the future]. Berlin, Germany: Deutsche Bundesregierung.. [Google Scholar]
- European Commission . (2017). Special Eurobarometer 461 “Designing Europe's future”. Luxembourg city, Luxembourg: Eurobarometer. [Google Scholar]
- European Food Safety Authority , Hart, A. , Maxim, L. , Siegrist, M. , Von Goetz, N. , da Cruz, C. , … Hardy, A. (2019). Guidance on communication of uncertainty in scientific assessments. EFSA Journal, 17(1), 1–73. 10.2903/j.efsa.2019.5520 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Finucane, M. L. , Alhakami, A. , Slovic, P. , & Johnson, S. M. (2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13(1), 1–17. 10.1002/(sici)1099-0771(200001/03)13:11::Aid-bdm3333.0.Co;2-s [DOI] [Google Scholar]
- Fischhoff, B. , Slovic, P. , Lichtenstein, S. , Read, S. , & Combs, B. (1978). How safe is safe enough? A psychometric study of attitudes towards technological risks and benefits. Policy Sciences, 9(2), 127–152. 10.1007/BF00143739 [DOI] [Google Scholar]
- Frewer, L. J. (2003). Trust, transparency, and social context: Implications for social amplification of risk. In Pidgeon N., Kasperson R. E., & Slovic P. (Eds.), The social amplification of risk (pp. 123–137). Cambridge, UK: Cambridge University Press. [Google Scholar]
- Frewer, L. J. , Howard, C. , Hedderley, D. , & Shepherd, R. (1996). What determines trust in information about food‐related risks? Underlying psychological constructs. Risk Analysis, 16(4), 473–486. 10.1111/j.1539-6924.1996.tb01094.x [DOI] [PubMed] [Google Scholar]
- Frewer, L. J. , Miles, S. , & Marsh, R. (2002). The media and genetically modified foods: Evidence in support of social amplification of risk. Risk Analysis, 22(4), 701–711. 10.1111/0272-4332.00062 [DOI] [PubMed] [Google Scholar]
- Frewer, L. J. , Scholderer, J. , & Bredahl, L. (2003). Communicating about the risks and benefits of genetically modified foods: The mediating role of trust. Risk Analysis, 23(6), 1117–1133. 10.1111/j.0272-4332.2003.00385.x [DOI] [PubMed] [Google Scholar]
- Funk, C. , Tyson, A. , Kennedy, B. , & Johnson, C. (2020). Science and scientists held in high esteem across global publics. Washington DC: Pew Research Center. [Google Scholar]
- Goebel, J. , Krekel, C. , Tiefenbach, T. , & Ziebarth, N. R. (2015). How natural disasters can affect environmental concerns, risk aversion, and even politics: Evidence from Fukushima and three European countries. Journal of Population Economics, 28(4), 1137–1180. 10.1007/s00148-015-0558-8 [DOI] [Google Scholar]
- Goldstein, J. S. , Qvist, S. A. , & Pinker, S. (2019, April 6). Nuclear power can save the world. The New York Times. Retrieved from https://www.nytimes.com/2019/04/06/opinion/sunday/climate‐change‐nuclear‐power.html
- Greenberg, M. (2009a). Energy sources, public policy, and public preferences: Analysis of US national and site‐specific data. Energy Policy, 37(8), 3242–3249. 10.1016/j.enpol.2009.04.020 [DOI] [Google Scholar]
- Greenberg, M. (2009b). How much do people who live near major nuclear facilities worry about those facilities? Analysis of national and site‐specific data. Journal of Environmental Planning and Management, 52(7), 919–937. 10.1080/09640560903181063 [DOI] [Google Scholar]
- Greenberg, M. (2009c). NIMBY, CLAMP, and the location of new nuclear‐related facilities: U.S. national and 11 site‐specific surveys. Risk Analysis, 29(9), 1242–1254. 10.1111/j.1539-6924.2009.01262.x [DOI] [PubMed] [Google Scholar]
- Greenberg, M. (2014). Energy policy and research: The underappreciation of trust. Energy Research and Social Science, 1, 152–160. 10.1016/j.erss.2014.02.004 [DOI] [Google Scholar]
- Greenberg, M. , Cox, A. , Bier, V. , Lambert, J. , Lowrie, K. , North, W. , … Wu, F. (2020). Risk analysis: Celebrating the accomplishments and embracing ongoing challenges. Risk Analysis, 40(S1), 2113–2127. 10.1111/risa.13487 [DOI] [PubMed] [Google Scholar]
- Greenberg, M. , & Truelove, H. B. (2011). Energy choices and risk beliefs: Is it just global warming and fear of a nuclear power plant accident? Risk Analysis, 31(5), 819–831. 10.1111/j.1539-6924.2010.01535.x [DOI] [PubMed] [Google Scholar]
- Hansen, J. , Holm, L. , Frewer, L. J. , Robinson, P. , & Sandøe, P. (2003). Beyond the knowledge deficit: Recent research into lay and expert attitudes to food risks. Appetite, 41(2), 111–121. 10.1016/S0195-6663(03)00079-5 [DOI] [PubMed] [Google Scholar]
- Hargittai, E. , & Walejko, G. (2008). The participation divide: Content creation and sharing in the digital age. Information, Communication & Society, 11(2), 239–256. 10.1080/13691180801946150 [DOI] [Google Scholar]
- Hermwille, L. (2016). The role of narratives in socio‐technical transitions‐Fukushima and the energy regimes of Japan, Germany, and the United Kingdom. Energy Research & Social Science, 11, 237–246. 10.1016/j.erss.2015.11.001 [DOI] [Google Scholar]
- Hodgetts, D. , Chamberlain, K. , Scammell, M. , Karapu, R. , & Waimarie Nikora, L. (2008). Constructing health news: Possibilities for a civic‐oriented journalism. Health, 12(1), 43–66. 10.1177/1363459307083697 [DOI] [PubMed] [Google Scholar]
- Horowitz, B. (2006). Creators, synthesizers, and consumers. Retrieved from https://blog.elatable.com/2006/02/creators‐synthesizers‐and‐consumers.html
- Hsu, Y. C. , Chen, Y. L. , Wei, H. N. , Yang, Y. W. , & Chen, Y. H. (2017). Risk and outbreak communication: lessons from Taiwan's experiences in the post‐SARS era. Health Security, 15(2), 165–169. 10.1089/hs.2016.0111 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Irwin, A. , & Wynne, B. (1996). Misunderstanding science? The public reconstruction of science and technology. Cambridge, UK: Cambridge University Press. [Google Scholar]
- Jackob, N. , Schultz, T. , Jakobs, I. , Ziegele, M. , Quiring, O. , & Schemer, C. (2019). Medienvertrauen im Zeitalter der Polarisierung [Media confidence in the age of polarisation]. Media Perspektiven [Media Perspectives], 5, 210–220. [Google Scholar]
- Johnson, N. (2018, July 24). Next‐gen nuclear is coming–If society wants it. WIRED. Retrieved from https://www.wired.com/story/next‐gen‐nuclear/ [Google Scholar]
- Jungermann, H. , Pfister, H. R. , & Fischer, K. (1996). Credibility, information preferences, and information interests. Risk Analysis, 16(2), 251–261. [Google Scholar]
- Kasperson, R. E. (2014). Four questions for risk communication. Journal of Risk Research, 17(10), 1233–1239. 10.1080/13669877.2014.900207 [DOI] [Google Scholar]
- Kasperson, R. E. , Golding, D. , & Tuler, S. (1992). Social distrust as a factor in siting hazardous facilities and communicating risks. Journal of Social Issues, 48(4), 161–187. 10.1111/j.1540-4560.1992.tb01950.x [DOI] [Google Scholar]
- Kasperson, R. E. , Renn, O. , Slovic, P. , Brown, H. S. , Emel, J. , Goble, R. , … Ratick, S. J. (1988). The social amplification of risk: A conceptual framework. Risk Analysis, 8(2), 177–187. [Google Scholar]
- Kepplinger, H. M. , & Lemke, R. (2016). Instrumentalizing Fukushima: Comparing media coverage of Fukushima in Germany, France, the United Kingdom, and Switzerland. Political Communication, 33(3), 351–373. 10.1080/10584609.2015.1022240 [DOI] [Google Scholar]
- Kharecha, P. A. , & Sato, M. (2019). Implications of energy and CO2 emission changes in Japan and Germany after the Fukushima accident. Energy Policy, 132, 647–653. 10.1016/j.enpol.2019.05.057 [DOI] [Google Scholar]
- Kormann, C. (2019, December 22). Is nuclear power worth the risk? The New Yorker, Retrieved from https://www.newyorker.com/news/dispatch/is‐nuclear‐power‐worth‐the‐risk [Google Scholar]
- Lin, T. T. C. , & Bautista, J. R. (2016). Predicting intention to take protective measures during haze: the roles of efficacy, threat, media trust, and affective attitude. Journal of Health Communication, 21(7), 790–799. 10.1080/10810730.2016.1157657 [DOI] [PubMed] [Google Scholar]
- Löfstedt, R. (2005). Risk management in post‐trust societies. London, UK: Earthscan. [Google Scholar]
- Lutzke, L. , Drummond, C. , Slovic, P. , & Árvai, J. (2019). Priming critical thinking: Simple interventions limit the influence of fake news about climate change on Facebook. Global Environmental Change, 58, 101964. 10.1016/j.gloenvcha.2019.101964 [DOI] [Google Scholar]
- Malka, A. , Krosnick, J. A. , & Langer, G. (2009). The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Analysis, 29(5), 633–647. 10.1111/j.1539-6924.2009.01220.x [DOI] [PubMed] [Google Scholar]
- Mase, A. S. , Cho, H. , & Prokopy, L. S. (2015). Enhancing the Social Amplification of Risk Framework (SARF) by exploring trust, the availability heuristic, and agricultural advisors' belief in climate change. Journal of Environmental Psychology, 41, 166–176. 10.1016/j.jenvp.2014.12.004 [DOI] [Google Scholar]
- Morgan, M. G. , Fischhoff, B. , Bostrom, A. , & Atman, C. J. (2002). Mental models approach. Cambridge, UK: Cambridge University Press. [Google Scholar]
- NZZ . (2011). May 25. Die Schweiz steigt aus der Atomenergie aus [Switzerland withdraws from nuclear energy]. NZZ , Retrieved from https://www.nzz.ch/die_schweiz_baut_keine_atomkraftwerke_mehr‐1.10699185
- Park, D. J. , Wang, W. R. , & Pinto, J. (2016). Beyond disaster and risk: Post‐Fukushima nuclear news in US and German press. Communication Culture & Critique, 9(3), 417–437. 10.1111/cccr.12119 [DOI] [Google Scholar]
- Paté‐Cornell, E. (2012). On “black swans” and “perfect storms”: Risk analysis and management when statistics are not enough. Risk Analysis, 32(11), 1823–1833. 10.1111/j.1539-6924.2011.01787.x [DOI] [PubMed] [Google Scholar]
- Petts, J. , & Niemeyer, S. (2004). Health risk communication and amplification: Learning from the MMR vaccination controversy. Health, Risk & Society, 6(1), 7–23. 10.1080/13698570410001678284 [DOI] [Google Scholar]
- Pidgeon, N. , Kasperson, R. E. , & Slovic, P. (2002). The social amplification of risk. Cambridge, UK: Cambridge University Press. [Google Scholar]
- Pidgeon, N. , Lorenzoni, I. , & Poortinga, W. (2008). Climate change or nuclear power—No thanks! A quantitative study of public perceptions and risk framing in Britain. Global Environmental Change, 18(1), 69–85. 10.1016/j.gloenvcha.2007.09.005 [DOI] [Google Scholar]
- Pidgeon, N. , Poortinga, W. , & Walls, J. (2007). Scepticism, reliance and risk managing institutions: towards a conceptual model of ‘critical trust’. In Earle T. C., Siegrist M., & Gutscher H. (Eds.), Trust in risk management ‐ Uncertainty and scepticism in the public mind (pp. 117–142). London, UK: Earthscan. [Google Scholar]
- Prati, G. , Pietrantoni, L. , & Zani, B. (2011). A social‐cognitive model of pandemic influenza H1N1 risk perception and recommended behaviors in Italy. Risk Analysis, 31(4), 645–656. 10.1111/j.1539-6924.2010.01529.x [DOI] [PubMed] [Google Scholar]
- Rainie, L. , Keeter, S. , & Page, D. (2019). Trust and distrust in America. Washington, DC: Pew Research Center. [Google Scholar]
- Rayner, S. (1988). Muddling through metaphors to maturity: A commentary on Kasperson et al., the social amplification of risk. Risk Analysis, 8(2), 201–204. 10.1111/j.1539-6924.1988.tb01172.x [DOI] [Google Scholar]
- Rayner, S. (2004). The novelty trap: Why does institutional learning about new technologies seem so difficult? Industry and Higher Education, 18(6), 349–355. 10.5367/0000000042683601 [DOI] [Google Scholar]
- Rehner, R. , & McCauley, D. (2016). Security, justice and the energy crossroads: Assessing the implications of the nuclear phase‐out in Germany. Energy Policy, 88, 289–298. 10.1016/j.enpol.2015.10.038 [DOI] [Google Scholar]
- Renn, O. , & Marshall, J. P. (2016). Coal, nuclear and renewable energy policies in Germany: From the 1950s to the “Energiewende”. Energy Policy, 99, 224–232. 10.1016/j.enpol.2016.05.004 [DOI] [Google Scholar]
- Retzbach, A. , & Maier, M. (2015). Communicating scientific uncertainty: Media effects on public engagement with science. Communication Research, 42(3), 429–456. 10.1177/0093650214534967 [DOI] [Google Scholar]
- Rip, A. (1988). Should social amplification of risk be counteracted? Risk Analysis, 8(2), 193–197. [Google Scholar]
- Rossmann, C. , Meyer, L. , & Schulz, P. J. (2018). The mediated amplification of a crisis: Communicating the A/H1N1 pandemic in press releases and press coverage in Europe. Risk Analysis, 38(2), 357–375. 10.1111/risa.12841 [DOI] [PubMed] [Google Scholar]
- Rudolf, M. , Seidl, R. , Moser, C. , Krutli, P. , & Stauffacher, M. (2014). Public preference of electricity options before and after Fukushima. Journal of Integrative Environmental Sciences, 11(1), 1–15. 10.1080/1943815x.2014.881887 [DOI] [Google Scholar]
- Saleh, R. , Bearth, A. , & Siegrist, M. (2019). Chemophobia today: Consumers’ knowledge and perceptions of chemicals. Risk Analysis, 39(12), 2668–2682. 10.1111/risa.13375 [DOI] [PubMed] [Google Scholar]
- Science Barometer . (2020). Science barometer 2020. Retrieved from https://www.wissenschaft‐im‐dialog.de
- Science Barometer Switzerland . (2020). WissensCHaftsbarometer Schweiz COVID‐19 edition [Science Barometer Switzerland COVID‐19 edition]. Retrieved from https://wissenschaftsbarometer.ch
- Siegrist, M. (2010). Trust and confidence: The difficulties in distinguishing the two concepts in research. Risk Analysis, 30(7), 1022–1024. 10.1111/j.1539-6924.2010.01454.x [DOI] [PubMed] [Google Scholar]
- Siegrist, M. (2019). Trust and risk perception: A critical review of the literature. Risk Analysis, 41(3). 10.1111/risa.13325 [DOI] [PubMed] [Google Scholar]
- Siegrist, M. , & Árvai, J. (2020). Risk perception: reflections on 40 years of research. Risk Analysis, 40(S1), 2191–2206. 10.1111/risa.13599 [DOI] [PubMed] [Google Scholar]
- Siegrist, M. , & Cvetkovich, G. (2000). Perception of hazards: The role of social trust and knowledge. Risk Analysis, 20(5), 713–720. 10.1111/0272-4332.205064 [DOI] [PubMed] [Google Scholar]
- Siegrist, M. , Cvetkovich, G. , & Roth, C. (2000). Salient value similarity, social trust, and risk/benefit perception. Risk Analysis, 20(3), 353–362. 10.1111/0272-4332.203034 [DOI] [PubMed] [Google Scholar]
- Siegrist, M. , Earle, T. C. , & Gutscher, H. (2003). Test of a trust and confidence model in the applied context of electromagnetic field (EMF) risks. Risk Analysis, 23(4), 705–716. 10.1111/1539-6924.00349 [DOI] [PubMed] [Google Scholar]
- Siegrist, M. , Gutscher, H. , & Earle, T. C. (2005). Perception of risk: The influence of general trust, and general confidence. Journal of Risk Research, 8(2), 145–156. 10.1080/1366987032000105315 [DOI] [Google Scholar]
- Siegrist, M. , Luchsinger, L. , & Bearth, A. (2021). The Impact of trust and risk perception on the acceptance of measures to reduce COVID‐19 Cases. Risk Analysis. 10.1111/risa.13675 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Siegrist, M. , & Visschers, V. H. M. (2013). Acceptance of nuclear power: The Fukushima effect. Energy Policy, 59, 112–119. 10.1016/j.enpol.2012.07.051 [DOI] [Google Scholar]
- Siegrist, M. , & Zingg, A. (2014). The role of public trust during pandemics implications for crisis communication. European Psychologist, 19(1), 23–32. 10.1027/1016-9040/a000169 [DOI] [Google Scholar]
- Slovic, P. (1987). Perception of risk. Science, 236(4799), 280–285. 10.1126/science.3563507 [DOI] [PubMed] [Google Scholar]
- Slovic, P. (1993). Perceived risk, trust, and democracy. Risk Analysis, 13(6), 675–682. 10.1111/j.1539-6924.1993.tb01329.x [DOI] [Google Scholar]
- Spiegel International (2011). Merkel gambles credibility with nuclear U‐turn. Der Spiegel . Retrieved from https://www.spiegel.de/international/germany/out-of-control-merkel-gambles-credibility-with-nuclear-u-turn-a-752163.html
- Teräväinen, T. , Lehtonen, M. , & Martiskainen, M. (2011). Climate change, energy security, and risk—debating nuclear new build in Finland, France and the UK. Energy Policy, 39(6), 3434–3442. 10.1016/j.enpol.2011.03.041 [DOI] [Google Scholar]
- Terpstra, T. (2011). Emotions, trust, and perceived risk: affective and cognitive routes to flood preparedness behavior. Risk Analysis, 31(10), 1658–1675. 10.1111/j.1539-6924.2011.01616.x [DOI] [PubMed] [Google Scholar]
- US National Science Board . (2018). Science & engineering indicators. Alexandria, VA: US National Science Board. [Google Scholar]
- Van de Walle, S. , Van Roosbroek, S. , & Bouckaert, G. (2008). Trust in the public sector: Is there any evidence for a long‐term decline? International Review of Administrative Sciences, 74(1), 47–64. 10.1177/0020852307085733 [DOI] [Google Scholar]
- van Mierlo, T. (2014). The 1% rule in four digital health social networks: An observational study. Journal of Medical Internet Research, 16(2), e33. 10.2196/jmir.2966 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Visschers, V. H. M. , Keller, C. , & Siegrist, M. (2011). Climate change benefits and energy supply benefits as determinants of acceptance of nuclear power stations: Investigating an explanatory model. Energy Policy, 39(6), 3621–3629. 10.1016/j.enpol.2011.03.064 [DOI] [Google Scholar]
- Visschers, V. H. M. , & Siegrist, M. (2013). How a nuclear power plant accident influences acceptance of nuclear power: Results of a longitudinal study before and after the Fukushima disaster. Risk Analysis, 33(2), 333–347. 10.1111/j.1539-6924.2012.01861.x [DOI] [PubMed] [Google Scholar]
- Wachinger, G. , Renn, O. , Begg, C. , & Kuhlicke, C. (2013). The risk perception paradox—Implications for governance and communication of natural hazards. Risk Analysis, 33(6), 1049–1065. 10.1111/j.1539-6924.2012.01942.x [DOI] [PubMed] [Google Scholar]
- Weber, F. , Jenal, C. , Rossmeier, A. , & Kuhne, O. (2017). Conflicts around Germany's Energiewende: Discourse patterns of citizens' initiatives. Quaestiones Geographicae, 36(4), 117–130. 10.1515/quageo-2017-0040 [DOI] [Google Scholar]
