Skip to main content
Sage Choice logoLink to Sage Choice
. 2023 Sep 1;33(2):241–259. doi: 10.1177/09636625231191348

Does knowledge make a difference? Understanding how the lay public and experts assess the credibility of information on novel foods

Mengxue Ou 1,, Shirley S Ho 1
PMCID: PMC11892078  PMID: 37655614

Abstract

Drawing on Metzger’s dual-processing model of credibility assessment, this study examines how individuals with varying topical knowledge (laypersons vs experts) assess the credibility of information on novel foods. Online focus group discussions reveal that both groups share similar motivations for assessing the credibility of information on novel foods (e.g. personal relevance and concerns about the impact of unverified information on others). However, they differ in the barriers they encounter during the assessment of information credibility. Both groups employ analytical (e.g. evaluating content quality) and intuitive methods (e.g. looking at source credibility) to assess the credibility of novel food-related information. However, they differ in the cues used for credibility assessment. Laypersons tend to rely on superficial heuristics (e.g. social endorsement cues or surface features), whereas experts rely more on content features and scientific knowledge to evaluate information credibility. Theoretical and practical implications are discussed.

Keywords: information credibility assessment, lay-expert comparison, novel foods


Novel foods, which are foods or ingredients with a limited history of consumption, have become increasingly prevalent in people’s diets (Hendrich, 2016). Processed foods containing ingredients produced using new technologies and production processes, such as genetically modified (GM) soybeans, GM sugar beets, and alternative proteins, have been available for years (US Food and Drug Administration, 2022). Despite their widespread consumption, people continue to exhibit pervasive skepticism toward novel foods (Pew Research Center, 2020b). Previous studies have identified numerous psychological and sociocultural factors that are related to the public’s acceptance or rejection of novel foods, such as individuals’ science or health literacy (e.g. Rodríguez-Entrena and Salazar-Ordóñez, 2013), moral concerns (Scott et al., 2018), social norms (Menozzi et al., 2017), public culture (Bates, 2005), and trust in relevant stakeholders such as governments, scientists, and corporations (Cook et al., 2006; Ho et al., 2023). However, the prevalence of social media, accompanied by the proliferation of misinformation, has presented new challenges to the public’s understanding of novel foods.

Misinformation about novel foods is particularly concerning as it can lead to people’s misconceptions and polarized attitudes toward novel food technologies. A recent report by Lynas et al. (2022) found that 9% of the 535 online articles about GM food contained misinformation, potentially reaching up to 256 million readers and distorting public perceptions of novel food technologies. Therefore, it is increasingly important to understand what factors contribute to people’s belief or disbelief in such (mis)information. As false beliefs often arise through the same mechanisms that establish credibility beliefs (Ecker et al., 2022), this study aims to investigate how and when individuals assess a piece of information related to novel foods as truthful or credible. By doing so, it intends to provide insights that can guide efforts to mitigate misinformation about novel foods.

Numerous studies have investigated how laypersons judge the credibility of information on various scientific issues (e.g. COVID-19, climate change, and GM technology), reflecting a list of source (e.g. source expertise), channel (e.g. webpage appearance), content (e.g. content quality), and receiver factors (e.g. personal knowledge) that influence laypersons’ credibility judgments of scientific information (e.g. Aharon et al., 2021; Scharrer et al., 2021). However, few studies have investigated how the credibility judgments of laypersons differ from those of experts, who possess greater knowledge in the science domain. In addition, limited research has explored the differences between these two groups in judging the credibility of information on novel food technologies, a complex field accompanied by controversies and misconceptions. Previous studies have found that nonexperts tend to rely on heuristic cues (e.g. affect, trust, and the natural-is-better heuristic) rather than engaging in elaborative processing when making judgments about novel food-related issues (Siegrist and Hartmann, 2020), while experts exhibit a higher tendency to inhibit the use of heuristics in their decision-making (Thoma et al., 2015). However, it remains unclear whether these judgmental differences persist between experts and nonexperts when evaluating the credibility of novel food-related information.

Given the existing research gaps, this study has two aims. First, it explores individuals’ motivations and strategies when assessing the credibility of information on novel food. We draw from Metzger’s (2007) dual-processing model of credibility assessment, which proposes that various motivations (e.g. accuracy motivations) and personal traits (e.g. topical knowledge) influence how people evaluate information credibility (e.g. through systematic or heuristic evaluation). Second, this study aims to compare novel food experts and laypersons in terms of their motivations and strategies for assessing the credibility of novel food information. Theoretically, this study will contribute to the refinement of Metzger’s model by providing a nuanced delineation of multiple motivations and strategies for information credibility assessment. Practically, this study will inform the development of science communication strategies that address misinformation by offering insights into how experts and laypersons evaluate information credibility. This will enable the development of targeted misinformation correction strategies tailored to audiences with different levels of scientific knowledge.

Literature review

Study context

This study focuses on the context of Singapore, a Southeast Asian country that heavily relies on food imports due to limited land and water resources for agricultural and pastoral development. To strengthen food security, Singapore is at the forefront of introducing novel food technologies and promoting public acceptance of them. In March 2019, the Singapore Food Agency approved the sale of lab-grown chicken products, making it the first country to permit the sale of laboratory-grown meat in the market (Tan and Yi, 2021). The Singapore government has also made significant investments in novel food technologies and innovations, with the aim of producing 30% of Singaporeans’ nutritional needs using this technology by 2030 (Singapore Food Agency, 2022).

Despite the growing innovations in novel food technologies in Singapore, some psychological and sociocultural elements, such as socioeconomic status (Barrena and Sánchez, 2013), cultural worldviews (Kahan et al., 2009), and moral concerns (e.g. Scott et al., 2018), continue to hinder the public’s understanding of novel foods and judgment about the credibility of related information (Chong et al., 2022). Furthermore, the complexity of novel food technology, combined with a lack of understanding among the general public, renders it susceptible to misinformation, resulting in false beliefs and misconceptions (Siegrist and Hartmann, 2020). Given that misinformation related to novel foods can contribute to public misunderstandings and rejection of these foods (e.g. Bode et al., 2021), it is crucial to explore how and when Singaporeans assess the credibility of information on novel foods.

Information credibility assessment

Recent work conceptualizes information credibility assessment as “an individual’s judgment of the veracity of the content of communication,” which emanates from individuals’ synthetic assessment of the source, message, and media of information (Appelman and Sundar, 2016: 5). Research on how people assess information credibility has provided numerous insights from various theoretical perspectives. Some scholars suggest that due to the cognitive resources required to assess the truthfulness of information, people may accept information rather than scrutinize it (Schwarz and Jalbert, 2021). Similarly, the prominence-interpretation theory of web credibility (Fogg, 2003) proposes that most people are likely to evaluate only parts of the information they notice, in line with Lang’s (2000) limited capacity model of information processing, which suggests that people selectively process information due to limited cognitive abilities. These theories acknowledge people’s selective and heuristic way of processing information and its impact on credibility judgments but do not explore situations in which people engage in systematic credibility assessment or provide a detailed understanding of the motivations underlying heuristic or systematic credibility assessment. In addition, there is limited research on whether the way people assess credibility (systematic vs heuristic) varies based on their knowledge. Thus, it remains unclear whether individuals with different levels of knowledge have varying motivations or strategies for assessing information credibility.

This study employs Metzger’s (2007) dual-processing model of credibility assessment to explore how and why individuals (i.e. experts and nonexperts) assess credibility when encountering information on novel foods. There are two reasons for using Metzger’s model. First, unlike other frameworks (e.g. the prominence-interpretation theory of web credibility) that assume individuals rely on superficial cues rather than systematic processing to evaluate information credibility, Metzger’s model argues that people can engage in either heuristic or systematic credibility assessment depending on their motivations and abilities. Thus, this model takes situational backgrounds and personal traits into account, making it suitable for understanding how individuals form credibility judgments based on various cues embedded in information (i.e. strategies) and when they will engage in different modes of credibility evaluation (i.e. motivations). Second, Metzger’s model views information credibility judgment as a conjoint function of individuals’ motivations, abilities, and evaluation modes, making it ideal for addressing our study objectives, i.e., comparing how people with different personal traits (i.e. varying knowledge levels) differ in terms of their motivations and strategies for assessing the credibility of information on novel foods.

Dual-processing model of information credibility assessment

Drawing on dual-process models such as Eagly and Chaiken’s (1993) heuristic-systematic model and Petty and Cacioppo’s (1986) elaboration likelihood model, Metzger (2007) developed the dual-processing model of credibility assessment. This model proposes that situational and individual factors, such as the need for accurate information (i.e. accuracy motivation) or relevant topical knowledge, influence the way individuals evaluate the credibility of information. In particular, the model proposes three phases of information credibility evaluation.

The first phase is the exposure phase. In this phase, individuals’ motivations and abilities determine the strategies they use to evaluate information credibility. Abilities to evaluate information include factors such as topical knowledge or information skills, while motivations were previously conceptualized as “accuracy goals”—the individual’s need to find accurate information. However, recent research on the multi-motive model of information processing (Winter et al., 2016) has identified multiple types of motivations for information processing, distinguishing between accuracy motivation, impression motivation (i.e. the desire to make a positive impression on others), and defense motivation (i.e. the need to defend personal values). As different types of motivations can influence how individuals use different cues to make credibility judgment, it is essential to investigate the specific motivations underlying individuals’ credibility assessment of novel food information. By doing so, this study contributes to Metzger’s model by expanding our understanding of the different types of motivations for credibility evaluation. Therefore, this study revisits Metzger’s model and explores the specific motivations underlying individuals’ credibility assessments of novel food information.

The second phase proposed by Metzger’s model is the evaluation phase. During this phase, individuals with different motivations and abilities engage in either heuristic or systematic evaluation of credibility or avoid evaluation altogether. When individuals have a desire for accurate information and possess sufficient knowledge to evaluate information credibility, they are more likely to pay attention to the information and exert cognitive effort to evaluate its content systematically (i.e. systematic evaluation). Conversely, when individuals have low motivations, they tend to evaluate the information more heuristically and intuitively by focusing on superficial cues, such as the design of the interface, or avoid evaluation altogether. Moreover, if individuals have high motivations but insufficient abilities to assess the truthfulness of information, they are more likely to engage in heuristic evaluation rather than systematic evaluation. In the third phase (i.e. judgment phase), individuals establish credibility judgments based on the evaluation strategies and cues they used.

Although Metzger’s model distinguishes between heuristic and systematic approaches to evaluating information credibility, it does not identify the specific cues individuals tend to use when assessing information credibility systematically or heuristically. Existing research has identified several informational elements individuals rely on when assessing credibility, such as source cues (e.g. source expertise or congruency; Chung et al., 2012), content cues (e.g. content quality and message valence; Henke et al., 2020), and channel cues (e.g. media interactivity and modality; Greussing, 2020). However, Metzger’s model lacks a nuanced delineation of the specific types of cues that individuals with different motivations tend to use for credibility judgment. Therefore, this study will refine this model by providing clarity on the specific cues or strategies employed by individuals (i.e. laypersons and experts) when assessing the credibility of information on novel foods.

The expert-lay discrepancies in information credibility assessment

Extensive literature suggests that people with different knowledge levels (e.g. experts and laypersons) perceive and evaluate novel food technologies, including nano-enabled foods (e.g. Ho et al., 2011) and GM foods (Sjöberg, 2008), differently. Laypersons without specific knowledge about novel foods often rely on affect, trust, or naturalness heuristics to assess novel food-related information (Siegrist and Hartmann, 2020). In contrast, scientists evaluate scientific information in a more systematic manner, considering criteria such as content, presentation, technical aspects, and context (Fähnrich et al., 2023). Moreover, studies have revealed that individuals with varying topical knowledge levels exhibit differences in their socioscientific argumentation patterns (e.g. Sadler and Fowler, 2006; Yang and Anderson, 2003). Those with more knowledge about genetic engineering content tend to provide higher-quality arguments about genetic engineering compared to those with less knowledge.

The disparities in laypersons’ and experts’ responses to novel food can also be attributed to variations in their scientific literacy (i.e. individuals’ actual knowledge about science; Thomas and Durant, 1987) or health literacy (i.e. individuals’ ability to access, comprehend, and utilize information for personal health-related choices; Centers for Disease Control and Prevention, 2020; Health Literacy Tool Shed, 2023). Research indicates that individuals with a higher level of scientific literacy tend to display greater acceptance of GM foods (Ceccoli and Hixon, 2012). Furthermore, individuals with higher health literacy are more inclined to consume novel foods derived from insects (Lorini et al., 2021). In addition, the literature on information credibility judgment has also documented the role of knowledge in explaining expert-lay discrepancies in credibility assessment. For example, Lucassen et al. (2013) found that the level of domain knowledge influences the types of cues people use for credibility judgment. Those with high domain knowledge focus more on semantic features (e.g. neutrality), while those with low domain knowledge tend to rely more on surface features (e.g. website appearance) for credibility judgments.

Understanding the discrepancies between laypersons and experts in terms of how and when they assess novel food-related information is crucial as it would help to develop effective strategies to mitigate people’s credulity of relevant misinformation. However, current studies have not yet fully explored the disparities or similarities in credibility judgments between laypersons and experts. In addition, while Metzger’s model offers a valuable theoretical framework for understanding when and how individuals make credibility judgments, it does not account for whether individuals with varying levels of topical knowledge (i.e. laypersons and experts) possess different motivations and use different cues for assessing information credibility. Hence, this study aims to investigate the expert-lay discrepancies in terms of their motivations and strategies when assessing the credibility of novel food information. We pose the following research questions:

  • RQ1: How do the lay public and experts concur or differ in terms of their motivations for assessing the credibility of novel food-related information?

  • RQ2: How do the lay public and experts concur or differ in terms of their strategies used for assessing the credibility of novel food-related information?

Method

Data collection

To obtain insights from experts and the lay public regarding their motivations and strategies for assessing novel food-related information, this study conducted four online focus group discussions involving 40 participants—three groups with the lay public and one group with novel food experts—between February and April 2022. We adopted the focus group discussion approach for two reasons. First, this method has the potential to generate insights from group interactions, thereby encouraging more profound and natural self-disclosure compared to one-to-one interviews (Tracy, 2019). Second, our study aims to investigate how different groups (i.e. laypersons and experts) respond to the same issue of assessing the credibility of information on novel foods, and how they share their experiences and thoughts on this topic. Therefore, focus groups, which facilitate information exchange and collective ideation during discussions, were an appropriate methodology to address our research objectives. Due to the social gathering restrictions during the COVID-19 pandemic in Singapore, the focus group discussions were conducted online via a videoconferencing platform, i.e., Zoom. The online focus group discussion is more flexible and feasible for participants with geographical constraints, time restrictions, and anonymity concerns (Tracy, 2019).

Ethics statement

This study was approved by the Institutional Review Board (IRB) of Nanyang Technological University in Singapore (IRB number: IRB-2021-998). Prior to participating in the study, all participants had to read through the consent form and provide their consent.

Sample and recruitment

We recruited 40 participants to participate in the focus group discussions. All participants are Singaporeans or Permanent Residents aged 21 years old and above. Only Singaporeans and Permanent Residents who were comfortable with speaking English and capable of using Zoom (an online videoconferencing platform) were eligible to participate in this study. Each online focus group discussion lasted approximately 2 hours and consisted of 9 to 11 participants. Upon completion, we compensated each lay public participant with S$50 and each novel food expert with S$100.

Three focus group discussions were conducted with the lay public (N = 29) who lacked expertise in the field of novel food technology. The lay public was recruited through convenience and snowball sampling. The use of convenience and snowball sampling strategies could help us easily access specific populations (e.g. the elderly and experts) interested in participating in our study (Goodman, 1961). To ensure that diversified insights from all age and gender groups were captured, participants from three different age groups were recruited, and each focus group had an equivalent gender proportion. Therefore, we conducted one focus group discussion with Millennials, another with Generation X, and a third one with Baby Boomers (Pew Research Center, 2020a). We classified participants into different age groups as people of different generations tend to differ in their assessment of information credibility (Choi, 2020; Liao and Fu, 2014). For instance, research has shown that older adults are more likely to accept web information without assessing its credibility compared to younger adults (Liao and Fu, 2014). Furthermore, classifying participants into different age groups could also ensure that participants express their opinions freely and actively (Ho et al., 2018). In addition, one focus group discussion was conducted with 11 experts in the field of novel food technologies who were recruited through publicly-available contact information (e.g. institutional email and telephone contact). The expert participants consisted of 4 scientists from local research institutes and 7 assistant professors (and above) from universities in Singapore. Table 1 shows the demographic characteristics of the focus group participants.

Table 1.

Details of online focus group participants.

Focus group Participants
(No. and Gender)
Age Generations
Laypersons FGD1 10 (5 males, 5 females) 21–38 (M = 25.90; SD = 4.95) Millennial
FGD2 10 (5 males, 5 females) 43–56 (M = 50.50; SD = 5.25) Generation X
FGD3 9 (4 males, 5 females) 58–73 (M = 63.67; SD = 5.12) Baby Boomer
Experts FGD4 11 (7 males, 4 females) 29–73 (M = 47.64; SD = 12.03)

FGD: focus group discussions; SD: standard deviation.

Moderation and guide

To ensure participants could comfortably express their opinions, all the online sessions were conducted in English, the lingua franca of Singapore. A moderator from the research team moderated the focus group sessions with the help of assistant moderators. The first three focus group sessions with the lay public were moderated by a doctoral student well-trained in social science research, and the last focus group session with the experts was moderated by a college faculty member in the field of social science who has rich experience in focus group discussion moderation. A semistructured moderator’s guide consisting of a list of questions and prompts was developed based on the aims of this study. The moderator first introduced participants to the definition and examples of novel foods by sharing a slide. Prior to conducting the focus groups, the definition of novel foods was reviewed and revised several times based on the suggestions from professors in the field of novel foods. Then, the moderator asked participants about their experience of encountering novel food information whose accuracy they were unsure of, the cues they used to determine the accuracy of such information, and what made them evaluate or verify them. To maximize the discussions and interactions among participants, the moderators utilized several strategies, including (a) prompting them to elaborate more on their taken-for-granted ideas, (b) encouraging participants to discuss the inconsistencies between their ideas, (c) and following up on nonverbal reactions (e.g. nodding or shaking heads).

Coding and analysis

The online focus group sessions were video-and-audio recorded and transcribed verbatim. All the identifiable information of participants was removed from the final transcripts to ensure participants’ confidentiality. Nonverbal features and reactions of the participants, such as nods, head shakes, group smiles, and laughter, were also transcribed and documented. Each participant’s name was substituted with an alphanumeric code. For example, L1 P1 represents participant 1 from the first focus group session with the lay public, while E1 P1 refers to participant 1 from the focus group session with experts. Two coders underwent rigorous training to ensure accurate and consistent coding and analysis of the data.

To ensure the rigor and trustworthiness of the data analysis, this study follows the interpretive paradigm of qualitative research, ensuring that the current analysis meets the trustworthiness criteria of qualitative data analysis, that is, credibility, transferability, dependability, and conformability (Chen and Haley, 2014; Lincoln and Guba, 1985). A hybrid coding approach that is a combination of inductive and deductive coding was used to analyze the data (Tracy, 2019). First, inductive coding was employed to alleviate any potential preconceptions about the data. Two coders engaged in an open-coding process, in which they independently read and coded all the transcripts line-by-line and generated initial codes (e.g. codes like “most others’ reviews” or “reputable scientists”). In this process, the two coders made constant comparisons to generate codes grounded in the data and modified code definitions to reflect the data (Glaser et al., 1968). Second, the two coders engaged in the axial-coding stage (i.e. deductive categorization) where they categorized the codes generated in the first stage into second-level hierarchical codes (e.g. “evaluating content quality” or “looking at endorsement cues”). During this stage, the extant literature on information credibility judgment guided the grouping of the initial codes and the defining of second-level hierarchical codes. Finally, the researchers and coders discussed the second-level hierarchical codes and identified larger themes that answered the study’s research questions. The initial codes and second-level hierarchical codes were identified based on their frequency, specificity, extensiveness, and similarity of the mentions by the participants (Krueger and Casey, 2000), as well as their relevance to the literature and research questions.

Results

The first section of the results discusses the similarities and differences in the motivations underlying laypersons’ and experts’ assessment of the credibility of information on novel foods, answering RQ1. The second section discusses major convergences and differences in the approaches employed by laypersons and experts to assess the credibility of novel food-related information, answering RQ2.

Motivations for information credibility assessment

The data analysis revealed four shared reasons that motivated both laypersons and experts to assess the credibility of information on novel foods: the presumed negative influence of unverified information on others, personal relevance, personal interest, and suspicious feelings. In addition, the results unveiled the different barriers that hinder the assessment of information credibility for experts and laypersons (see Table 2).

Table 2.

The similarities and differences in participants’ motivations for assessing the credibility of novel food information.

Laypersons (N = 29) Experts (N = 11)
RQ1: How do the lay public and experts concur or differ in terms of their motivations for assessing the credibility of novel food information? Similarities Presumed negative influence of unverified information on others
• When unverified information on novel foods causes harm to others
Personal relevance
• When there is a desire to try novel foods
• When novel foods have negative health implications
Personal interest
• When there is an interest in novel foods or novel-food-related information
Suspicious feelings
• When the claim on novel foods is suspicious
Differences Barriers
• Facing difficulties in understanding the technical jargon in novel food information
Barriers
• Lacking time to evaluate information credibility due to research obligations

RQ: research question.

Similarities between laypersons and experts

Presumed negative influence of unverified information on others

Both laypersons and expert participants expressed concerns regarding the potential threat that unverified information could pose to others. They acknowledged the importance of assessing the credibility of novel food-related information to address the harmful effects of misinformation on others. L3 P4 emphasized the need to evaluate the truthfulness of information before sharing it with others. He believed that the spread of unverified novel food information causes problems for others, stating:

“There is fake news floating around, and I always tell my friends, ‘Please verify. Don’t just forward it, and check the source.’ As a lot of fake news is circulating, whether it is about novel foods or anything around us, it will cause many problems. I think we all should practice verifying information.”

L3 P9 also agreed that sharing unverified information can create significant harm. Therefore, he argued that it should be a self-obligation to verify information before sharing it with others, saying:

“Basically, we have to be responsible for our actions. Whether it’s sharing novel food-related misinformation or anything else with others, it’s just very harmful. It’s like gossip, you know. Some information, like gossip, can create a lot of problems.”

Similarly, expert participants, such as E1 P8 and E1P11, also expressed concerns about the dissemination of unverified novel food information. They agreed that if a claim had the potential to harm others, they would conduct further investigation. For instance, E1 P8 stated:

“If I’m interested and if it could harm others, even if I’m not the one affected, then I’ll probably investigate further. If the claim is harmful but relatively benign, and it seems like a marketing ploy, I probably won’t delve that much deeper.”

In general, comments from both the laypersons and experts emphasized the importance of evaluating the truthfulness of novel food-related information before sharing it, as unverified information could potentially cause harm to others.

Personal relevance

Both laypersons and novel food experts expressed that they would assess the credibility of novel food-related information if they perceived it to be personally relevant. This personal relevance could arise in situations where there is a need (e.g. job requirements) to consume novel foods, or when there are potential health impacts associated with the consumption of novel foods on them. For instance, L1 P2 from the lay public group stated, “Besides the fact that if it [i.e. novel food] affects my own direct consumption, I won’t have any other reason to verify it.” L1 P6 from the lay public group agreed and added that he would likely evaluate the novel food-related information to ensure the safety of the novel foods before consuming them. Similarly, E1P11 from the experts’ group shared the same thoughts, stating that she would further evaluate the truthfulness of novel food information if it impacts her. Overall, both the lay public and experts shared common situations in which they considered the relevance of information to decide whether to engage in the evaluation of information credibility.

Personal interest

The third common motivation for evaluating the credibility of novel food-related information among both laypersons and experts is personal interest. Both groups mentioned that their evaluation of novel food-related information depends on their level of interest in the food. For instance, L1 P3 from the lay public group stated that he would evaluate the credibility of novel food-related information based on how interesting the food is to him:

“I think it depends on how interesting the food is. If it’s about a super delicious burger or insects that are known to provide unorthodox vitamin benefits, I would verify it. Due to the interest factor, I would delve deeper and verify whether it’s true or not.”

In addition, participant E1P11, an expert in the field of food science and technology from academia, also emphasized this point, mentioning:

“I would probably do a little bit of extra work to find out [the truthfulness of the information], if [I am] interested.”

Feeling suspicious

Both laypersons and expert participants mentioned that they typically do not evaluate the truthfulness of information unless it contains doubtful and exaggerated claims. L1 P4 mentioned that if the information contains suspicious and exaggerated claims about the special benefits of the novel food, she would verify it because of her skepticism. She stated, “For example, if they are saying ‘Hey, these types of food can cure cancer,’ which have significant health impacts, either positive or negative, it is something that I would doubt and pay more attention to.” Expert participants, such as E1P11, also shared a similar sentiment, stating that if some claims made in the information sound “too good to be true” and “suspicious” based on their knowledge, they are more likely to spend more time checking it out.

Differences between laypersons and experts

Barriers to information credibility assessment

While laypersons and experts shared similar motivations for assessing the credibility of information on novel foods, they differed substantially in the barriers they faced when evaluating credibility. Although some experts recognized the need to evaluate information due to its potentially harmful effects on others, others were less enthusiastic about evaluating the credibility of novel food-related information due to a lack of time. For instance, E1 P9, a researcher in the field of alternative proteins and cultured meat, explained that he would not dedicate time to verifying novel food-related information, mentioning:

“If I’m a journalist who has a stake in uncovering all these big stories, then I will definitely spend time trying to gather more information to verify if it’s true because it is my reputation and job. But as a scientist, I think I’ll have to spend more time on my own work and experiments instead of pursuing all this.”

The lay public had different reasons for not assessing the truthfulness of information on novel food. For example, some participants (e.g. L2 P5 and L2 P7) stated that they had difficulty understanding the technical jargon used in novel food-related information. Evaluating the truthfulness of such information would require a substantial amount of time and effort, which they were unwilling to invest. Consequently, they would give up on assessing the credibility of such information. For instance, L2 P5 explained:

“I wouldn’t verify that much because sometimes when you read information about GMO or genetically modified food, what does the ‘O’ stand for? I cannot understand. So, all of this technical jargon makes it difficult for me to evaluate the information.”

He further added:

“If I wanted to purposely understand the technical jargon and verify it across sources, it would require a lot of work.”

Strategies for assessing information on novel food

Both laypersons and experts mentioned that they would use content cues (e.g. content quality) and source cues (e.g. source credibility) to assess the credibility of novel food-related information, blending heuristic and systematic evaluation. However, they differ in their use of other specific cues for credibility assessment. Specifically, laypersons mentioned more heuristic cues (e.g. endorsement cues or surface features) that they tend to rely on when assessing information credibility, resulting in a judgment process that is largely heuristic in nature. However, experts tend to engage in a more systematic and analytical assessment of credibility, which includes evaluating content objectivity and its consistency with scientific consensus (see Table 3).

Table 3.

The similarities and differences in participants’ strategies used for assessing the credibility of novel food information.

Laypersons (N = 29) Experts (N = 11)
RQ2: How do the lay public and experts concur or differ in terms of their strategies used for assessing the credibility of novel food information? Similarities Source credibility
• Checking whether the information is from professional individual sources and reliable institutional sources
Content quality
• Evaluating the plausibility or logical coherence of the argument
• Checking the presence of supporting evidence for the argument
Differences Endorsement cues
• Assessing whether the information has been widely reported across multiple channels
• Considering the opinions or reviews of the majority
Surface features/appearance of information
• Evaluating the appearance of information channels (e.g. the legitimacy of websites)
• Looking for logos or labels indicating endorsement by authorities
Navigability
• Checking if the websites include clickable links within the information to facilitate cross-checking of information
Content objectivity
• Evaluating whether the information contains bluffing or exaggerated claims
Consistency with scientific consensus
• Assessing whether the content aligns with current scientific understanding and knowledge
The presence of commercial content
• Checking the existence of marketing claims in the information

RQ: research question.

Similarities between laypersons and experts

Source credibility

Source credibility was one of the most prominent cues for assessing the credibility of novel food-related information, according to both laypersons and experts. Both groups mentioned that information from two types of sources would be more trustworthy: professional individual sources and reliable institutional sources. Specifically, both laypersons and experts concurred that if an institution “has a big name,” they would consider the information from this institution as credible. For instance, both expert and lay participants mentioned that they would evaluate the credibility of novel food-related information by checking whether this information is from agencies that enjoy a good reputation, such as the Singapore government (e.g. Singapore Food Agency; P1 P2 and P1 P4), reputable media agencies (e.g. The Straits Times and the New York Times; L3 P9), or well-regarded academic institutions (e.g. reliable journals; E1 P2). In addition, they would also trust information from individuals who have demonstrated expertise and fame in the field of novel foods, such as medical professionals (e.g. L2 P2) or renowned scientists (e.g. L2 P5 and E1 P2).

Content quality

In addition to relying on source heuristics, results from the focus group discussions revealed that both laypersons and experts would also engage in analytical evaluation of content quality when assessing the credibility of information on novel food. Their evaluation of content quality revolves around two key aspects: evaluating the plausibility of the arguments and checking the presence of evidence or data that support the arguments. For example, L1 P4 stated that information that is logically unreasonable or incredulous would not be deemed trustworthy, adding that “If it sounds incredulous or hard to believe, it would probably raise some doubts.” Similarly, E1P11 also concurred that they would not trust information containing suspicious statements. In addition to evaluating the reasonability and logic of the argument, other participants (e.g. E1P10 and L2 P5) emphasized the importance of evidence or data as an indicator of information credibility. They would also consider who provided that data as well as the accuracy of the data. L2 P5 mentioned, “Some of us prefer to listen to friends or people we trust, but others may look at the data. I mean, people like me will definitely look at the data.”

Differences between laypersons and experts

Cues used by the lay public

The results of the focus group discussions reflected three types of heuristic cues that the majority of the lay public would rely on when assessing the credibility of information on novel foods: the number of sources (endorsement cues), surface features or appearance of information, and navigability. However, these three heuristics were not mentioned by the experts.

Number of sources (endorsement cues)

Many laypersons expressed that they would rely on endorsement heuristics to assess the credibility of information on novel foods, whereas the experts did not mention this type of heuristic when making credibility assessments. Endorsement heuristics suggest that the more people are talking about the information, the more likely it is to be true (Lee and Shin, 2021). For instance, L1 P4 stated that he would trust the information on novel foods if it is “widely reported across various channels.” He referred to this criterion for information credibility judgment as “cross-referencing,” indicating that the more channels are referring to the same information, the more trustworthy it would be. Similarly, another participant (L2 P5) also mentioned that he assesses the truthfulness of novel food information and makes purchasing decisions based on reviews from others, stating that “Normally, I will check the reviews. It’s the most effective way. For example, if I want to find a product on Lazada and I see hundreds of people’s reviews, then the product is good.”

Surface features of information

Most participants mentioned that they also rely on surface features or the appearance of information to assess the credibility of novel food-related information. For example, L1 P3 from the millennials group stated that he would consider the legitimacy of a website’s appearance to determine the credibility of its content. He argued that the more legitimate the website looks, the more trustworthy the information is likely to be. Similarly, L2 P3 from the Generation X group mentioned that he would trust information that includes an authority tag. He explained, “If there’s a logo or some kind of tag below the content, I’ll trust it because it shows that there is authority behind it.” L3 P3 from the baby boomers group also highlighted the importance of labels, such as food labels indicating the origin of novel food products, in assessing the trustworthiness of novel food and related information.

Navigability

Participants (e.g. L2 P1 and L3 P1) from the lay public groups also mentioned that they would rely on the presence of hyperlinks on websites to determine the legitimacy of information about novel foods. Websites that allow users to click on embedded links and navigate to other websites were seen as more reliable. For instance, participant L2 P1 mentioned that if a website introduces novel foods, such as cultured meat, and claims that such foods are approved by the government, it should provide relevant links that allow consumers to cross-check and verify the government’s approval.

Strategies used by novel food experts

The findings of the focus group discussions revealed three strategies that experts utilize to evaluate the credibility of information on novel foods, which were not mentioned by the lay public. These strategies include evaluating the objectivity of the content, validating whether the information is consistent with scientific consensus, and identifying the presence of commercial content.

Content objectivity

The expert participants expressed a general consensus that they would definitely distrust information on novel foods that makes exaggerated and bluffing statements, as such information is deemed “too good to be true.” For instance, E1 P3, an assistant professor in the area of novel foods from a local university in Singapore, stated, “I think the main ‘red flag’ is when they make too many claims; [it often indicates that] it’s too good to be true.” Another participant, E1P11, a senior lecturer in the field of food science and technology from a local university, also agreed that if she encountered information that sounded like “bluffing” and “too good to be true,” she would feel suspicious and not trust it.

Consistency with scientific consensus

Different from the lay public, who lack the relevant knowledge to aid them in assessing the credibility of novel food information, experts commonly mentioned that they would rely on scientific consensus to make judgments regarding the credibility of novel food-related information. For instance, E1P11, a senior lecturer in the field of food science and technology, indicated that she would utilize her background knowledge to determine the truthfulness of information related to novel foods, as she can easily identify suspicious content as a scientist in this field. Another participant, E1 P7, a researcher in the field of alternative proteins and cultured meat, added that he would evaluate the information to ensure its consistency with scientific understanding. While the use of scientific knowledge or consensus was an important strategy for experts to ascertain the truthfulness of information on novel foods, it was not mentioned in the focus group discussions conducted with the lay public.

Presence of commercial content

The experts generally regarded the presence of commercial content (e.g. advertising) in the novel food information as an important indicator of bias, and such marketing claims would decrease their trust in the information. Participant E1 P2 mentioned, “So now, with all the online e-commerce, many of these novel food claims are actually very misleading, and some of them are not true. That’s where I have trust issues.” Another participant, E1 P4, also expressed concerns about marketing claims and stated that he does not trust commercial information related to novel foods. He commented, “ When I look at the source of information, one of my first criteria is to ask myself, ‘Is this another marketing ploy?’ If it is, then I seriously doubt the validity of the information.” Overall, the presence of commercial information, such as marketing claims or advertising, served as an important cue for experts to judge information credibility. However, the reliance on the presence of commercial content was not observed in the sessions with the lay public.

Discussion

Despite the increasing consumption of novel foods over the past two decades, research suggests that people are not adequately prepared to assess the credibility of information related to these “disruptive food technologies.” This lack of preparedness increases the likelihood of the public falling for misinformation concerning novel foods (Siegrist and Hartmann, 2020). By conducting focus group discussions with both laypersons and experts, this study contributes to the literature on information credibility assessment. It not only uncovers the motivations underlying laypersons’ and experts’ information credibility assessment but also provides insights into how these two groups evaluate controversial scientific issues by taking into account various informational cues, such as content, source, and channel cues.

RQ1 compares laypersons’ and experts’ motivations driving their credibility assessment of novel food information. Results revealed that their motivations for assessing information credibility did not differ significantly. Both groups mentioned that they would assess the credibility of novel food-related information if they perceived it to be detrimental to others, relevant to themselves, interesting, or suspicious. Interestingly, the reasons for not assessing credibility differed between the two groups. The lay public, lacking sufficient knowledge of novel foods, would give up assessing credibility to save time and effort, while some experts did not consider credibility evaluation as relevant to their main obligations as scientists. This study contributed to Metzger’s model by specifying that the lack of relevant knowledge may lead individuals to avoid credibility evaluation. Given that previous studies have primarily focused on unveiling the motivations eliciting credibility assessment behavior (e.g. Waruwu et al., 2021), and scant studies shed light on why individuals avoid evaluating information credibility, future studies could examine the psychological or sociocultural mechanisms underlying such credibility evaluation avoidance behavior.

This study significantly contributes to the body of knowledge on information credibility assessment by elucidating the cues that individuals rely on when making credibility judgments about scientific information. Prior research suggests that individuals typically rely on heuristic cues, such as superficial characteristics, to make credibility judgments (e.g. Metzger et al., 2010). In contrast, our findings demonstrate that individuals (both laypersons and experts) adopt a combined systematic and heuristic approach to evaluating information credibility, by both relying on superficial cues (i.e. source reliability) and their analytical assessment of content quality. However, it should be noted that the results from the focus group discussions may not entirely capture individuals’ actual cognitive processes and behaviors due to the presence of social-desirability bias inherent in self-report research. To overcome this limitation, future research could integrate alternative methods, such as eye-tracking, to obtain more objective documentation of individuals’ information evaluation processes.

This study highlights that laypersons and experts rely on a combination of superficial cues (e.g. source credibility) and content quality to make credibility judgments but differ in the specific cues used. In particular, the lay public tends to rely more on simple decision rules/heuristics for evaluating information credibility (e.g. massive opinion, surface features, and navigability), while experts employ a more systematic approach (i.e. assessing content objectivity and comparing with scientific consensus). These differences may be attributed to varying knowledge levels between the two groups, which aligns with previous research indicating that individuals with different knowledge levels rely on different cues to evaluate credibility (e.g. Aharon et al., 2021). Such differences also align with the semantic-surface-source model (Lucassen et al., 2013), which suggests that people with more domain expertise are more likely to leverage semantic features (e.g. content completeness and neutrality) to evaluate information analytically, while those without topical knowledge tend to rely more on superficial features (e.g. surface features) than semantic ones to form credibility perceptions.

Theoretical and practical implications

The present study represents one of the first attempts to apply Metzger’s model to explore how and why laypersons and experts assess the credibility of information on novel foods. While existing models of information credibility assessment, including Metzger’s model, primarily focus on accuracy motivations (i.e. the need for precise and adequate information) underlying information credibility assessment, this exploratory study used focus group discussions to uncover various motivations, such as the perceived negative influence of unverified information on others and personal relevance/interest, that underlie people’s evaluation of information credibility. By revealing multiple motivations for credibility assessment, this study refines Metzger’s model by expanding the range of motivations considered. In addition, this study identified the multifaceted yet distinct cues used by the lay public and experts in assessing the credibility of novel food information. This contributes to the development of an inventory of cues and strategies that people employed during credibility assessment, supplementing Metzger’s model. Finally, this study also revealed situations in which both laypersons and experts refrain from making credibility evaluations, highlighting the need for further investigation in future studies to explain this phenomenon.

Practically, given the prevalence of misinformation on novel food technologies, it is crucial to identify when and how people evaluate information related to these foods. This can help inform the development of effective corrective messages combating novel food-related misinformation and reduce people’s susceptibility to it. As many participants in our study expressed concern about the potential negative impact of novel food misinformation on others, strategies for mitigating the spread of such misinformation could highlight the detrimental effects of fake news on society, thus motivating people to assess information credibility before sharing it. Moreover, as most lay people face knowledge barriers when evaluating the credibility of information on novel foods, scientific communication should reduce the use of technical language to eliminate these barriers and promote a greater understanding of novel food technologies. In addition, our study found that the lay public relies more on heuristic cues than systematic processing when assessing the credibility of information on novel foods, making them more susceptible to misinformation (Ecker et al., 2022). Thus, media literacy training programs should focus on cultivating analytic thinking skills to help people better discern misinformation on novel food technologies.

Limitations and future research directions

This study acknowledges several limitations. First, our conclusions are based solely on qualitative data obtained from individuals in a Southeast Asian country. It is challenging to generalize these findings beyond the particular context of Singapore. Future studies should replicate this study in diverse settings outside of Asia. Second, the qualitative approach we employed is inherently subjective, as it relies on self-reported data and the interpretations of researchers and coders. Therefore, future research should aim to integrate both qualitative and quantitative methods to achieve a more comprehensive, objective, and nuanced understanding of individuals’ information credibility assessment processes. By incorporating other methodologies such as eye-tracking, Q methodology (McKeown and Thomas, 2013), and experiments, future studies can mitigate the influence of social-desirability bias inherent in self-reported data collection. This mixed-methods approach would yield valuable insights into how individuals actually interact with novel food information, contributing to a deeper understanding of credibility assessment. Third, while our study explored how individuals with varying levels of knowledge assess the credibility of information about novel foods by relying on various informational cues, it neglects other personal (e.g. health or science literacy) or sociocultural factors (e.g. cultural worldviews and norms) that may influence people’s evaluation of scientific information. Future research should broaden its scope to investigate how additional personal and sociocultural elements may affect the evaluation of scientific information credibility.

Acknowledgments

We thank Dr. Benjamin Smith and Dr. Kelvin Ng who provided insightful suggestions and comments that greatly improved the moderator’s guide for the focus group discussion.

Author biographies

Mengxue Ou (M.A., Nanyang Technological University, Singapore) is currently a Ph.D. candidate in the Wee Kim Wee School of Communication and Information at Nanyang Technological University, Singapore. Her research focuses on science and health communication, with a particular interest in investigating how individuals process misinformation in the context of emerging controversial scientific issues.

Shirley S. Ho (PhD, University of Wisconsin–Madison, USA) is President’s Chair Professor in Communication Studies in the Wee Kim Wee School of Communication and Information at Nanyang Technological University (NTU), Singapore. She is concurrently the Associate Vice President (Humanities, Social Sciences, & Research Communication) in the President’s Office at NTU. Her research focuses on science communication, in which she investigates cross-cultural public opinion dynamics related to science and technology, with potential health or environmental impacts. She is an elected fellow of the International Communication Association.

Footnotes

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was supported by the Singapore Ministry of Education Academic Research Fund Tier 1 Grant [Grant number: RT16/20].

References

  1. Aharon AA, Ruban A, Dubovi I. (2021) Knowledge and information credibility evaluation strategies regarding COVID-19: A cross-sectional study. Nursing Outlook 69(1): 22–31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Appelman A, Sundar SS. (2016) Measuring message credibility: Construction and validation of an exclusive scale. Journalism & Mass Communication Quarterly 93(1): 59–79. [Google Scholar]
  3. Barrena R, Sánchez M. (2013) Neophobia, personal consumer values and novel food acceptance. Food Quality and Preference 27(1): 72–84. [Google Scholar]
  4. Bates BR. (2005) Public culture and public understanding of genetics: A focus group study. Public Understanding of Science 14(1): 47–65. [DOI] [PubMed] [Google Scholar]
  5. Bode L, Vraga EK, Tully M. (2021) Correcting misperceptions about genetically modified food on social media: Examining the impact of experts, social media heuristics, and the gateway belief model. Science Communication 43(2): 225–251. [Google Scholar]
  6. Ceccoli S, Hixon W. (2012) Explaining attitudes toward genetically modified foods in the European Union. International Political Science Review 33(3): 301–319. [Google Scholar]
  7. Chen H, Haley E. (2014) Product placement in social games: Consumer experiences in China. Journal of Advertising 43(3): 286–295. [Google Scholar]
  8. Centers for Disease Control and Prevention (2020) What is health literacy? Available at: https://www.cdc.gov/healthliteracy/learn/index.html (accessed 7 August 2023).
  9. Choi W. (2020) Older adults’ credibility assessment of online health information: An exploratory study using an extended typology of web credibility. Journal of the Association for Information Science and Technology 71(11): 1295–1307. [Google Scholar]
  10. Chong M, Leung AKY, Lua V. (2022) A cross-country investigation of social image motivation and acceptance of lab-grown meat in Singapore and the United States. Appetite 173: 105990. [DOI] [PubMed] [Google Scholar]
  11. Chung CJ, Nam Y, Stefanone MA. (2012) Exploring online news credibility: The relative influence of traditional and technological factors. Journal of Computer-Mediated Communication 17(2): 171–186. [Google Scholar]
  12. Cook G, Robbins PT, Pieri E. (2006) “Words of mass destruction”: British newspaper coverage of the genetically modified food debate, expert and non-expert reactions. Public Understanding of Science 15(1): 5–29. [Google Scholar]
  13. Eagly AH, Chaiken S. (1993) The Psychology of Attitudes. New York, NY: Harcourt brace Jovanovich College Publishers. [Google Scholar]
  14. Ecker UK, Lewandowsky S, Cook J, Schmid P, Fazio LK, Brashier N, et al. (2022) The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology 1(1): 13–29. [Google Scholar]
  15. Fähnrich B, Weitkamp E, Kupper JF. (2023) Exploring “quality” in science communication online: Expert thoughts on how to assess and promote science communication quality in digital media contexts. Public Understanding of Science 32(5): 605–621. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Fogg BJ. (2003) Prominence-interpretation theory: Explaining how people assess credibility online. In: Proceedings of CHI’03, extended abstracts on human factors in computing systems, Fort Lauderdale, FL, 5–10 April, pp. 712–723. New York, NY: ACM. [Google Scholar]
  17. Glaser BG, Strauss AL, Strutzel E. (1968) The discovery of grounded theory; strategies for qualitative research. Nursing Research 17(4): 364. [Google Scholar]
  18. Goodman LA. (1961) Snowball sampling. Annals of Mathematical Statistics 32: 148–70. [Google Scholar]
  19. Greussing E. (2020) Powered by immersion? Examining effects of 360-degree photography on knowledge acquisition and perceived message credibility of climate change news. Environmental Communication 14(3): 316–331. [Google Scholar]
  20. Health Literacy Tool Shed (2023) About health literacy. Available at: https://healthliteracy.bu.edu/about (assessed 5 July 2023).
  21. Hendrich S. (2016) Novel foods. Available at: https://www.sciencedirect.com/topics/food-science/novel-food (assessed 7 August 2022).
  22. Henke J, Leissner L, Möhring W. (2020) How can journalists promote news credibility? Effects of evidences on trust and credibility. Journalism Practice 14(3): 299–318. [Google Scholar]
  23. Ho SS, Looi J, Chuah AS, Leong AD, Pang N. (2018) “I can live with nuclear energy if. . .”: Exploring public perceptions of nuclear energy in Singapore. Energy Policy 120: 436–447. [Google Scholar]
  24. Ho SS, Scheufele DA, Corley EA. (2011) Value predispositions, mass media, and attitudes toward nanotechnology: The interplay of public and experts. Science Communication 33(2): 167–200. [Google Scholar]
  25. Ho SS, Ou M, Vijayan AV. (2023) Halal or not? Exploring Muslim perceptions of cultured meat in Singapore. Frontiers in Sustainable Food Systems 7: 1127164. [Google Scholar]
  26. Kahan DM, Braman D, Slovic P, Gastil J, Cohen G. (2009) Cultural cognition of the risks and benefits of nanotechnology. Nature Nanotechnology 4(2): 87–90. [DOI] [PubMed] [Google Scholar]
  27. Krueger RA, Casey MA. (2000) Focus Groups: A Practical Guide for Applied Research. London: Sage. [Google Scholar]
  28. Lang A. (2000) The limited capacity model of mediated message processing. Journal of Communication 50(1): 46–70. [Google Scholar]
  29. Lee EJ, Shin SY. (2021) Mediated misinformation: Questions answered, more questions to ask. American Behavioral Scientist 65(2): 259–276. [Google Scholar]
  30. Liao QV, Fu WT. (2014) Age differences in credibility judgments of online health information. ACM Transactions on Computer-Human Interaction (TOCHI) 21(1): 1–23. [Google Scholar]
  31. Lincoln YS, Guba EG. (1985) Naturalistic Inquiry. Thousand Oaks, CA: Sage. [Google Scholar]
  32. Lorini C, Ricotta L, Vettori V, Del Riccio M, Biamonte MA, Bonaccorsi G. (2021) Insights into the predictors of attitude toward entomophagy: The potential role of health literacy: A cross-sectional study conducted in a sample of students of the University of Florence. International Journal of Environmental Research and Public Health 18(10): 5306. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Lucassen T, Muilwijk R, Noordzij ML, Schraagen JM. (2013) Topic familiarity and information skills in online credibility evaluation. Journal of the American Society for Information Science and Technology 64(2): 254–264. [Google Scholar]
  34. Lynas M, Adams J, Conrow J. (2022) Misinformation in the media: Global coverage of GMOs 2019–2021. GM Crops & Food. Epub ahead of print 17 November. DOI: 10.1080/21645698.2022.2140568. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. McKeown B, Thomas DB. (2013) Q Methodology. Thousand Oaks, CA: Sage. [Google Scholar]
  36. Menozzi D, Sogari G, Veneziani M, Simoni E, Mora C. (2017) Eating novel foods: An application of the theory of planned behaviour to predict the consumption of an insect-based product. Food Quality and Preference 59: 27–34. [Google Scholar]
  37. Metzger MJ. (2007) Making sense of credibility on the web: Models for evaluating online information and recommendations for future research. Journal of the American Society for Information Science and Technology 58(13): 2078–2091. [Google Scholar]
  38. Metzger MJ, Flanagin AJ, Medders RB. (2010) Social and heuristic approaches to credibility evaluation online. Journal of Communication 60(3): 413–439. [Google Scholar]
  39. Petty RE, Cacioppo JT. (1986) Communication and Persuasion. London: Springer. [Google Scholar]
  40. Pew Research Center (2020. a) On the cusp of adulthood and facing an uncertain future: What we know about Gen Z so far. Available at: https://www.pewresearch.org/social-trends/2020/05/14/on-the-cusp-of-adulthood-and-facing-an-uncertain-future-what-we-know-about-gen-z-so-far-2/ (accessed 16 March 2023).
  41. Pew Research Center (2020. b) Many publics around world doubt safety of genetically modified foods. Available at: https://www.pewresearch.org/fact-tank/2020/11/11/many-publics-around-world-doubt-safety-of-genetically-modified-foods/ (accessed 23 July 2022).
  42. Rodríguez-Entrena M, Salazar-Ordóñez M. (2013) Influence of scientific–Technical literacy on consumers’ behavioural intentions regarding new food. Appetite 60: 193–202. [DOI] [PubMed] [Google Scholar]
  43. Sadler TD, Fowler SR. (2006) A threshold model of content knowledge transfer for socioscientific argumentation. Science Education 90(6): 986–1004. [Google Scholar]
  44. Scharrer L, Bromme R, Stadtler M. (2021) Information easiness affects non-experts’ evaluation of scientific claims about which they hold prior beliefs. Frontiers in Psychology 12: 678313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Schwarz N, Jalbert M. (2021) The Psychology of Fake News. Accepting, Sharing, and Correcting Misinformation. New York, NY: Routledge. [Google Scholar]
  46. Scott SE, Inbar Y, Wirz CD, Brossard D, Rozin P. (2018) An overview of attitudes toward genetically engineered food. Annual Review of Nutrition 38: 459–479. [DOI] [PubMed] [Google Scholar]
  47. Siegrist M, Hartmann C. (2020) Consumer acceptance of novel food technologies. Nature Food 1(6): 343–350. [DOI] [PubMed] [Google Scholar]
  48. Singapore Food Agency (2022) A sustainable food system for Singapore and beyond. Available at: https://www.sfa.gov.sg/food-for-thought/article/detail/a-sustainable-food-system-for-singapore-and-beyond#:~:text=For%20greater%20food%20resilience%2C%20we,%2C%20and%20resource%2Defficient%20way (accessed 8 July 2022).
  49. Sjöberg L. (2008) Genetically modified food in the eyes of the public and experts. Risk Management 10(3): 168–193. [Google Scholar]
  50. Tan C, Yi TH. (2021) More cell-cultured chicken products approved for sale in Singapore. The Straits Times. Available at: https://www.straitstimes.com/singapore/more-cultured-chicken-products-approved-for-sale-in-singapore (accessed 7 August 2022).
  51. Thoma V, White E, Panigrahi A, Strowger V, Anderson I. (2015) Good thinking or gut feeling? Cognitive reflection and intuition in traders, bankers and financial non-experts. PLoS ONE 10(4): e0123202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Thomas G, Durant J. (1987) Why should we promote the public understanding of science. Scientific Literacy Papers 1: 1–14. [Google Scholar]
  53. Tracy SJ. (2019) Qualitative Research Methods: Collecting Evidence, Crafting Analysis, Communicating Impact. London: John Wiley & Sons. [Google Scholar]
  54. US Food and Drug Administration (2022) Agricultural biotechnology. Available at: https://www.fda.gov/food/consumers/agricultural-biotechnology (accessed 7 August 2023).
  55. Waruwu BK, Tandoc EC, Jr, Duffy A, Kim N, Ling R. (2021) Telling lies together? Sharing news as a form of social authentication. New Media & Society 23(9): 2516–2533. [Google Scholar]
  56. Winter S, Metzger MJ, Flanagin AJ. (2016) Selective use of news cues: A multiple-motive perspective on information selection in social media environments. Journal of Communication 66(4): 669–693. [Google Scholar]
  57. Yang FY, Anderson OR. (2003) Senior high school students’ preference and reasoning modes about nuclear energy use. International Journal of Science Education 25: 221–244. [Google Scholar]

Articles from Public Understanding of Science (Bristol, England) are provided here courtesy of SAGE Publications

RESOURCES