Skip to main content
PLOS One logoLink to PLOS One
. 2024 Jul 25;19(7):e0307771. doi: 10.1371/journal.pone.0307771

Navigating persuasive strategies in online health misinformation: An interview study with older adults on misinformation management

Wei Peng 1,*, Jingbo Meng 2, Barikisu Issaka 3
Editor: Nicola Diviani4
PMCID: PMC11271879  PMID: 39052635

Abstract

Online health misinformation commonly includes persuasive strategies that can easily deceive lay people. Yet, it is not well understood how individuals respond to misinformation with persuasive strategies at the moment of exposure. This study aims to address the research gap by exploring how and why older adults fall into the persuasive trap of online health misinformation and how they manage their encounters of online health misinformation. Using a think-aloud protocol, semi-structured interviews were conducted with twenty-nine older adults who were exposed to articles employing twelve groups of common persuasive strategies in online health misinformation. Thematic analysis of the transcripts revealed that some participants fell for the persuasive strategies, yet the same strategies were detected by others as cues to pin down misinformation. Based on the participants’ own words, informational and individual factors as well as the interplay of these factors were identified as contributors to susceptibility to misinformation. Participants’ strategies to manage misinformation for themselves and others were categorized. Implications of the findings are discussed.

Introduction

Online health misinformation is health-related information disseminated on the Internet that is false, inaccurate, misleading, biased, or incomplete, which contradicts the consensus of the scientific community based on the best available evidence [1]. Some researchers distinguish misinformation and disinformation by intentionality such that misinformation is unintentionally false information, whereas disinformation is intentionally false information that is spread deliberately to achieve a certain goal [2]. However, many researchers consider misinformation as an umbrella term to describe all false and inaccurate information lacking scientific evidence because it is hard to know whether such information is created and spread intentionally or not [3]. We adopt the second approach to define misinformation in the current study. Online health misinformation carries serious social and public health implications, including links to hesitancy in vaccines, hesitancy in cancer treatment and screening, as well as distrust in science, medicine, and the medical and research communities [4, 5].

Although health misinformation can be traced back to the time of hunting and gathering societies [6], online health misinformation, especially on social media, poses unique challenges due to its use of persuasive strategies, making it more difficult to identify and address [1]. It garners more attention in terms of number of views and shares compared to accurate public health information [7], making it vital to understand how individuals process these strategies, how and why they succumb to them, and how they react to and manage such misinformation. An in-depth understanding of such can provide insights for future misinformation mitigation and debunking efforts.

In the past decade, older adults have significantly grown more reliance on using technology and the Internet for health information according to the US National Health Interview Survey [8]. However, older adults generally have been found to be more vulnerable to online misinformation, with users over 50 responsible for 80% of fake news shares [9]. A review argues three characteristics for older adults’ vulnerability to online misinformation, including cognitive decline, social changes, and lack of digital literacy [10]. These characteristics of older adults may make them at risk of stumbling upon persuasive strategies of online misinformation. For example, due to cognitive decline, older adults may rely more on heuristic cues rather than delving into the reasoning when evaluating information. For social change, given that interpersonal trust grows with age [11, 12], older adults may tend to believe vivid stories told by another individual. Therefore, this study will focus on the older adult population.

Existing research on persuasive strategies in online misinformation primarily uses content analysis [1316], or experiments to examine the effects of these persuasive strategies [17, 18]. In-depth qualitative research, especially during the moment of engagement with misinformation, is limited [19]. Through participants’ narratives of their own thought processes, we can gain a better understanding of their reactions as well as how and why they succumb to misinformation. Additionally, information management insights from individuals who recognize these persuasive strategies can inform future digital health literacy programs.

This study aims to bridge this gap by conducting interviews to answer the following research questions (RQ)s: how do older adults respond to misinformation crafted with persuasive strategies? (RQ1), how and why do they fall for such misinformation crafted with these persuasive strategies? (RQ2), and how do they engage in effective misinformation management? (RQ3).

Literature review

Persuasive strategies in misinformation

Using language to influence others’ beliefs, attitudes, or behaviors is referred to as a persuasion strategy [20]. Health-related misinformation creators employ persuasive techniques akin to those used in prosocial health efforts [21]. A systematic review summarized 12 groups of persuasive strategies in online health misinformation [1].

The first group of persuasive strategies in online health misinformation is fabricating narrative with details–presenting vivid stories that highlight particulars to create online health misinformation [1]. Secondly, misinformation uses anecdotes and personal experiences to spread false information [22]. The strategy of distrusting government sources or pharmaceutical companies constitutes the third group: implying that certain health issues and regulations are being implemented by the government and pharmaceutical companies with the intention of making money [23]. The fourth group is politicizing health issues. This includes the use of values and ideologies such as freedom, religion, and politicians, and capitalizing on people’s propensity to favor members of their own group and be prejudiced against those of other groups [13]. The fifth group highlights the uncertainty and risk associated with medical actions and uses ambiguous and tentative words for audiences to cast doubt on medical actions [15]. The sixth strategy is to undermine science by attacking its innate limitations, such as overthrowing early discoveries with more recent ones [1]. Online health misinformation also inappropriately uses scientific evidence or uses the evidence out of context [14]. The eighth group is rhetorical tricks that generate a false balance of misinformation, exaggerate some information, and selectively omit other information [18]. The ninth group employs biased reasoning to draw conclusions. Users of this tactic invoke logical fallacies, misleading linkages, or the combination of unconnected pieces of knowledge, among other strategies, to advance and circulate misinformation [24]. The tenth group appeals to people’s emotions and not factual information to persuade them [25]. The eleventh group includes linguistic markers such as uppercase words, first-person pronouns among others [15]. Lastly, online health misinformation attempts to establish legitimacy by exploiting a number of well-established tactics, including citing seemingly credible sources or the use of surface credibility markers to take advantage of users’ propensity to process information heuristically [26].

These persuasive strategies were systematically identified from 58 studies. Although prior research examined the effects of some particular persuasive strategies on people’s intention to trust or share misinformation [17, 18], these studies were conducted in experiments and lacked in-depth understanding of how people process these persuasive strategies. Thus, RQ1 explores how older adults respond to and process these twelve groups of persuasive strategies embedded in online health misinformation.

Why people fall to misinformation

The above persuasive strategies are summaries of message characteristics that complicate individuals’ ability to identify misinformation, making them more susceptible to it [1]. While susceptibility to misinformation is present in all individuals, some are more prone to accept and share on social media, and fall victim to it [27, 28]. Psychological traits also predict individuals’ susceptibility to health misinformation, such as knowledge/skills, cognitive styles, and predispositions [27]. One widely held belief, sometimes referred to as the “deficit theory,” holds that people who accept false information may lack the necessary knowledge or literacy to discern the truth [28]. For instance, prior research [29] discovered that higher education levels improved recognition of online misinformation among older adults. Similarly, individuals who have training in spotting false information are more adept at spotting false health claims online [30].

Compatibility with prior beliefs is another factor that makes individuals susceptible to online misinformation. More specifically, confirmation bias refers to the tendency to believe in the evidence that confirms rather than disconfirms people’s existing knowledge, beliefs, and experiences [31]. People tend to accept information aligning with their views. When a new piece of information that is consistent with existing knowledge is accepted, it is very difficult to change, and this resistance to change increases with the size of the congruent knowledge base [32]. Identity-motivated thinking is a unique case of compatibility of belief that supports false information [33]. Identity-threatening information lowers the evaluation of information, which makes people more susceptible to misinformation [34].

The “illusory truth effect” also explains why people believe false information [35]. Fluency, or the ease of processing information [36], often resulting from repeated exposure, may contribute to the illusory truth effect. This effect may explain the acceptance of COVID-19 conspiracy despite the lack of any convincing proof, as claims presented multiple times seem more credible [37].

These prior studies offer psychological drivers of health misinformation susceptibility. However, most of these studies are quantitative in nature and lack qualitative research that examines how individuals succumb to misinformation at the moment of encountering it [19, 29], especially using their own words to explain why they believe the misinformation and how they assess its credibility. Thus, RQ2 explores why and how older adults fall for online health misinformation crafted with persuasive strategies.

Misinformation management in the age of social media

In the age of rampant online misinformation, recognizing and managing false information is challenging for people of all ages [38]. Typically, individuals engage a two-step authentication process: internal and external [39]. People rely on the self (their own wisdom, instinct, and insight), the source, and the message (intrinsic tone and characteristics of the information) as their three primary sources of authentication during the first interaction. After the first stage, if the person is still not persuaded that the information is accurate, he or she moves on to the second stage, which entails performing external acts of authentication such as deliberately or incidentally utilizing interpersonal (friends and families) and institutional resources.

Responses to misinformation vary. Some proactively counteract by commenting to let others know it is false, messaging the poster to caution them, or reporting false posts so it is taken down, while others hesitate due to societal concerns [38]. Fear of being attacked and the desire to maintain social relationships hinder the proactive management of misinformation [40]. For example, people have to consider face-saving and choose appropriate communication channels to discuss misinformation so that others do not feel their self-images are threatened [41]. In addition, they also weigh factors like issue importance, relationship closeness [42], the person’s personality [43], and the impact the misinformation has on others to decide whether and how to manage the situation [40]. Specifically for older adults, while some research found older adults are less likely to correct COVID-19 misinformation from others [44], recent research found that older adults are more likely to challenge misinformation in general compared to their younger counterparts on social media [40]. In the current study, RQ3 explores the best practice for managing online health misinformation among older adults as our third research question.

Method

Participants

The results reported here were part of a Zoom-based semi-structured interview study conducted during the COVID-19 pandemic in 2021. Recruitment started on August 20, 2021 and ended on October 20, 2021. Only results relevant to the research questions were reported in this paper and other findings were reported elsewhere [45]. Participants were recruited via a local community subject pool and an online participant-sourcing platform, CloudResearch. A screening questionnaire was distributed to include eligible participants who were at least 50 years old, English-speaking, and able to use Zoom. Twenty-nine older adults participated: 20 (69%) were female, and nine (31%) were male. We intentionally oversampled the participants in the racial minority group and lower education levels to have a diverse sample. The screening questionnaire also measured their general health, how frequently they think about health issues (e.g., diet, vaccine, food safety, memory loss, physical activity, cancer prevention), need for cognition, and self-efficacy of online information search. An invitation to sign up for the interview study was sent to a total of 58 participants who completed the screening questionnaire, and 29 participants completed the interview. Table 1 includes the detailed demographic information of the participants.

Table 1. Description of participants.

Participant Sex Year of Birth Race Education Income
1 Male 1960 White Master’s degree $100,000–$149,999
2 Female 1971 White 4 year bachelor’s degree $80,000–$89,999
3 Female 1957 White Master’s degree $80,000–$89,999
4 Male 1965 White Some college but no degree $90,000–$99,999
5 Male 1951 White 4 year bachelor’s degree $100,000–$149,999
6 Male 1962 White High school diploma or equivalent including GED $80,000–$89,999
7 Female 1955 White High school diploma or equivalent including GED $80,000–$89,999
8 Female 1958 White Master’s degree $100,000–$149,999
9 Female 1953 White Some college but no degree $10,000–$19,999
10 Female 1963 White Some college but no degree $60,000–$69,999
11 Female 1960 White High school diploma or equivalent including GED $20,000–$29,999
12 Female 1961 White Some college but no degree
13 Female 1945 White Some college but no degree Not provided
14 Female 1962 White Master’s degree $90,000–$99,999
15 Female 1948 Black/African American Master’s degree $50,000–$59,999
16 Female 1966 Black/African American Some college but no degree $20,000–$29,999
17* Male 1964 White Doctoral degree More than $150,000
18 Female 1956 White Master’s degree $20,000–$29,999
19 Male 1958 Black/African American 2 year associate degree $20,000–$29,999
20 Female 1948 White 2 year associate degree $10,000–$19,999
21 Female 1965 White 4 year bachelor’s degree $90,000–$99,999
22 Male 1955 White Master’s degree $30,000–$39,999
23 Female 1962 Black/African American 4 year bachelor’s degree $100,000–$149,999
24 Male 1960 White 4 year bachelor’s degree $90,000–$99,999
25 Male 1962 White Professional degree (JD, MD) More than $150,000
26 Female 1966 White Master’s degree $100,000–$149,999
27 Female 1950 White 4 year bachelor’s degree $80,000–$89,999
28 Female 1959 White 2 year associate degree $20,000–$29,999
29 Female 1970 Black/African American 2 year associate degree $20,000–$29,999

Note.

* Participant identified as Spanish, Hispanic, or Latino.

Data collection and procedure

The study was approved by the Institutional Review Board of Michigan State University. Before the interview, the participants completed a short online survey to schedule their interview session. A consent form was provided before the online survey started. The participants clicked the “next” button to indicate their consent to participate in this study and to start the online survey. At the beginning of the interview session, the interviewer went over the consent form with the participants and obtained verbal consent once more. No written consent was obtained and recording of the interview started after the verbal consent was obtained. The interviewer documented the verbal consent. Each interview lasted an average of 62 minutes. Participants received $15 in cash or an Amazon gift-card. Participants were first asked about their perception of health-related misinformation. Then, they were exposed to six pieces of randomly displayed online articles. These articles were written based on online content and incorporated 12 groups of common persuasive strategies identified in online health misinformation. Among the articles, four were misinformation (i.e., cashews can treat depression, sunscreen causes cancer, radiation contaminated pet food, masking not necessary and COVID-19 is just a flu), one was factual information (i.e., the long-term effectiveness of COVID-19 vaccine), and one was uncertain information (i.e., association of caffeine intake and Alzheimer’s). For example, in the article about radiation-contaminated pet food, the persuasive strategies of emotional appeals and distinctive linguistics were used in the sentence “BEWARE that the food you purchase for your fur friends can kill them!” S1 Appendix included the annotation of the persuasive strategies in the four misinformation pieces. Uncertain and factual information pieces were mixed into the misinformation pieces to ensure that the participants would not mindlessly consider all information presented as misinformation because they knew this study was about misinformation perception. The results of this study were based on the participants’ discussion of the four misinformation articles. S2 Appendix included the six pieces of information presented to the participants. Except for the COVID-19 related messages, participants mainly indicated that they had little prior knowledge of the topics of the message.

The participants were asked to think aloud and vocalize their thoughts as they read [46]. After they read each piece of information and verbalized their thoughts, the interviewer asked them a set of questions based on the interview guide, e.g., “what specific elements of the information made you think that it is believable?”, “what actions would you take after reading this article/post?”.

Data analysis

The audio recordings from the interviews were transcribed verbatim. Transcripts were inductively analyzed using the NVivo 12 software. The study employed a 5-stage process of thematic analysis [47], which included becoming familiar with the transcribed interview data, coding, outlining initial themes, reviewing and revising themes, and defining final themes. The first and second authors and two graduate students coded the transcripts. Each transcript was coded independently by two coders. To ensure accuracy, primary and secondary coders analyzed each transcript, resolving any discrepancies through discussion. The nodes were iteratively compared and revised during the coding process. Only nodes and themes relevant to the research questions in this study were reported to focus on the most relevant results while ensuring conciseness. Findings related to information processing and evaluation were reported elsewhere [45]. Finally, the first author reviewed, revised, and finalized themes in NVivo 12.

Results

How older adults react to persuasive strategies in online health misinformation (RQ1)

This section illustrates how older adults reacted to the twelve groups of persuasive strategies within these misinformation pieces. Participants’ own words were used to explain how these persuasive strategies could both alert some individuals to potential misinformation and trick others into false information.

Narrative with details

Some individuals were swayed by realistic stories presented in the misinformation, while others used vivid details in it to verify its accuracy. For instance, in reaction to the misinformation about using the Geiger counter inherited from a professor of physics to check radioactive pet food imported from East Asia after the Fukushima nuclear disaster, P29 found the article credible due to the ‘fact’ that “they put the meter next to the cat food, and it was three times fold”. In contrast, the same detail triggered suspicions for P9, who explained: “the Geiger counter thing, and putting in their can of cat food, I mean there was the reliability of an old device, like that sounds like it’s quite old.” P8 said: “the inherited Geiger counter is just not, you know, doesn’t get it with me……well yeah, I think that there are people who would think that “oh my gosh this person measured it with a Geiger counter,” you know, “that must be you know it’s a machine and it works and it said this.”

Anecdotes and personal evidence

Anecdotes and personal evidence were considered as red flags for those adept at spotting misinformation. When reading a Facebook post of a woman blaming sunscreen containing paraben for causing breast cancer, P13 responded: “making a really strong statement about the skincare industry based on her personal experience.” P14 had a similar reaction: “it’s this person’s talking about her personal experience.” Conversely, P29 fully believed this misinformation, justifying it by saying “it’s all personal experience and I identify with her.”

Distrust government and the medical establishment

Some older adults used the common persuasive strategies related to distrusting government and medical establishment as heuristic cues to spot misinformation, while others fell for them. Reacting to the story about cashew is equivalent to Prozac and it is a secret that “big pharma” has been trying to hide for monetary reasons, P5 admitted that the article made the most impression on him and convinced him because “the lie parts stuck out to me. Big pharma, and big lie, that hit home.” The same phrase, on the other hand, triggered skepticism for P1 during think-aloud in the middle of reading this article, “‘Yet another secret that big pharma will never admit’, now, my skepticism has just gone up probably threefold……When I got to the part about that so this is a big secret–secret that big pharma is trying to hide, that’s a trope I’ve heard before. And quite often proves to be just a way to reach people’s emotions.”

Politicizing health issues

Politicizing health issues exploits individuals’ existing moral values, beliefs, and identity by feeding misinformation that aligns with these values to trick them. For instance, regarding the tweet that mandatory masking is a violation of individuals’ freedom and rights, P21 identified with this view: “I am concerned about freedom……so I guess that kind of resonates with me about personal freedoms.” P27 refuted: “It’s using your love of church and freedom to try and promote being reckless with other people’s health, in my opinion.” For the radioactive pet food imported from Asia, P26 pointed out: “I think there’s so many biases and stuff. I mean people who might have some bias against the different Asian countries might be like, ‘Oh, that makes sense.’ You know, I could see that mindset kind of playing a little bit.” Some participants were particularly cognizant of the use of politics in health: “As soon as I read democratic governor, I thought, oh God, here we go. It was going to be political and they were politicizing it. So I mean the whole thing just rubbed me the wrong way.” [P10]

Highlighting uncertainty and risk

Uncertainty and risk exploit people’s fear of the unknown to create an emotional state of anxiety. Some prefer safety over doubt, choosing to trust misinformation rather than dismiss it. Regarding the radioactive pet food, P15 stated: “if they are sending this type of food, providing food makers are using this kind of food, how dangerous it is for pets as well as for humans. That’s what I’m afraid of and then it talks about contaminating the entire Pacific Ocean. I think that’s really dangerous, you know.” P26, on the other hand, considered this highlighted risk as ‘alarmist’ and correctly identified this as misinformation.

Emotional appeal

Misinformation can evoke emotions like anxiety, fear, anger, and sadness. While some older adults’ were emotionally affected by these persuasive strategies, others rationally assessed the emotional appeals. For instance, regarding the misinformation that sunscreen causes breast cancer, P21 expressed sympathy: “I feel bad for that person because they, you know, it’s an unfortunate situation for them.” On the other hand, P18 was alert to the emotional appeal strategy: “It seems like an emotional play and somebody that needs an answer to why something happened to them specifically.” Similarly, P1, quickly classified the post about radiation contaminated pet food as misinformation: “the food you purchased for your ‘fur friends’ can kill them, that seems like a very emotional appeal so that is a red flag.”

Exploit science’s limitation

Misinformation often exploits science’s inherent limitation to portray science as unstable, lacking longitudinal data, limited sample size, etc. For instance, for the misinformation on paraben in sunscreen causing breast cancer, the misinformation clicked with P12: “We all know scientific data is delayed and who knows when they may release new data, one day, retracting the previous finding. Where you hear something and then maybe six months down the road, they’re saying, ‘that’s not necessarily true anymore’”.

Inappropriate use of scientific evidence

Health misinformation typically incorporates scientific evidence, but it is applied inappropriately to support false claims. Many were tricked by the sheer presence of scientific evidence that established legitimacy and only very few were able to see through it. For example, P24 identified the inappropriate use of scientific evidence while reading the misinformation that eating cashew is equivalent to taking Prozac: “Even though cashews probably have that compound in them, I don’t think there would be enough in it to affect your depression.”

Rhetorical tricks

Rhetorical tricks in health misinformation include exaggeration and selective omission. Some participants were easily tricked without considering the omitted facts. For instance, P6 agreed with the misinformation that COVID-19 is just like a flu, stating, “it was basically saying that you know it’s a 99.99% curable situation, just like the flu”, without realizing that people over the age of 50 are the most vulnerable yet not included in the data. For some participants, the rhetorical trick of exaggeration put them off. Reacting to the exaggeration about the contamination of radioactive substances, P3 stated: “It seems much more of an alarmist piece than something that’s really done with research.”

Biased reasoning

Misinformation may include arguments that sound reasonable but contain logical fallacies. Some fell directly for the seemingly reasonable arguments, while others were able to spot the biased reasoning. For instance, P25 stated: “Chart showing the height of the tsunami that hit this location, that doesn’t give any way to what they’re trying to say about the pet food.” Similarly, reacting to the post claiming the sun cream could cause breast cancer, P2 stated: “she’s making a very wide sweeping argument that that particular ingredient caused her breast cancer and you can’t really say that because it could be anything environmental, or possibly genetic even though she doesn’t know that anybody in her family has had that type of cancer or a cancer similar to it that could turn into breast cancer.”

Distinctive linguistic features

Linguistic features, including excessive uses of uppercase words, use of extreme words and other linguistic intensifiers, can grab attention or instigate involvement, especially emotional involvement from the audience. They effectively tricked some participants. For example, P21 stated: “Well, for me it [uppercase word] just caught my attention that I just feel bad for that person because they’re just making a strong statement.” Yet, P28 constructively used these features to pick out misinformation: “Okay, so I’m seeing like in bold letters ‘lying,’ and that “this industry has been lying.” So that makes me suspicious that this is not true.”

Establish legitimacy

Misinformation can deceive people by using persuasive strategies to establish legitimacy, such as citing a source even if it’s fake or lacks relevant credentials, using medical jargon or words associated with health to appear credible. P16 believed the misinformation that sunscreen causes breast cancer because “she named the chemicals that were in the sunscreen.” However, others (i.e., P10) questioned these legitimacy cues and indicated that further research was needed: “These are words that a regular person won’t understand. If I were reading something like this, I would be looking up those words and seeing what they mean.” Reacting to the post that cashews can treat depression, P27 pointed exactly to the persuasive strategy: “Natural, that’s another key word that, you know, might get people in, hook them in.”

Why and how older adults fall for misinformation (RQ2)

The previous section demonstrated that responses to the same persuasive strategies in misinformation varied. Some participants were tricked by them, and some were able to spot these strategies and utilize them to conclude that misinformation was at play. So why were some participants tricked? How were they tricked? Our thematic analysis organizes factors contributing to the why and how into three categories: information factors, individual factors, and the interplay of these two factors.

Information factors

Three characteristics of misinformation make it hard for people to discern. Firstly, there is an abundance of information. Easy access, lack of regulation, and receiving through the sharing of acquaintances, family, and friends, further complicate the cognitive challenge of older adults in distinguishing misinformation on social media. P14 said: “There’s so much information you have to kind of triage it, and what’s worth your time to invest in investigating it, and then need to be? Also really really careful, cuz I’ve gotten burned and I know lots of people where I’ve passed on information that turns out not to be true.” With such information overload, people usually relied on heuristic cues to make it easier for them to process information, which gives leeway for persuasive strategies such as establishing legitimacy to exploit.

Second, misinformation thrives with uncertainty, especially in areas with conflicting evidence and many unknowns. P8 discussed this problem: “I have a sister who’s a nurse and a niece who is in health care as well and they’re two different opposites. You want to trust your healthcare professionals. But, you know, again this is my hard time sorting it out. My sister will believe one thing, and her daughter, who was my niece, will say ‘oh no, you should do it this way and you should do it this way.’”

Lastly, misinformation contains some truth but not the complete truth. Over time, the partial truth becomes dominant and is perceived as the whole truth due to easy access and sharing of information. P10 offered her explanation why the radioactive pet food misinformation may be believable, “I have heard, and I think the news has said that there has been, you know, some stuff from Japan that has floated made it over to the United States now from that tsunami. So I mean, parts of it are just believable enough that I would, that people would believe it.” Similarly, regarding the post of sunscreen causing breast cancer, P9 explained: “When she talks about the paraben issue, it’s a lengthy paragraph and she cites that it’s a known cause of cancer and we’re getting it through our largest organ. So, some of it seems to be planted with some truth or facts as far as that person believes to be.”

Individual factors

Individual susceptibility to misinformation is influenced by specific traits or the absence thereof. We identified three categories of such individual factors. First, the primary factor was the lack of media literacy, wherein individuals lack the knowledge and skills to access, critically analyze, and evaluate information, especially in diverse online platforms. P2 highlighted this vulnerability: “Sometimes it can be very difficult and sometimes you have to really dig deeper, and you don’t have the time or the resources or, you know, the energies or follow it through, then you may accept it for what it is at face value, a little bit sooner than normal.” Additionally, individuals who overestimate their media literacy skills are also prone to misinformation. A few participants believed that they had never encountered misinformation (e.g., P7, P16, P21) or had confidence in themselves for having critical thinking and analytical skills (e.g., P6 [“I am more like an analytical type person, I try to get all the facts, and then be logical on the facts”]), yet struggled to discern misinformation during the interview.

A second individual factor was confirmation bias, which affects everyone to some extent but poses a higher risk for those with a strong predisposition to confirmation bias. This can lead to echo chambers where “individuals seek out sources that reinforce their existing beliefs”, as stated by P13. This is particularly true for identity-based motivating reasoning. If one already believes that pharmaceutical companies are evil, the government is corrupted, or natural ingredients-based medicine is better, they will be kept in the bubble and “choose to believe what they want to believe”. For instance, P7 agreed that cashew was equivalent to Prozac because “I’m always looking for natural things. Like for instance, I have osteoporosis and I’m not taking anything for that either. But I do what I can out there naturally, so I think this article is right up with that.”

The third trait contributing to vulnerability to misinformation was risk aversion and short-term orientation. For instance, participants were afraid of vaccines and medicines due to side effects, and they would “better be safe than sorry”. The risks and side effects are usually imminent, and they overlook the long-term benefits. P16 discussed her reluctance to take the booster shot: “It sounds good, the protection, everything sounds good, but I just don’t want to go through that side effects and go to the sickness.”

Interplay of individual and information factors

The interplay of information and individual factors makes people more susceptible to misinformation. When misinformation aligns with individuals’ emotional needs, it can become highly convincing. P18 commented that “very vulnerable people that are looking for answers, sometimes you have to rationalize and blame somebody for something that just happens without a reason that we know. It seems like an emotional play and somebody that needs an answer to why something happened to them specifically.”

Secondly, individuals may be drawn to misinformation that justifies their existing behaviors. Regarding the misinformation that cashew is equivalent to Prozac, P8 offered her explanation: “If you like cashews, you’re going to believe it because you know, this is good for me so yeah that must be.”

Thirdly, misinformation can cater to individuals’ wishful thinking that there is a simple cure or solution to complex problems. Using P11’s words to illustrate that: “A lot of seniors are not very good. You know, as we get older, you know, oh my back’s hurting and also you see something……we just, ‘Oh my God that’s gonna help me.’” P5 would accept that cashew could be equivalent to Prozac because: “it’s always worth a try. You know, again, it is not endangering your health in any way……whether it works or not, who knows but nothing gained, nothing ventured, nothing lost or something like that”. P3 commented: “who wouldn’t want to eat a handful of cashews, to solve a problem.”

Manage misinformation (RQ3)

The previous sections illustrated how and why some older adults fell for the misinformation. Nevertheless, some older adults were able to identify these persuasive strategies and also discussed how they successfully managed (mis)information online. Our analysis identified the following three sub-themes.

Manage misinformation for self: Further research when in doubt

The primary strategy mentioned by the older adults to manage (mis)information was to conduct further research. Many older adults initially had suspicions about the misinformation in this study but maintained a “truth default” mentality, leaving room for uncertainty. So many of them could not make a definite conclusion that they disagreed with the article because it was misinformation. When unsure of when the information they received contradicted their existing knowledge, they looked for additional information: “I doubt it but let’s see if you can prove it to me.” (P1). They verify and authenticate information through the following eight methods: 1) checking the source (e.g., “use this person, that researcher’s name, and so you can do your own dig” [P26]), 2) checking cited references (e.g., “But it’s nice that they link to an article from Huffington Post. You have to see what links that has within that to see if they actually have the research that they are talking about.” [P17]), 3) using fact-checking websites (e.g., “if it’s really something—like another one that she told me was that Moderna, was owned by Bill Gates, so like I just went to Snopes.com.” [P28]), 4) looking up unfamiliar terms (e.g., “So I mean that my first thought is, I would be looking up those words and seeing what they mean.” [P10]), 5) googling key words or other searchable content mentioned in the article (e.g., “I’m not aware that paraben is a carcinogen. I would want to validate that.” [P1]), 6) going to legitimate and trustworthy sources for further research (e.g., “So I may look into articles on like PubMed, in terms of if cashews have been looked at for treating depression or alleviating symptoms of depression” [P2]), 7) consulting trusted individuals (e.g., “I would ask someone that’s smarter than me, that’s specific to this, that has more information.” [P4]), and 8) using multiple online sources to verify (e.g., “Most people would say, ‘I’ll go to Google’. Well, Google is one. I’m going to Bing. I’m gonna go to duckduckgo.”[P4]). However, participants indicated that doing further research is not an easy task. As P2 described: “I think sometimes it can be very difficult and sometimes you have to really dig deeper, and you don’t have the time or the resources.”

Manage misinformation for self: Critical thinking

Participants emphasized the importance of critical thinking in managing misinformation. They highlighted the use of questioning techniques to determine the credibility of information, such as asking themselves questions such as whether they got emotional and why, what are the motives of the author, whether the source is credible, what is the meaning of specific words, especially technical words and the scope and applicability of the information. For instance, while reading the article about radioactive pet food imported from East Asia, P25 engaged in think-aloud and asked a series of questions: “I would point to things like where they are getting their information about importing seafood from waters near Fukushima. And then I would question the data from, okay, ‘my friend has a Geiger counter that was inherited from her dad’, so this device is years old, likely and is it still working correctly? Is it still calibrated correctly? Are they using it correctly? ‘Putting it within one inch of the wet cat food.’ Is a reading something else other than food? And you know what limit is set in a Geiger counter to go off?”

Secondly, deliberative counter arguments were also found to be effective in identifying fallacies in misinformation. For example, while reading the post about masking and COVID-19, P17 counterargued: “And I don’t see how wearing a mask is taking away your freedom and controlling you. It seems like most churches require people to wear other type of clothing and nobody’s complaining if they have to wear pants in church that their freedom and control has been taken away.” However, there was also a caveat that individuals who already believed certain types of misinformation might use their existing misbelief to counter-argue true information. For instance, P5 somewhat agreed with the tweet that masking was useless and offered counter arguments for masking being useful: “I’ve always said if the mask is so good and why does the virus keep spreading? If the mask is that good, the virus wouldn’t spread like this.”

The other two types of critical thinking were: 1) acknowledging individual differences (e.g., “I guess everybody is different, cause they get, they are affected by it differently [P19]), and 2) not jumping into absolute conclusions (e.g., “I’m in between disagree and agree with the article that they are still working on it. And they just haven’t come to a complete conclusion, still working on it [P16]).

Manage misinformation for self: Actions to take upon misinformation

When faced with misinformation, participants discussed various actions to take. These included: 1) ignoring the misinformation, (e.g., “So that’s why sometimes like I just see something and I just don’t give it the time of day.”[P14]), 2) taking steps to avoid future encounters with similar misinformation (e.g., “you could block these people, stop them from entering your feed or your, you know, dropping from your Instagram or whatever……you know, I don’t need that sort of stupidity in my life.”[P22]), and 3) reporting the misinformation (e.g., “I would judge whether I thought it was presenting anything dangerous or causing people to react in a way that might inflame them to take action against other people. If I saw anything that I felt was leaning toward that, I would report the post.” [P1].

Manage misinformation for others

Besides managing misinformation for themselves, participants also discussed the need to manage misinformation for their family and friends. It was not an easy task, and not all family and friends welcomed their involvement. P10 stated: “I can try to counter them and tell them where I’m getting my information, but then they hang up on me. So, you know, so I can only do so much.” They offered the following insights on when and how to effectively assist others.

First, participants suggested considering relationship closeness and deciding whether and how to engage with them, especially on social media. P10 comments: “If they were simply a Facebook friend, I would probably just keep scrolling. If they were someone I knew well, I would probably either post it right in there, if I could find a believable source that said otherwise, I would post it right in their feed and just say ‘please read this article’. If it was someone in between I would probably message them, not in the feed. I have a cousin who’s a Trumper, and the posts, the things in his feed, the responses to the things in his feed are just so argumentative and so against what I believe, so I wouldn’t comment on that one at all.”

Second, participants attempted to demonstrate an understanding of where the person sharing the misinformation came from. Telling people they were wrong or engaging in argument was never helpful. P3 shared: “I don’t want to create conflict by, you know, telling them that they’re wrong, but trying to just gently guide them to look at information in a different way. And also to look from individuals’ point of view, you know, because we all have our own perceptions, and we all have different experiences in our lives.”

Third, participants mobilized their own critical thinking skills to gently guide their family and friends to look at information differently. Four strategies were identified. First, they shared accurate information, (e.g., “After I read to him some of the reviews and showed him a little bit more about what it was all about, he said ‘no I guess you’re right’”. [P27]). Second, they encouraged scrutiny of misinformation (e.g., “My brother sent me something. Again, off his social media and the numbers he sent me or the quantity he sent me was so extraordinary that I wrote back to him and I said, ‘Really, Think again. You really think this many, and it was a matter that the numbers were so extraordinary or the item that they stated was really kind of off the mainstream’”. [P13]). Third, they encouraged others to ask questions (e.g., “I don’t want to stand blames or anything like that. I want to have them at least, just question. You know I would say, ‘where did you get that information along that line’ or confront”. [P22]). Lastly, they offered alternative explanations (e.g., “my son and his wife were double vaccinated, and they still got COVID-19, and my daughter-in-law got it very bad. So, [Son’s name] is saying ‘now well maybe the vaccines are useless’. And I said, ‘well, you don’t know because you might have died if [you didn’t have a vaccine], you know.’ So I try to try to explain to them” [P10].

Discussion

The current study conducted semi-structured interviews and provided qualitative evidence from the perspective of older adults using their own words to describe how they process persuasive strategies in misinformation. Our thematic analyses revealed themes explaining why and how older adults fall for misinformation that employs persuasive strategies. The discussion will begin with informational and individual factors as well as their interplay to explain older adults’ susceptibility to misinformation. Next, connections will be elaborated between the findings and media literacy suggestions that may mitigate the factors contributing to believing misinformation among older adults. Our thematic analyses also revealed major strategies that older adults engage in to manage misinformation for themselves and others. The discussion will continue with the best practices learned from them regarding misinformation management in everyday life.

Why and how older adults fall for misinformation with persuasive strategies

Among the twelve groups of persuasive strategies in misinformation, our findings showed that some older adults detected the persuasive strategies as red flags, while others were tricked by them. Persuasive strategies that grab attention, such as narrative, personal stories, emotional appeals, and distinctive linguistic features, not only make the misinformation stand out in a sea of information but also draw the audience in due to cognitive ease and emotional engagement [22]. These strategies tend to work on those who do not critically analyze information or engage in systematic processing [48]. Additionally, in areas where there is abundance of unknown and conflicting evidence, misinformation is easy to get around. For instance, persuasive strategies such as exploiting science’ limitations, and highlighting risk and uncertainty may increase participants’ doubts about science and be drawn into conspiracy theories [14, 24]. The last information factor is the partial truth in misinformation. Among the twelve groups of persuasive strategies, inappropriate use of scientific evidence, rhetorical tricks, and biased reasoning all contain some truth to mislead the audience. For instance, general scientific principles may be referenced, but no evidence was available for the specific domain [16] or for providing factual information and connecting the pieces of information without valid reasoning [24]. These strategies in misinformation are tough to tackle because while the information is factual, the reasoning behind is flawed. Individuals with certain traits such as low cognitive abilities or a tendency to believe in truth by default are more susceptible to these persuasive strategies [49, 50].

Improving media literacy to mitigate susceptibility to persuasive strategies

Our findings have shown evidence of older adults falling for the common persuasive strategies while processing online health misinformation. To mitigate their susceptibility to the common persuasive strategies, older adults should be equipped with the knowledge of what they are and be able to identify the common tricks and pitfalls. Media literacy encompasses the knowledge and skills that people need to become mindful media content consumers [51, 52]. The knowledge and skills to identify common persuasive strategies in online health misinformation should be a critical part of media literacy education for older adults. Research has shown that older adults are more susceptible to online misinformation not due to their declining cognitive abilities but rather because they have less experience or knowledge with online information [10], which may lead older adults to apply truth default while consuming online information [45].

Prior research has attempted to include tips for spotting misinformation in media literacy programs, primarily focusing on formatting cues (e.g., misspellings, awkward layouts) and direct behavioral suggestions (e.g., inspecting the dates and sources) [52]. However, given the increasing prevalence of high-quality misinformation, it’s more crucial to understand how persuasive arguments are created. For example, evidence-based misinformation is seen as more accurate than fact-free misinformation [53]. Recent AI-generated misinformation makes use of sophisticated persuasive strategies such as enhancing narrative details, simulating emotional appeals, and making conclusions using biased reasoning [54]. Simple behavioral suggestions like checking evidence or sources are insufficient in navigating today’s complex media environment [55]. Learning common persuasive strategies underpinning online health misinformation is pivotal in helping older adults critically analyze content and arguments.

One caveat is that the knowledge and skills to identify persuasive strategies in online misinformation are not necessarily easy to learn and may need conditions to practice in everyday information consumption. However, once the knowledge and skills are learned, the benefits can be sustainable in the long term. Learning and practicing persuasive strategies require the motivation of older adults to spend time and cognitive resources. According to the dual-process theories such as the Elaboration Likelihood Model (ELM), people are likely to scrutinize message arguments when they find the issue is personally important and relevant [56]. Similarly, literature has revealed that people who have analytic thinking styles show more acceptance of corrective messages explaining why a piece of misinformation is incorrect [57]. The media literacy program teaching persuasive strategies may be more acceptable and successful among older adults who recognize the importance of discerning online misinformation and who have greater needs for cognition. However, it is worth pointing out that some of the persuasive strategies could be identified as cues that only require peripheral information processing. For example, people can apply limited cognitive processing to spot personal anecdotes and emotional appeals such as emotionally intensive words [58]. The media literacy program may begin with teaching simple persuasive strategies that only require peripheral information processing and deepen the education with more complex persuasive strategies (e.g., biased reasoning). Additionally, as prior research found that media literacy training for older adults is better learned when taught in a social context where family and friends are involved [59], media literacy programs to help older adults to manage misinformation can also benefit from social facilitation.

In addition, it is important to create awareness of how media content interacts with individual vulnerabilities, particularly in relation to misinformation. Emotional and wishful thinking are identified as common vulnerabilities that can make people more susceptible to misinformation. When misinformation triggers people’s emotions such as anger and anxiety, people’s information processing will be masked by such emotions. A recent review has concluded that anger and greater cortisol response to stress increased people’s vulnerability to misinformation [60]. Media literacy education should focus on critical engagement with media contents by encouraging people to reflect on their emotions and detach it from credibility assessment while processing information. It will be helpful for people to recognize their emotions at the moment of consuming media content, reflect on how media content may fill in their emotional needs, and assess information credibility based on its accuracy rather than its ability to make them feel better [61]. With respect to health misinformation, this media literacy education may be particularly important for health issues, such as vaccines, that tend to be politicized and may trigger affect polarization [62]. An understanding of how strong emotion may impair our ability to rationally process science information provides a foundation for people to manage their emotions when they are motivated to discern misinformation.

Managing misinformation for self

Similar to prior research [39], during the initial encounter with an information item, our older adult participants engaged in internal authentication acts involving the assessment of information credibility based on the source, the message, and their prior experience and knowledge. Researching information further was a common strategy to manage online health misinformation, driven by suspicions about the legitimacy of the source and original research of the evidence cited in the information. However, not everyone pursues further research, influenced by belief in misinformation’s accuracy or a perceived lack of importance. For instance, prior research found that older adults sometimes resorted to the strategy of avoiding social media content [63, 64]. Although prior research found that people verify information when doubting its accuracy or when confirming existing beliefs [39], recent research shows that the predictor of people’s motivation to verify information is how much they care about the information or how important the information is to them [65].

For people who are willing to conduct verification, further information search is not always easy and does not guarantee an accurate assessment of misinformation. Online health information requires a high reading level to understand complex health issues that involve conflicting evidence [66]. For some older adults, they may turn to trusted family members to assist them [67]. Our older adults participants shared the following strategies to develop the competence of navigating through the complex information environment. The first aspect is related to how to conduct further research. Our findings suggested effective practices, including checking the source and references cited in the information, using search engines to verify details, using fact-checking websites to verify facts, looking up unfamiliar terms, going to legitimate and trustworthy sources for further research, and verifying with trusted individuals. These practices for future research are effective but may provide other layers of complexity and challenges. For example, using search engines to verify details requires the capability of selecting, analyzing, and integrating information from various sources [53]. Additionally, choices of trustworthy sources of information may be subjective to individuals’ political ideology [68].

Another competence to develop is critical thinking, such as practicing generating alternative and/or counterarguments to claims. When examining evidence, reasoning, and conclusions, people can be encouraged to create counterarguments at each step. Does the evidence support an alternative reasoning and does the reasoning lead to an alternative or an opposing conclusion? If yes, what evidence or other information is needed to make a better decision? These insights may be incorporated in future media literacy interventions to educate older adults how to react to media content and in future information literacy interventions to strengthen older adults’ ability in finding, evaluating, and using information [69]. Moreover, when examining personal experience and stories, individuals should consider individual differences and the extent to which personal experience can be generalized. The experience described by a person may be true to the person, but what backgrounds or other contextual factors may have shaped the experience. They can ask themselves questions: Does the person or the storyteller draw a correct conclusion from his/her experience? Can the person’s case be generalized to others? These are the questions to ask while processing information.

Managing misinformation for others

The closeness of the relationship plays an important role as a moderator in terms of how people manage misinformation for others. Our findings are consistent with the existing literature that relationship closeness [38] and the availability of channel affordances [37] impact individuals’ decision about whether or not to correct or challenge misinformation from others. A pattern found in the current study is that they would first think about how close they were to a target person and then choose a desired communication channel (e.g., private text message vs. confrontation in face) to talk about sensitive issues. Participants also mentioned that they would respond to misinformation from strangers online by posting a counterarguing comment. However, when misinformation comes from friends and or family members, they would choose a less face-threatening way (e.g., private text) that does not damage interpersonal relationships [38]. Moreover, when misinformation comes from a close tie, people may avoid discussing misinformation at all or gently suggest alternative information sources rather than confronting the person with counterarguments [43].

In terms of how to discuss misinformation with others, our findings reveal that showing understanding and empathy before challenging or correcting misinformation is important, as it facilitates persuasion and reduces psychological reactance, which is likely to occur when discussing misinformation [70]. The feeling of being understood and recognized could reduce perceived threat from someone’s challenges or corrections of misinformation. Therefore, we recommend practicing empathetic communication when managing misinformation for others. Future research may also examine this correction approach beyond the interpersonal communication setting to other health settings [55].

Lastly, our findings make another contribution to the literature by suggesting nudging strategies used to manage misinformation for family and friends. Four nudging strategies were identified: providing true information and sources, prompting scrutiny of misinformation, encouraging questions, and providing alternative explanations. The four strategies may be used depending on others’ individual characteristics, such as personality and their relationship closeness [40].

Limitation

The study’s findings should be interpreted considering the following limitations. First, we included older adults aged at least 50 (participants’ ages ranged from 51 to 76 with an average age of 62) and most of whom had higher education and used Zoom to participate in this study. This group of older adults were relatively more technologically savvy and may have high media and digital literacy compared to the average older adults. Given that this study focused on online health misinformation, the group of participants represented the type of older adults more likely to use the Internet for health information and thus the implications of online health misinformation were more relevant to them. However, the findings may not be transferable to other older adults not represented by our sample. Second, the participants may be more alert and cognizant when processing the information encountered in this study than they usually would in their everyday life. Nevertheless, our findings demonstrated that these persuasive strategies were still quite effective, suggesting that it is important to educate the public to see through these strategies in misinformation. Third, although we used a systematic approach to creating the misinformation messages, i.e., embedding persuasive strategies identified in a systematic review to misinformation contents that were found from fact-checking websites and using various popular online platforms to present the information to the participants, the messages may not be representative of all types of online health misinformation and future research is needed to expand the current study.

Conclusion

This study provides qualitative evidence on why and how older adults fall for misinformation crafted with persuasive strategies at the moment of encountering them. Thematic analysis demonstrated factors related to the information content, the individual, and their interplay, which may contribute to misinformation susceptibility. Misinformation management tactics were extracted from those individuals who could see through the persuasive strategies embedded in the misinformation content. The insights of the findings can inform the design of future digital literacy programs to help individuals, especially older adults, combat misinformation.

Supporting information

S1 Appendix. Four health misinformation articles with annotated persuasive strategies.

(DOCX)

pone.0307771.s001.docx (95.1KB, docx)
S2 Appendix. Six messages participants exposed to.

(PDF)

pone.0307771.s002.pdf (1.5MB, pdf)
S3 Appendix. Interview guide.

(DOCX)

pone.0307771.s003.docx (9.3KB, docx)

Acknowledgments

The authors would like to thank Sue Lim and Huiyi Liu for assisting in conducting interviews and coding the transcripts.

Data Availability

The authors are not able to provide any data beyond what is presented in the manuscript due to restrictions that study participants agreed to when they signed the consent form, which was approved by the Michigan State University IRB.

Funding Statement

The study was partially funded by the Brandt Fellowship awarded to Wei Peng. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Peng W, Lim S, Meng J. Persuasive strategies in online health misinformation: a systematic review. Inf Commun Soc. 2023; 26(11):2131–48. [Google Scholar]
  • 2.Al Khaja KAJ, AlKhaja AK, Sequeira RP. Drug information, misinformation, and disinformation on social media: a content analysis study. J Public Health Pol. 2018;39(3):343–57. doi: 10.1057/s41271-018-0131-2 [DOI] [PubMed] [Google Scholar]
  • 3.Swire-Thompson B, Lazer D. Public health and online misinformation: Challenges and recommendations. Annu Rev Public Health. 2020; 41(1):433–51. doi: 10.1146/annurev-publhealth-040119-094127 [DOI] [PubMed] [Google Scholar]
  • 4.Romer D, Jamieson KH. Conspiracy theories as barriers to controlling the spread of COVID-19 in the U.S. Soc Sci Med. 2020; 263:113356. doi: 10.1016/j.socscimed.2020.113356 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Roozenbeek J, Schneider CR, Dryhurst S, Kerr J, Freeman ALJ, Recchia G. Susceptibility to misinformation about COVID-19 around the world. R Soc Open Sci. 2020; 7(10):201199. doi: 10.1098/rsos.201199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Ackerknecht EH. A short history of medicine. JHU Press; 2016. 273 p. [Google Scholar]
  • 7.Sharma M, Yadav K, Yadav N, Ferdinand KC. Zika virus pandemic—analysis of Facebook as a social media health information platform. Am J Infect Control. 2017; 45(3):301–2. doi: 10.1016/j.ajic.2016.08.022 [DOI] [PubMed] [Google Scholar]
  • 8.Hung L-Y, Lyons JG, Wu C-H. Health information technology use among older adults in the United States, 2009–2018. Current Medical Research and Opinion. 2020;36: 789–797. doi: 10.1080/03007995.2020.1734782 [DOI] [PubMed] [Google Scholar]
  • 9.Grinberg N, Joseph K, Friedland L, Swire-Thompson B, Lazer D. Fake news on Twitter during the 2016 U.S. presidential election. Science. 2019;363: 374–378. doi: 10.1126/science.aau2706 [DOI] [PubMed] [Google Scholar]
  • 10.Brashier NM, Schacter DL. Aging in an era of fake news. Curr Dir Psychol Sci. 2020;29: 316–323. doi: 10.1177/0963721420915872 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Poulin MJ, Haase CM. Growing to trust: Evidence that trust increases and sustains well-being across the life span. Social Psychological and Personality Science. 2015;6: 614–621. doi: 10.1177/1948550615574301 [DOI] [Google Scholar]
  • 12.Li T, Fung HH. Age differences in trust: An investigation across 38 countries. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 2013;68: 347–355. doi: 10.1093/geronb/gbs072 [DOI] [PubMed] [Google Scholar]
  • 13.DeDominicis K, Buttenheim AM, Howa AC, Delamater PL, Salmon D, Omer SB, et al. Shouting at each other into the void: A linguistic network analysis of vaccine hesitance and support in online discourse regarding California law SB277. Soc Sci Med. 2020; 266:113216. doi: 10.1016/j.socscimed.2020.113216 [DOI] [PubMed] [Google Scholar]
  • 14.Gallagher J, Lawrence HY. Rhetorical Appeals and Tactics in New York Times Comments About Vaccines: Qualitative analysis. J Med Internet Res. 2020;22(12):e19504. doi: 10.2196/19504 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Ghenai A, Mejova Y. Fake Cures: User-centric modeling of health misinformation in social media. Proc ACM Hum-Comput Interact. 2018; 2(CSCW):58:1–58:20. [Google Scholar]
  • 16.Moran MB, Lucas M, Everhart K, Morgan A, Prickett E. What makes anti-vaccine websites persuasive? A content analysis of techniques used by anti-vaccine websites to engender anti-vaccine sentiment. J Commun Health. 2016; 9(3):151–63. [Google Scholar]
  • 17.Chua AYK, Banerjee S. Intentions to trust and share online health rumors: An experiment with medical professionals. Comput Hum Behav. 2018; 87:1–9. [Google Scholar]
  • 18.Salvador Casara BG, Suitner C, Bettinsoli ML. Viral suspicions: Vaccine hesitancy in the Web 2.0. J Exp Psychol Appl. 2019;25(3):354–71. doi: 10.1037/xap0000211 [DOI] [PubMed] [Google Scholar]
  • 19.Choudrie J, Banerjee S, Kotecha K, Walambe R, Karende H, Ameta J. Machine learning techniques and older adults processing of online information and misinformation: A covid 19 study. Comput Hum Behav. 2021;119:106716. doi: 10.1016/j.chb.2021.106716 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Dillard JP, Pfau M. The Persuasion handbook: Developments in theory and practice. SAGE Publications; 2002. 897 p. [Google Scholar]
  • 21.Chen S, Xiao L, Mao J. Persuasion strategies of misinformation-containing posts in social media. Inf Process Manag. 2021; 58(5):102665. [Google Scholar]
  • 22.Kearney MD, Selvan P, Hauer MK, Leader AE, Massey PM. Characterizing HPV vaccine sentiments and content on Instagram. Health Educ Behav. 2019; 46(2_suppl):37S–48S. doi: 10.1177/1090198119859412 [DOI] [PubMed] [Google Scholar]
  • 23.Prasad A. Anti-science misinformation and conspiracies: COVID–19, post-truth, and science & technology studies (STS). Sci Technol Soc. 2022; 27(1):88–112. [Google Scholar]
  • 24.Kou Y, Gui X, Chen Y, Pine K. Conspiracy talk on social media: Collective sensemaking during a public health crisis. Proc ACM Hum-Comput Interact. 2017; 1(CSCW):61:1–61:21. [Google Scholar]
  • 25.Buts J. Memes of Gandhi and mercury in anti-vaccination discourse. Media Commun. 2020; 8(2):353–63. [Google Scholar]
  • 26.Haupt MR, Li J, Mackey TK. Identifying and characterizing scientific authority-related misinformation discourse about hydroxychloroquine on twitter using unsupervised machine learning. Big Data Soc. 2021; 8(1):20539517211013843. [Google Scholar]
  • 27.Bryanov K, Vziatysheva V. Determinants of individuals’ belief in fake news: A scoping review determinants of belief in fake news. PLOS ONE. 2021;16(6):e0253717. doi: 10.1371/journal.pone.0253717 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Scherer LD, McPhetres J, Pennycook G, Kempe A, Allen LA, Knoepke CE, et al. Who is susceptible to online health misinformation? A test of four psychosocial hypotheses. Health Psychol. 2021;40(4):274–84. doi: 10.1037/hea0000978 [DOI] [PubMed] [Google Scholar]
  • 29.Seo H, Blomberg M, Altschwager D, Vu HT. Vulnerable populations and misinformation: A mixed-methods approach to underserved older adults’ online information assessment. New Media Soc. 2021; 23(7):2012–33. [Google Scholar]
  • 30.van der Linden S, Roozenbeek J, Compton J. Inoculating against fake news about COVID-19. Front Psychol. 2020: 2928. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Nickerson RS. Confirmation Bias: A ubiquitous phenomenon in many guises. Rev Gen Psychol. 1998; 2(2):175–220. [Google Scholar]
  • 32.Lewandowsky S, Ecker UKH, Seifert CM, Schwarz N, Cook J. Misinformation and Its correction: Continued influence and successful debiasing. Psychol Sci Public Interest. 2012; 13(3):106–31. doi: 10.1177/1529100612451018 [DOI] [PubMed] [Google Scholar]
  • 33.Onorato RS, Turner JC. Fluidity in the self-concept: the shift from personal to social identity. Eur J Soc Psychol. 2004; 34(3):257–78. [Google Scholar]
  • 34.Wischnewski M, Krämer N. I Reason Who I am? Identity salience manipulation to reduce motivated reasoning in news consumption. In: International Conference on Social Media and Society; 2020. p. 148–55.
  • 35.Hasher L, Goldstein D, Toppino T. Frequency and the conference of referential validity. J Verbal Learn Verbal Behav. 1977; 16(1):107–12. 34. [Google Scholar]
  • 36.Schwarz N, Jalbert M, Noah T, Zhang L. Metacognitive experiences as information: Processing fluency in consumer judgment and decision making. Consum Psychol Rev. 2021; 4(1):4–25. [Google Scholar]
  • 37.Ecker UKH, Lewandowsky S, Cook J, Schmid P, Fazio LK, Brashier N. The psychological drivers of misinformation belief and its resistance to correction. Nat Rev Psycho. 2022; 1(1):13–29. [Google Scholar]
  • 38.Tandoc EC, Lim D, Ling R. Diffusion of disinformation: How social media users respond to fake news and why. Journalism. 2020; 21(3):381–98. [Google Scholar]
  • 39.Tandoc EC, Ling R, Westlund O, Duffy A, Goh D, Zheng Wei L. Audiences’ acts of authentication in the age of fake news: A conceptual framework. New Media Soc. 2018; 20(8):2745–63. [Google Scholar]
  • 40.Gurgun S, Arden-Close E, Phalp K, Ali R. Online silence: why do people not challenge others when posting misinformation? Internet Res. 2022; 33(5):1928–48. [Google Scholar]
  • 41.Pearce KE, Malhotra P. Inaccuracies and Izzat: Channel affordances for the consideration of face in misinformation correction. J Comput-Mediat Commun. 2022; 27(2):zmac004. [Google Scholar]
  • 42.Rohman A. Counteracting misinformation in quotidian settings. In: Toeppe K, Yan H, Chu SKW, editors. Diversity, Divergence, Dialogue. Cham: Springer International Publishing; 2021. p. 141–55. (Lecture Notes in Computer Science). [Google Scholar]
  • 43.Scott L, Coventry L, Cecchinato ME, Warner M. “I figured her feeling a little bit bad was worth it to not spread that kind of hate”: Exploring how UK families discuss and challenge misinformation. In: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems; 2023. p. 1–15.
  • 44.Boyd D. Why Youth (Heart) Social Network Sites: The Role of networked publics in teenage social life. YOUTH, IDENTITY, AND DIGITAL MEDIA, Buckingham David, ed., The John D. and Catherine T. MacArthur Foundation Series on Digital Media and Learning, The MIT Press, Cambridge, MA. 2008; 3: 2007–16. [Google Scholar]
  • 45.Peng W, Meng J, Lim S, Liu H. Health [mis]information processing among older adults: An in-depth interview study comparing “misinformation detectives” and “misinformation victims.” 2023 May.
  • 46.Van Someren M, Barnard YF, Sandberg J. The think aloud method: A practical approach to modelling cognitive. London: AcademicPress; 1994. [Google Scholar]
  • 47.Braun V, Clarke V. Thematic analysis. In: APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological. American Psychological Association; 2012. p. 57–71. [Google Scholar]
  • 48.Chaiken S. Heuristic Versus systematic information processing and the use of source versus message cues in persuasion.J Pers Soc Psychol. 1980; 39(5):752. [Google Scholar]
  • 49.Levine TR. Truth-Default Theory (TDT): A theory of human deception and deception detection. J Lang Soc Psychol. 2014; 33(4):378–92. [Google Scholar]
  • 50.Nan X, Wang Y, Thier K. Why do people believe health misinformation and who is at risk? A systematic review of individual differences in susceptibility to health misinformation. Soc Sci Med. 2022; 314:115398. doi: 10.1016/j.socscimed.2022.115398 [DOI] [PubMed] [Google Scholar]
  • 51.Chan M. News literacy, fake news recognition, and authentication behaviors after exposure to fake news on social media. New Media Soc. 2022;14614448221127675. [Google Scholar]
  • 52.Vraga EK, Bode L, Tully M. Creating News Literacy Messages to Enhance Expert Corrections of Misinformation on Twitter. Commun Res. 2022; 49(2):245–67. [Google Scholar]
  • 53.Hameleers M. Separating truth from lies: comparing the effects of news media literacy interventions and fact-checkers in response to political misinformation in the US and Netherlands. Inf Commun Soc. 2022; 25(1):110–26. [Google Scholar]
  • 54.Zhou J, Zhang Y, Luo Q, Parker AG, De Choudhury M. Synthetic lies: Understanding AI-generated misinformation and evaluating algorithmic and human solutions. In: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 2023; 1–20.
  • 55.Malhotra P. A relationship-centered and culturally informed approach to studying misinformation on COVID-19. Soc Media Soc. 2020; 6(3):2056305120948224. doi: 10.1177/2056305120948224 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Petty RE, Cacioppo JT. The elaboration likelihood model of persuasion. Springer; New York; 1986. [Google Scholar]
  • 57.Martel C, Mosleh M, Rand DG. You’re definitely wrong, maybe: Correction style has minimal effect on corrections of misinformation online. Media Commun. 2021;9(1):120–33. [Google Scholar]
  • 58.Yeo SK, McKasy M. Emotion and humor as misinformation antidotes. Proc Natl Acad Sci. 2021;118(15):e2002484118. doi: 10.1073/pnas.2002484118 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Barrie H, La Rose T, Detlor B, Julien H, Serenko A. “Because I’m old”: The role of ageism in older adults’ experiences of digital literacy training in public libraries. J Technol Hum Serv. 2021;39(4): 379–404. [Google Scholar]
  • 60.Sharma PR, Wade KA, Jobson L. A systematic review of the relationship between emotion and susceptibility to misinformation. Memory. 2023; 31(1):1–21. doi: 10.1080/09658211.2022.2120623 [DOI] [PubMed] [Google Scholar]
  • 61.Pennycook G, McPhetres J, Zhang Y, Lu JG, Rand DG. Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychol Sci. 2020; 31(7):770–80. doi: 10.1177/0956797620939054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Dolman AJ, Fraser T, Panagopoulos C, Aldrich DP, Kim D. Opposing views: associations of political polarization, political party affiliation, and social trust with COVID-19 vaccination intent and receipt. J Public Health. 2023; 45(1):36–9. doi: 10.1093/pubmed/fdab401 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Banerjee S, Kapetanaki AB, Dempsey L. Older people’s online information search during the pandemic. 2022 16th International Conference on Ubiquitous Information Management and Communication (IMCOM). 2022. pp. 1–6.
  • 64.Vivion M, Reid V, Dubé E, Coutant A, Benoit A, Tourigny A. How older adults manage misinformation and information overload—A qualitative study. BMC Public Health. 2024;24: 871. doi: 10.1186/s12889-024-18335-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Mourão RR, Thorson E, Kryston K, Tunney C. Predicting audience verification intention: The impact of partisanship, source, importance, and information familiarity on willingness to verify headlines. Journal Mass Commun. 2023; 100(2):308–31. [Google Scholar]
  • 66.Berland GK, Elliott MN, Morales LS, Algazy JI, Kravitz RL, Broder MS. Health information on the internet accessibility, quality, and readability in English and Spanish. JAMA. 2001; 285(20):2612–21. doi: 10.1001/jama.285.20.2612 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Vargas-Bianchi L, Mateus J-C, Pecho-Ninapaytan A, Zambrano-Zuta S. “No, auntie, that’s false”: Challenges and resources of female baby boomers dealing with fake news on Facebook. First Monday. 2023 [cited 11 Jun 2024].
  • 68.Featherstone JD, Bell RA, Ruiz JB. Relationship of people’s sources of health information and political ideology with acceptance of conspiratorial beliefs about vaccines. Vaccine. 2019; 37(23):2993–7. doi: 10.1016/j.vaccine.2019.04.063 [DOI] [PubMed] [Google Scholar]
  • 69.Jones—Jang M, Mortensen T. Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. Am Behav Sci. 2021; 65(2):371–88. [Google Scholar]
  • 70.Shen L. Mitigating Psychological Reactance: The Role of Message-Induced Empathy in Persuasion. Hum Commun Res. 2010; 36(3):397–422. [Google Scholar]

Decision Letter 0

Nicola Diviani

16 Jan 2024

PONE-D-23-40641Navigating persuasive strategies in online health misinformation: An interview study with older adultsPLOS ONE

Dear Dr. Peng,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

ACADEMIC EDITOR:

Dear Authors,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration of two independent reviews, it is clear that while your study on older adults' navigation of persuasive online health misinformation is both timely and significant, major revisions are necessary before it can be considered for publication.

The reviewers have identified several key areas that require substantial improvement. These include but are not limited to, the clarity of your research questions, methodological details, and the depth of your discussion section. Your findings offer valuable insights; however, the current structure and presentation of your manuscript do not fully convey the potential impact of your work.

To progress towards publication, it is essential that you address all the points raised by the reviewers in a comprehensive and detailed manner. This revision is not just a matter of making minor adjustments, but rather it involves a significant restructuring of your manuscript to meet the journal's standards.

We recognize the effort and dedication you have put into your research and believe that with thorough revisions, your study could make a meaningful contribution to the field. We look forward to receiving your revised manuscript.

==============================

Please submit your revised manuscript by Mar 01 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Nicola Diviani

Academic Editor

PLOS ONE

Journal requirements:

1. When submitting your revision, we need you to address these additional requirements.

Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

2. In the ethics statement in the Methods, you have specified that verbal consent was obtained. Please provide additional details regarding how this consent was documented and witnessed, and state whether this was approved by the IRB.

3. Thank you for stating the following financial disclosure: 

 [Brandt Fellowship].  

Please state what role the funders took in the study.  If the funders had no role, please state: ""The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."" 

If this statement is not correct you must amend it as needed. 

Please include this amended Role of Funder statement in your cover letter; we will change the online submission form on your behalf.

4. We note that you have indicated that there are restrictions to data sharing for this study. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. 

Before we proceed with your manuscript, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., a Research Ethics Committee or Institutional Review Board, etc.). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of recommended repositories, please see

https://journals.plos.org/plosone/s/recommended-repositories. You also have the option of uploading the data as Supporting Information files, but we would recommend depositing data directly to a data repository if possible.

We will update your Data Availability statement on your behalf to reflect the information you provide.

5. Please include your full ethics statement in the ‘Methods’ section of your manuscript file. In your statement, please include the full name of the IRB or ethics committee who approved or waived your study, as well as whether or not you obtained informed written or verbal consent. If consent was waived for your study, please include this information in your statement as well. 

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to read the article entitled ‘Navigating persuasive strategies in online health information: An interview study with older adults’. The manuscript describes a qualitative interview study about how older adults identify and cope with online health-related misinformation. The manuscript addresses an important topic from a qualitative angle, and provides rich data and potentially in-depth insights in how older people attend to and cope with online misinformation. It is clear from the manuscript that the authors have put a lot of effort in the data collection, data analysis and report of the study. I find the topic interesting and important and believe that the richness of the qualitative data can add something to the literature on online health-related misinformation attention, awareness and effects. At the same time I have some questions and concerns which I will mention point by point below, and I hope that they are helpful to the authors in the strengthening of their manuscript.

Major points

- There is a lot of information in this manuscript – which is common for a qualitative study. Therefore, a clear structure is vital to guide reader through the manuscript and understand the focus of the study. The most easy way to do this - in my view is - to formulate clear aims and research questions and use those to clearly structure the manuscript. The current manuscript formulates research questions and aims, but their relation with how the rest of the manuscript is structured (e.g., the theory section and results section) was not always clear to me. As a result I felt often lost while reading as so many topics are discussed.

- The title mentions ‘persuasive strategies’ which is a central theme to the study. These strategies have been identified in a prior literature review conducted by the authors. I find this classification interesting, and unfortunately due to anonymity I cannot check the article they refer to. The twelve strategies are central to the first part of the literature review. In my view, the 12 strategies would better fit the method section (e.g., in the form of a text box) as they are clearly part of the operationalization of the study. I assume that the strategies have been used to develop the messages that the participants were presented with, or that at least that the strategies can be identified in the messages. The literature review, I would expect to discuss more board themes/categories of misinformation characteristics (in contrast to the individual strategies that were identified in a single study). For example in terms of layout/format, content, role of scientific evidence, to place the strategies in a broader context of the literature.

- The literature review could perhaps focus more on older adults in particular and what is known about this group in terms of encountering, attending, and responding to (e.g., reasons to check or share) health-related misinformation online and a discussion of the distinction between misinformation (can be false information, but unintendedly or being outdated) and disinformation (intendedly false information).

- I was surprised that some theoretical mechanisms that are important to misinformation effects, such as confirmation bias, are only mentioned in the results section and not in the literature review section (as a reason why people believe misinformation).

- At some points, some rather strong conclusions are drawn upon this qualitative study and I think that the authors could be a bit more careful in their formulation. For example the conclusion on page 21 mentions that “based on the findings media literacy is a solution to mitigate misinformation susceptibility”. As media literacy is not measured or assessed in some way in this study, this should perhaps be formulated more carefully. Another example is the statement on page 208 (misinformation usually incorporates the trope of big pharma being evil). A reference would be helpful here. I also wonder whether this statement refers to misinformation or mainly disinformation.

- Some critical information about the study is missing in the methods section. These are for example, information about how and where the participants of this study were recruited, when participants qualified as ‘older adults’ – what was the minimum age? Did other inclusion/exclusion criteria apply? In which country was the study conducted? I assume it was in the US but would be good to mention. Who conducted the interviews?

- It would be informative to see the articles that were discussed with the participants added as appendix to the manuscript. This would allow the reader to see how the various strategies were incorporated in each article. Am I correct that all 12 persuasive strategies were incorporated in each article? How was this done? What was the length of the articles? What role played images (if they were there)? Where they all designed in the form of a Facebook post? (line 203).

- I also wonder whether the authors tested the knowledge of the participants on the topics of the messages prior to the study. I can imagine that peoples prior knowledge might also (or mainly) impact people’s ability to sport misinformation, perhaps more than the persuasive strategies within the articles.

- There is no limitation section in the article.

Minor points

- Throughout the manuscript, it says “12 groups of persuasive strategies”. This is a bit confusing, because 12 is already a lot and the word groups implies that each strategy consists of multiple strategies. In the discussion section on page on page 12 (line 475) the word group is left out, which I find more logical and clear.

- It is mentioned that the interviews were part of a larger study and that other parts of the findings are reported elsewhere. For reasons of transparency I would advise the authors to provide references to related studies here.

- The example questions from the interview guide mentioned on page 7/8 are helpful. The authors might consider adding the entire interview guide as an appendix.

- Who did the coding of the interviews? Did the first and second coder code the interviews independently?

Reviewer #2: Thanks for the opportunity to review this manuscript which delves into the perspective of older adults while navigating persuasive online health misinformation. The attention on the specific population subgroup is needed, and the methodological approach is adequate. I believe the manuscript could be improved by addressing the comments listed below.

Title: I would encourage the authors to try finding a more informative title.

Abstract Revision:

It would be beneficial to consider explicitly stating the research gap that the study addresses at the beginning of the abstract, as this would provide readers with a clearer understanding of the study's significance.

Methods Section:

Provide a more detailed explanation of the analysis process, including the coding procedures and criteria.

Clarify the reasons behind the reporting decision by stating that it was made to focus on the most relevant results while ensuring the paper's conciseness.

Include a table summarizing participant characteristics, and expand on population, sampling, and recruitment procedures.

Empirical Study Design:

Offer more information about the design and selection of the misinformation used in the empirical study. Explain how this misinformation was chosen and its relevance to the research questions.

Results Section:

Consider referencing more specific raw data or examples from the study to support the results and enhance their clarity.

Discussion Section:

Add an introductory section to the discussion to provide context and set the stage for the subsequent points.

Elaborate on how the findings relate to the idea of "improving media literacy," discussing its sustainability and feasibility in more depth.

Explore the significance of individuals being able to recognize their emotions, considering its implications for health misinformation discernment.

Incorporate discussions about relevant dual-process theories like the Elaboration Likelihood Model (ELM) to better explain cognitive processing in the context of misinformation.

Delve into the complexities and challenges associated with developing the competences necessary for discerning conflicting health evidence.

Connect the influence of personal relationships as a moderator of misinformation's impact with the respective results to provide a clearer link.

Minor Edits:

Follow the WHO naming conventions by using "COVID-19" instead of "Covid-19" throughout the paper.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 Jul 25;19(7):e0307771. doi: 10.1371/journal.pone.0307771.r002

Author response to Decision Letter 0


10 Mar 2024

The response letter to reviewers is attached in the submission.

Attachment

Submitted filename: Response to Reviewers .docx

pone.0307771.s004.docx (15.7KB, docx)

Decision Letter 1

Nicola Diviani

28 May 2024

PONE-D-23-40641R1Navigating persuasive strategies in online health misinformation: An interview study with older adults on misinformation managementPLOS ONE

Dear Dr. Peng,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

Dear Authors,

Thank you for your detailed revision of the manuscript. The paper has significantly improved as a result. However, one of the original reviewers (Reviewer #2) has a few outstanding concerns that need to be addressed before we can proceed with publication. I look forward to receiving a revised version that addresses these issues.

Best regards,

Nicola Diviani

==============================

Please submit your revised manuscript by Jul 12 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Nicola Diviani

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: N/A

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: The manuscript in its present form has improved and most of the comments from reviewers have been adequately addressed.

However, I still believe that even if the introduction section is quite extensive, the reasons justifying the interest on the specific subpopulation (older adults) are not adequately presented. I would suggest dedicating a section to this and also reflecting specific considerations into the discussion.

I noticed a typo on page 2 line 60.

Reviewer #3: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: No

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 Jul 25;19(7):e0307771. doi: 10.1371/journal.pone.0307771.r004

Author response to Decision Letter 1


6 Jul 2024

Response: We now have dedicated a section in the introduction to justify focusing on this specific subpopulation of older adults (the 3th paragraph of the Introduction).

We also added in the discussion section reflecting specific considerations of older adults (p. 28 and p. 30).

We also carefully proofread the manuscript for grammar and typos. Thank you!

Attachment

Submitted filename: 2nd Response to reviewer.docx

pone.0307771.s005.docx (6.5KB, docx)

Decision Letter 2

Nicola Diviani

11 Jul 2024

Navigating persuasive strategies in online health misinformation: An interview study with older adults on misinformation management

PONE-D-23-40641R2

Dear Dr. Peng,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Nicola Diviani

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Nicola Diviani

15 Jul 2024

PONE-D-23-40641R2

PLOS ONE

Dear Dr. Peng,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Nicola Diviani

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix. Four health misinformation articles with annotated persuasive strategies.

    (DOCX)

    pone.0307771.s001.docx (95.1KB, docx)
    S2 Appendix. Six messages participants exposed to.

    (PDF)

    pone.0307771.s002.pdf (1.5MB, pdf)
    S3 Appendix. Interview guide.

    (DOCX)

    pone.0307771.s003.docx (9.3KB, docx)
    Attachment

    Submitted filename: Response to Reviewers .docx

    pone.0307771.s004.docx (15.7KB, docx)
    Attachment

    Submitted filename: 2nd Response to reviewer.docx

    pone.0307771.s005.docx (6.5KB, docx)

    Data Availability Statement

    The authors are not able to provide any data beyond what is presented in the manuscript due to restrictions that study participants agreed to when they signed the consent form, which was approved by the Michigan State University IRB.


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES