Skip to main content
Wiley - PMC COVID-19 Collection logoLink to Wiley - PMC COVID-19 Collection
editorial
. 2021 Apr 16;143(5):377–379. doi: 10.1111/acps.13290

Managing the infodemic about COVID‐19: Strategies for clinicians and researchers

Jan Scott 1,2,3,4,5,
PMCID: PMC8251356  PMID: 33861872

‘We’re not just fighting an epidemic; we’re fighting an infodemic’.Tedros Adhanom Ghebreyesus, Director‐General of the World Health Organization, Munich Security Conference, 15th February 2020.

The exponential increase in demand for and dissemination of information about COVID‐19 means the pandemic has been accompanied by an ‘infodemic’. 1 This overabundance of accurate and inaccurate information is not limited to scientific or policy publications but threatens to overwhelm news and social media outlets. 1 , 2 As noted by Eysenbach, 1 the price of freedom of speech and increased information technology has been the unfiltered uncontrolled rapid and widespread broadcasting of rumours, misinformation and disinformation. Furthermore, the echo chamber effect of social media means the public have willingly or unwillingly generated, amplified and proliferated potentially harmful myths that can contribute to poor decision‐making regarding health‐related behaviours. 1 , 2 The World Health Organization has reacted by publishing a detailed international report regarding management of the infodemic and similar publications exist which highlight the key role that social media companies can/should play in limiting the spread or legitimization of misinformation (‘for balance’) and in flagging disinformation. 2 Considerably fewer publications summarize the role that individuals or communities might play. 3 , 4 This article briefly reviews strategies for counteracting misinformation that clinicians and researchers might usefully employ. Like ongoing research on the pandemic, not every suggestion is an established fact, some represent consensus about so‐called ‘BETs’ (best evidence at the time) and/or are the subject to ongoing research. 1 , 2 , 3 , 4 , 5

The core dimensions of rumour, misinformation and disinformation, are the degree of ‘facticity’ (ie the accuracy of the statement), the intention of the author or source and the role of the audience. 5 For instance, misinformation may arise because a misleading statement is unintentionally spread by an individual to other members of their social network (some of whom may not recognize it as erroneous). In contrast, disinformation is typically a deliberately fabricated story disseminated by individuals or groups that are trying to discredit a scientific paper, research collaboration or government policy, etc. Statements included within misinformation and disinformation are characterized by selective amplification of some facts and downplaying of others (eg exaggerating negative effects of vaccination and understating the clinical severity of COVID‐19). 1 , 5 , 6 Sharing is facilitated if the statement has congruence with the worldview of the audience because the message validates individual preconceptions. 5 , 6 The emotiveness expressed in the source material (positive or negative) can enhance memorability and repetition of the communication strengthens belief in the statements. 5 , 6 Reliance on misinformation differs from ignorance about a topic, and the latter indicates absence of relevant knowledge (and missing information is probably less hazardous than misinformation). 5 , 6

Individuals incorporate misleading statements into a mental model that combines new (mis)information with pre‐existing assumptions and beliefs, thus creating an integrated scaffolding of ideas. To modify this requires reconstruction of the mental model. So, labelling the statement as inaccurate is a starting point, but it is insufficient on its own. It is necessary to draw on some or all the following: debunking, empowering individuals to evaluate information, pre‐bunking, infoveillance, and ‘nudge’ techniques. 1 , 5 , 6 , 7 , 8 Sociologically, debunking aims to disprove what is commonly thought to be ‘reality’ (eg the individual's model or understanding of an element of the pandemic) by helping unmask the factual truth. 6 So, after labelling the misinformation as a myth, optimal debunking involves a detailed rebuttal. Incorrect elements of any statement must be dissected, with gaps in logic highlighted and detailed explanations as to why the statement is wrong. 8 The erroneous elements within the framework of the current mental model need to be replaced. 6 , 7 , 8 It is critically important to provide sufficient detail to prevent causal gaps in the new, accurate model; otherwise, the original mental model will prevail. 4 , 5 To make the new model memorable, it must engage the audience, and simpler models are cognitively more attractive than complex models (the latter may backfire and again result in persistence of the inaccurate model). Models that are coherent and straightforward are more likely to be shared. Tone and style of debunking are important, and corrections and rebuttals that are delivered empathically are perceived as helpful and likely to be effective. 4 , 5 , 6 Aggression is counterproductive as the messenger is regarded as less credible or trustworthy. Likewise, ridiculing the audience does not change minds and often build resistance. It is important to consider who is the target audience and how best to engage with them. 3 , 4 , 6 For example, a ‘zoom Q and A’ session may allow time to encourage individuals to scrutinize misinformation, discover inaccuracies for themselves and for the group to develop plausible counterarguments. Two other elements of debunking are essential. First, the facts must be the memorable hook in a message, so the original myth should not feature in the headline. 1 , 5 Second, it is important to minimize repeating or spreading the myth during any communcations. 8 If you must repeat some or all the misinformation, warn the audience before mentioning it (this strategy puts them cognitively on their guard). 6 Some evidence suggests that the use of narrative can be a valuable debunking strategy. 9 Many clinicians and researchers are reluctant to use anecdotes or case histories, but studies indicate it can be helpful in delivering messages, particularly with older adults.

Debunking is associated with instant gains and diminution in the strength of belief in the original mental model. 6 However, the gains quickly fade, so repeated novel and varied presentations of the preferred causal model are required. Also, debunking efforts are undermined by delays between the misinformation becoming pervasive, the timing of the rebuttal and the congruence between the misleading ideas and the preconceptions of the audience. 1 It is unrealistic to expect an individual clinician or researcher to convince committed anti‐vaxxers to adopt the most accurate or factual model. 3 , 4 , 6 However, individuals who express healthy scepticism or are ‘vaccine hesitant’ often find debunking helpful, especially if it is delivered via an expert who comes across as honest and authentic and if the sceptic can check the source material for themselves. Lastly, debunking may be more acceptable to individuals if provided alongside general guidance on spotting misinformation or fake news.

Empowering individuals to discriminate facts from myths can take several forms including online games, such as Go Viral (https://www.goviralgame.com/en) that offer a gentle and entertaining introduction to developing skills for spotting fake news. 10 Other programmes, such as those targeted at educational settings, help individuals work through a checklist that involves reviewing the credibility of the source (including reviewing the url of the website), fact‐checking (including information on websites dedicated to this purpose), considering whether the myth is reported by multiple trusted mainstream sources, speculating on the motivation of the source (and who benefits), through to checking research references and the accuracy of source materials. 2 This ‘filtering, vetting and verifying’ approach may have long‐term benefits, but it is heavily reliant on individual desire to examine the reliability of information. Fact‐checking is effortful, so most individuals only do it if the misinformation violates their own pre‐conceptions. 11 Myths that are concordant with other beliefs and ideas are frequently accepted without challenge. 5

Pre‐bunking is based on ‘inoculation theory’ and represents the ideal approach as it enables warnings about the reliability and authority of information to be given at the time of exposure. 8 Although pre‐warning, that is flagging misinformation before stating it, is better than warnings delivered immediately post‐exposure, the latter still has some effect. 8 The likely rate‐limiting step for pre‐bunking is delivering the corrective message at the right moment. One way to optimize the impact of pre‐bunking is via ‘infoveillance’, that is monitoring social messaging content to identify when misinformation reaches a tipping point in terms of the level of attention and traction. 1 This moment represents the best opportunity to deliver a detailed coherent alternative model. Additionally, research demonstrates that individuals rarely consider the accuracy of messages before sharing them. 11 Encouraging individuals to pause, reflect and consider the source of any news before sharing it has been shown to reduce dissemination of fake news and increase the likelihood of sharing accurate information. As a minimum, promoting cognitive reflection can prevent unintentional dissemination of misinformation (even if the first recipient recognizes a statement is untrue, there is a risk they forward it to someone who accepts it as factually accurate).

If misinformation is compatible with other beliefs or ideas held by an individual, then the above strategies may have restricted impact. 6 Other techniques have not been tested in a pandemic, but research on nudge theory suggests that, in different scenarios, it is possible to modify specific behaviours by large segments of the general population without directly challenging the underlying beliefs of some of those people. 7 The underlying principle of ‘nudging’ is that, rather than trying to force individuals to adopt a particular behaviour if they are unwilling to do so, the social context or environment is changed in such way that makes it easier for everyone to follow a preferred pathway or pattern of behaviour. Existing examples of nudges include changing the policy on organ donation so that the default position is that individuals have to an opt‐out rather than opt‐in, etc. Whilst some argue that nudges may raise ethical concerns or undermine civil liberties, nudges are usually aimed at creating changes in social behaviours in a predictable but non‐mandatory way. The obvious nudge to behaviours related to COVID‐19 is that several countries are considering whether tourists will require ‘vaccination passports’ in the future. Given that evidence that vaccination against other diseases is often required, for example for travel to the Indian sub‐continent, it has been argued that this is a potentially helpful way to increase uptake of COVID‐19 vaccinations.

In conclusion, in a time of uncertainty, the widespread use of electronic media makes it hard to keep up with the rapid spread of misinformation. Social media companies are unlikely to resolve this problem in the short‐term but individuals, especially those involved in health professions, can play a role in counteracting unhelpful myths. Empirical evidence suggests that, even if an audience is motivated to access accurate information, some techniques are more useful in ensuring facts rather than myths are remembered. Debunking can be effective but only when enacted in a non‐judgemental manner. This technique and nudges are reactive, so are mainly instituted once misinformation is in circulation. Pre‐bunking and infoveillance are pro‐active interventions but require systemic responses rather than individual efforts alone. All approaches, including educating the public in how to assess the credibility of messages received during the infodemic, more likely to be effective if delivered by broader multi‐disciplinary and inter‐professional collaborations, at international, national and local levels.

DECLARATION OF INTEREST

None declared.

REFERENCES

  • 1. Eysenbach G. How to fight an infodemic: the four pillars of infodemic management. J Med Internet Res. 2020;22(6):e21820. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. World Health Organization . An ad hoc WHO technical consultation managing the COVID‐19 infodemic: call for action, 7‐8 April 2020. Geneva: World Health Organization; 2020. Licence: CC BY‐NC‐SA 3.0 IGO. [Google Scholar]
  • 3. Scheufele D, Krause N. Science audiences, misinformation, and fake news. PNAS. 2019;116(16):7662. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Schmid P, Betsch C. Effective strategies for rebutting science denialism in public discussions. Nat Hum Behav. 2019;3(9):931‐939. [DOI] [PubMed] [Google Scholar]
  • 5. Lewandowsky S, Ecker UKH, Seifert CM, Schwarz N, Cook J. Misinformation and its correction: continued influence and successful debiasing. Psychol Sci Public Interest. 2012;13(3):106‐131. [DOI] [PubMed] [Google Scholar]
  • 6. Paynter J, Luskin‐Saxby S, Keen D, et al. Evaluation of a template for countering misinformation—Real‐world Autism treatment myth debunking. PLoS One. 2019;14(1):e0210746. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Pennycook G, McPhetres J, Zhang Y, Lu JG, Rand D. Fighting COVID19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention. Psychol Sci. 2020;31(7):770‐780. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. van der Linden S, Roozenbeek J, Compton J. Inoculating against fake news about COVID‐19. Front Psychol. 2020;11:566790. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Dahlstrom M. Using narratives and storytelling to communicate science with nonexpert audiences. PNAS. 2014;111(4):13614. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Basol M, Roozenbeek J, van der Linden S. Good news about bad news: gamified inoculation boosts confidence and cognitive immunity against fake news. J Cogn. 2020;3(1):2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Greenspan R, Loftus E. Pandemics and infodemics: research on the effects of misinformation on memory. Hum Behav Emerg Technol. 2021;3(1):8‐12. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Acta Psychiatrica Scandinavica are provided here courtesy of Wiley

RESOURCES