Abstract
Although the use of social media to spread misinformation and disinformation is not a new concept, the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has further highlighted the dangers that misinformation can pose to public health. More than two-thirds of Americans receive their news from at least 1 social media outlet, most of which do not undergo the same review process as academic journals and some professional news organizations. Unfortunately, this can lead to inaccurate health information being conveyed as truth. The purpose of this article is to inform the infectious diseases community of the history and dangers of health misinformation and disinformation in social media, present tools for identifying and responding to misinformation, and propose other ethical considerations for social media.
Keywords: misinformation, social media, infectious disease
Social media has changed the way that people buy and sell goods, communicate with friends and strangers, and receive and digest information [1, 2]. It has the power to shape human behavior and consequently health. Despite its many advantages and ubiquitous use, the proliferation of social media has also led to some unintended consequences. As the coronavirus disease 2019 (COVID-19) pandemic has illustrated, misinformation and in some instances, disinformation, surrounding public health and communicable diseases have disseminated over social media channels to reach a large and diverse audience [3, 4]. Part of the challenge in considering false information on social media is in naming the problem. For the purposes of this commentary, misinformation refers to misleading or sometimes false statements that run contrary to the epistemic consensus of the scientific community [5]. Disinformation, on the other hand, can be defined as the deliberate spread of false information for secondary gain, be it financial, political, or both [5]. Within these categories lies a wide spectrum of conflicting or misleading information. Misinformation and disinformation are both of concern in the era of social media as it relates to the dissemination of infectious diseases-related knowledge. In this viewpoint, we discuss the disadvantages associated with social media and suggest strategies for combating them.
HISTORY OF HEALTH INFORMATION ON SOCIAL MEDIA
The spread of false information has been propagated throughout human history. Contemporary models of contagion and epidemiological studies of misinformation suggest that false information diffuses faster and farther than true information, particularly in the context of social media [6]. Although the underpinnings of misinformation diffusion remain complex, the concept is not new. The term “fake news” for example, was lamented as early as 1925 in a Harper’s Magazine article that discussed the role of the news media in disseminating misinformation to the public [2, 7]. The 1998 study published by The Lancet linking autism with the measles, mumps, and rubella (MMR) vaccine is perhaps one of the most infamous contemporary examples of misinformation that, despite its eventual retraction, continues to be associated with vaccine hesitancy [5, 8]. Since the advent of the internet, however, misinformation and disinformation in particular have propagated more rapidly and with greater ease across nearly every discipline and on a variety of subjects [9].
Health-related information has been particularly vulnerable. Although the internet represents a diverse array of modalities for conducting searches for health information (eg, Google, online news, YouTube, Twitter, etc), it is clear that social media has propelled the spread of misinformation and disinformation by allowing content to be shared among users more easily. A systematic review examining health-related misinformation on social media and the way in which it disseminates concluded that from 2012 to 2018, the number of studies focusing on health-related misinformation increased, with particular emphasis on and after 2017, postulated to be in the setting of concomitant political events at that time [2]. Most were related to communicable diseases and vaccination, with misconceptions around MMR vaccine and autism being prevalent even from the early days of social media. Vaccine-related misinformation was noted to be relatively common. Emerging infectious diseases such as Zika and Ebola viruses were also notable in that misleading information was often propagated in settings where experts or public health officials expressed uncertainty as those outbreaks were unfolding [2].
Health-related disinformation is another phenomenon which has rapidly developed on social media, though its origins are less well understood. One early study of Twitter concluded that approximately 53% of initial users were human; the rest were classified as cyborgs or bots, which are automated accounts on social media platforms created to imitate human activity [10]. Trolls, or users who misrepresent their identities to promote discord, have also proliferated on social media since its advent [11]. Some early studies have suggested that these entities may have negatively impacted online discourse about vaccinations deliberately for political or financial gain [11].
The COVID-19 pandemic is the latest public health issue to have been affected by misinformation and disinformation and among the first to have occurred in the era of widespread social media use. This in part has led the World Health Organization (WHO) to label the rapid spread of false or misleading information during public health crises an “infodemic.” [4]. Although the term infodemiology, or the study of health information and misinformation, was initially coined in 2002, it later found traction in the context of COVID-19 through its designation by the WHO (Zielinski 2021) [12].
DANGERS OF SOCIAL MEDIA AND MISINFORMATION RELATED TO HEALTH
Historically, mass media, governmental bodies, and other leading organizations produced or disseminated information for public consumption. However, decreasing trust in the institutions and public health or political figures that typically deliver and produce health information coupled with the increasing availability of social media outlets have provided fertile ground for misinformation to develop [13].
In the context of social media, the general public now has the option to choose what to read and can contribute to this content through real-time commentary. As a result, social media has empowered those who previously may not have had the opportunity to engage with their peers on a public stage [1]. Although this power has been transformative by providing a platform for traditionally underrepresented voices, in some settings it has also contributed to the erosion of traditional health communication strategies.
The Pew Research Center reports that 7 in 10 Americans use social media to connect with others, engage in news content, and share information [14]. This level of utilization and dissemination of unvetted sources may render more of the population susceptible to misinformation. Through widespread use of social media, misinformation may subsequently be spread at a global level. For example, one recent study found over 2000 COVID-19 related rumors, stigma, and conspiracy theories in 25 languages from 87 countries [3]. The authors concluded that of these reports, 82% of them were false, highlighting the need for interventions aimed to combat misinformation. More recently, a Kaiser Family Foundation COVID-19 Vaccine Monitor report demonstrated that approximately 78% of Americans believe or are not sure about certain inaccurate statements about COVID-19 vaccines or the pandemic, suggesting the ubiquitous nature and integration of misinformation into our daily lives [15].
Misinformation and disinformation may be affected by personal, negative, and opinionated voices inducing fear, anxiety, and mistrust in credible institutions [2]. This is of particular concern in the setting of misinformation related to health and health-related outcomes and has the potential for far-reaching implications. For example, vaccine misinformation and disinformation has contributed to increased vaccine hesitancy worldwide, resulting in measles outbreaks in the United States, Philippines, Ukraine, Venezuela, Brazil, Italy, France, and Japan and resurgence of other vaccine-preventable diseases [16]. Consequences of misinformation have additionally propagated through social media during the COVID-19 pandemic, leading to hospitalizations due to self-medication with inappropriate and potentially toxic regimens, such as ivermectin intended for livestock, along with global panic-buying due to rumors of complete shutdown [17]. This can impact not only health-related outcomes but also lead to economic effects as supply hoarding has resulted in price inflation and extreme shortage of essential goods [3].
TOOLS USED TO PROPAGATE NEGATIVE NARRATIVES
A variety of tools are used to misrepresent medical information and portray inappropriate health narratives to the public.
Misinformation proponents take advantage of the overwhelming volume of health-related articles available on the internet. This inexhaustible resource inevitably leads to information overload of both accurate and inaccurate messaging [4]. Many available articles were not written or reviewed by field-specific experts and therefore may contain erroneous information, either knowingly or unintentionally. The average consumer is thus strained by excess information and may be unsure of which websites or authors to trust. This uncertainty is further fueled by general anxiety in the setting of a pandemic, leading to difficulties distilling out pertinent and factual information [18]. To overcome this uncertainty, individuals may rely on friends and family to determine their source of news and health information. However, as people tend to interact with like-minded individuals, this may lead to confirmation bias and an inability to change thoughts or behaviors even if presented with a well-reasoned evidence-based argument [19]. Moreover, social media feeds themselves are often curated to support an individual’s beliefs based on prior online activity. These “information silos” create closed networks that, in turn, allow misinformation to circulate more easily [20]. Many social media platforms inherently support the formation of “echo chambers,” or environments in which a user’s existing opinions are reinforced due to interactions with like-minded sources or individuals [21]. Repeated, selective exposure to information adhering to a specific worldview may in turn contribute to polarization and in some cases, allow misinformation to proliferate.
In some situations, misinformation propagators design posts that confirm their desired narrative with a focus on increasing visibility and engagement, rather than on medical or public health accuracy. These stories may include personal anecdotes, unproven media reports, or scientific pre-prints not yet peer-reviewed by experts. Due to the algorithms of how many social media platforms promote content, these posts are increasingly disseminated based on their engagement (eg, likes, reposts, comments, opening hyperlinks) rather than by their chronological order of posting. Therefore, they are often written to induce a strong, frequently negative, emotional reaction in the reader, thus increasing the likelihood of social interaction [2, 5]. Additional engagement occurs when these posts are promoted by individuals with large social media followings or “influencers”—these can include celebrities and political leaders.
An additional tool used to push health disinformation in social media in particular is the use of bots. These artificial accounts propagate stories that align with the desired narrative in order to increase engagement or for political or financial purposes [10]. They can also attack legitimate accounts that express a differing viewpoint, in an effort to decrease the perceived legitimacy of the author and discourage future posting. This has become progressively apparent during the current pandemic. One reference estimates that up to 66% of current bots focus on COVID-19 [22].
FRAMEWORK FOR IDENTIFYING AND RESPONDING TO MISINFORMATION
Understanding the common tools used to promote medical misinformation allows for more targeted and effective techniques to combat its spread. With this in mind, we propose 3 distinct levels at which misinformation and disinformation on social media must be combated: social media platforms, trusted institutions, and individuals (Table 1).
Table 1.
Type of Misinformation | Strategy to Address Misinformation |
---|---|
Misinformation spread through social media feeds and “echo chambers” | • Modification of machine learning algorithms to take into account other factors, such as the source and accuracy of information, rather than promotion of posts solely based on user engagement |
Challenges with misinformation identification | • Social media platforms to flag and remove misinformation |
• Acknowledge impact of evolution of science on misinformation | |
• Initiatives supporting misinformation identification in languages apart from English | |
Unclear validity of information, mixed messaging | • Consistently disseminate reliable information from trusted institutions and community leaders in near real-time |
• Promotion of domain experts | |
• Respectful peer review of social media | |
• Careful consideration of information source and content prior to sharing on social media platforms |
First, social media platforms should play a role in addressing health misinformation and disinformation by modifying their automated machine learning algorithms. As previously discussed, content with the most views or shares is pushed to the top of users’ feeds, promoting the facile promulgation of misinformation [23]. Although some platforms have added information labels on posts that reference important issues like COVID-19, social media platforms must go further in their efforts. By implementing algorithms that also consider how trustworthy a certain source is, social media users are less likely to fall into a trap of false information. Social media platforms can also create editorial teams of experts to track and remove identified misinformation. If users can report misinformation, then these teams may better respond to misleading posts by removing them or publicly labeling them as potential misinformation. Some platforms have already begun these efforts informally, such as Reddit that allows community members to document and report misinformation (Tiffany 2021) [24]. Additionally, social media platforms should also work to promote alternative accurate information in settings where misinformation is reported. This type of work should also extend to non-English language domains, as misleading information has been reported in the context of COVID-19 and elsewhere in multiple languages and countries [3].
Second, supporting clear and reliable information at the institutional level should be prioritized. The World Health Organization member states passed Resolution WHA73.1 in May 2020, recognizing the importance of managing the consequences of the “infodemic” associated with COVID-19 [25]. At the onset of the COVID-19 pandemic, the Infectious Diseases Society of America (IDSA) took an active role in creating a podcast featuring experts and sharing their outlook on the spread of the virus. The IDSA has also updated its COVID-19 website frequently so that members always have access to the most up-to-date information reviewed by field experts [26]. Another example of increased institutional responsibility would be the US Centers for Disease Control and Prevention (CDC) roundtable discussion with content experts that was created during the pandemic [27]. National and international organizations should continue to provide high-quality, vetted, and real-time information surrounding public health crises which can then be disseminated by social media users.
Third, individuals should play a more active role against misinformation and disinformation. Social media users, particularly domain experts, should be encouraged to consider each post consumed, research the authors, and note the origin of publication. Infectious diseases health professionals, practitioners, and public health figures that engage on social media should also strive to provide reliable information when available, as well as provide thoughtful feedback buttressed by scientific inquiry to address misinformation on the platform. Another potential approach could be to partner with community leaders in the battle against misinformation. Establishing partnerships between infectious disease or public health professionals and pillars of the community such as religious leaders or local officials may be more successful in highlighting misinformation and spreading accurate health-related information to the public. After trust is established at the local level, this relationship has the potential to grow over time.
Finally, apart from sharing accurate information, institutions and individuals should diligently counter and debunk misinformation and disinformation. Although there is concern for the “backfire effect”—the idea that correcting a misperception can cause individuals to become more entrenched in their beliefs—recent data suggest that this phenomenon occurs more rarely than previously thought [28]. Although cognitive biases likely still do play a role, varying levels of scientific knowledge may also be associated with subsequent belief, and concise scientific evidence should still be promoted by individuals and institutions to correct misinformation [28]. An example of an effective strategy to debunk misinformation was a tweet by the official Twitter account of the Food and Drug Administration (FDA). It was a warning against the off-label use of ivermectin to treat COVID-19 stating, “You are not a horse. You are not a cow. Seriously, y’all. Stop it.” The tweet was also linked to a post entitled “Why You Should Not Use Ivermectin to Treat or Prevent COVID-19” [29].
As has become apparent over the course of the COVID-19 pandemic, misinformation is a complex issue and will likely require a multi-pronged approach to stem its tide.
OTHER ETHICAL CONSIDERATIONS OF SOCIAL MEDIA
Although it is critical to combat misinformation, it also remains important to demonstrate openness to discussion of evolving recommendations as more scientific data are collected and reviewed. The COVID-19 pandemic has been a humbling case study. For example, initial recommendations in the US that did not strongly advocate for universal face mask usage early in the pandemic have transitioned to recommendations for a variety of masks in various settings. Even in that regard, there was misunderstanding of the types of masks that offer protection to the user, as opposed to those intended to decrease transmission of COVID-19 to others [30]. Initial promise of convalescent plasma for select patients used early in disease has now led to minimal use in a very select patient group after further review of the data [31, 32]. Debates about the origin of the pandemic continue today, with important data unavailable to international teams seeking to determine its origin. Previously anticipated benefits of vaccination in reducing transmission of COVID-19 have been challenged by follow-up studies [33]. This pandemic has demonstrated that humility and a sense of historical perspective are necessary as the medical community continues to understand COVID-19 and other emerging infectious diseases.
Misinformation and disinformation are not new, only the means by which they can be so efficiently propagated. Transparency and prompt communication of new information are essential in maintaining the credibility of the medical profession, medical societies, and public health authorities. The Office of the US Surgeon General recently released a toolkit for addressing misinformation, noting that misinformation is sometimes shared with the intention of protecting others or trying to learn more in a state of uncertainty [34]. Respect for autonomy of the patient and provider continues to be central to the practice of medicine, while also trying to avoid harm (“non-maleficence”) to the patient and the community. Although damage from misinformation and disinformation is significant, a calm, reasoned approach is important when engaging in debate around data and ideas of those conveying such misinformation or disinformation and, more importantly, their followers. Respectful discussion that avoids ad hominem attacks or deference to authorities who have revised recommendations as new data have become available may change the perspectives of followers and others who are observing the discussion.
The Federation of State Medical Boards, followed by the American Board of Family Medicine, American Board of Internal Medicine, and the American Board of Pediatrics, recently issued a statement stating that “Physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their medical license...an ethical and professional responsibility to practice medicine in the best interests of their patients and must share information that is factual, scientifically grounded and consensus-driven for the betterment of public health” [35, 36]. This statement requires very thoughtful consideration to avoid hampering open discussion of available data. For example, a commentary published prior to the Omicron-associated surge questioned vaccine boosters as opposed to focusing on improving vaccination rates of the general population [37]. One week later, however, an FDA panel voted 16-2 in favor of a booster dose of Pfizer vaccine for patients over 65 years of age, or otherwise at high risk [38]. Would the 2 individuals on the FDA panel that voted against the booster be characterized as generating misinformation through debate? As data are presented in this context, what may be perceived as misinformation one week, may not be so the following week.
Social media is a tool that should remain openly accessible, with careful consideration of how and when to label certain activity as containing misinformation. Acknowledging scientific uncertainty is one way to balance the potential harms of misinformation with the importance of transparency and open scientific debate.
CONCLUSION
Although social media can be a tool for profound social change, it can also serve as a platform for the widespread dissemination of misleading or false information. As the COVID-19 pandemic has demonstrated, both of these outcomes can occur amid a public health crisis. Rapidly disseminating accurate information to the public is an important task in this setting; however, a nuanced approach is necessary to present scientific uncertainty to prevent misinterpretation. Partnerships between social media platforms, national and international public health organizations, and domain experts must be forged in order to collectively combat the widespread dissemination of health misinformation, particularly during a public health crisis. Infectious diseases and public health practitioners in particular should be aware of the potential for misinformation and disinformation on social media as it pertains to communicable diseases, and support the dissemination of robust data in the public sphere.
Notes
Supplement sponsorship. This supplement is supported by the Infectious Diseases Society of America.
Potential conflicts of interest. All of the authors are former or current members of the IDSA Digital Strategy Advisory Board. J. S. has received a CareDx Fellowship grant from the American Society of Transplantation (2020–2022) and a National Institutes of Health–National Institute of Allergy and Infectious Diseases (NIH-NIAID) T32 Training Grant number AI100851 (2018–2021). J. S. also has a pending patent for gene expression-based classifiers for fungal infections. A. D. has received a grant from the International Society for Infectious Diseases. All other authors report no potential conflicts. All authors have submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Conflicts that the editors consider relevant to the content of the manuscript have been disclosed.
References
- 1. Harper RA. The social media revolution: exploring the impact on journalism and news media organizations. Inquiries Journal/Student Pulse. 2010. Available at: http://www.inquiriesjournal.com/a?id=202. Accessed 16 September 2021.
- 2. Wang Y, McKee M, Torbica A, Stuckler D.. Systematic literature review on the spread of health-related misinformation on social media. Soc Sci Med 2019; 240:112552. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Islam MS, Sarkar T, Hossain S, et al. . COVID-19–related infodemic and its impact on public health: a global social media analysis. Am J Trop Med Hyg 2020; 103:1621–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Infodemic. World Health Organization. Available at: https://www.who.int/health-topics/infodemic. Accessed 16 September 2021.
- 5. Swire-Thompson B, Lazer D.. Public health and online misinformation: challenges and recommendations. Annu Rev Public Health 2020; 41:433–51. [DOI] [PubMed] [Google Scholar]
- 6. Vosoughi S, Roy D, Aral S.. The spread of true and false news online. Science 2018; 359:1146–51. [DOI] [PubMed] [Google Scholar]
- 7. McKernon E. Fake news and the public. Harper’s Magazine 1925. Available at: https://harpers.org/archive/1925/10/fake-news-and-the-public/. Accessed 16 September 2021. [Google Scholar]
- 8. Wakefield AJ, Murch SH, Anthony A, et al. . Retracted: Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. Lancet 1998; 351:637–41. [DOI] [PubMed] [Google Scholar]
- 9. World Economic Forum—Global Risks 2013 Eighth Edition. World economic forum. Available at: http://wef.ch/GJKqei. Accessed 16 September 2021.
- 10. Chu Z, Gianvecchio S, Wang H, Jajodia S.. Detecting automation of twitter accounts: are you a human, bot, or cyborg? IEEE Trans Dependable Secure Comput 2012;9: 811–824. [Google Scholar]
- 11. Broniatowski DA, Jamison AM, Qi S, et al. . Weaponized health communication: twitter bots and Russian trolls amplify the vaccine debate. Am J Public Health 2018;108: 1378–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Zielinski C. Infodemics and infodemiology: a short history, a long future. Rev Panam Salud Publica 2021;45:e40. Available at: https://doi.org/10.26633/RPSP.2021.40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Key findings about Americans’ declining trust in government and each other. 2019. Available at: https://www.pewresearch.org/fact-tank/2019/07/22/key-findings-about-americans-declining-trust-in-government-and-each-other/. Accessed 16 September 2021.
- 14. Social Media Use in 2021. Pew research center. 2021. Available at: https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2021/04/PI_2021.04.07_Social-Media-Use_FINAL.pdf. Accessed 13 September 2021.
- 15. COVID-19 misinformation is ubiquitous: 78% of the public believes or is unsure about at least one false statement, and nearly a third believe at least four of eight false statements tested. 2021. Available at: https://www.kff.org/coronavirus-covid-19/press-release/covid-19-misinformation-is-ubiquitous-78-of-the-public-believes-or-is-unsure-about-at-least-one-false-statement-and-nearly-at-third-believe-at-least-four-of-eight-false-statements-tested/. Accessed 10 November 2021.
- 16. Benecke O, DeYoung SE.. Anti-vaccine decision-making and measles resurgence in the United States. Glob Pediatr Health 2019; 6:2333794X19862949. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. FDA. Consumer updates: why you should not use ivermectin to treat or prevent covid-19. 2021. Available at: https://www.fda.gov/consumers/consumer-updates/why-you-should-not-use-ivermectin-treat-or-prevent-covid-19. Accessed 13 September 2021.
- 18. Chen K, Luo Y, Hu A, Zhao J, Zhang L.. Characteristics of misinformation spreading on social media during the COVID-19 outbreak in China: a descriptive analysis. Risk Manag Health Policy 2021; 14:1869–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Alvarez-Galvez J, Suarez-Lledo V, Rojas-Garcia A.. Determinants of infodemics during disease outbreaks: a systematic review. Front Public Health 2021; 9:603603. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Chou W-YS, Oh A, Klein WMP.. Addressing health-related misinformation on social media. JAMA 2018; 320:2417–8. [DOI] [PubMed] [Google Scholar]
- 21. Cinelli M, De Francisci Morales G, Galeazzi A, Quattrociocchi W, Starnini M.. The echo chamber effect on social media. Proc Natl Acad Sci U S A 2021; 118. doi: 10.1073/pnas.2023301118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Himelein-Wachowiak M, Giorgi S, Devoto A, et al. . Bots and misinformation spread on social media: implications for COVID-19. J Med Internet Res 2021; 23:e26933. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Zimmer F, Scheibe K, Stock M, Stock WG.. Fake news in social media: bad algorithms or biased users? J Inf Sci Theory Pract 2019; 7:40–53. [Google Scholar]
- 24. Tiffany K. The Al Capone Approach to Anti-vaxxers. The Atlantic. Available at: https://www.theatlantic.com/technology/archive/2021/09/reddits-workaround-vaccine-misinformation/620165/. Accessed 6 January 2022. [Google Scholar]
- 25. Joint Statement by WHO, UN, UNICEF, UNDP, UNESCO,UNAIDS,ITU,UN Global Pulse, and IFRC. Managing the COVID-19 infodemic: promoting healthy behaviours and mitigating the harm from misinformation and disinformation. World Health Organization. 2021. Available at: https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation. Accessed 26 September 2021.
- 26. COVID-19 real-time learning network. Available at: https://www.idsociety.org/covid-19-real-time-learning-network/about/. Accessed 25 September 2021.
- 27. CDC/IDSA COVID-19 clinician calls. Available at: https://www.idsociety.org/multimedia/#/+/0/score,date_na_dt/desc/?channels_na_str=CDC%2FIDSA%20Clinician%20Call. Accessed 25 September 2021.
- 28. Caulfield T. Does debunking work? correcting COVID-19 misinformation on social media. 2020. doi: 10.31219/osf.io/5uy2f.
- 29. Scientists must speak up against misinformation. NIH Record. 2021. Available at: https://nihrecord.nih.gov/2021/11/12/scientists-must-speak-against-misinformation. Accessed 26 January 2022.
- 30. Brooks JT, Beezhold DH, Noti JD, et al. Maximizing fit for cloth and medical procedure masks to improve performance and reduce SARS-CoV-2 transmission and exposure, 2021. MMWR Morb Mortal Wkly Rep 2021; 70: 254–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Janiaud P, Axfors C, Schmitt AM, et al. . Association of convalescent plasma treatment with clinical outcomes in patients with COVID-19: a systematic review and meta-analysis. JAMA 2021; 325:1185–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Delgado-Fernandez M, Garcia-Gemar GM, Fuentes-Lopez A, et al. . Treatment Of Covid-19 with convalescent plasma in patients with humoral immunodeficiency--three consecutive cases and review of the literature. Enferm Infecc Microbiol Clin 2021. Available at: https://www.sciencedirect.com/science/article/pii/S0213005X21000355. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Brown CM, Vostok J, Johnson H, et al. . Outbreak of SARS-CoV-2 infections, including COVID-19 vaccine breakthrough infections, associated with large public gatherings—Barnstable County, Massachusetts, July 2021. MMWR Morb Mortal Wkly Rep 2021; 70:1059–62. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. A Community Toolkit for Addressing Health Misinformation. Toolkit for addressing health misinformation from the Office of U.S. Surgeon General-2021. Available at: https://www.hhs.gov/sites/default/files/health-misinformation-toolkit-english.pdf. Accessed 11 November 2021.
- 35. Joint statement from ABFM, ABIM, ABP on dissemination of misinformation by board-certified physicians about COVID-19. Available at: https://www.abp.org/news/statement-about-dissemination-covid-19-misinformation. Accessed 25 September 2021.
- 36. Federation of State Medical Boards. FSMB: Spreading COVID-19 Vaccine Misinformation May Put Medical License At Risk. Available at: https://www.fsmb.org/advocacy/news-releases/fsmb-spreading-covid-19-vaccine-misinformation-may-put-medical-license-at-risk/. Accessed 3 February 2022. [Google Scholar]
- 37. Krause PR, Fleming TR, Peto R, et al. . Considerations in boosting COVID-19 vaccine immune responses. Lancet 2021; 398:1377–80. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Vaccines and related biological products advisory committee. Available at: https://www.fda.gov/advisory-committees/advisory-committee-calendar/vaccines-and-related-biological-products-advisory-committee-september. Accessed 25 September 2021.