Skip to main content
American Journal of Public Health logoLink to American Journal of Public Health
editorial
. 2020 May;110(5):617–618. doi: 10.2105/AJPH.2020.305616

Vaccine Communication as Weaponized Identity Politics

David A Broniatowski 1,, Sandra C Quinn 1, Mark Dredze 1, Amelia M Jamison 1
PMCID: PMC7144449  PMID: 32267750

The World Health Organization declared vaccine misinformation—and consequent declines in vaccination rates—as a top health threat of 2019 (http://bit.ly/37G2NWP). In 2018, there have been measles outbreaks in 98 countries worldwide (including 1717 cases in Russia from January to June 2018—a 13-fold increase from the previous year; http://bit.ly/2S1upit), resulting in more than 140 000 deaths (http://bit.ly/2S0Pbin). Deaths from vaccine-preventable illnesses may be attributed primarily to the spreading phenomenon of vaccine refusal.

“ASTROTURFING” VACCINE REFUSAL

Typically framed as grassroots opposition, vaccine refusal has been increasingly linked with populist political rhetoric and attempts to undermine scientific authority.1 Concurrently, recent evidence has linked actors in the “vaccine debate” to state-sponsored interests—especially those associated with Russian interference in the 2016 US elections.2 We found that one set of Russian trolls was more than 22 times more likely to tweet about vaccines than was the average Twitter user. An in-depth analysis of hundreds of these troll-generated tweets indicated that the trolls were “playing both sides” of the vaccine debate, seemingly to promote political discord on the topic. Consequently, what appears to be popular support is at least partially “Astroturf” (i.e., artificial tweets, masquerading as grassroots advocacy).

Since our article appeared, Twitter has released several data sets pertaining to election integrity (http://bit.ly/318bdno). In this issue of AJPH, the work by Walter et al. (p. 718) draws on these new data to shed further light on the rationales underlying Russian troll activity, providing welcome new insights. Russian trolls used tweets about vaccination to construct “thematic personas” that enabled them to masquerade as US citizens taking specific political and other controversial stances, including presenting themselves as African American and promoting Black Lives Matter. This inclusion of an African American persona seems to have explicitly targeted underlying racial tensions in US society, in part by playing on stereotypes about African Americans. Moreover, the creation of that persona, in a time of significant racialized division, seems designed to fuel animosity with other stereotyped personas, such as the “pro-Trump” persona.

Evidence that vaccine content may be used to signal credibility and to create more “believable” personas reflects racialization of the vaccine debate that may or may not be accurate. Although lower levels of trust in government pertaining to vaccination and health care in general are long standing,3 these attitudes are not spread uniformly.4 Thus, the success of these trolling operations depends on how realistic these African American personas appear within the diverse African American community.

WEAPONIZED RACE RELATIONS

Unfortunately, stereotyped trolling campaigns may constitute a self-fulfilling prophecy. Decades of Kremlin-backed disinformation campaigns have targeted racial cleavages to promote internal strife and undermine Western values, both domestically and overseas, reinforcing the stereotypes that they sought to exploit. For example, on July 17, 1983, the Soviet KGB launched a disinformation campaign alleging that a mysterious illness—AIDS—was the result of US bioweapons experiments.5 Although initially created to discredit US influence in the Third World, this conspiracy theory ultimately morphed into an allegation of the US government using HIV for racial genocide of African Americans and Africans.

Public beliefs about the Ebola outbreak in 2014, and today’s outbreak in the Democratic Republic of the Congo, have shaped more recent iterations of conspiracy theories, including recent Russian propaganda that accused Ebola treatment workers in the Democratic Republic of the Congo of spreading, rather than treating, the disease in a bid to depopulate the continent. This disinformation has led to direct attacks on these workers, increasing the likelihood of a deadly outbreak.

Thus, Russian attempts to weaponize complex racial attitudes and link them to vaccination are simply the latest in a series of operations designed to increase existing tensions in the United States and in the West more broadly. History has shown that simply debunking these conspiracy theories has not been effective in stopping their spread. Rather, as our previous work has shown, culturally sensitive communication using trusted intermediaries may be effective in increasing vaccination rates among this and other vulnerable populations.

WEAPONIZED POLITICAL DISCOURSE

Russian troll activity has demonstrated the extent to which the mainstream political discourse has infused vaccine policy. Until recently, childhood vaccination has been a relatively nonpartisan issue, with political polarization focused on very specific cases (e.g., the introduction of the Gardasil human papillomavirus vaccine).6 However, recent years have seen an uptick in political polarization on generalized vaccine opposition. For example, we found that vaccine opponents aligned themselves with candidate Donald J. Trump in the leadup to the 2016 presidential election and continued to express support for him on Twitter when rumors broke that he was considering appointing Robert F. Kennedy Jr. to lead a vaccine safety commission.7 Russian trolls seem to have exploited these changes opportunistically, aligning their discourse with the followers of different candidates in the 2016 US presidential elections.

Walter et al. found that, in their attempts to promote discord, Russian trolls used tweets about vaccination to build convincing personas both supporting and opposing specific candidates who had made public statements about vaccines. Although presidential elections are somewhat adversarial by nature, the 2016 election was considered to be especially partisan, and Russian interference in this election is widely considered to have been the primary aim of the Internet Research Agency’s trolling campaign. The presence of tweets expressing consistent vaccine opposition by “pro-Trump” personas and tweets expressing support for vaccination by “anti-Trump” personas therefore highlights the extent to which Russian trolls perceive vaccination becoming an increasingly partisan issue and, indeed, foster that change. Vaccine opposition may become entrenched as part of the political platforms of one of the major US political parties, as it has in populist movements worldwide. This would move a fringe position into the political mainstream and polarize a public health challenge, as has occurred with gun violence reduction and climate policies.

The sophistication of this Russian information operation sheds light on important aspects of vaccine communications: targeting and tailoring communication. Best practice for health communicators is to use messages designed to speak to specific communities’ needs, and it appears that foreign governments have become increasingly adept at identifying and targeting messages to vulnerable communities. This suggests a degree of cultural awareness that could be enabled only by significant financial and educational resources.

By contrast, many domestic public health agencies’ communication operations are woefully underresourced and understaffed. These findings therefore point to an urgent need to invest in public health communication. There is a clear need for funding for research focused on how to effectively counter disinformation on social media. Adversaries, such as Russian-backed disinformation campaigns, aim to promote discord and confusion and therefore are free to experiment with multiple, often conflicting, narratives in pursuit of their goals. By contrast, public health communications must remain evidence based while imparting meaningful and compelling messages. Constructing these messages requires both scientific guidance and sociocultural expertise. These online challenges also speak to the compelling need for public health agencies to use more traditional partners to promote vaccine acceptance. From health care providers, community organizations, faith communities, and others, we must ensure trusted, reciprocal, persuasive communications to strengthen vaccine acceptance and protect the health of our communities.

ACKNOWLEDGMENTS

The Institute for Data, Democracy, and Politics is supported in part by the John S. and James L. Knight Foundation (award ECNS21702N).

Note. The Knight Foundation had no role in study design.

CONFLICTS OF INTEREST

M. Dredze holds equity in Sickweather, Inc. and has received consulting fees from Bloomberg LP and Good Analytics, Inc. These organizations did not have any role in the study design, data collection and analysis, decision to publish, or preparation of the editorial.

Footnotes

See also Walter et al., p. 718.

REFERENCES

  • 1.Kennedy J. Populist politics and vaccine hesitancy in Western Europe: an analysis of national-level data. Eur J Public Health. 2019;29(3):512–516. doi: 10.1093/eurpub/ckz004. [DOI] [PubMed] [Google Scholar]
  • 2.Broniatowski DA, Jamison AM, Qi S et al. Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. Am J Public Health. 2018;108(10):1378–1384. doi: 10.2105/AJPH.2018.304567. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Freimuth VS, Jamison AM, An J, Hancock GR, Quinn SC. Determinants of trust in the flu vaccine for African Americans and Whites. Soc Sci Med. 2017;193:70–79. doi: 10.1016/j.socscimed.2017.10.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Quinn SC, Jamison A, An J, Freimuth VS, Hancock GR, Musa D. Breaking down the monolith: understanding flu vaccine uptake among African Americans. SSM Popul Health. 2017;4:25–36. doi: 10.1016/j.ssmph.2017.11.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Boghardt T. Soviet Bloc intelligence and its AIDS disinformation campaign. Stud Intell. 2009;53(4):1–24. [Google Scholar]
  • 6.Kahan DM. A risky science communication environment for vaccines. Science. 2013;342(6154):53–54. doi: 10.1126/science.1245724. [DOI] [PubMed] [Google Scholar]
  • 7.Dredze M, Wood-Doughty Z, Quinn SC, Broniatowski DA. Vaccine opponents’ use of Twitter during the 2016 US presidential election: implications for practice and policy. Vaccine. 2017;35(36):4670–4672. doi: 10.1016/j.vaccine.2017.06.066. [DOI] [PubMed] [Google Scholar]

Articles from American Journal of Public Health are provided here courtesy of American Public Health Association

RESOURCES