ABSTRACT
This study explores the social factors that may impact individuals' evaluation process of pandemic‐related misinformation through a socio‐cognitive lens. We conducted eight semi‐structured interviews to collect data from individuals. Content analysis was guided by framework analysis of the interview transcripts. The social factors revealed in the study are social identity, social groups, social authorities, social spaces, social media, and social algorithms. These factors work together and isolate individuals from heterogeneous information. Social identity may decide other factors; correspondingly, the information filtered by social groups, authorities, spaces, media, and algorithms reinforces individuals' social identity. The tendency may reinforce bias on pandemic information and put people at risk. The research may provide an implication to information platforms to reconsider their algorithm designs and a direction for information literacy training programs to break the deficit assumption on individuals.
Keywords: COVID‐19, Information practices, Information world, Misinformation
INTRODUCTION
Pandemic misinformation is incorrect or inaccurate information related to the COVID‐19 pandemic regardless of intent (Gabarron et al., 2021). The World Health Organization (WHO) indicates that the prevalence of COVID‐19 is accompanied by misinformation (WHO, 2020). Widely spread misinformation may exacerbate the pandemic, put people at risk, and threaten democracy (Ali, 2020; Hansson et al., 2021; Islam et al., 2020).
Understanding what factors may impact individuals' perceptions of pandemic misinformation is essential to reduce its negative outcomes. Pandemic misinformation studies usually emphasize individuals' health or/and information literacy impacted by the principle of autonomy, a paradigm that encourages citizens to take individual responsibilities for their health (Morley et al., 2020). However, information phenomena are embedded in social, organizational, and professional contexts (Hartel, 2019). This study focuses on how the surrounding environment can create barriers for an individual to identify pandemic misinformation. The researchers explore the following questions: 1) What social factors may impact individuals' evaluation process of pandemic‐related misinformation? 2) How may social factors impact individuals' evaluation process of pandemic‐related misinformation?
METHODOLOGY
This exploratory study adopts a qualitative research design since individuals' perception of pandemic misinformation is heavily impacted by their various backgrounds. Eight participants were recruited by convenience and snowball sampling. All the participants are above 18 years old and currently living in Columbia, SC, USA. Four participants are native‐born US citizens, and the other four were born in China.
The semi‐structured interviews were conducted in person between March and May 2022. The interviews include two main parts: the context of pandemic misinformation exposure and participants' evaluation process of pandemic misinformation. Participants could decide if they wanted to use pseudonyms. The interviews were audio‐recorded and transcribed verbatim with participants' permission. We developed the initial codebook by inductive content analysis using NVivo. At first, we separately read the transcripts, highlighted the sentences related to the research questions, and extracted keywords from participants' statements. Then, we compared the highlights and the keywords, classified the keywords, and created codes together. We defined each code by the transcripts and exiting literatures to resolve the coding discrepancies. The coding process was ongoing during data collection, which lasted about three months.
FINDINGS AND DISCUSSION
Pandemic Misinformation Evaluation
Participants used various strategies to evaluate the pandemic information. They tended to verify the suspicious information with different information resources. Embodied experiences play an important role in the evaluation process. For example, Yuting followed a diet suggestion that claimed it would strengthen people's immune system but eventually found it was misinformation. Participants considered costs and benefits when facing pandemic information: “There's a lot of tools that you can use to do this, but they take more time than it's worth” (Katie).
Making a Cocoon
Social Identity as a Starting Point
Participants hold specific social identities when interacting with pandemic information. They defined themselves and other people by nationality, region, political affiliation, religious affiliation, personal belief, and social status. These social identities profoundly impact their choices of social groups, social authorities, social spaces, and social media. When talking about news, Ollie said, “Like news sources that are generally considered more conservative, then I generally would be more skeptical. That probably, you know, is my own bias.” The participants born in China but living in Columbia acquired pandemic information from Chinese media; as Hannah said, “I just feel it is easy for me to understand and read.”
Spinning: The Flow of Pandemic Information
Pandemic information flows to an individual from different directions. Social group members are essential pandemic information resources: Stacy heard pandemic misinformation from her grandparents, Joey heard it from co‐workers, etc. Pandemic information also directly comes from social authorities; a typical example is news about the pandemic or government statements. Social spaces facilitate access to pandemic information, increasing both opportunities and risks. Adrienne's workspace provided them free access to the New York Times; meanwhile, Joey encountered misinformation in her lab. Social media are another primary pandemic information source; however, participants critiqued them as lacking evidence (Adrienne) and profit‐oriented (Hannah).
These factors often create intertwined scenarios for pandemic information flow instead of working separately. Adrienne indicated that their partner listened to a lot of podcasts and read a lot of news during the pandemic, so they would ask for pandemic information from them. In this scenario, pandemic information flowed from social groups to social media and social authorities. Joey's colleagues showed her misinforming TikTok videos when she went to her lab after getting vaccinated. In this example, pandemic misinformation was flowing to her via her social group using social media in a social space. A potential underlying issue identified in our analysis is that the information flowing within these social dynamics might tend to be homogenous rather than heterogeneous since individuals construct information worlds based on their social identity, which may create a consensus among people in the information flow circle (e.g., Chatman, 1991; Granovetter, 1973).
Social Algorithms, The Last Layer
What was clear in our interviews is that participants' interactions with pandemic information was deeply intertwined with the Internet, leaving the space for algorithms to control information flow. “I'm not sure you're familiar with TikTok, if you watch one video on that topic, then it just gives you more about the same topic, so you can keep watching it. It's like constantly they pushing you with all the information, like, the bad part of the technology” (Joey). Social algorithms tailor pandemic information to participants based on their social identity, which might be reflected by their interaction with social groups, authorities, spaces, and social media. When participants clicked a New York Times link, liked the speech of Dr. Fauci, or shared statements from the Centers for Disease Control and Prevention or the World Health Organization, the social algorithms automatically decreased the priority of or even filtered information from news sources that opposed these perspectives. Social algorithms may relieve people from information overload but also create filter bubbles to isolate us from different opinions. They work with social groups, social authorities, social spaces, and social media, tightening the cocoon for an individual to limit the heterogeneous information flow in one's information world.
CONCLUSION
Individuals may try to verify pandemic information using various literacies and strategies when they think it is necessary. However, the resources they relied on may be determined by their social identities. Information literacy training usually emphasizes checking personal biases. Biases, however, can be invisible when we live in a world fulling of heterogeneous information filtered by algorithms. Therefore, information literacy training programs will be more effective if they include training about social algorithms and how social decisions affect what information one is fed online. In a perfect world, information delivered through social media platforms and search engines might be reconsidered and used to empower the user rather than making decisions for them based on fragments of individuals' social identities. As COVID‐19 continues, misinformation about the pandemic will likely continue to be prevalent. This pilot study may raise more attention to social factors in pandemic misinformation research. The research might lack universality due to the sample size and sampling techniques. Future studies can recruit participants from a wider variety of backgrounds to explore more social factors impacting misinformation evaluation. Quantitative studies also can be conducted to verify the influence of the discussed factors on individual misinformation evaluation practices.
ACKNOWLEDGEMENTS
We gratefully appreciate the participants of this study and Dr. Mónica Colón‐Aguirre and Dr. Vanessa Kitzie for suggestions on previous versions of the document.
Contributor Information
Yi Wan, yw19@email.sc.edu.
Kim M. Thompson, Email: kimthompson@sc.edu.
REFERENCES
- Ali, S. (2020). Combatting against COVID‐19 & misinformation: A systematic review. Human Arenas , 1‐16. 10.1007/s42087-020-00139-1, 337, 352 [DOI] [Google Scholar]
- Chatman, E. A. (1991). Life in a small world. Journal of the American Society for Information Science, 42(6), 438–449. https://doi.org/10.1002/(SICI)1097‐4571(199107)42:6<438::AID‐ASI6>3.0.CO;2‐B [Google Scholar]
- Gabarron, E. , Oyeyemi, S. O. , & Wynn, R. (2021). COVID‐19‐related misinformation on social media: A systematic review. Bulletin of the World Health Organization , 99(6), 455‐463A. http://dx.doi.org/ 10.2471/BLT.20.276782 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Granovetter, M. (1973). The strength of weak ties. American Journal of Sociology, 78(6), 1360–1380. https://www.jstor.org/stable/2776392 [Google Scholar]
- Hansson, S. , Orru, K. , Torpan, S. , Bäck, A. , Kazemekaityte, A. , Meyer, S. F. , … & Pigrée, A. (2021). COVID‐19 information disorder: Six types of harmful information during the pandemic in Europe. Journal of Risk Research , 24(3–4), 380–393. 10.1080/13669877.2020.1871058 [DOI] [Google Scholar]
- Hartel, J. (2019). Turn, Turn, Turn. In Proceedings of CoLIS, 24, paper colis 1901. http://www.informationr.net/ir/24-4/colis/colis1901.html [Google Scholar]
- Islam, M. S. , Sarkar, T. , Khan, S. H. , Kamal, A. H. M. , Hasan, S. M. , Kabir, A. , … Seale, H. (2020). COVID‐19–related infodemic and its impact on public health: A global social media analysis. The American Journal of Tropical Medicine and Hygiene, 103(4), 1621–1629. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morley, J. , Cowls, J. , Taddeo, M. , & Floridi, L. (2020). Public health in the information age: Recognizing the infosphere as a social determinate of health. Journal of Medical Internet Research, 22(8), e19311. [DOI] [PMC free article] [PubMed] [Google Scholar]
- WHO . (2020). Countering misinformation about COVID‐19. https://www.who.int/news-room/feature-stories/detail/countering-misinformation-about-covid-19 [Google Scholar]
