Skip to main content
Scientific Reports logoLink to Scientific Reports
. 2025 Oct 6;15:34740. doi: 10.1038/s41598-025-18457-1

Assessing the impact of misinformation during the spread of infectious diseases

Alejandro Bernardin 1, Tomas Perez-Acle 1,2,
PMCID: PMC12501366  PMID: 41053411

Abstract

In today’s digital era, the internet offers unprecedented access to information, but it also accelerates the spread of misinformation. Nowhere is this more problematic than in public health, as the COVID-19 pandemic clearly demonstrated. Misinformation can erode trust in science and health authorities, leading people to disregard expert guidance and adopt unverified treatments that endanger population health. We examine how misinformation alters the course of an infectious disease outbreak by modeling the simultaneous uptake of preventive measures and engagement in harmful behaviors. The model captures the competing influences of accurate information and misinformation on individual decision making. Our results show that even a modest influx of misinformation can greatly amplify disease transmission, deepening the epidemic’s severity. These findings highlight the urgent need for robust strategies to curb misinformation and support public health interventions during health crises.

Keywords: Infectious disease, Epidemic, Misinformation, Ebola

Subject terms: Diseases, Health care, Mathematics and computing

Introduction

The exponential growth of the internet has dramatically expanded channels for disseminating information, amplifying both reliable reporting and false or misleading content1,2. Here, we define factual information as evidence-consistent guidance and misinformation as an umbrella term for claims that contradict the best available evidence at the time of sharing, irrespective of intent2. The stakes of spreading misinformation become particularly high in public health contexts, where access to reliable information is paramount, especially concerning infectious disease outbreaks. This scenario unfolds within a complex web of dynamics heavily influenced by human behavior, which in turn is shaped by the information that individuals receive, primarily through mass media, the internet, and social networks3.

The COVID-19 pandemic has given rise to “infodemics”, as defined by the World Health Organization: an overwhelming mix of accurate and misleading information flooding digital platforms and social media4,5. This phenomenon illustrates the dual impact of digital information on public health crisis management. Specifically, the spread of misinformation on COVID-19, including baseless treatments, speculative ideas, and conspiracy theories68, highlights the substantial public health risks posed by misinformation. Therefore, the role of the internet and social networks in distributing information presents a paradox, as they have the power to either dampen or intensify the spread of infectious diseases9.

To elucidate the impact of misinformation during health crises, such as infectious disease outbreaks, our research examines the dynamics of disease transmission among human populations. We emphasize the dual role of factual information and misinformation, which can respectively foster preventive or harmful behaviors. Our approach involves a SEIRD compartmental model embedded in an Agent-Based Modeling (ABM) framework, featuring mechanisms for information decay and differentiating between factual information and misinformation. Furthermore, it incorporates behavioral responses, classified as either preventative or harmful. Through this framework, our objective is to analyze the impact of disseminating both factual information and misinformation on the spread of an infectious disease.

We have chosen Ebola Virus Disease (EVD) as the focus of our case study. EVD is characterized by a high mortality rate and poses a significant challenge to public health. To support our selection in the COVID-19 era, it is crucial to consider that the basic reproductive number (Inline graphic) of EVD is below 1.510, meaning that each infected person transmits the disease to, on average, 1.5 other individuals. This suggests that with effective interventions to reduce Inline graphic close to 1 (stop the spread of the infection), containment of EVD could be feasible. The selection of EVD is emphasized by its unique ability to trigger profound social and behavioral reactions—such as panic, discrimination against healthcare professionals, excessively stringent public policies, and widespread skepticism toward scientific findings—amplified by media and political rhetoric11,12. This complexity makes EVD an exemplary case for studying the interplay between infectious disease dynamics and societal response to misinformation dissemination.

The EVD outbreaks between 2014 and 2016 highlighted the essential role of public health strategies in the management of the disease, especially given the lack of a universally effective vaccine or treatment for most EVD strains when those outbreaks occurred13. This period demonstrated the intricate relationship among disease transmission, the spread of misinformation, and societal behaviors—a focal point of our research. The prevalence of misinformation and the consequent erosion of trust in institutions during these outbreaks had a profound impact on public reactions and the success of health policies. This underscores the importance of analyzing the interaction between the dispersion of factual information and misinformation within our model14,15.

Given the progression since the last major outbreaks, it is noteworthy that while treatments and vaccines for EVD have since advanced, the disease remains a global health concern16,17. This backdrop, together with the unique set of characteristics of EVD, such as Inline graphic, makes it an ideal subject to study the impacts of factual information, misinformation, and behavioral responses on disease containment and public health strategy.

Our findings suggest that misinformation can significantly alter the course of an infectious disease’s spread, amplifying its impact on populations even with minimal exposure. This phenomenon adds an extra, challenging layer of complexity to the already formidable task of containing epidemics.

Methods

Model overview

Compartmental models, fundamentally established by the seminal contributions of Kermack and McKendrick in 192718, are the workhorse of theoretical epidemiology. Individuals are grouped as susceptible (S), infected (I), and removed (R) based on their health status over time. The removed category comprises individuals who no longer participate in the disease dynamics due to recovery, acquired immunity, or death. Adaptations of these models, including but not limited to SIS (susceptible–infected–susceptible), SEIRD (susceptible–exposed–infected–removed–death), and SEIRHVD (susceptible–exposed–infected–removed–hospitalized–mechanically_ventilated–death), are designed to accommodate the progression stages of various diseases1921. A critical component of these models is the infection rate, Inline graphic, which quantifies the transition of individuals from susceptible to exposed or infected categories, depending on the disease. Although Inline graphic is often assumed to be constant for simplification, this assumption is primarily for analytical tractability. In practical scenarios, Inline graphic is subject to temporal variations influenced by interventions, changes in pathogen virulence, and the development of herd immunity22,23. Recent literature suggests that Inline graphic not only depends on biological factors but also on the quality and quantity of information individuals have about the infection at any given time2427. To explore this, we introduce two parameters to independently account for the influence of factual information (x(t)) and misinformation (y(t)) on disease dynamics, where x and y range from 0 (no information/misinformation) to just below 1 (fully informed/misinformed).

Behavioral-information factor (BIF)

Following our previous approach28, yet extending it to include misinformation, we propose Inline graphic as follows:

graphic file with name d33e396.gif 1

where Inline graphic is the behavioral-information factor, BIF, a function of time through x(t) and y(t). Although the exact form of Inline graphic is not specified, it should meet certain criteria: (1) Inline graphic, as Inline graphic is always positive; (2) Inline graphic approaches 1 as both x and y approach 0, reflecting the baseline infection rate; (3) Inline graphic is analytic at (0, 0), allowing expansion as a Taylor series; and (4) the effects of information and misinformation on Inline graphic are inversely related, modeled as

graphic file with name d33e469.gif 2

where H.O.T. denotes higher order terms of the Taylor series. We can visualize Inline graphic in Supplementary Figure S6.

Information-decay kinetics

To elucidate the dynamics of factual information (x) and misinformation (y) over time, we define x(t) and y(t), as Inline graphic and Inline graphic, namely the awareness decay constant and the misinformation decay constant, respectively. Here, i and j represent the quality of factual information and misinformation, respectively, which decays over time or as it is passed from one individual to another. This decay is modeled by Inline graphic as proposed in29, illustrating that when an individual with lower-quality information (k) interacts with someone possessing higher-quality information (i), the resultant information is of quality Inline graphic, slightly diminished from the original (larger numbers imply lower quality). Notably, high-quality information (Inline graphic) is generated in newly infected individuals, emphasizing the role of direct experience in fostering high-quality awareness about the epidemic30. We emphasize that by the “quality” of factual information or misinformation, we refer to its fidelity to the original source. In this context, information that originated long ago and has been transmitted through multiple intermediaries tends to lose quality, that is, it may no longer contain all the original details or may have lost part of its content. Consequently, high-quality information is that which closely preserves the content and accuracy of the original source, regardless of whether it is factual information or misinformation.

Importantly, our model conceptualizes awareness as knowledge about the epidemic that motivates preventive actions, a critical factor in disease control31. Consistent with the general definition of misinformation given above, we operationalize (in the public-health context) misinformation as evidence inconsistent beliefs that motivate harmful (risk-promoting) behavior, thereby undermining control efforts. This is an operational refinement, not a second definition, restricting “misinformation” to the behaviorally relevant subset used in the model.

Now, simplifying the BIF (Inline graphic) to its first-order approximation in x and y, since both variables remain below 1 for all t, and defining Inline graphic and Inline graphic, we can reformulate the adjusted infection rate (Inline graphic) as

graphic file with name d33e623.gif 3

This formulation allows us to consider the original model by Funk et al.29, where Inline graphic, as a special case within our broader framework, achieved by setting Inline graphic and Inline graphic. This extension incorporates the impacts of both factual information and misinformation on disease spread, enriching the model’s capacity to explore epidemic dynamics. We summarize all the parameters of our model in Table 1.

Table 1.

Summary of model parameters used in the ABM–SEIRD framework. We present the parameters for the infectious disease and communication models. The dimensionless parameters are indicated with a “–”. More details in Supplementary Material S2.

Parameter Value Units Description
Inline graphic 11 days Incubation period (E Inline graphic I)
Inline graphic 6 days Infection period (I Inline graphic{R, D})
Inline graphic 4 days Death-to-burial period (D Inline graphic burial)
Inline graphic 0.7 Fatality rate (I Inline graphic D)
Inline graphic 0.25 Inline graphic Infection rate (I Inline graphic S)
Inline graphic 0.20 Inline graphic Infection rate from corpses (D Inline graphic S)
Inline graphic [0,1] Awareness–information state
Inline graphic [0,Inline graphic) Unawareness–information state
Inline graphic [0,1] Awareness decay constant
Inline graphic [0,1] Misinformation decay constant
Inline graphic [0,Inline graphic) Awareness–information quality constant
Inline graphic [0,Inline graphic) Unawareness–information quality constant
Inline graphic [0,1] Preventive behavior coefficient
Inline graphic [0,Inline graphic) Harmful behavior coefficient
Inline graphic [0,1] Ratio of informed individuals with factual information
Inline graphic [0,1] Ratio of informed individuals with misinformation

The transformation of information into actionable behavior is a pivotal aspect of our model. Information, irrespective of its persistence, indicated by the awareness decay constant Inline graphic for factual information and misinformation decay constant Inline graphic for misinformation, must ultimately influence behavior. This underscores a fundamental principle: the possession of information alone is insufficient unless it translates into behavioral change32,33. In our framework, Inline graphic and Inline graphic serve as metrics quantifying preventive and harmful behaviors, respectively. This differentiation is crucial as it acknowledges the varying effectiveness of behaviors in combating or exacerbating disease spread.

For instance, preventive behaviors vary in efficacy, from basic hygiene practices to vaccination, necessitating a model that is flexible enough to simulate different strategic interventions. Conversely, harmful behaviors also exhibit a spectrum of impact. Notably, behaviors such as avoiding vaccination can significantly affect disease dynamics. More extreme examples, such as “measles parties” discussed in the literature34,35, highlight scenarios where misinformation leads to intentionally risky behavior, aiming at natural immunity through infection. These examples highlight the potential of our model to study the effects of various types of behaviors, both preventive and harmful, on the spread of disease. It’s important to highlight that the harmful behavior coefficient, denoted by Inline graphic, can potentially reach infinite values. This reflects the reality that harmful behaviors can exacerbate the infection rate exponentially, unlike protective behaviors Inline graphic, which, by their nature, cannot reduce the infection rate to negative values. This distinction underscores the potentially boundless impact of detrimental actions on the spread of an infection.

Crucially, our model distinguishes between the decay constants of factual information Inline graphic versus misinformation Inline graphic. Empirical evidence suggests that misinformation, often more sensational and emotionally charged, tends to spread more widely and persist longer than factual information1,36. As an example, we have the “salt-water” cure that urged people to bathe and drink hot salt water because ’Ebola is now airborne’, leading to reported deaths and hospitalizations, an archetypal high arousal, urgent frame37. As we see, this distinction is not merely academic, but reflects observed phenomena, which warrants a different treatment of Inline graphic and Inline graphic in our model. Because of this, we assume Inline graphic, indicating that misinformation has a more prolonged influence compared to factual information. This definition will become important for establishing a more realistic approach when analyzing the outcomes of our simulations (see below).

Extensive research has been conducted on the spread of EVD, involving both the collection of empirical data and the development of mathematical models to understand its temporal dynamics12,3845. The SEIRD model, which accounts for the significant roles of both the incubation period and the infectivity of deceased individuals, has been particularly effective in capturing the disease’s behavior. This model’s parameters were initially derived from data that did not consider the impact of information flow on disease transmission40. In response to this gap, our study employs an ABM framework designed to simulate the spread of EVD in a population while incorporating the competing dynamics of factual information and misinformation. This approach enables a more nuanced understanding of how factual information and misinformation can influence disease spread. Specifically, we extend the model to adjust the infection rates from infected to susceptible individuals (Inline graphic) and from deceased to susceptible individuals (Inline graphic), incorporating the effects of information and misinformation as follows:

graphic file with name d33e1111.gif 4

where Inline graphic and Inline graphic are BIF parameters that quantify the effects of preventive and harmful behaviors, respectively, across both live and deceased individuals. Similarly, Inline graphic and Inline graphic are BIF parameters that quantify the decay of factual information and the decay of misinformation, respectively. The uniform application of BIF underscores our model’s approach to studying disease transmission, emphasizing the critical role of information dynamics in public health responses to epidemics.

Implementing the model in an ABM framework

Based on the ABM capabilities of NetLogo 6.1.146, a platform renowned for its versatility in multi-agent simulations and robust support community, we designed a simulation encompassing 10,000 agents. NetLogo’s support for spatially embedded agent dynamics is particularly advantageous for epidemiological simulations, where the spatial proximity between agents significantly influences disease spread4749.

In our model, each agent is defined by four primary parameters: spatial position Inline graphic, epidemiological state Inline graphic, awareness-information state Inline graphic, and unawareness-information state Inline graphic (representing the possession of misinformation that promotes harmful, risk-increasing behavior), with i indicating the individual agent. The epidemiological states are derived from the SEIRD model, accommodating susceptible S, exposed E, infected I, recovered R, and deceased states D. The awareness-information and unawareness-information states quantify the factual information and misinformation an agent has about EVD, respectively, influencing both their behavior and susceptibility.

Agents perform a random walk on a 2D torus (periodic boundaries). This approach ensures that, over time, agents disperse uniformly across the space, albeit theoretical convergence to a uniform distribution is only achieved asymptotically. While agents primarily move independently, their epidemiological and information states evolve based on interactions, simulating the transmission of both disease and information. Minor modifications from our prior work28 include adjustments to the rate of information decay and the introduction of more nuanced agent behaviors based on their information state, enhancing the model’s realism and applicability to real-world scenarios. Although our simulations extend to 1000 days, this period is selected to balance computational feasibility with the need to observe the long-term dynamics of EVD spread and information flow, acknowledging that this timeframe may offers merely a snapshot rather than a comprehensive endpoint of epidemic progression. Each set of parameters of the simulation was run 100 times, to then calculate a mean and a standard deviation.

At the outset of the simulation, agents are uniformly distributed across the 2-torus space, which is subdivided into Inline graphic square patches, each with a side length of 1. This spatial arrangement creates a grid-like environment conducive to simulating disease transmission and information dissemination, with interactions restricted to agents sharing the same patch. The choice of this patch size and number was strategically made to emulate interaction rates that align with the SEIRD model outcomes based on ordinary differential equations (ODEs) detailed by Weitz and Dushoff40, specifically tailored for understanding the EVD dynamics during the 2014–2016 outbreak. For details on the parameters used, see S1.

In terms of epidemiological dynamics, our simulation delineates two primary processes: infection dynamics and state transition dynamics. As mentioned before, the agents may be in one of five epidemiological states: Inline graphic. The infection dynamics is triggered when susceptible agents encounter infected or deceased agents within the same patch. This event initiates a probabilistic transition, governed by a Monte Carlo algorithm, from the (S) to the (E) state, denoted as Inline graphic. Specifically, during these encounters, we generate a random number Inline graphic from a uniform distribution Inline graphic for each S agent present. This number is then compared against the infection probability Inline graphic. If Inline graphic, the transition to the E state occurs.

The infection probabilities Inline graphic and Inline graphic, for Inline graphic transitions due to (I) and (D) agents, respectively, are adapted from the parameters identified by Weitz and Dushoff40 for the EVD outbreak. These parameters incorporate the impact of awareness-information

graphic file with name d33e1303.gif

and unawareness-information

graphic file with name d33e1309.gif

states on the infection process, thereby integrating the dynamics of information flow with epidemiological spread. Given the simulation’s setup, interactions are predominantly pairwise, owing to an average agent density per patch of approximately 1.38, a condition that also holds true for information exchange dynamics.

Apart from the susceptible state, agents experience transitions such as Inline graphic, Inline graphic, Inline graphic, and Inline graphic, reflecting the progression from exposure to infection, recovery or death, and eventually removal from the simulation (signifying burial). Transitions from E to I and from D to Inline graphic occur deterministically after periods Inline graphic and Inline graphic, respectively. Conversely, the transitions Inline graphic and Inline graphic happen after Inline graphic days, determined probabilistically at rates Inline graphic and f, respectively, illustrating the uncertainty in an infected individual’s outcome. Furthermore, the dynamics of these transitions are influenced by each agent’s awareness-information state Inline graphic, and unawareness-information state, Inline graphic. These states dynamically affect the transition probabilities by altering the rates Inline graphic and Inline graphic. The temporal evolution of an agent’s awareness-information and unawareness-information states, represented by Inline graphic and Inline graphic, respectively, models how the quality of information—whether factual information or misinformation—waxes or wanes over time. This variation in information quality and its impact on the infection rates Inline graphic highlight the intricate relationship between the spread of information and the dynamics of the epidemic. Additionally, Inline graphic denotes the awareness-information quality constant for agent i, a time-varying measure with Inline graphic indicating the peak quality of awareness-information, a concept paralleled in the unawareness-information quality constant, Inline graphic.

To account for heterogeneity in agent responses, the model allows that each agent’s awareness decay constant Inline graphic and misinformation decay constant Inline graphic are sampled from distributions Inline graphic and Inline graphic, where in this case p(x) is a delta distribution, for simplicity. These distributions aim to reflect societal characteristics, suggesting that a society’s trust levels, in its institutions, media, and amongst individuals can significantly impact the collective behavior in response to an epidemic. Although a direct method to quantify these sociocultural factors into specific distributions for Inline graphic remains elusive, the model posits that societies with higher trust are likely to maintain higher levels of awareness longer than those with lower trust levels, especially when faced with crucial information on how to handle disease spread50. This detailed framework for agent transitions and the integration of awareness and unawareness dynamics into the model not only enhances the realism of simulated epidemic scenarios but also opens avenues for exploring the effects of societal traits on epidemic outcomes.

In our model, agents acquire new factual information and misinformation through three principal mechanisms: (1) exposure to a global source that periodically disseminates factual information or misinformation, thereby resetting the quality constant Inline graphic to zero for all recipients. This reset reflects the reception of an authoritative, lossless broadcast (e.g., an official Ministry of Health briefing), which completely replaces any prior uncertainty, rather than merely incrementally improving the existing information. (2) interaction with other agents within the same patch, where factual information or misinformation of superior quality is transferred, leading to Inline graphic; and (3) upon acquiring the disease, where exposed agents automatically receive high-quality information, reflected by Inline graphic. Factual information and misinformation quality degrade over time, with quality decreasing by one unit per simulation time step. Accordingly, the evolution of an agent’s information quality over time is described by:

graphic file with name d33e1531.gif 5

Similarly, the dynamics of misinformation follow a parallel pattern, albeit with nuanced differences to reflect the distinct nature and spread of misinformation:

graphic file with name d33e1539.gif 6

This model posits that factual information and misinformation are transferred from agents possessing higher-quality content to those with lower-quality content during interactions, underpinning the model’s communication dynamics. This assumption is consistent with the two-step flow: more informed opinion leaders, those with greater media exposure and knowledge (high-quality), tend to transmit information to less active followers (low-quality)51,52. Similarly, classical rumor-spreading models53 formalize this directional flow, where an informed “spreader” converts an “ignorant” individual upon contact. Crucially, in our model, this dynamic can operate both through factual information and through misinformation. Importantly, these exchanges are based on the condition that both participating agents are alive: Inline graphic.

To introduce spatial homogeneity into our simulation, we randomly distributed agents across the grid, adhering to a uniform distribution. The initial setting of the, factual information and misinformation, quality constant at Inline graphic for all agents—apart from those initially infected, for whom Inline graphic—serves a specific purpose. This value effectively represents a baseline of minimal factual information or misinformation, chosen because it yields an awareness-information/unawareness-information state Inline graphic almost negligible when applied to equation Inline graphic, for Inline graphic. This ensures that at the start, the majority of agents possess virtually no awareness or unawareness about the pandemic, with Inline graphic, setting a uniform stage for the spread and impact of information as the simulation progresses. The compartmental model is shown in Fig 1.

Fig. 1.

Fig. 1

SEIRD and communication model. Compartmental model of infectious disease dynamics and communication. The continuous lines show the transition between epidemiological states, the black dashed lines the infectious dynamics, and the red dashed lines the flow of information between agents and from the external source. Details of the parameters can be found in Table 1.

In our simulation’s initial configuration, 100 agents are designated as infected, Inline graphic, to catalyze the epidemiological dynamics from the outset, while the rest are susceptible, Inline graphic. This initial infection number was chosen to strike a balance between a realistic start to an outbreak and computational manageability, ensuring dynamic interactions between health states and information flow can emerge naturally. The model’s capacity for agents to concurrently hold factual information and misinformation introduces nuanced dynamics that reflect the complexities of real-world information consumption and belief formation. For example, an agent might be aware of the need for handwashing (factual information) while also believing in unproven treatments (misinformation) such as bleach injection, affecting their behavior in ways that are not straightforwardly predictable54.

Initial infected agents were randomly selected, emphasizing the unpredictable nature of disease spread and ensuring diverse interactions within the simulation environment. This approach, coupled with the distinct starting points for information quality, lays a foundational framework for exploring how factual information, misinformation, and epidemiological states evolve and influence one another over time.

Results and discussion

As a starting point, we explore a scenario without misinformation, with the inclusion of a component denoting preventive behavior Inline graphic, acting over the infection rate Inline graphic. This inspection allows us to examine more closely how factual information alone influences the spread of the disease and the agents’ responses. Subsequently, we extend our investigation to a more complex scenario that incorporates misinformation originating from a central entity and circulating via word-of-mouth, mirroring real-world conditions where factual information and misinformation coexist and compete. By comparing these two scenarios, we aim to uncover the effects of misinformation on the epidemic’s dynamics and the population’s behavior, highlighting the significance of the preventive and harmful behavior Inline graphic and Inline graphic, respectively, when navigating an environment riddled with factual information and misinformation.

A system with factual information

For our simulations, as detailed in Fig. 2, we employed the parameters delineated in Table 2, concentrating solely on the dissemination and ramifications of factual information. We proceed under the premise that the behaviors of individuals are in perfect concordance with the information they are exposed to, thereby adhering strictly to the recommended preventive actions.

Fig. 2.

Fig. 2

System with factual information. Exploration of the model for a scenario where there is only factual information. Panel (a), examines the time evolution of the susceptible agent ratio across varying levels of awareness decay constant Inline graphic and the ratio of informed individuals Inline graphic, with a constant preventive behavior parameter Inline graphic. Panel (b), offers a density plot that visualizes the final ratio of susceptible agents at the simulation’s conclusion, correlating with different values of Inline graphic and Inline graphic, under the assumption of preventive behavior Inline graphic. Panel (c), revisits the time evolution of the susceptible agent ratio for an array of Inline graphic and Inline graphic values, this time with the preventive behavior parameter adjusted to Inline graphic. Panel (d), presents a density plot detailing the final ratio of susceptible agents, examining the impacts of varying Inline graphic and Inline graphic, with preventive behavior set at Inline graphic.

Table 2.

Parameters for different simulations (different parameters). In Fig. 2, setting Inline graphic denotes the absence of misinformation, effectively negating harmful behavior denoted by Inline graphic. Conversely, Figs. 3 and 4 introduce misinformation to the simulations. Each parameter set was executed 100 times.

Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
Figure 2 0.7, 1.0 0 Variable 0 Variable 0
Figure 3 1.0 1.0 1.0, 0.5 1.0 Variable Variable
Figure 4 1.0 1.0 1.0 1.0, 0.1 Variable Variable

The results, depicted in Fig. 2B and further elaborated in Supplementary Figure S1, illustrate how variations in the decay constant of factual information Inline graphic and the preventive behavior coefficient Inline graphic affect the simulation outcomes. The color gradient in the figure indicates the final number of susceptible individuals Inline graphic, with higher values represented in yellow, signaling more effective epidemic control.

Interestingly, even at the highest level of preventive behavior (Inline graphic), the area within which the disease is effectively halted (indicated in yellow) is relatively small compared to the regions where the disease exerts a greater impact (shown in blue). This discrepancy becomes more pronounced as the preventive behavior level decreases to Inline graphic, still considered significantly proactive.

A critical observation from our analysis is that the yellow region, denoting effective disease control, emerges only when both the ratio of informed individuals with factual information Inline graphic and the awareness decay constant Inline graphic are high. This indicates that for a population to derive benefit, it is crucial that a substantial proportion remains well-informed and that the information retains its behavioral influence over time. Conversely, a decrease in either Inline graphic or Inline graphic invariably leads to adverse outcomes for the population. These findings emphasize the vital importance of maintaining a high ratio of people informed with factual information Inline graphic and fostering effective preventive behaviors Inline graphic. Given the uncertain behavior of Inline graphic in real-world scenarios, prioritizing high values for Inline graphic and Inline graphic could significantly enhance our ability to protect populations during epidemics.

This analysis reveals the essential role of reliable information and behavior compliance in controlling epidemic spread, offering valuable insights into optimizing public health responses.

A system with both factual information and misinformation

For simplicity, in this section, we initially assume that the dissemination rates of both factual information and misinformation among individuals are equivalent. The results of this augmented scenario are presented in Fig. 3, where, in contrast to Fig. 2, the y-axis now represents the awareness decay constant Inline graphic, and the x-axis denotes the misinformation decay constant Inline graphic.

Fig. 3.

Fig. 3

System with variably factual information and constant misinformation We analyze scenarios where the proportion of individuals informed with misinformation is fixed at 1.0, while the proportion of those receiving factual information varies from 1.0 to 0.5. We denote different configurations of Inline graphic—the parameters representing misinformation and awareness decay constant, respectively—as points A, B, and C, with Inline graphic, Inline graphic, and Inline graphic. The harmful behavior adjustment factor is consistently set at Inline graphic. Panel (a), illustrates the time evolution of the susceptible agent ratio for configurations A, B, and C. Panel (b), displays a density plot showing the final susceptible agent ratio at the simulation’s conclusion across varying levels of Inline graphic and Inline graphic. These panels account for a scenario where the population is informed with a mix of factual information and misinformation in equal proportions (1.0 each). Panel (c), repeats the time evolution analysis of the susceptible agent ratio for points A, B, and C, under modified conditions. Panel (d), presents a corresponding density plot for the final susceptible agent ratio, considering different values of Inline graphic and Inline graphic. Here, the ratio of the population informed with factual information is reduced to 0.5, maintaining the misinformation level at 1.0.

Examining the temporal evolution of our SEIRD model through Fig. 3a, we observe three distinct simulations—labelled A, B, and C—each characterized by unique pairs of Inline graphic and Inline graphic values: Inline graphic, Inline graphic, and Inline graphic. Interestingly, these simulations reveal a comparable final count of susceptible individuals, suggesting minimal variance in epidemic size across these parameter sets when Inline graphic of the population receives factual information.

This observation is further visualized in Fig. 3b, which displays a density plot of the final number of susceptible individuals. Notably, the region (depicted in yellow) indicating a high final number of susceptibles—corresponding to a low epidemic impact—is relatively small, even when the factual information dissemination rate is at 1.0. This indicates that universal dissemination of factual information alone does not significantly mitigate the spread of the epidemic.

To more closely align our simulation with real-world dynamics, we include a red line to represent scenarios where Inline graphic. Given literature suggesting that misinformation tends to persist longer within populations1, we infer that realistic scenarios likely feature Inline graphic. Consequently, simulations falling beneath this red line are considered more plausible, implicitly excluding the optimistic yellow region from real-world applicability. This reinforces the conclusion that, despite the global dissemination of factual information, the potential to control the epidemic remains limited when misinformation is present. Figure 3c,d further illustrate the system’s behavior with only Inline graphic of the population receiving factual information. A comparison with the Inline graphic informed, with factual information, scenario emphasizes that a higher factual information dissemination rate marginally improves outcomes. Yet, similar to the fully informed case, this scenario predominantly resides above the red line, suggesting its improbability in a real-world context. This analysis accentuates the complex challenges in achieving effective epidemic control through factual information dissemination alone, especially in the presence of competing misinformation. For a comprehensive view of how varying degrees of factual information dissemination affect the simulation—from a ratio of 0.0 (no factual information) to 1.0 (everyone is informed with factual information)—please refer to Supplementary Figure S2.

Building on our earlier analysis, we now contrast two systems with a focus on adjusting the proportion of individuals receiving misinformation, while maintaining constant exposure to factual information across the entire population (fixed ratio at 1.0, indicating that everyone is exposed to factual information). This comparative study is visualized in Fig. 4, which notably illustrates the high impact of the misinformation, observable by comparing panel b and d of Fig. 4. In Fig. 4b (is the same of Fig. 3b) we have all the population misinformed, resulting in a yellow area above the red line, that indicates the epidemic has almost no chance to be stopped. In the second scenario shown in Fig. 4c, d, just Inline graphic of the people are misinformed, but nevertheless, the yellow area still remains above the red line, despite increasing its area. We must recall that the parameters below the red line, as previously discussed, represent a likely realistic condition where the misinformation decay constant Inline graphic is bigger than the awareness decay constant Inline graphic (people remember more misinformation). This stipulation further diminishes the likelihood of optimistic scenarios, reinforcing the challenge of achieving meaningful public health outcomes even under a low amount of misinformation influence. For a comprehensive view of how varying degrees of misinformation dissemination affect the simulation—from a ratio of 0.0 (no misinformation) to 1.0 (everyone is misinformed)—please refer to Supplementary Figure S3. This extended analysis provides deeper insights into the dynamics at play and highlights the critical challenge posed by misinformation in managing public health responses effectively.

Fig. 4.

Fig. 4

System with constant factual information and variable misinformation. In this exploration of the model, we vary the ratio of individuals informed with misinformation from 1.0 to 0.1. Scenarios are illustrated through points A, B, and C, each representing distinct values of Inline graphic: Inline graphic, Inline graphic, and Inline graphic. Throughout, we maintain the behavior adjustment parameter at Inline graphic. Panel (a) details the time evolution of the susceptible agent ratio at the specified points A, B, and C. Panel (b) displays a density plot showing the final ratio of susceptible agents at the end of the simulation, examining various levels of awareness decay constant Inline graphic and misinformation decay constant Inline graphic. In this scenario, the population is equally informed with both factual information and misinformation (ratio of 1.0 each). Panel (c) again observes the time evolution of the susceptible agent ratio for points A, B, and C, under modified information distribution. Panel (d) provides a density plot of the final susceptible agent ratio, exploring different values Inline graphic and Inline graphic. This analysis adjusts the ratio of the population informed with factual information to 1.0, while reducing the misinformation spread to a ratio of 0.1.

Exploration of the solution landscape through dimensional reduction

In the previous section, we conducted simulations varying certain parameters (Inline graphic, Inline graphic, Inline graphic, Inline graphic) while keeping others constant (Inline graphic, Inline graphic), gaining preliminary insights into our model’s behavior and its potential real-world implications. Building upon this, we embark on a comprehensive exploration of the parameter space through a grid search to uncover the full spectrum of possible model behaviors. To manage the complexity of our multidimensional data—originating from a simulation space defined by six variable parameters—we employ the UMAP (Uniform Manifold Approximation and Projection) algorithm55 for dimensionality reduction and clustering. This process transforms our 7-dimensional vector: (Inline graphic, Inline graphic, Inline graphic, Inline graphic, Inline graphic, Inline graphic, Inline graphic), where Inline graphic denotes the final count of susceptible individuals, in a more interpretable form, now representing each simulation vector as a two-dimensional vector, which is mapped in the UMAP plot.

The UMAP visualization, depicted in Fig. 5, reveals a critical observation: the yellow region, indicative of scenarios where the epidemic impact is minimal or halted, occupies a notably small portion of the solution landscape. This suggests that the majority of parameter combinations result in unfavorable outcomes, characterized by a low final number of susceptible individuals Inline graphic. A closer examination (see Sup. Figure S4) of this constrained yellow area reveals that successful outcomes correlate with higher awareness decay constants (implying that factual information persists longer in the population) and lower misinformation decay constants (indicating a shorter lifespan for misinformation). Further analysis highlights that scenarios conducive to positive public health outcomes are associated with a stronger emphasis on preventive behavior (higher Inline graphic) over harmful behavior (lower Inline graphic). Despite this, the prevailing real-world trend, that misinformation tends to circulate more extensively than accurate information (manifested as Inline graphic), renders these optimal scenarios challenging to achieve.

Fig. 5.

Fig. 5

UMAP visualization. Exploration of the model across all parameter combinations. Each point on the visualization marks the final state of susceptible individuals from a single simulation. The yellow region shows an area where the final counts of susceptible individuals in the simulation are notably high.

In an additional visualization presented in Fig. 6, we focus solely on simulations where Inline graphic, as suggested by literature1, that misinformation often outlasts factual information. This constraint significantly reduces the number of scenarios leading to the epidemic’s cessation, underscoring the formidable obstacle posed by the proliferation of misinformation. We also look closer at this analysis (see Sup. Figure S5), and we can observe that the remaining numbers of scenarios in which the final number of susceptible is high are related to a low value of harmful behavior (low Inline graphic).

Fig. 6.

Fig. 6

UMAP visualization when Inline graphic (bigger impact of misinformation, according to literature) Exploration of the model across all parameter combinations, focusing specifically on scenarios where Inline graphic. Each point on the visualization represents the final state of susceptible individuals from a single simulation. The yellow region shows an area where the final counts of susceptible individuals in the simulation are notably high.

Conclusion

This study has embarked on a detailed examination of the interplay between factual information and misinformation within the context of infectious disease outbreaks, with a particular lens on EVD. By leveraging a computational modeling approach that combines an ABM with a SEIRD compartmental model, we have studied the complex dynamics that govern the spread of infectious diseases alongside the propagation of both factual information and misinformation.

To do so, we introduced a comprehensive model integrating the dissemination of both infectious diseases and information among a population. This approach allowed us to investigate the pivotal role of information dissemination in combating the spread of infectious diseases, focusing on the impact of factually informed versus misinformed segments within the population. Our findings underscore the potential of strategic information delivery to effectively counter disease proliferation, drawing a parallel to the concept of vaccination in controlling outbreaks. Importantly, we propose that the efficacy of information in curbing disease spread is significantly influenced by the duration for which the information remains relevant within the community, highlighting the critical role of information decay over time.

To the best of our knowledge, the relationship between epidemics and the proliferation of misinformation remains largely uncharted in scholarly research, although a few studies5661 begin to address this complex interplay. In fact, Brainard et al. utilized an ABM to simulate the dissemination of misinformation during a norovirus outbreak, which highlighted the significant threat posed by misinformation in the context of an epidemic59. A distinct aspect of our study is the incorporation of a temporal decay in the information assimilated by individuals, a feature not considered in the aforementioned research. Moreover, our analysis revealed the harmful effects of misinformation on public health outcomes, emphasizing the need for robust strategies to mitigate its harmful influence during the spread of an infectious disease.

Our research illuminates the profound influence misinformation can exert on escalating the transmission and severity of infectious diseases. The introduction of even minor amounts of misinformation can significantly amplify disease spread, primarily due to its capacity to encourage harmful behaviors, such as disregarding health advisories and adopting infection-prone practices. In contrast, the propagation of factual information serves as a bulwark in curtailing the impact of outbreaks by fostering preventive behaviors. Nevertheless, the potency of factual information is intricately linked to its dissemination scope and the durability of its behavioral influence. As we move forward, facing new pandemic threats6264, it is imperative to build upon this foundational work, expanding our understanding and developing innovative approaches to safeguard public health in the face of emerging threats.

Of note, the analytical outcomes of our model highlight that optimal public health scenarios, marked by a large reservoir of susceptible individuals at the epidemic’s conclusion, are attainable with higher rates of awareness decay and lower rates of misinformation decay. Such findings underline the essential nature of enduring factual information and vastly combating misinformation in epidemic control. Yet, the challenge posed by the pervasive spread and lasting presence of misinformation, as opposed to factual information, presents a formidable obstacle. In our favor, it is worth noting that, in the real world, misinformation typically represents a smaller share of available information, as noted by Allen et al.65, a scenario that our model also explores.

Given the challenges associated with mitigating the spread of misinformation, coupled with the insights gleaned from our model and the relevant literature66,67, we suggest adopting techniques often utilized in the dissemination of misinformation, such as amplifying emotional engagement, to improve the spread and retention of factual information. Specifically, this approach aims to increase the awareness decay constant Inline graphic (raise the red line on our density plots, Figs. 3 and 4), thereby enhancing the effectiveness of accurate information dissemination.

By adopting methods that promote a longer decay period for awareness, we can aspire to shift the balance towards more favorable outcomes, as suggested by our model’s predictions. This approach shows a suitable direction for public health communication strategies, aiming to harness the power of emotional resonance to effectively counteract the spread of misinformation.

As a whole, our work signals the dire consequences of misinformation, urging the adoption of sophisticated communication strategies to enhance factual information spread and fortify societal defenses against both misinformation and infectious diseases. As the global community braces for future pandemics, it becomes imperative to expand upon these findings, crafting innovative strategies to protect public health against the twin threats of disease and misinformation.

Supplementary Information

Acknowledgements

This work was partially supported by Financiamiento Basal para Centros Científicos y Tecnológicos de Excelencia (CCTE) FB210008 to Fundación Ciencia & Vida. This material is based upon work supported by the Air Force Office of Scientific Research under award number FA9550-20-1-0196. The authors acknowledge economic support from Fondecyt de Exploración 13240042. We also acknowledge the National Laboratory for High-Performance Computing (NLHPC), Universidad de Chile. Powered@NLHPC. We thank Samuel Ropert for reviewing the manuscript and providing critical feedback that improved the article. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author contributions

AB and TPA conceptualized the study and drafted the manuscript. AB and TPA conducted the research. AB implemented the models and generated the results, tables, and figures. TPA acquired funding, administered the project, and supervised its execution.

Data availability

All data generated and analysed in this study, the simulation model implemented in NetLogo 6.1.1, and all scripts used for data processing and statistical analysis are publicly available in the GitHub repository https://github.com/alejob/paper_disease_misinformation

Declarations

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

The online version contains supplementary material available at 10.1038/s41598-025-18457-1.

References

  • 1.Vosoughi, S., Roy, D. & Aral, S. The spread of true and false news online. Science359(6380), 1146–1151 (2018). [DOI] [PubMed] [Google Scholar]
  • 2.Lazer, D. M. et al. The science of fake news. Science359(6380), 1094–1096 (2018). [DOI] [PubMed] [Google Scholar]
  • 3.Ferguson, N. Capturing human behaviour. Nature446(7137), 733–733 (2007). [DOI] [PubMed] [Google Scholar]
  • 4.WHO, World Health Organization, infodemic, https://www.who.int/health-topics/infodemic#tab=tab_1, accessed: 2021-10-05 (2021).
  • 5.Gallotti, R., Valle, F., Castaldo, N., Sacco, P. & Domenico, M. Assessing the risks of ‘infodemics’ in response to COVID-19 epidemics. Nat. Hum. Behav.4(12), 1285–1293 (2020). [DOI] [PubMed] [Google Scholar]
  • 6.Tiwari, S., Kanchan, S., Subash, N. R. & Bajpai, P. Prevalence of authentic versus false information in general population during covid 19 pandemic. Indian J. Prevent. Social Med.51(2), 61–71 (2020). [Google Scholar]
  • 7.Linden, S., Roozenbeek, J. & Compton, J. Inoculating against fake news about covid-19. Front. Psychol.11, 2928 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Y. M. Rocha, G. A. de Moura, G. A. Desidério, C. H. de Oliveira, F. D. Lourenço, L. D. de Figueiredo Nicolete, The impact of fake news on social media and its influence on health during the covid-19 pandemic: A systematic review, Journal of Public Health 1–10 (2021). [DOI] [PMC free article] [PubMed]
  • 9.Campbell, E. M. & Salathé, M. Complex social contagion makes networks more vulnerable to disease outbreaks. Sci. Rep.10.1038/srep01905 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Khan, A., Naveed, M., Ahmad, M. D. E. & Imran, M. Estimating the basic reproductive ratio for the ebola outbreak in liberia and sierra leone. Infect. Diseases Poverty.10.1186/s40249-015-0043-3 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Gonsalves, G. & Staley, P. Panic, paranoia, and public health—The aids epidemic’s lessons for ebola. N. Engl. J. Med.371(25), 2348–2349 (2014). [DOI] [PubMed] [Google Scholar]
  • 12.Umeora, O. U. et al. Ebola viral disease in Nigeria: The panic and cultural threat. Afr. J. Med. Health Sci.13(1), 1 (2014). [Google Scholar]
  • 13.J. Hageman, C. Hazim, K. Wilson, P. Malpiedi, N. Gupta, S. Bennett, A. R. Kolwaite, A. J. Tumpey, K. J. Brinsley-Rainisch, B. E. Christensen, C. Gould, A. Fisher, M. Jhung, D. Hamilton, K. Moran, L. Delaney, C. Dowell, M. Bell, A. Srinivasan, M. Schaefer, R. Fagan, N. Adrien, N. Chea, B. J. Park, Infection prevention and control for ebola in health care settings - west africa and united states., MMWR supplements 65 3, 50–6. 10.15585/mmwr.su6503a8 (2016). [DOI] [PubMed]
  • 14.Vinck, P., Pham, P., Bindu, K. K., Bedford, J. & Nilles, E. Institutional trust and misinformation in the response to the 2018–19 Ebola outbreak in North Kivu, Dr Congo: A population-based survey., The Lancet. Infectious diseases19 5, 529–536. 10.1016/S1473-3099(19)30063-5 (2019). [DOI] [PubMed] [Google Scholar]
  • 15.Sell, T., Hosangadi, D. & Trotochaud, M. Misinformation and the us ebola communication crisis: Analyzing the veracity and content of social media messages related to a fear-inducing infectious disease outbreak. BMC Public Health.10.1186/s12889-020-08697-3 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Marzi, A. & Feldmann, H. Ebola virus vaccines: An overview of current approaches. Expert Rev. Vaccines13, 521–531. 10.1586/14760584.2014.885841 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Dhama, K. et al. Advances in designing and developing vaccines, drugs, and therapies to counter ebola virus. Front. Immunol.10.3389/fimmu.2018.01803 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Kermack, W. O. & Mckendrick, À. G. A contribution to the mathematical theory of epidemics. Proc. R. Soc. A Math. Phys. Eng. Sci. 115, 700–721 (1927).
  • 19.Huang, G. & Takeuchi, Y. Global analysis on delay epidemiological dynamic models with nonlinear incidence. J. Math. Biol.63, 125–139. 10.1007/s00285-010-0368-2 (2011). [DOI] [PubMed] [Google Scholar]
  • 20.Li, M. Y. & Muldowney, J. Global stability for the seir model in epidemiology. Math. Biosci.125 2, 155–64. 10.1016/0025-5564(95)92756-5 (1995). [DOI] [PubMed] [Google Scholar]
  • 21.T. Veloz, P. Maldonado, S. Ropert, C. Ravello, S. Mora, A. Barrios, T. Villaseca, C. Valdenegro, T. Perez-Acle, On the interplay between mobility and hospitalization capacity during the covid-19 pandemic: The seirhud model (2020). arXiv:2006.05357.
  • 22.Britton, T., Ball, F. & Trapman, P. A mathematical model reveals the influence of population heterogeneity on herd immunity to sars-cov-2. Science (New York, N.y.)369, 846–849. 10.1126/science.abc6810 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.M. G. M. Gomes, R. M. Corder, J. G. King, K. e. Langwig, C. Souto-Maior, J. Carneiro, G. Gonçalves, C. Penha-Gonçalves, M. U. Ferreira, R. Águas, Individual variation in susceptibility or exposure to sars-cov-2 lowers the herd immunity threshold, Journal of Theoretical Biology 540 (2020) 111063 – 111063. 10.1016/j.jtbi.2022.111063. [DOI] [PMC free article] [PubMed]
  • 24.Kiss, I. Z., Cassell, J., Recker, M. & Simon, P. L. The impact of information transmission on epidemic outbreaks. Math. Biosci.225(1), 1–10 (2010). [DOI] [PubMed] [Google Scholar]
  • 25.Granell, C., Gómez, S. & Arenas, A. Dynamical interplay between awareness and epidemic spreading in multiplex networks. Phys. Rev. Lett.111(12), 128701 (2013). [DOI] [PubMed] [Google Scholar]
  • 26.Granell, C., Gómez, S. & Arenas, A. Competing spreading processes on multiplex networks: Awareness and epidemics. Phys. Rev. E90(1), 012808 (2014). [DOI] [PubMed] [Google Scholar]
  • 27.Funk, S., Gilad, E. & Jansen, V. Endemic disease, awareness, and local behavioural response. J. Theoret. Biol.264(2), 501–509 (2010). [DOI] [PubMed] [Google Scholar]
  • 28.Bernardin, A., Martínez, A. J. & Perez-Acle, T. On the effectiveness of communication strategies as non-pharmaceutical interventions to tackle epidemics. PloS One16(10), e0257995 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Funk, S., Gilad, E., Watkins, C. & Jansen, V. A. The spread of awareness and its impact on epidemic outbreaks. Proc. Natl. Acad. Sci.106(16), 6872–6877 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Chen, F. Modeling the effect of information quality on risk behavior change and the transmission of infectious diseases. Math. Biosci.217 2, 125–33. 10.1016/j.mbs.2008.11.005 (2009). [DOI] [PubMed] [Google Scholar]
  • 31.Samanta, S. & Chattopadhyay, J. Effect of awareness program in disease outbreak—A slow-fast dynamics. Appl. Math. Comput.237, 98–109. 10.1016/j.amc.2014.03.109 (2014). [Google Scholar]
  • 32.Funk, S., Salathé, M. & Jansen, V. A. Modelling the influence of human behaviour on the spread of infectious diseases: A review. J. R. Soc. Interface7(50), 1247–1256 (2010). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Zhang, H., Xie, J.-R., Tang, M. & Lai, Y. Suppression of epidemic spreading in complex networks by local information based behavioral responses. Chaos.10.1063/1.4896333 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Kremer, J. R. & Muller, C. P. Measles in Europe—There is room for improvement. Lancet373(9661), 356–358 (2009). [DOI] [PubMed] [Google Scholar]
  • 35.Abruzzi, W. Measles—A serious pediatric disease. J. Pediatr.64(5), 750–752 (1964). [DOI] [PubMed] [Google Scholar]
  • 36.Berger, J. & Milkman, K. L. What makes online content viral?. J. Marketing Res.49(2), 192–205 (2012). [Google Scholar]
  • 37.ABC, Nigerian ebola hoax results in two deaths, ABC News, retrieved from https://abcnews.go.com/Health/nigerian-ebola-hoax-results-deaths/story?id=25842191 (Sep. 2014).
  • 38.WHO Ebola Response Team. Ebola virus disease in west Africa-The first 9 months of the epidemic and forward projections. N. Engl. J. Med.2014(371), 1481–1495 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Victory, K. R., Coronado, F., Ifono, S. O., Soropogui, T. & Dahl, B. A. Ebola transmission linked to a single traditional funeral ceremony—Kissidougou, Guinea, December, 2014–January 2015. MMWR. Morbidity Mortality Weekly Rep.64(14), 386 (2015). [PMC free article] [PubMed] [Google Scholar]
  • 40.Weitz, J. S. & Dushoff, J. Modeling post-death transmission of ebola: Challenges for inference and opportunities for control. Sci. Rep.5, 8751 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Althaus, C. L. Estimating the reproduction number of ebola virus (ebov) during the, outbreak in west africa. PLoS Curr. Outbreaks6(2014), 1–11 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.M. F. Gomes, A. P. y Piontti, L. Rossi, D. Chao, I. Longini, M. E. Halloran, A. Vespignani, Assessing the international spreading risk associated with the 2014 west African Ebola outbreak, PLOS Currents Outbreaks 6 (2014). [DOI] [PMC free article] [PubMed]
  • 43.D. Fisman, E. Khoo, A. Tuite, Early epidemic dynamics of the west african 2014 ebola outbreak: estimates derived with a simple two-parameter model, PLOS Currents Outbreaks 6 (2014). [DOI] [PMC free article] [PubMed]
  • 44.Dénes, A. & Gumel, A. B. Modeling the impact of quarantine during an outbreak of ebola virus disease. Infect. Disease Model.4, 12–27 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Abo, S. M. et al. Modelling the daily risk of ebola in the presence and absence of a potential vaccine. Infect. Disease Model.5, 905–917 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.U. Wilensky, Netlogo. evanston, il: center for connected learning and computer-based modeling, northwestern university (1999).
  • 47.Sok, J. & Fischer, E. A. Farmers’ heterogeneous motives, voluntary vaccination and disease spread: An agent-based model. Eur. Rev. Agric. Econ.47(3), 1201–1222 (2020). [Google Scholar]
  • 48.E. Hunter, B. Mac Namee, J. Kelleher, An open-data-driven agent-based model to simulate infectious disease outbreaks. PloS One. 13(12), e0208775 (2018). [DOI] [PMC free article] [PubMed]
  • 49.Munkhbat, B. A computational simulation model for predicting infectious disease spread using the evolving contact network algorithm. Master Theses. 790 (2019).
  • 50.Ye, M. & Lyu, Z. Trust, risk perception, and COVID-19 infections: Evidence from multilevel analyses of combined original dataset in china. Social Sci. Med.265(2020), 113517–113517. 10.1016/j.socscimed.2020.113517 (1982). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.E. Katz, P. F. Lazarsfeld, E. Roper, Personal influence: The part played by people in the flow of mass communications, Routledge, 2017.
  • 52.E. M. Rogers, A. Singhal, M. M. Quinlan, Diffusion of innovations, in: An integrated approach to communication theory and research, Routledge, pp. 432–448 (2014).
  • 53.Daley, D. J. & Kendall, D. G. Stochastic rumours. IMA J. Appl. Math.1(1), 42–55 (1965). [Google Scholar]
  • 54.Rivera, J. M. et al. Evaluating interest in off-label use of disinfectants for COVID-19. Lancet Digital Health2, e564–e566. 10.1016/S2589-7500(20)30215-6 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.L. McInnes, J. Healy, J. Melville, Umap: Uniform manifold approximation and projection for dimension reduction, arXiv preprint arXiv:1802.03426 (2018).
  • 56.Sontag, A., Rogers, T. & Yates, C. A. Misinformation can prevent the suppression of epidemics. J. R. Soc. Interface19(188), 20210668 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Mumtaz, N., Green, C. & Duggan, J. Exploring the effect of misinformation on infectious disease transmission. Systems10(2), 50 (2022). [Google Scholar]
  • 58.Prandi, L. & Primiero, G. Effects of misinformation diffusion during a pandemic. Appl. Netw. Sci.5, 1–20 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Brainard, J., Hunter, P. & Hall, I. R. An agent-based model about the effects of fake news on a norovirus outbreak. Revue D’Épidémiologie Et De Santé Publique68(2), 99–107 (2020). [DOI] [PubMed] [Google Scholar]
  • 60.M. R. DeVerna, F. Pierri, Y.-Y. Ahn, S. Fortunato, A. Flammini, F. Menczer, Modeling the amplification of epidemic spread by misinformed populations, arXiv preprint arXiv:2402.11351 (2024). [DOI] [PMC free article] [PubMed]
  • 61.I. M. Bulai, M. Sensi, S. Sottile, A geometric analysis of the sirs compartmental model with fast information and misinformation spreading, arXiv preprint arXiv:2311.06351 (2023).
  • 62.Barclay, E. Predicting the next pandemic. Lancet (London, England)372, 1025–1026. 10.1016/S0140-6736(08)61425-7 (2008). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.R. Izurieta, A. Campos, J. Parikh, T. Gardellini, Which plagues are coming next?, Contemporary Developments and Perspectives in International Health Security - Volume 2 [Working Title] (2021). 10.5772/intechopen.96820.
  • 64.G. Neumann, Y. Kawaoka, Which virus will cause the next pandemic?, Viruses 15 (2023). 10.3390/v15010199. [DOI] [PMC free article] [PubMed]
  • 65.Allen, J., Howland, B., Mobius, M., Rothschild, D. & Watts, D. J. Evaluating the fake news problem at the scale of the information ecosystem. Sci. Adv.6(14), eaay3539 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Roozenbeek, J. & Linden, S. Fake news game confers psychological resistance against online misinformation, Palgrave. Communications5, 1–10. 10.1057/s41599-019-0279-9 (2019). [Google Scholar]
  • 67.Lewandowsky, S. & Linden, S. Countering misinformation and fake news through inoculation and prebunking. Eur. Rev. Social Psychol.32, 348–384. 10.1080/10463283.2021.1876983 (2021). [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

All data generated and analysed in this study, the simulation model implemented in NetLogo 6.1.1, and all scripts used for data processing and statistical analysis are publicly available in the GitHub repository https://github.com/alejob/paper_disease_misinformation


Articles from Scientific Reports are provided here courtesy of Nature Publishing Group

RESOURCES