Abstract
We discuss challenges to implementing evidence-based practice within the broad field of public health preparedness and response. We discuss the progress of public health preparedness and response in building and translating evidence to practice since the World Trade Center attacks of 9/11/2001.
We briefly describe analogies to struggles that other professional disciplines face, and we highlight key factors that facilitate and impede the implementation of evidence-based practice.
We recommend a partnership led by funding agencies and closely involving research organizations and professional associations as a means to ensure that the public health preparedness and response field continues to develop an evidence-based culture and practice.
For more than 15 years, the United States has invested substantially in building emergency preparedness and response capability and capacity. Spurred initially by the World Trade Center attacks and “Amerithrax” anthrax incidents in 2001 and reinforced by subsequent events such as the severe acute respiratory syndrome outbreak of 2003 and Hurricane Katrina in 2005, improving the nation’s ability to respond and recover from “all hazards” became a priority codified in law.1 The Public Health Emergency Preparedness program, for example, has provided more than $11 billion to state and local health departments since 2002 to improve preparedness and emergency response infrastructure and systems. Programs touching other aspects of the health sector, such as the Hospital Preparedness Program, the Urban Areas Security Initiative, and the Cities Readiness Initiative were also launched; these have provided substantial funding over the same period.2 Concurrent with these myriad investments were assessments of the status of the public health preparedness and response (PHPR) field, with the consensus view being that the scientific evidence base for PHPR activities was severely lacking.3,4
Evidence-based practice (EBP) in public health entails selecting and implementing programs, developing policies, and assessing progress and outcomes on the basis of scientific evidence.5 The National Institutes of Health clarifies that basing decisions and actions on scientific evidence means the “application of principles of scientific reasoning, including systematic uses of data and information systems, and appropriate use of behavioral science theory and program planning models.”6 Despite hundreds of millions of dollars expended on research to address PHPR challenges since the terrorist attacks of 9/11/2001, the evidence base for the field is still described as weak because of heavy reliance on anecdotal reports, narrative reviews, or studies with limited generalizability.7 Although progress has been made in recent years with the execution of well-designed and analytically sound research programs and projects,8,9 the ideal of a preparedness and response field fully grounded in scientific evidence has not yet been realized.
Implementation of EBPs involves challenges separate and distinct from the difficulties associated with building and validating evidence in PHPR. Because outcomes of interest in PHPR are often related to improvements in systems and processes (rarely analogous to the discrete outcomes of clinical research), developing and assessing the evidence in PHPR requires a more nuanced view than does the model traditionally put forth for conducting and grading evidence in medicine. Moreover, disasters, epidemics, and other adverse health events are heterogeneous and usually unexpected, adding complexity to attempts to differentiate the systems, processes, and contexts that are common versus unique to different event types.
Insights from disciplines outside traditional public health and medicine, including systems science and operational fields that have grappled with bringing an evidence-informed approach to their decision-making (e.g., law enforcement), may prove useful for science in PHPR.10–12 We recognize the conceptual and methodological difficulties; accordingly, the Centers for Disease Control and Prevention (CDC) recently sponsored the National Academies of Science, Engineering and Medicine to have a large-scale review conducted by a multidisciplinary panel of experts that is aimed at assessing the progress of the past 15 years, bringing greater coherence to the existing body of knowledge and making recommendations for the way forward.13
Despite the complexities of research in this field and the need for a framework tailored to PHPR for assessing the strength of evidence, recent work suggests that a significant downstream problem is inconsistent translation and implementation of scientific knowledge to practice, even where useful evidence exists. For example, surveying the PHPR practice community to determine highest priority information needs, researchers found that disconnects exist not only between practitioners and academic subject matter experts but also between practitioners at Public Health Emergency Preparedness–funded preparedness and response programs (primarily in state and directly funded jurisdictional health departments) and those based in local health departments.
Local health departments, often stretched thin, had on average less awareness of existing research-based information and expressed significantly greater information needs. Additionally, Public Health Emergency Preparedness program directors had substantially different views of the highest priority research needs of the field than did local officials.14 Meanwhile, in this special issue of AJPH, Baseman et al. (p. S369) found that implementation of scientifically informed programs likely to improve PHPR communication practices during emergencies is hindered by a variety of organizational obstacles—despite ready availability of evidence-based tools and trainings. These findings suggest that scientific knowledge that could improve practice often exists but practitioners are either unaware or unable to implement this knowledge in their work.
Difficulty with implementation of EBPs is not unique to the public health preparedness and emergency response arena. The history of other areas of public health and medicine, as well as nonhealth domains such as education15 and law enforcement,16 reveals long and difficult struggles with implementing EBPs. This is particularly true in disciplines in which training and development relies on experiential knowledge and techniques passed on from highly influential, master practitioners to their apprentices, without the benefit of systematically evaluating outcomes that drive standards of practice for the field.17,18 In disciplines such as medicine, for example, the shift to evidence-based care involved decades of cultural change, driven in part by externalities such as increased accountability for clinical outcomes and substantial financial and reputational liability for institutions and practitioners when standards of care, are disregarded.19–22
THE CHALLENGES OF IMPLEMENTATION
The implementation of EBP in the PHPR community can be stalled for several reasons. An obvious obstacle is the potential gap between the studies that academic researchers are interested in conducting and the research-based information that state, local, territorial, and tribal public health practitioners require to improve outcomes. Universities and other academic institutions have strong incentives to conduct research and continually develop new programs and practices, especially for promulgation in the academic literature. However, health care and public health delivery systems have different incentives to implement new practices or programs. In addition, practitioners who are aware of limitations in current practice and genuinely interested in EBPs may lack training or experience in articulating their programmatic or practice challenges to researchers in a manner that lends itself to framing scientific studies.14 In resource-constrained environments, opportunities for close engagement of practitioners and researchers may be sporadic or nonexistent.
Although communication challenges between the practice community and researchers can hinder the development of useful applications of current science, even EBPs that would be of considerable benefit may remain largely unknown or underused. Studies on the implementation of EBPs in medicine have demonstrated this situation. Although medical practitioners place a high value on EBPs, they face significant individual and organizational barriers that limit their ability to access, synthesize, and implement these practices.23-25 For example, evidence-based practices that fall outside the “patient flow culture” of emergency departments (e.g., conducting a nutritional assessment), even if beneficial to longer-term patient outcomes, will be resisted or ignored, whereas status quo practices that support the rapid patient flow goal (e.g., guidelines that speed triage) will persist.26 Similarly, practitioners often face multiple barriers to implementing EBPs; these barriers may include lack of personal time, lack of financial incentive to shift practice behavior, lack of access to information, insufficient support staff and other resources, and the perception that there is either too little evidence or conflicting evidence surrounding a specific practice.27
The existing body of knowledge on dissemination and implementation has yielded some consistent factors related to barriers and facilitators of EBPs. Increasing practitioner involvement with research at multiple stages (generation, participation, and consumption) can increase uptake of EBPs. However, this is only successful if practitioners are both motivated to seek out new information and able to access and understand the implications of research-based recommendations.24 Leadership also plays a crucial role in facilitating the uptake of EBPs: organizations with senior leaders who encourage innovation are more likely to implement and sustain such practices.28 Additionally, organizations choose programs to implement on the basis of congruence with their mission, demonstrated effectiveness in real-world settings, the ability to meet a specific organizational need, and cost-effectiveness.29 Although none of these factors are surprising, all should be routinely considered by those developing EBPs and disseminating them to PHPR organizations.
In a thought-provoking analysis, Kreuter and Wang30 compared private sector product innovation, development, marketing, and distribution to the dissemination and implementation of EBPs in public health. They observe, “Across a range of domains, promising products and ideas routinely fail to gain widespread adoption. In many cases, such failure is not just common, it is an overwhelming probability.”30(p12) By contrast, they note that many researchers in public health seem to operate under the assumption that “every empirically supported intervention should be pushed into wider dissemination,” despite the fact that successful uptake of an evidence-based program or practice requires far more than strong scientific evidence.30(p13)
EVIDENCE DOMAINS VS DEMAND
Ballew et al.31 have argued that evidence in public health comes in three different forms. Type 1 evidence describes a particular issue or problem and indicates “something should be done.” Type 2 evidence proposes an intervention: “This should be done.” Type 3 evidence goes beyond the first two forms of evidence and informs us “How something should be done.”31(p186) Ballew et al.31 note that type 3 evidence is often unavailable from published academic articles yet is invaluable for guiding practitioners and policymakers for implementing programs in the real world. Additionally, as Kreuter and Wang remind us with respect to the uptake of EBPs, practitioner “preferences, needs, and capacity matter, as do social forces like colleagues’ opinions and perceived practice norms.”30(p13)
Central to Kreuter and Wang’s discussion is the critical difference between evidence and demand, which they see as analogous to the concepts of “push” and “pull” of dissemination. Scientific evidence leads the academic community to push interventions out to potential adopters, whereas demand, or pull, is what potential adopters actually want because they see these interventions or programs as consequential to their work. Ultimately, Kreuter and Wang put forth the propositions that (1) many evidence-based programs are simply not worth implementing, (2) research-tested versions of programs are rarely ready for widespread use, and (3) most researchers and program developers make poor disseminators and implementers.30 They summarize the state of affairs in public health as follows:
Researchers are busy developing and testing programs. Practitioners are busy delivering programs and services but are open to better solutions when practical and feasible. Between the two lies a substantial gulf that neither group of professionals is particularly well suited to bridge.30(p16)
Although bridging this gulf has been daunting, Kreuter and Wang30 see solutions parallel to those used by private sector firms in the product development and marketing arenas. They conclude that dissemination must be more demand driven, programs and products must be made “practice ready,” and specialists in translation and implementation—not researchers—are needed to ensure the spread and uptake of innovative programs. To address these needs, they propose three components of a “dissemination support system” in public health: (1) user review panels made up of decision-makers from key stakeholder organizations involved in public health practice that identify “practice ready” products, (2) design and marketing teams for EBPs, and (3) dissemination field agents to champion EBPs across the practice community.30 This dissemination support system goes beyond academic–practice partnerships by addressing specific expertise and resources required for successful translation and implementation.
SUPPORTING DISSEMINATION AND IMPLEMENTATION
Where should the dissemination support system for the preparedness and response field reside, and who should sustain it? Such a system will require components that work “hand-in-glove” among three sets of organizations: (1) universities and institutes conducting preparedness and response research, (2) agencies such as the CDC that fund states and localities and have a major stake in translating science to practice, and (3) groups that represent the needs of practice communities, such as the Association of State and Territorial Health Organizations and the National Association of County and City Health Officials. Of these three categories of organizations, funding agencies may be best positioned to encourage or even mandate the uptake of EBPs through grant and cooperative agreement requirements. Ensuring application of EBP is consistent with ensuring good stewardship of public funds. However, funding agencies need to be cognizant that successful EBP implementation often hinges on tailoring the practices to local needs.32,33
Although Kreuter and Wang’s discussion is compelling, an implication of their argument is that practitioners will mostly know best what they need and researchers must bend to the practitioner market when determining the questions to address scientifically and prioritizing programs to promote. For the PHPR field, our agreement with this notion is qualified. If one accepts that PHPR has operated for years with a minimal or at best uneven evidence base, it is perilous to bend the scientific enterprise solely to practitioner perceptions. Although practitioners are indeed the end users and paramount source of experiential knowledge to inform the academic community about practice gaps, their varied professional backgrounds and the organizational cultures in which they operate may constrain their perception in a similar manner that such factors limit the understanding of academics.
In striving to tailor EBP development and dissemination to practitioner demand, it is worthwhile to consider that innovations can originate from within the academic community—innovations that the PHPR field may not immediately recognize as such but that have potential for positive impact over time. Practitioner demand should drive EBP development, whereas scientific knowledge emerging from the research community continually informs and shapes that demand. We believe this will contribute to a cultural shift toward reliance on EBP in preparedness and emergency response.
The set of articles in this special issue span a diverse range of topics, including effective design and dissemination of preparedness training and tools, the identification of gaps in knowledge and variation in preparedness capacities, and techniques to address barriers to EBP in preparedness. The knowledge, tools, and settings described in these articles provide an exciting sampling of efforts to disseminate and implement research-informed practice in PHPR programs, and they reveal both successes and long-standing challenges to the uptake and sustainment of EBPs in health agencies.
We have observed a shift in the landscape of preparedness and response science over the past 15 to 20 years. Gaps in translation and implementation may now be a greater challenge than are gaps in scientific knowledge. This is not to suggest that all the questions that can be answered by research have been answered—far from it. PHPR consists of a remarkably broad array of domains and topics; new research findings and each subsequent natural disaster or disease outbreak often leads to novel questions. However, the science of dissemination, translation, and implementation now looms large in determining how and when the preparedness and response field will evolve into a truly evidence-based professional discipline. This continued shift is crucial for protecting the health of the public and for ensuring that we can clearly communicate the benefits of preparedness and response programs to policymakers and funding agencies. In our view, agencies that fund preparedness and response programs must continue to support science and drive the uptake of EBPs in partnership with implementation champions at state and local levels.
REFERENCES
- 1. Pandemic and All-Hazards Preparedness Act. Pub L No. 109-417. 109th Congress (December 19, 2006).
- 2.Toner E. Healthcare preparedness: saving lives. Health Secur. 2017;15(1):8–11. doi: 10.1089/hs.2016.0090. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Lurie N. Public Health Preparedness in the 21st Century. Santa Monica, CA: RAND Corporation; 2006. [Google Scholar]
- 4.Nelson C, Lurie N, Wasserman J. Assessing public health emergency preparedness: concepts, tools, and challenges. Annu Rev Public Health. 2007;28:1–18. doi: 10.1146/annurev.publhealth.28.021406.144054. [DOI] [PubMed] [Google Scholar]
- 5.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201. doi: 10.1146/annurev.publhealth.031308.100134. [DOI] [PubMed] [Google Scholar]
- 6. National Institutes of Health. Evidence based public health: a guide about evidence based public health information and resources. Available at: https://nihlibrary.nih.gov/resources/subject-guides/evidence-based-public-health. Accessed January 12, 2018.
- 7.Khan Y, Fazli G, Henry B et al. The evidence base of primary research in public health emergency preparedness: a scoping review and stakeholder consultation. BMC Public Health. 2015;15:432. doi: 10.1186/s12889-015-1750-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Lurie N, Manolio T, Patterson AP, Collins F, Frieden T. Research as a part of public health emergency response. N Engl J Med. 2013;368(13):1215–1255. doi: 10.1056/NEJMsb1209510. [DOI] [PubMed] [Google Scholar]
- 9.Carbone EG, Wright MM. Hurricane sandy recovery science: a model for disaster research. Disaster Med Public Health Prep. 2016;10(3):304–305. doi: 10.1017/dmp.2015.140. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Sherman LW, MacKenzie DL, Farrington DP, Welsh BC, editors. Evidence-Based Crime Prevention. London: Routledge; 2002. [Google Scholar]
- 11.Sherman LW. Evidence-Based Policing. Washington, DC: Police Foundation; 1998. [Google Scholar]
- 12.Links JM, Schwartz BS, Lin S et al. COPEWELL: A conceptual framework and system dynamics model for predicting community functioning and resilience after disasters. Disaster Med Public Health Prep. 2018;12(1):127–137. doi: 10.1017/dmp.2017.39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Institute of Medicine. Evidence-based practices for public health emergency preparedness and response: assessment of and recommendations for the field. 2018. Available at: http://nationalacademies.org/hmd/Activities/PublicHealth/PublicHealthPreparedness.aspx. Accessed March 1, 2018.
- 14.Siegfried AL, Carbone EG, Meit MB, Kennedy MJ, Yusuf H, Kahn EB. Identifying and prioritizing information needs and research priorities of public health emergency preparedness and response practitioners. Disaster Med Public Health Prep. 2017;11(5):1–10. doi: 10.1017/dmp.2016.198. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Sebba J. Developing evidence-informed policy and practice in education. In: Thomas G, Pring R, editors. Evidence Based Practice in Education. Maidenhead, UK: Open University Press; 2004. pp. 34–43. [Google Scholar]
- 16.Bueermann J. Being Smart on Crime with Evidence-Based Policing. Washington, DC: US Department of Justice; 2012. [Google Scholar]
- 17.Polavarapu H, Kulaylat A, Sun S, Hamed O. 100 years of surgical education: the past, present, and future. Bull Am Coll Surg. 2013;98(7):22–27. [PubMed] [Google Scholar]
- 18.Siddiqui S. Of mentors, apprenticeship, and role models: a lesson to relearn? Med Educ Online. 2014;19:25428. doi: 10.3402/meo.v19.25428. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Rothstein WG. American Medical Schools and the Practice of Medicine: A History. New York, NY: Oxford University Press; 1987. [Google Scholar]
- 20.Claridge JA, Fabian TC. History and development of evidence-based medicine. World J Surg. 2005;29(5):547–553. doi: 10.1007/s00268-005-7910-1. [DOI] [PubMed] [Google Scholar]
- 21.Evidence-Based Medicine Working Group. Evidence-based medicine. A new approach to teaching the practice of medicine. JAMA. 1992;268(17):2420–2425. doi: 10.1001/jama.1992.03490170092032. [DOI] [PubMed] [Google Scholar]
- 22.Zimerman AL. Evidence-based medicine: a short history of a modern medical movement. Virtual Mentor. 2013;15(1):71–76. doi: 10.1001/virtualmentor.2013.15.1.mhst1-1301. [DOI] [PubMed] [Google Scholar]
- 23.Patelarou AE, Laliotis A, Brokalaki H, Petrakis I, Dafermos V, Koukia E. Readiness for and predictors of evidence-based practice in Greek healthcare settings. Appl Nurs Res. 2016;32:275–280. doi: 10.1016/j.apnr.2016.08.010. [DOI] [PubMed] [Google Scholar]
- 24.Williams B, Perillo S, Brown T. What are the factors of organisational culture in health care settings that act as barriers to the implementation of evidence-based practice? A scoping review. Nurse Educ Today. 2015;35(2):e34–e41. doi: 10.1016/j.nedt.2014.11.012. [DOI] [PubMed] [Google Scholar]
- 25.Foster A, Worrall L, Rose M, O’Halloran R. “That doesn’t translate”: the role of evidence based practice in disempowering speech pathologists in acute aphasia management. Int J Lang Commun Disord. 2015;50(4):547–563. doi: 10.1111/1460-6984.12155. [DOI] [PubMed] [Google Scholar]
- 26.Kirk JW, Nilsen P. Implementing evidence-based practices in an emergency department: contradictions exposed when prioritising a flow culture. J Clin Nurs. 2016;25(3–4):555–565. doi: 10.1111/jocn.13092. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Baig M, Sayedalamin Z, Almouteri O, Algarni M, Allam H. Perceptions, perceived barriers and practices of physicians towards evidence-based medicine. Pak J Med Sci. 2016;32(1):49–54. doi: 10.12669/pjms.321.8841. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Bennett S, Allen S, Caldwell E et al. Organisational support for evidence based practice: occupational therapists perceptions. Aust Occup Ther J. 2016;63(1):9–18. doi: 10.1111/1440-1630.12260. [DOI] [PubMed] [Google Scholar]
- 29.Hannon PA, Fernandez ME, Williams RS et al. Cancer control plannersʼ perceptions and use of evidence-based programs. J Public Health Manag Pract. 2010;16(3):E1–E8. doi: 10.1097/PHH.0b013e3181b3a3b1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Kreuter MW, Wang ML. From evidence to impact: recommendations for a dissemination support system. New Dir Child Adolesc Dev. 2015;149:11–23. doi: 10.1002/cad.20110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Ballew P, Brownson RC, Haire-Joshu D, Heath GW, Kreuter MW. Dissemination of effective physical activity interventions: are we applying the evidence? Health Educ Res. 2010;25(2):185–198. doi: 10.1093/her/cyq003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Ebener PA, Hunter SB, Adams RM . Outcomes Guide for Community Emergency Preparedness. Santa Monica, CA: RAND Corporation; 2017. [Google Scholar]
- 33.Wang S, Moss JR, Hiller JE. Applicability and transferability of interventions in evidence-based public health. Health Promot Int. 2006;21(1):76–83. doi: 10.1093/heapro/dai025. [DOI] [PubMed] [Google Scholar]