Abstract
The goal of this project was to enhance the capacity of local health departments to translate and implement evidence-based programs in emergency preparedness by using the Getting To Outcomes approach. Our evaluation determined that local health department staff reported improved capacities. A “Getting To Outcomes Guide for Community Emergency Preparedness” guidebook was produced and is available online.
Improving the capacity of local health departments (LHDs) to engage communities in their own preparedness is a national health security priority, but that capacity is often lacking.
INTERVENTION
The intervention included two components: (1) training and technical assistance for LHD partners to adapt, implement, evaluate, and improve household disaster preparedness programs in their jurisdictions, and (2) creation and use of a guidebook for LHDs to translate and implement evidence-based programs (EBPs).
PLACE AND TIME
We implemented this intervention in 2016 at three large LHDs in southern California (staff size, 340–2800; jurisdictional population, 470 000–3.2 million), which received the Centers for Disease Control and Prevention’s Public Health Emergency Preparedness (PHEP) funding. We selected the LHDs on the basis of preexisting relationships with our research team.
PERSON
Three teams of emergency preparedness program staff at the LHDs participated, including program directors, emergency managers, program coordinators, public health nurses, and health educators. From three to six staff members participated at each LHD (n = 12).
PURPOSE
PHEP Capability One focuses on community preparedness, but LHDs often have limited and scattered evidence-based approaches for improving preparedness and may lack the guidance to identify, translate, and implement EBPs. We aimed to enhance the capacity of public health practitioners to adapt these programs to their populations’ needs, while maintaining fidelity to a program’s evidence core.1
IMPLEMENTATION
We chose to use Getting To Outcomes (GTO) to build the LHDs’ workforce capacity to translate and implement EBPs in emergency preparedness. GTO is a 10-step process for planning, implementing, evaluating, and improving programs; it was commissioned by the Substance Abuse and Mental Health Services Administration in 1999, originally for substance abuse prevention programs. GTO has been applied to multiple content domains (www.rand.org/gto) and shown in randomized trials to improve the capacity of practitioners, fidelity of programs, and outcomes of program participants in multiple domains.2 The first six steps, which focus on developing a program plan, are
needs and resources assessment,
development of goals and outcomes,
identification of EBPs and best practices,
adaptation of EBPs to fit context,
capacity assessment, and
program plan development. The subsequent four steps are meant to be completed after program implementation begins:
process evaluation,
outcome evaluation,
continuous quality improvement, and
sustainability.
Through the GTO approach, practitioners receive three key supports to facilitate progression through the 10 steps: (1) written tools that stimulate decision-making for each step (e.g., “Getting To Outcomes Guide for Community Emergency Preparedness” guidebook), (2) training, and (3) technical assistance consistent with the facilitation model of implementation support. Technical assistance is often supported by grant funding in GTO projects.
We provided each of the teams of participating staff from the three LHDs with three separate 2-hour, in-person trainings covering each of the GTO steps. A faculty member and doctoral student from the research team conducted the trainings at each of the LHD offices. Prior to each training, the research team completed templates of the GTO step-by-step tools that were tailored to community emergency preparedness. The templates were further developed into a GTO guidebook with additional background information and resources relevant to developing an emergency preparedness program. Throughout this project, the research team developed sections of the GTO guidebook corresponding to each training, with a final guidebook produced at its completion.
Using the GTO guidebook, the research team trained LHD staff to use the GTO tools. The trainings consisted of a presentation and a team-based workshop facilitated by the researchers. During the first training, the LHDs began to define their target population and desired outcomes. Using this information, the research team helped them identify relevant EBPs based on the researchers’ knowledge and a national list of preparedness EBPs.3,4 During the second training, they began to solidify their program plan as they adapted the EBPs. The adaptation process involved the reproduction of EBP core components, the basic principles required to obtain expected outcomes.5 The research team modified noncore components of the programs to meet the needs of their target population and LHDs’ capacity to implement the program. During the final training, they began to develop plans for program evaluation, continuous quality improvement, and sustainability.
In between the in-person trainings, the LHDs completed the tools included in the GTO guidebook. To support them in this process, the doctoral student—supervised by faculty—provided technical assistance through conference calls held once or twice per month, depending on LHD staff availability.
Two of the LHDs adapted educational EBPs with a train-the-trainer format that aimed to improve household preparedness among older adults through stockpiling disaster supplies (e.g., medication) and developing disaster plans. The third aimed to enhance Medical Reserve Corps volunteers’ willingness and ability to respond in a disaster by focusing on their household disaster kits and plans, as well as risk communication and trauma support skills. Each of the LHDs piloted its program, with plans to later evaluate whether it improved emergency preparedness among the LHD’s target populations.
EVALUATION
The research team conducted pre- and postassessments of LHD staff. Because of staff turnover and competing responsibilities, which are common problems in LHDs, there were fewer respondents in the postsurvey (n = 9) than in the presurvey (n = 12). The GTO Practitioner Capacity Scale assessed perceived knowledge and skills required for conducting implementation of best practices that are incorporated into the 10 GTO steps. Practitioner knowledge to complete GTO steps increased by 9.5%, though the results were nonsignificant (P = .448; Table 1). Practitioner skills significantly increased by 25.4% (P = .005; Table 2). Despite our small sample size, our evaluation suggests that the intervention can improve capacity among small, engaged groups of practitioners.
TABLE 1—
Average Score |
|||||
Practitioner Knowledge Itemsa | Baseline (n = 12) | Follow-Up (n = 9) | % Change | t | P |
Evaluate current programming, project, or initiative to assess whether it is meeting its goals and objectives by analyzing and interpreting new or existing data. | 1.83 | 2.00 | 9.3 | 0.482 | .64 |
Assess community strengths in programming by examining existing resources such as existing programs, staff, and availability of volunteers. | 1.83 | 2.33 | 27.3 | 1.590 | .13 |
Determine whether any best practice or evidence-based program is compatible with the goals and objectives of your new program, project, or initiative. | 1.83 | 2.00 | 9.3 | 0.595 | .60 |
Develop objectives (i.e., what you want to change) that are directly linked to program, project, or initiative goals. | 2.42 | 2.33 | −1.4 | 0.249 | .81 |
Examine how the new program, project, or initiative will fit with the philosophy of your local health department. | 2.25 | 2.00 | −11.1 | 0.772 | .45 |
Use results from an evaluation of the program, project, or initiative to improve implementation—such as modifying an activity of the program, project, or initiative. | 2.00 | 2.22 | 11.0 | 0.932 | .36 |
Develop a plan to sustain the program, project, or initiative if it is successful (i.e., determine future funding sources). | 1.83 | 2.44 | 33.3 | 2.491 | .022 |
Average knowledge score | 2.00 | 2.19 | 9.5 | 0.774 | .49 |
Range = 1 (would need a great deal of help to carry out this task) to 3 (could carry out this task without any help).
TABLE 2—
Average Score |
|||||
Practitioner Skills Itemsa | Baseline (n = 12) | Follow-Up (n = 9) | % Change | t | P |
Examine your community’s current need. | 4.08 | 5.11 | 25.2 | 1.652 | .12 |
Determine the availability of resources (e.g., staff) in your community. | 4.80 | 5.22 | 8.8 | 0.591 | .56 |
Develop goals (e.g., short-term, intermediate, or long-term) to address your community’s needs (e.g., to improve preparedness). | 4.33 | 4.56 | 5.3 | 0.335 | .74 |
Locate, evaluate, or use best practices. | 4.00 | 4.67 | 16.8 | 1.101 | .29 |
Examine whether your programs, projects, or initiatives duplicate existing efforts in your community. | 3.33 | 4.11 | 23.4 | 1.071 | .30 |
Develop a detailed implementation plan (e.g., staff roles, timelines, target population locations) for your programs, projects, or initiatives. | 3.67 | 4.89 | 33.2 | 1.651 | .12 |
Evaluate whether programs, projects, or initiatives are implemented according to plan. | 3.42 | 4.67 | 36.5 | 1.639 | .12 |
Evaluate how well your programs, projects, or initiatives produced the desired improvements in the participants. | 3.17 | 4.89 | 54.3 | 2.495 | .022 |
Use evaluation feedback to improve your programs, projects, or initiatives | 3.75 | 5.22 | 39.2 | 1.899 | .07 |
Take actions to keep your programs, projects, or initiatives running | 4.58 | 5.56 | 21.4 | 1.400 | .18 |
Average skills score | 3.90 | 4.89 | 25.4 | 3.208 | .005 |
Range = 1 (never) to 7 (very often).
ADVERSE EFFECTS
The results of the Evidence-Based Practice Attitude Scale6 show a small, nonsignificant (P = .462) reduction in positive attitudes toward EBPs (3.1%) that is unlikely to be meaningful in this small sample (results not presented). We also assessed participants’ perceived organizational support of EBPs via the Organizational Support for Evidence-Based Practices Scale.7 We observed a nonsignificant 21.5% decrease (P = .850) in perceptions of health departments’ organizational support for EBP translation, which was largely driven by a reduction in perceptions about health departments providing financial incentives to use EBPs (results not presented).
SUSTAINABILITY
We tailored GTO to emergency preparedness in the “Getting To Outcomes Guide for Community Emergency Preparedness.” It has tools that facilitate the completion of each step and links to additional resources such as risk assessment tools, examples of EBPs, and evaluation instruments. The final version of the guidebook is available at https://www.rand.org/pubs/tools/TL259.html and https://cphd.ph.ucla.edu/tools-and-resources. As with our capacity development interventions, the knowledge and skills developed remain after program completion and can be used for future efforts.
PUBLIC HEALTH SIGNIFICANCE
The number of evidence-based emergency preparedness programs is growing4 at the same time that our nation needs to implement these programs in the face of rising disaster severity and frequency. This project demonstrated that a GTO approach could enhance capacity for EBP translation and implementation. Widespread use of the GTO Guide for Community Emergency Preparedness to translate and implement these EBPs could improve public health emergency preparedness nationally. The GTO guide and accompanying training and technical assistance can be used by any LHD. Funding should be made available to LHDs to support working with academic partners in the training and technical assistance components of GTO.
ACKNOWLEDGMENTS
This study was supported under a cooperative agreement with the Centers for Disease Control and Prevention’s (CDC’s) Collaboration With Academia to Strengthen Public Health Workforce Capacity (grant 3 U36 OE000002-04 S05), funded by the CDC and the Office of Public Health and Preparedness and Response through the Association of Schools and Programs of Public Health (ASPPH).
Note. The contents of this article are solely the responsibility of the authors and do not necessarily represent the official views of the CDC, the Department of Health and Human Services, or the ASPPH.
REFERENCES
- 1.Ennett S, Ringwalt C, Thorne J et al. comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prev Sci. 2003;4(1):1–14. doi: 10.1023/a:1021777109369. [DOI] [PubMed] [Google Scholar]
- 2.Chinman M, Acosta J, Ebener P, Malone PS, Slaughter M. A cluster-randomized trial of Getting To Outcomes’ impact on sexual health outcomes in community-based settings. Prev Sci. 2018;19(4):437–448. doi: 10.1007/s11121-017-0845-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Eisenman D, Glik D, Gonzalez L et al. Improving Latino disaster preparedness using social networks. Am J Prev Med. 2009;37(6):512–517. doi: 10.1016/j.amepre.2009.07.022. [DOI] [PubMed] [Google Scholar]
- 4.PERRC toolkits. Available at: https://cdn1.sph.harvard.edu/wp-content/uploads/sites/1609/2017/03/PERRC-Toolkit-Inventory.pdf. Accessed April 27, 2018.
- 5.Blase K, Fixsen D. Core intervention components: identifying and operationalizing what makes programs work. ASPE Research Brief. February 1, 2013. Available at: https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work. Accessed January 8, 2018. [Google Scholar]
- 6.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS) Ment Health Serv Res. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Aarons GA, Sommerfeld DH, Walrath-Greene CM. Evidence-based practice implementation: the impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implement Sci. 2009;4:83. doi: 10.1186/1748-5908-4-83. [DOI] [PMC free article] [PubMed] [Google Scholar]