Abstract
Context
Sustainability has been defined as the existence of structures and processes that allow a program to leverage resources to effectively implement and maintain evidence-based public health and is important in local health departments (LHDs) to retain the benefits of effective programs.
Objective
Explore the applicability of the Program Sustainability Framework in high- and low-capacity LHDs as defined by national performance standards.
Design
Case study interviews from June-July 2013. Standard qualitative methodology was used to code transcripts; codes were developed inductively and deductively.
Setting
Six geographically diverse LHD’s (selected from three high- and three low-capacity)
Participants
35 LHD practitioners
Main Outcome Measures
Thematic reports explored the eight domains (Organizational Capacity, Program Adaptation, Program Evaluation, Communications, Strategic Planning, Funding Stability, Environmental Support, and Partnerships) of the Program Sustainability Framework.
Results
High-capacity LHDs described having environmental support, while low-capacity LHDs reported this was lacking. Both high- and low-capacity LHDs described limited funding; however, high-capacity LHDs reported greater funding flexibility. Partnerships were important to high- and low-capacity LHDs, and both described building partnerships to sustain programming. Regarding organizational capacity, high-capacity LHDs reported better access to and support for adequate staff and staff training compared to low-capacity LHDs. While high-capacity LHDs described integration of program evaluation into implementation and sustainability, low-capacity LHDs reported limited capacity for measurement specifically and evaluation generally. When high-capacity LHDs described program adoption, they discussed an opportunity to adapt and evaluate. Low-capacity LHDs struggled with programs requiring adaptation. High-capacity LHDs described higher quality communication than low-capacity LHDs. High- and low-capacity LHDs described strategic planning, but high-capacity LHDs reported efforts to integrate evidence-based public health.
Conclusions
Investments in leadership support for improving organizational capacity, improvements in communication from the top of the organization, integrating program evaluation into implementation, and greater funding flexibility may enhance sustainability of evidence-based public health in LHDs.
Keywords: Program sustainability, Evidence-based public health, Local health departments
Introduction
Today, there is substantial evidence regarding effective public health interventions, programs, and policies.1–4 Evidence-based decision making (EBDM) in local health departments (LHDs), and other public health settings, is the process of translating the best available scientific data about effective programs and policies while considering the local needs and resources of the community.3 The Public Health Accreditation Board (PHAB) has included EBDM in the standards for LHDs pursuing or maintaining voluntary accreditation status.5 Use of evidence-based programs and policies as part of EBDM involves implementation of programs and policies that have been shown to be effective, for example in the Community Guide.2 An evidence-based process is related less to the actual programs and policies, and more to the processes within the LHD, for example, leadership, organizational climate and culture, and relationships and partnerships.6
Barriers to EBDM have been well defined and include such things as lack of relevant research, leadership characteristics and current political environment.3, 7, 8 In addition, ineffective dissemination of evidence-based programs and policies is a barrier to adoption in LHDs.9, 10 Less is known about what contributes to the sustained use of evidence-based programs and policies after they have been adopted in LHDs, and is an understudied area in dissemination and implementation science.6, 11
Sustainability has been defined as “the existence of structures and processes that allow a program to leverage resources to effectively implement and maintain evidence-based policies and activities”.12 This concept is complex and has varying terminology; in this definition, the structures could also include resources such as strong organizational infrastructure and leadership. Luke et al developed the Program Sustainability Framework to assess public health program capacity for sustainability, which includes eight domains.12–14 Five domains (Organizational Capacity, Program Adaptation, Program Evaluation, Communications, and Strategic Planning) are thought to fall in the internal locus of control, and involve activities that primarily occur or are managed within the program itself. The remaining three domains (Funding Stability, Environmental Support, and Partnerships) are grouped into external control as they are more greatly influenced by factors external to the program. The purpose of this study is to explore differences in high- and low-capacity LHDs in the sustainability framework domains.
Methods
Case Study Guide Development
The interview guide was developed based on previous literature,15–19 prior work by members of the project team,9, 109, 109, 109, 109, 109, 109, 109,10 and project team input to explore LHD use of EBDM. The guide included the following interview topics: 1) biographical information; 2) awareness of the existence of evidence-based programs and policies and an evidence-based process; 3) administrative support for an evidence-based process; 4) knowledge of the LHD accreditation process; 5) political climate and support for evidence-based programs and policies; 6) dissemination strategies that would further evidence-based programs and policies; and 7) key networks and partnerships to support EBDM. The interview guide was also based, in part, on findings from a national survey conducted by the research team, which examined use of administrative and management evidence-based practices in LHDs.19–21 Interview guide questions were developed to qualitatively supplement the data gaps from the national survey.20, 21 In May 2013, our case study guide underwent cognitive response testing (CRT) to elicit questions that were either unclear or potentially difficult to answer. CRT is routinely used in refining questionnaires to improve the quality of data collection. These 45–60 minute phone interviews were conducted with directors of LHDs in Missouri and Tennessee by the project manager. The CRT sample (n=6) was self-selected by members of the research team and included both urban and rural LHDs. Upon verification of consent, all interviews were audio recorded and field notes were taken during testing. Participants were instructed to provide feedback on questions lacking clarity and items that could be viewed as potentially difficult to answer. After the tester verbalized each question, the participant was allowed time to provide relevant feedback on each item. Information from these interviews was used to modify items and formulate the revised questionnaire for reliability testing. The final interview guide included 37 questions in the seven interview topics previously listed.
Case Study Sample Selection
The case study sample was selected using an administrative evidence-based process score from the national survey (described elsewhere20) and was linked to secondary data from the National Public Health Performance Standards Program (NPHPSP). The sample included LHDs falling in the top and bottom quartiles of each survey’s respective scoring system. Seventy-one (14%) LHDs from the national administrative evidence-based processes sample had overall performance score data from NPHPSP. In concordance with NPHPSP scoring methodology, an overall performance score was computed as a simple average of the 10 Essential Public Health Services scores and then ranked into quartiles. We defined “high-capacity” as NPHPSP scores in the top quartile (n=17) and “low-capacity” as scores in the bottom quartile (n=17). Of the 16 LHDs in the final sample, three that scored high and three that scored low on both measures were selected as case study sites. Based on this selection, the high-capacity LHD’s were more likely to have leadership and organizational culture and climate supportive of evidence-based practice (as evidenced by their administrative evidence-based process score), providing an environment that supports on-going evidence-based practices and policies.
Case Study Interviews
Interviews were conducted with 35 practitioners from six LHDs (three low- and three high-capacity) in June–July of 2013. The participants included the LHD director or deputy director who then self-selected key members of their LHD’s management team responsible for overseeing a division or work unit of the department. An average of five to six interviews were conducted at each of the six LHDs. Each interview was between 30 minutes and an hour depending on the length of answers and knowledge of the practitioner and was conducted by two members of the research team. All participants provided informed consent before the interview began. This study was approved by the Human Research Protection office at Washington University in St. Louis.
Case Study Analysis
The interviews were tape recorded with the respondent’s permission and transcribed verbatim. Standard qualitative methodology was used for data coding using NViVo software. Four team members were trained on coding to ensure reliability among raters. Coders were assigned transcripts to code independently, developing the codebook to capture new themes and subcategories. Updated codebooks were distributed after each coding session. Coding pairs systematically coded three interviews using NViVO noting any discrepancies and alternate coding. Once these transcripts were coded and the codebook refined, inter-rater reliability was evaluated using NVIVO with a final percent agreement among coders of 98%. Node reports were generated to explore common themes in the high-capacity and low-capacity LHD’s. These reports were then summarized into thematic reports for each questionnaire item in all of the eight domains of the sustainability framework.
Summary Table
From the thematic reports, a table was generated to compare similarities and differences of high-capacity and low-capacity LHDs using the eight sustainability framework domains defined by Luke et al.12: environmental support, funding stability, partnerships, organizational capacity, program evaluation, program adaptation, communications, and strategic planning. High- and low-capacity LHDs were compared based on their responses to similar themes.14 This information was based primarily on the thematic report on sustainability, but was drawn from other thematic reports as well. Based on the sustainability table, specific themes and patterns were identified and explored.
Results
Table 1 shows the characteristics of the six LHDs included in the final sample, and Table 2 summarizes the themes identified for each of the sustainability domains separated for high- and low-capacity LHDs; Illustrative quotes are provided. Except where it is noted that the responses were similar, respondents from high- and low-capacity LHDs had different responses to the same themes, as illustrated by the quotations. Though the quotes represent one respondent, they illustrate the perspective from the type of LHD where they work.
Table 1.
LHD Characteristics | n | % |
---|---|---|
Governance Structure | ||
Local | 2 | 33.3 |
Shared | 2 | 33.3 |
State | 2 | 33.3 |
Geographic Region | ||
Northeast | 1 | 16.6 |
Midwest | 2 | 33.3 |
South | 2 | 33.3 |
West | 1 | 16.6 |
Jurisdiction Size | ||
<25K | 0 | 0 |
25K – 49,999 | 2 | 33.3 |
50K – 99,999 | 2 | 33.3 |
100K – 499,999 | 1 | 16.6 |
500,000+ | 1 | 16.6 |
Table 2.
Sustainability Domain | Description | High-capacity LHDs | Low-Capacity LHDs |
---|---|---|---|
Environmental Support | Having a supportive internal and external climate for your program | “Public health is whatever the legislature says it is. So if they want us sampling beach water and give us money to do that, then that's what we will do. Whether there's any benefit to that or not, is not our decision, if that makes sense.” | “We've mostly received a lot of opposition in the political climate for our programs, and even though we're able to show that they are evidence-based, and what we're trying to do really does help in the long run, people find it as intruding on their personal rights.” |
Funding Stability | Establishing a consistent financial base for your program (meeting long-term needs, adjusting to changing trends, having a plan) |
“So we’re not the best paying agency in the world. I hate to say that. And I don’t have a lot of control over that. Not the best paying agency in the world, so we have to take what we can get, sometimes.” “They're willing to spend the money, if need be, or reach out to others in the community.” |
“Because we do not have latitude in how we spend money, I think there are probably… it probably impedes our ability to think about solutions to problems that could be affected had we been able to obtain and sustain.” |
Partnerships | Cultivating connections between your program and its stakeholders (connecting to greater resources/expertise, taking over providing services, advocating for the cause) | “We have to have collaborative people in there. Without them, we just couldn’t achieve a third of what we achieve.” |
“There’s no way we could do it on our own. No way. You have to have the community, you have to have the community partners, because if it’s community, then that means you don’t do it by yourself.” “If you’re not doing it, you’re assured that someone else is doing it.” |
Organizational Capacity | Having the internal support and resources needed to effectively manage your program |
“Our director and our senior management team [….] have to be of a mind to where we prioritize the evidence-based needs above other pet projects or other types of needs.” “involve the frontline staff to help us develop the solutions to the problems, so it’s not just us as the managers saying, this is what you have to do. We actually get them involved in the whole process.” |
“Sometimes we have support in, yes, we would like for you to do that. We think that's great, but time is limited, our funds are limited, and this is what you have to work with.” “If we had more personnel and more resources and more funding, we could be better equipped to implement evidence-based practices always. We have a hard time ever doing any evaluation, just because of limited resources.” |
Program Evaluation | Assessing your program to inform planning and document results (staying on track with goals/outcomes, collecting data about successes/impact to gain support and funding) |
“We plan it, we implement it, we check the data, then we go through the cycle again.” “We now have a standardized quality improvement (QI) process that’s written and each division within the department is asked to participate in one quality assurance project annually.” |
“They're not measuring anything, and of course I've only been here two years, and my question is, What are we measuring, and how are we evaluating the program to determine the impact?[….] If we can’t show where the outcomes truly are, are we really making an impact? Is that where we need to be spending our dollars?” |
Program Adaptation | Taking actions that adapt your program to ensure its ongoing effectiveness |
“We will identify the problem, obviously, and then go through a process by which we evaluate where the issues and the concerns are and then put into place an action plan.” “Usually people are pretty enthusiastic when it comes to something new. And this is kind of our reputation at this particular public health [department?], try it, if it doesn’t work, move on.” |
“You’ve got to prove not only that it works other places, but just because something works somewhere else, doesn’t mean that it’s going to work here, in the eyes of a lot of people.” “The evidence-based programs that exist out there may or may not always be the perfect fit for your program and success in it and trying to put everybody into a mold is… we’re all different.” (4.0 EBI Barriers) |
Communication | Strategic communication with stakeholders and the public about your program (internal and external) |
“There is formal monthly meetings and then there is the dissemination via emails and policies that are posted. I don’t think we ever will just post a policy and say, ‘here’s a new policy.’ No. So we’ll at least send the changes by email and talk about them.” “Our director and our senior management team [….] have to be of a mind to where we prioritize the evidence-based needs above other pet projects or other types of needs.” |
“You’ve just got to show it locally that those things are important and that they matter, they work.” “Our culture is pretty clear that wasted energy on non-evidence-based practices and programs is just that, wasted energy, and really not fitting in with our vision and mission.” |
Strategic Planning | Using processes that guide your program’s direction, goals, and strategies | “We are very forward thinking and have a vision of where we want this department to go and what we want it to be, I think those are very strong factors in continuing to sustain evidence-based interventions” | “I don't think we've got a systematic way of doing it. I think unfortunately, I have to admit, that it's oftentimes simply forced on us by way of grant opportunities.” |
Environmental Support
The need for environmental support and its importance as an underlying part of the way LHDs operate was recognized by both high and low-capacity LHDs, because, as one participant described, “Public health is whatever the legislature says it is.” However, the staff from the high-capacity LHDs interviewed described having this support, while low-capacity LHD staff reported this support was lacking. In some cases low-capacity LHDs reported that elected officials and/or health boards were not supportive of programs and even deny the evidence showing the need for programs. Although respondents did not discuss other environmental or economic supports, internal politics (i.e., organizational culture and climate) relates to the environment and were reoccurring themes throughout our interviews. The differences between high- and low-capacity LHDs are discussed under Organizational Capacity, below.
Funding Stability
Although both high and low-capacity LHDs described limited funding and limited length of funding, which limited sustainability, high-capacity LHDs reported greater flexibility of funding. Thus funding limitations were reported as much greater problem by staff at low-capacity LHDs. LHDs did report that programs with one-time expenses were more easily sustained when resources were limited. Low-capacity LHDs were also more likely to see financial constraints as limiting problems, rather than as barriers to overcome. They described very little latitude in how funds are used; this was seen as a major barrier to sustaining evidence-based programs and policies. Further, low-capacity LHDs were limited in their ability to fully implement mandated programming due to funding constraints.
Partnerships
Partnerships were seen as very important to both high- and low-capacity LHDs, and both groups described building partnerships to sustain programming and the necessity of these partnerships to the success of their work. Among high-capacity LHDs, buy-in from partners, particularly by helping the partners see the program benefits, was seen as a way for programming to be built into the partner’s budget. This was similar among low-capacity LHDs who saw partnerships as a way to help increase sustainability and build community buy-in. They also saw programming success as when a partner organization began to ‘own’ a program and as an important way to share resources and expertise. High- and low-capacity LHDs recognized the effort required to sustain partnerships over time or once a particular program or assessment activity was completed and that these struggles related to staff and funding.
Organizational Capacity
Both high- and low-capacity LHDs recognized the importance of having adequate staff and staff training. High-capacity LHDs reported presence of adequate staff and better access to staff training compared to low-capacity LHDs. Low-capacity LHDs reported that adequate staffing was often a barrier, and that recruiting staff was difficult due to salary constraints. In addition, staff at low-capacity LHDs described a disconnect in support from the top to bottom levels of leadership, where upper level management may not be supportive even when the director is. However, some high-capacity LHDs communicated the importance of EBDM by providing their staff with more opportunities for growth and involving staff at all levels in decision making.
Program Evaluation
While high-capacity LHDs described integration of evaluation into program implementation and sustainability, low-capacity LHDs reported much more limited capacity for measurement specifically and evaluation more generally. The measurement capabilities described by high-capacity LHDs stemmed from better training for department staff and a culture of prioritizing measurement and evaluation, whereas low-capacity LHDs reported being hampered in evaluation efforts by lack of staff capacity and funding priorities. The differences in measurement capacity were described in the context of program evaluation and quality improvement (QI), whereas high-capacity LHDs leveraged measurement capabilities for these processes as well. Further, high-capacity LHDs saw program success and results showing improved health as key factors for sustainability.
Program Adaptation
The major difference that emerged between high and low-capacity LHDs regarding program adaption is that when high-capacity LHDs described programs that may not fit, they viewed this an opportunity to evaluate and to adapt. Low-capacity LHDs had more difficulty getting past the lack of fit and trying new programs. They described needing proof that the program works locally. Additionally, several low-capacity LHDs reported that innovation and new ideas were not always welcome. These challenges were, in part, to limited funding flexibility and environmental support.
Communication
The presence of open communication occurred more frequently among high-capacity LHDs than low-capacity LHDs. Much of this difference seemed to come from the top down. This is particularly relevant to communicating the importance of EBDM to the organization, and changing perceptions within the LHD about EBDM as part of their culture. In high-capacity LHDs, this consistent message was reported to come from the top down, including all levels of the organization. Among low-capacity LHDs, there was a perception of poor communication about the importance of programs and about why EBDM matters as well as a lack of consistency from the managerial level.
Strategic Planning
High and low-capacity LHDs both described strategic planning, but high-capacity LHDs reported a greater effort to integrate EBDM into their strategic plans. High-capacity LHDs also described more progress in such planning, and ensuring from the outset that anything implemented would be lasting. On the other hand, low capacity LHDs seemed to be just starting to incorporate EBDM into their strategic planning, if at all and this process was less systematic. These LHDs, however, did report having begun health assessments and/or improvement plans as well as recognition of the need to look at whether there were programing needs or if others in the community were already filling those needs.
Discussion
This analysis shows that while there are some similarities in high- and low-capacity LHDs, there are differences that could explain why some LHDs have more capacity for sustaining programs than others. These differences tended to fall under the sustainability framework domains considered to be within internal, rather than external, control. Organizational capacity, particularly leadership support for building capacity, program evaluation, program adaptation, communication (with important differences from the organization’s hierarchy), and funding stability showed important differences between high- and low-capacity LHDs. Though there were some differences, partnerships, strategic planning, and environmental support showed more similarities between the two groups. A previous study found community partnership processes similar in high vs. low capacity agencies.22 Several of the sustainability domains identified through earlier research and the current study have been shown to be modifiable by a combination of training, practice changes, or analytic approaches.7, 23
Leadership support is important because leaders have the ability to affect the adoption of EBDM directly, through allocation of resources (human and material), and indirectly, through encouragement, support, and mentorship.22 Researchers in other health fields have stated that “leadership is critical to build organizational readiness for change”.24 Our findings reflect the importance of leadership in that many of the sustainability domains that low-capacity LHDs struggled with—particularly the internal factors, as well as partnerships, to some degree—were influenced by lack of support or leadership disconnect. High-capacity LHDs seemed to have more active, progressive, and supportive leadership when it came to EBDM, whereas low-capacity LHDs reported that some of their upper-level management was not as supportive.
Leaders’ priorities (i.e., program implementation, measurement, and evaluation) can intentionally or unintentionally affect what programs are implemented or how they are measured and evaluated. If EBDM is mentioned, discussed, or emphasized by organizational leaders but not by mid-level management, it creates an inconsistency within the organization that affects how programs are implemented, evaluated, and prioritized. A similar effect occurs when leaders do not allocate resources in a way that supports EBDM, model positive attitudes towards the use of evidence-based programs and policies, or show sufficient knowledge of evidence-based processes.25 The role of leadership in communication was especially important, as, in the current study, high-capacity LHDs mentioned consistent, constant communication of expectations and information, whereas low-capacity LHDs mentioned poorer communication about the importance of EBDM and a lack of consistency from those in leadership positions. An absence of clarity and consistency among leaders of an organization, even across multiple levels, can influence the implementation and impact of certain initiatives.24 This can reduce organizational capacity as was observed among several of the low-capacity LHDs in the sample.
Previous research distinguishes between transactional leadership, which is based on practical exchanges and emphasizes specific accomplishments and objectives, and transformational leadership, which aims to reach goals through inspiration and motivation and emphasizes specific values.26 Although both transformational and transactional leadership can be associated with encouraging attitudes towards EBDM and staff trying new things, transformational leadership specifically makes staff more likely to find new practices appealing, more likely to adopt them, and more likely to perceive fewer gaps between EBDM and their current practices.27 This may be particularly important for the differences observed between high- and low-capacity LHDs with regard to program adaptation, and the enthusiasm in high-capacity LHDs to adapt programs as opposed to the hesitancy to approach programs requiring adaptation among low-capacity LHDs. Leaders in low-capacity LHDs might seek to adopt this leadership style, which might build a positive attitude toward innovation among their staff.
Improving communication within the LHD, with a focus on the top of the organizational structure, may be important to improving program sustainability. Lack of leadership support, especially from mid-level management, makes it difficult to communicate the importance of EBDM, explain the reasoning behind it, and build enthusiasm and buy-in about its effectiveness.23–26 Internal communication should be seen as a dialogic process of engagement, clarification, negotiation, and perspective-taking, rather than as information exchange. This is made even more important by the differences in leaders’ views regarding good communication and those of employees. The best predictors of successful change (according to employees) are the sense that employee input is valued, leadership with a clear vision, and measures in place to reduce resistance to change.28 A review of administrative and management evidence-based practices to improve local public health identified participatory decision-making, involving communication with employees to get their input as an effective way to create an environment conducive to EBDM.6 Efforts by low-capacity LHDs to improve internal communication might include incorporating participatory decision-making.
Agencies that have created a QI culture are more likely to have a history of EBDM, have data collection systems and methods in place, and see barriers such as budget cuts or health crises as opportunities for improvement. The link between implementation and evidence-based programs and polices is important, as we found that LHD’s who had strong QI practices were more likely to have cultures that support EBP’s. As mentioned above, this was observed in our sample—high-capacity LHDs reported adapting programs to fit their department, whereas low-capacity LHDs viewed programs not designed to fit the LHD specifically with skepticism. A QI culture also tends to correspond with a perception of QI as something important to the agency, rather than something required by the accreditation process. In contrast, agencies that have more informal QI processes are more likely to consider QI a part of accreditation and see those barriers as overwhelming or insurmountable.29 This was reflected in the differing responses between high-capacity and low-capacity LHDs about program evaluation, as well as program adaptation. High-capacity LHDs were more likely to see evaluation as important, and use that evaluation to adapt or adopt new programs, whereas low-capacity LHDs were more likely to report lacking established methods of evaluation, and reported a lack of appropriate programs or funding as a significant barrier. Factors needed to facilitate a QI culture include having agency leaders and staff who are committed to using QI processes, aligning QI practices with strategic goals, having experience in QI or EBDM in the past, having leaders, partners, and boards that hold them accountable for their service quality, and having a supportive infrastructure with sufficient resources to maintain a QI culture.27–29 For many high-capacity LHDs, program evaluation has involved creating a culture of improvement within their agency that makes it an expectation—if they do face barriers, most of the time it is in formalizing the process. While some of the challenges faced by low-capacity LHDs are not modifiable, it might be feasible to begin to shift the culture of the organization toward one more supportive of QI. Similar findings have been observed in high- compared to low-capacity primary care practices with regard to implementation of patient-centered medical homes.30 This adds support to the notion that many factors, such as the way challenges are perceived, which may enhance implementation, differ based on organizational capacity as well as the role of organizational leadership in helping to overcome implementation challenges.30
The differences described in program adaptation and evaluation may be related to funding. This is particularly important given recent trends toward funding and job cuts.31 Though funding was a major issue for all the LHDs interviewed, low-capacity LHDs reported much less flexibility in using available funds and in seeking funds. The challenge of adaptation may be particularly pronounced for low-capacity LHDs given the risk of failure of an adapted program or practice in light of limited resources. Further, this may prevent the use of funds toward efforts to sustain EBDM, such as program adaptation and evaluation, which may in the end lead to more effective uses of resources. Unfortunately, other work has shown that many new programs are not sustained past the first few years after initial funding has ended.30 Therefore, efforts to adapt and evaluate programs may feel wasted in an atmosphere where programs are not sustained, which may further prevent low-capacity LHDs from investing their time in such efforts. Other studies of LHD have noted local sources of funding as important to program effectiveness, suggesting that local sources of revenue may make LHDs more responsive to local needs, thereby improving intervention effectiveness.32, 33 This may partially explain the finding by Luke et al. that funding stability was one of the two domains most closely associated with perceived program sustainability.12
This study had limitations. Only a small sample of LHDs was included, which may limit the generalizability of the findings. However, we sought a representative sample with regard to geography and capacity. Though we interviewed several employees from each LHD, it was not possible to gain perspectives with every employee. In addition, the LHD director identified staff to be interviewed, introducing selection bias. LHDs may also have tried to present a favorable approach to EBDM in the interview, introducing social desirability bias. However, this study is strengthened by the in-depth nature of the qualitative data collection, allowing a rich picture of the LHDs to emerge. While the sample was selected to identify high- and low-capacity LHDs, this selection was based on several scores to assess capacity, not based on actual implementation and sustainment of evidence-based programs and policies (i.e., the lack of an objective method of measuring implementation/maintenance). Therefore, it is possible that those identified as high-capacity, based on the metrics used, were not actually implementing and sustaining evidence-based programs and policies. Finally, this case study does not allow for longitudinal assessment, thus it is not possible to determine the direction of cause and effect.
Organizational capacity, program evaluation, program adaptation, communication, and funding stability seem to be related to whether a LHD is able to sustain programming. Modest investments in leadership support for improving organizational capacity, improvements in communication from the top of the organization down, integrating program evaluation into implementation, and greater flexibility in funding may enhance the sustainability of evidence-based programming in LHDs. Increased top-down communication and program evaluation would help to build an internal agency culture that research suggests would be more resilient to external conflicts, such as funding instability or complicated political environments. Integration of the findings from these case studies may help improve sustainability of evidence-based programming in LHD settings.
Acknowledgments
The authors thank members of our research team, Dr. Paul Erwin (University of Tennessee, Knoxville), Carolyn Leep (National Association of City and County Health Officials), Dr. Rodrigo Reis (the Pontifical Catholic University of Parana and the Federal University of Panara), Janet Canavese, Kathleen Wojciehowski (Missouri Institute for Community Health), Dr. Dorothy Cilenti (University of North Carolina), Dr. Jenine Harris, Dr. Amy Eyler, Dr. Beth Dodson, and Robert Fields (Washington University in St. Louis).
Funding: This study was supported by Robert Wood Johnson Foundation's grant no. 69964 (Public Health Services and Systems Research). This publication was also made possible by Grant Number 1P30DK092950 from the NIDDK and by the Washington University Institute of Clinical and Translational Sciences grant UL1TR000448 from the National Center for Advancing Translational Sciences (NCATS) of the National Institutes of Health (NIH), and its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIDDK or NIH.
Contributor Information
Rachel G. Tabak, Prevention Research Center in St. Louis, St. Louis, MO, USA; George Warren Brown School at Washington University in St. Louis, St. Louis, MO, USA.
Katie Duggan, Prevention Research Center in St. Louis, St. Louis, MO, USA.
Carson Smith, Prevention Research Center in St. Louis, St. Louis, MO, USA.
Kristelle Aisaka, Prevention Research Center in St. Louis, St. Louis, MO, USA.
Sarah Moreland-Russell, Brown School of Social Work, St. Louis, MO, USA; Brown School, Washington University in St. Louis, St. Louis, MO, USA.
Ross C. Brownson, Prevention Research Center in St. Louis, St. Louis, MO, USA; Brown School, Washington University in St. Louis, St. Louis, MO, USA; Division of Public Health Sciences and Alvin J. Siteman Cancer Center, Washington University School of Medicine, St. Louis, MO, USA.
REFERENCES
- 1.Centers for Disease Control and Prevention. Ten great public health achievements--United States, 1900–1999. MMWR. Morb. Mortal. Wkly. Rep. 1999 Apr 2;48(12):241–243. [PubMed] [Google Scholar]
- 2.The Guide to Community Preventive Services. 2014. [Accessed December 8, 2014];2014 http://www.thecommunityguide.org/. [Google Scholar]
- 3.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: A fundamental concept for public health practice. Annu. Rev. Public Health. 2009 Apr 21;30:175–201. doi: 10.1146/annurev.publhealth.031308.100134. [DOI] [PubMed] [Google Scholar]
- 4.Milio N. Evaluation of health promotion policies: tracking a moving target. WHO Reg. Publ. Eur. Ser. 2001;(92):365–385. [PubMed] [Google Scholar]
- 5.Public Health Accreditation Board (PHAB) Public Health Accreditation Board Standards: An Overview. 2011 http://www.phaboard.org/wp-content/uploads/PHAB-Standards-Overview-version-1.0.pdf. [Google Scholar]
- 6.Allen P, Brownson RC, Duggan K, Stamatakis KA, Erwin PC. The Makings of an Evidence-Based Local Health Department: Identifying Administrative and Management Practices. Frontiers in Public Health Services and Systems Research. 2012;1(2) [Google Scholar]
- 7.Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC. Fostering more-effective public health by identifying administrative evidence-based practices: a review of the literature. Am. J. Prev. Med. 2012 Sep;43(3):309–319. doi: 10.1016/j.amepre.2012.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Jacobs JA, Clayton PF, Dove C, et al. A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012;12:57. doi: 10.1186/1472-6963-12-57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Brownson RC, Ballew P, Brown KL, et al. The effect of disseminating evidence-based interventions that promote physical activity to health departments. Am. J. Public Health. 2007 Oct;97(10):1900–1907. doi: 10.2105/AJPH.2006.090399. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Brownson RC, Ballew P, Dieffenderfer B, et al. Evidence-based interventions to promote physical activity: what contributes to dissemination by state health departments. Am. J. Prev. Med. 2007 Jul;33(1 Suppl):S66–S73. doi: 10.1016/j.amepre.2007.03.011. [DOI] [PubMed] [Google Scholar]
- 11.Fielding JE. Where is the evidence? Annu. Rev. Public Health. 2001;22:v–vi. [Google Scholar]
- 12.Luke DA, Calhoun A, Robichaux CB, Elliot MB, Moreland-Russell S. The Program Sustainability Assessment Tool: A new instrument for public health programs. Preventing Chronic Disease. 2014 Jan 23;11(130184) doi: 10.5888/pcd11.130184. 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Schell SF, Luke DA, Schooley MW, et al. Public health program capacity for sustainability: a new framework. Implement Sci. 2013;8:15. doi: 10.1186/1748-5908-8-15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Calhoun A, Mainor A, Moreland-Russell S, Maier RC, Brossart L, Luke DA. Using the Program Sustainability Assessment Tool to assess and plan for sustainability. Prev Chronic Dis. 2014;11:130185. doi: 10.5888/pcd11.130185. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Steckler A, Goodman RM, McLeroy KR, Davis S, Koch G. Measuring the diffusion of innovative health promotion programs. Am. J. Health Promot. 1992 Jan-Feb;6(3):214–224. doi: 10.4278/0890-1171-6.3.214. [DOI] [PubMed] [Google Scholar]
- 16.Brink SG, Basen-Engquist KM, O'Hara-Tompkins NM, Parcel GS, Gottlieb NH, Lovato CY. Diffusion of an effective tobacco prevention program. Part I: Evaluation of the dissemination phase. Health Educ. Res. 1995 Sep;10(3):283–295. doi: 10.1093/her/10.3.283. [DOI] [PubMed] [Google Scholar]
- 17.Parcel GS, O'Hara-Tompkins NM, Harrist RB, et al. Diffusion of an effective tobacco prevention program. Part II: Evaluation of the adoption phase. Health Educ. Res. 1995 Sep;10(3):297–307. doi: 10.1093/her/10.3.297. [DOI] [PubMed] [Google Scholar]
- 18.Riley BL. Dissemination of heart health promotion in the Ontario Public Health System: 1989–1999. Health Educ. Res. 2003 Feb;18(1):15–31. doi: 10.1093/her/18.1.15. [DOI] [PubMed] [Google Scholar]
- 19.Erwin PC, Harris JK, Smith C, Leep CJ, Duggan K, Brownson RC. Evidence-Based Public Health Practice Among Program Managers in Local Public Health Departments. J. Public Health Manag. Pract. 2013 Nov 18; doi: 10.1097/PHH.0000000000000027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Brownson RC, Reis RS, Allen P, Fields R, Stamatakis KA, Erwin PC. Understanding administrative evidence-based practices: Findings from a survey of local health department leaders. Am. J. Prev. Med. 2014;46(1):49–57. doi: 10.1016/j.amepre.2013.08.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Reis R, Duggan K, Allen P, Stamatakis K, Erwin P, Brownson R. Developing a tool to assess administrative evidence-based practices in local health departments. Frontiers in PHSSR. 2014 doi: 10.1016/j.amepre.2013.08.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Alexander JA, Weiner BJ, Metzger ME, et al. Sustainability of collaborative capacity in community health partnerships. Med. Care Res. Rev. 2003 Dec;60(4 Suppl):130–160. doi: 10.1177/1077558703259069. [DOI] [PubMed] [Google Scholar]
- 23.Allen P, Brownson RC, Duggan K, Stamatakis KA, Erwin PC. The makings of an evidence-based local health department: Identifying administrative and management practices. Frontiers in Public Health Services and Systems Research. 2012;1(2):2. [Google Scholar]
- 24.Newhouse RP, Dearholt S, Poe S, Pugh LC, White KM. Organizational change strategies for evidence-based practice. J. Nurs. Adm. 2007 Dec;37(12):552–557. doi: 10.1097/01.NNA.0000302384.91366.8f. [DOI] [PubMed] [Google Scholar]
- 25.Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu. Rev. Public Health. 2014;35:255–274. doi: 10.1146/annurev-publhealth-032013-182447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatr. Serv. 2006 Aug;57(8):1162–1169. doi: 10.1176/appi.ps.57.8.1162. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Halm MA. "Inside looking in" or "inside looking out"? How leaders shape cultures equipped for evidence-based practice. Am. J. Crit. Care. 2010 Jul;19(4):375–378. doi: 10.4037/ajcc2010627. [DOI] [PubMed] [Google Scholar]
- 28.Lewis LK. Employee perspectives on implementation communication as predictors of perceptions of success and resistance. Western Journal of Communication. 2006;70(1):23–46. [Google Scholar]
- 29.Davis MV, Mahanna E, Joly B, et al. Creating quality improvement culture in public health agencies. Am. J. Public Health. 2014 Jan;104(1):e98–e104. doi: 10.2105/AJPH.2013.301413. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Wise CG, Alexander JA, Green LA, Cohen GR, Koster CR. Journey toward a patient-centered medical home: readiness for change in primary care practices. Milbank Q. 2011 Sep;89(3):399–424. doi: 10.1111/j.1468-0009.2011.00634.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Savaya R, Spiro S, Elran-Barak R. Sustainability of social programs a comparative case study analysis. American Journal of Evaluation. 2008;29(4):478–493. [Google Scholar]
- 32.Studnicki J, Gipson LS, Berndt DJ, et al. Special healthcare taxing districts: association with population health status. Am. J. Prev. Med. 2007 Feb;32(2):116–123. doi: 10.1016/j.amepre.2006.11.001. [DOI] [PubMed] [Google Scholar]
- 33.Studnicki J, Fisher JW, Kamble S. Race-differentiated outcomes in multiple special healthcare taxing districts. Am. J. Prev. Med. 2010 Mar;38(3):311–316. doi: 10.1016/j.amepre.2009.11.006. [DOI] [PubMed] [Google Scholar]