Abstract
We examined variation in the use of evidence-based decision-making (EBDM) practices across local health departments (LHDs) in the United States and the extent to which this variation was predicted by resources, personnel, and governance. We analyzed data from the National Association of County and City Health Officials Profile of Local Health Departments, the Association of State and Territorial Health Officials State Health Departments Profile, and the US Census using 2-level multilevel regression models. We found more workforce predictors than resource predictors. Thus, although resources are related to LHDs’ use of EBDM practices, the way resources are used (e.g., the types and qualifications of personnel hired) may be more important.
In 2003, 15 years after The Future of Public Health was published, the Institute of Medicine noted that the United States was not meeting population health goals; specifically, large health disparities existed among socioeconomic groups, racial groups, and men and women, and the US governmental health system was in disarray.1 To address these problems, the Institute of Medicine recommended that public health system organizations, including state and local health departments (LHDs), adopt a population-level approach to improve the public’s health, make decisions, and take action based on evidence.1 In 2013, disparities still existed; the United States ranked last among 17 peer high-income countries in health outcomes, including life expectancy, infant mortality, adolescent pregnancy, drug-related mortality, and obesity.2
Adopting evidence-based approaches allows LHDs, the local “backbone of our public health system,”1(p27) to effectively use their limited resources to improve the health of the population. These population health approaches focus on lowering disease risk for the entire population and reducing inequities that affect disease patterns. They are a more effective, less costly means to change disease patterns than providing personal health care.3–8 Researchers have differed, however, in how they define evidence-based public health (EBPH).5,7,9,10 Our work relies on the specific definition given by Brownson et al.7 and we refer to this process as evidence-based decision-making (EBDM). The key processes of EBDM are
making decisions using the best available scientific evidence, systematically using data and information systems, applying program-planning frameworks (that often have a foundation in behavioral science theory), engaging the community in assessment and decision making, conducting sound evaluation, and disseminating what is learned.7(p177)
Little information exists about the types of and frequency with which LHDs use EBDM practices, although many researchers and practitioners have written about barriers to and facilitators of EBDM.6,11–16 Increasing the extent to which LHDs practice EBDM requires first assessing the extent to which LHDs currently use EBDM practices and then identifying modifiable factors that predict their use. Two frameworks suggest factors that may be related to LHDs’ use of EDBM practices.17,18 Handler et al.17 argued that structural capacity (including information, organizational, physical, human, and fiscal resources) must be in place for the functions of the public health system to be achieved. Meyer et al.18 argued that the organizational capacity of the public health system includes fiscal resources, workforce and human resources, physical infrastructure, interorganizational relations, informational resources, system boundaries and size, governance and decision-making structure, and organizational culture.18 We used these frameworks to identify workforce, fiscal, and governance factors at the state and local level that may predict variation in the use of EBDM across LHDs.
Although investments in public health are associated with decreased mortality19,20 and improved performance across the 10 essential public health services,21 estimates have suggested that public health spending makes up only 3% of national health and medical care spending.22 Moreover, LHD funding continues to decrease. From 2009 to 2010, 44% of LHDs faced budget cuts, and 18% reduced services.23 Thus, even when LHDs are motivated to use EBDM practices, they may not have the financial resources to do so. Both time and money have been reported as barriers to EBDM use.5–7,11,12 We therefore hypothesized that funding for local public health would be associated positively with EBDM practices because of the effect of funding on organizational capacity and outcomes and that budget cuts would be negatively associated with EBDM practices.
Fewer than 1 in 5 LHD workers are trained in public health24; few LHD top executives have formal public health training25 or state-required professional credentialing.26 Previous public health services and systems research has found mixed effects of directors’ qualifications on LHD performance, essential public health services,27,28 the breadth of different LHD services provided,29 and reducing health disparities.30 Although medicine and nursing have had longer histories of evidence-based training than public health, evidence-based practice in these 2 disciplines tends to be clinical, rather than population based, in its focus.7,10,31 Therefore, we hypothesized that we would find a positive relationship between LHDs with public health–trained leaders and EBDM but no relationship between nurse-led or physician-led LHDs and EBDM.
Within public health, some professions have had more exposure to the processes involved in EBDM than others.5 For example, trained epidemiologists use surveillance data to identify community health problems and risk factors.32,33 Preparedness coordinators use syndromic surveillance for the early detection of outbreaks. Trained health educators are skilled in community assessment and the development, adaptation, implementation, and evaluation of evidence-based interventions.34 Not all LHDs have access to personnel with training in these evidence-based approaches.5,7,10,11,35,36 We hypothesized that LHDs that employed epidemiologists, health educators, and emergency preparedness staff would use more EBDM practices. Because many nutritionists work in clinical roles, we did not expect an association. We also hypothesized that other workforce factors, including the per capita workforce and staff attendance at health impact assessment training, would be positively associated with the number of EBDM practices used.
In terms of the local–state governance relationship, we hypothesized that LHDs in states with centralized public health systems (in which LHDs are under the authority of state government), compared with those in decentralized systems (in which LHDs are under the authority of local government),37 would use more EBDM practices because of the authority of the states to set and enforce local standards.38–40 Although the LHD performance research on the impact of boards of health is mixed,41 we hypothesized that a local board of health could influence the use of EBDM by pushing for the use of evidence to increase LHD effectiveness and efficiency42 or to respond to local needs through the use of epidemiology, community health assessment, and planning.42 We hypothesized that other local- and state-level contextual variables (e.g., jurisdiction size, percentage of the population living in poverty) would be important predictors of EBDM because of their influence on LHD performance. We treated these contextual variables as control variables because they are less modifiable.
METHODS
To test the hypotheses, we used cross-sectional data from the National Association of County and City Health Officials (NACCHO) 2010 Profile Study,35 the Association of State and Territorial Health Officials 2010 Profile Survey,43 and the 2010 Census of Population and Housing.44 In 2010, all LHDs (n = 2565) were invited to complete the core survey of the NACCHO study, and a scientifically derived sample of these LHDs (n = 625) was invited to complete an extra set of questions (module 2) that included items pertaining to EBDM. Researchers at the University of Kentucky and Georgia Southern University combined these data into a single harmonized data set.45 Our analyses used data from 516 LHDs that completed module 2 (82.6% of those LHDs invited to complete module 2, representing 20.1% of all LHDs in the country). The LHDs were from 47 states and Washington, DC (Hawaii and Rhode Island do not have LHDs, and no LHDs in Nevada completed module 2). The total number of LHDs per state that completed module 2 ranged from 1 to 25 (median = 10 LHDs per state).
Outcome Variable
We operationalized EBPH as EBDM for population health, and we created an index consisting of items that captured different aspects of EBDM.7 We did not use a scale because EBDM is not a latent construct causing LHDs to engage in specific activities.46 Instead, EBDM is defined by the specific activities in which LHDs engage. We selected items for this index on the basis of the literature and feedback from an expert advisory panel of what constitutes EBPH practices. The expert advisory panel (n = 14) consisted of EBPH researchers, state and local public health officials, and representatives of national public health organizations (NACCHO, Association of State and Territorial Health Officials, Public Health Foundation, National Network of Public Health Institutes, Centers for Disease Control and Prevention staff for the Task Force for Community Preventive Services). Interviews elicited from the experts (1) how they define evidence, (2) how they define EBPH, and (3) what LHDs do that would suggest they are using EBPH practices.
On the basis of the Brownson et al.7 definition and the responses of the expert panel members, we selected items from the NACCHO Profile Survey47 that indicated whether the LHD was engaged in EBDM. We established content validity by having 10 experts rate the items on a 5-point scale ranging from “not important/does not reflect evidence-based decision making” to “critical to evidence-based decision making.” All items were rated as “important,” “very important,” or “critical” to EBDM. Scores for the resulting EBDM index could range from 0 to 7, reflecting the number of different EBDM practices reported by a LHD (see Table 1 for scoring).
TABLE 1—
Index for Evidence-Based Decision-Making by Local Health Departments: United States, 2010
| Individual Item and Response | Item Score | % of LHDs Performing This Activity |
| No. of epidemiology or surveillance activities performed directly by LHDa | ||
| 0 types | 0 | 5.1 |
| 1–3 types | 1 | 43.0 |
| 4–7 types | 2 | 51.9 |
| Completed a community health assessment within the past 5 y | ||
| No or missing | 0 | 39.1 |
| Yes | 1 | 60.9 |
| Participated in a community health improvement plan within the past 5 y | ||
| No or missing | 0 | 50.5 |
| Yes | 1 | 49.5 |
| LHD has applied research findings in its own organization within the past 12 mo | ||
| No or missing | 0 | 74.3 |
| Yes | 1 | 25.7 |
| LHD has already used the County Health Rankings to increase public, policymaker, or media awareness of the multiple factors that influence health | ||
| No or missing | 0 | 64.4 |
| Yes | 1 | 35.6 |
| LHD staff used the Community Guide to support or enhance decision-making in the LHD | ||
| Missing, unknown, or LHD staff have not used the Community Guide | 0 | 77.7 |
| LHD staff in a few, many, or all relevant programmatic areas have used the Community Guide | 1 | 22.3 |
Note. Community Guide = Guide to Community Preventive Services4; LHD = local health department.
Epidemiology or surveillance activity types performed were communicable or infectious disease, Injury, environmental health, maternal and child health, syndromic surveillance, chronic disease, behavioral risk factors.
Predictor Variables
Contextual variables.
We included predictor and control contextual variables in all models. At the LHD level, we included the presence of a local board of health (predictor), the 2010 population size (in 100 000s), and the percentages of the population that (1) were living in poverty, (2) were aged younger than 18 years, (3) did not identify as White, and (4) had either a college or graduate degree. At the state level, we included state–local governance relationship (predictor), 2010 population size (in millions), and percentage of the state population living in poverty.
Resources.
Predictor variables based on the resources available to the LHD included the total expenditures per 1000 population for the most recently completed fiscal year and revenues per 1000 population from (1) local sources (i.e., city, township, or town and county sources), (2) Medicaid, and (3) Medicare. Because large percentages of data on expenditure and revenue variables were missing, we divided these variables into tertiles and included a fourth “not reported” category to keep as many LHDs in the analysis as possible. We also included an indicator of whether the LHD’s budget for the 2010 fiscal year had decreased from the previous fiscal year. Continuous LHD-level predictor variables included the percentage of total employees laid off or lost via attrition between July 1, 2009, and June 30, 2010, and the total number of clinical services (of 26) performed directly by the LHD (e.g., screening or treatment of different diseases, provision of maternal and child health services). The only state-level resource predictor variable was a continuous measure of expenditures per person.
Workforce.
All workforce predictor variables were measured at the LHD level. Several measures assessed the characteristics of the LHD’s top executive: previous experience as a top LHD executive, tenure in the position (< 5 years, 5–19 years [reference], or ≥ 20 years), and type of degree (whether the top executive held a public health degree, a medical degree, or a nursing degree). Additional measures assessed LHDs’ employment of staff assigned to specific roles or positions that often include tasks key to EBDM (≥ 1 epidemiologists, health educators, nutritionists, or emergency preparedness staff) and attendance by anyone in the LHD at health impact assessment training in the past year. The total number of current LHD employees (full time, part time, and contractual) per 1000 jurisdiction population was a continuous predictor.
Statistical Analyses
We used 2-level multilevel regression models to account for the hierarchical structure of the data. We modeled information about the 516 LHDs at level 1 and information about the 47 states at level 2. Each model included a random intercept for each state. All analyses incorporated statistical weights to account for sampling of LHDs that completed the module 2 supplemental survey and for variation in nonresponses as a function of LHD size (LHDs that served larger populations were more likely to respond to the profile survey than LHDs that served smaller populations). We conducted analyses in 2013 using MPlus 7.0 software (Muthén & Muthén, Los Angeles, CA).48 EBDM is a count variable, so we modeled it using a Poisson distribution.
RESULTS
As Table 1 shows, we found considerable variation in the percentage of LHDs that used each of the practices making up the EBDM index. Overall, 2.3% of LHDs reported using no EBDM practices; 31% used 1 to 2; 55.8% used 3 to 5; and 10.9% used 6 to 7.
Before adding any predictor variables, we tested an empty model with EBDM as the outcome to calculate the intraclass correlation, which was 0.30, indicating that 30% of the variance in EBDM was between states. In the resources model, LHDs in the middle tertile for expenditures used significantly more EBDM practices than LHDs in the bottom tertile (Table 2). LHDs in the top tertile also tended to use more EBDM practices than LHDs in the bottom tertile, but the difference was not significant (P = .06). LHDs that had experienced a budget cut from the previous year used more EBDM practices than those whose budget had stayed the same, had increased, or was unknown, and LHDs that provided a larger number of clinical services used more EBDM practices. None of the other predictor variables were statistically significant.
TABLE 2—
Descriptive Statistics and Incidence Rate Ratios From Multilevel Poisson Regression Model Testing Associations Between Resources and Evidence-Based Decision-Making: National Association of County and City Health Officials Profile Study, Association of State and Territorial Health Officials Profile Survey, and Census of Population and Housing; United States; 2010
| Predictor variables | Mean (SD) or % | IRR (95% CI) |
| LHD level (level 1)a | ||
| Population (in 100 000s) | 1.99* (3.71) | 1.02* (1.01, 1.03) |
| % in poverty | 14.59 (5.89) | 1.00 (0.98, 1.01) |
| % < 18 y | 23.44* (2.94) | 0.99* (0.97, 1.00) |
| % non-White | 22.70 (19.49) | 1.00 (1.00, 1.00) |
| % with a college education | 30.19 (10.57) | 1.00 (1.00, 1.00) |
| Existence of a local board of health | ||
| Does not have a local board of health | 26.9 | 1.00 (Ref) |
| Has a local board of health | 73.1 | 0.93 (0.82, 1.05) |
| Employees laid off or lost via attrition | 3.65 (6.89) | 1.00 (1.00, 1.00) |
| Revenues from local sources/1000 capita | ||
| Bottom third | 29.8 | 1.00 (Ref) |
| Middle third | 27.7 | 0.88 (0.77, 1.00) |
| Top third | 25.0 | 0.90 (0.77, 1.06) |
| Not reported | 17.7 | 0.80 (0.63, 1.02) |
| Revenues from Medicaid/1000 capita | ||
| Bottom third | 23.3 | 1.00 (Ref) |
| Middle third | 32.0 | 1.01 (0.90, 1.13) |
| Top third | 27.3 | 0.89 (0.75, 1.06) |
| Revenues from Medicare/1000 capita | ||
| Bottom third | 23.9 | 1.00 (Ref) |
| Middle third | 29.0 | 1.06 (0.94, 1.20) |
| Top third | 28.1 | 0.93 (0.82, 1.07) |
| LHD expenditures/1000 capita | ||
| Bottom third | 26.4 | 1.00 (Ref) |
| Middle third | 29.8* | 1.14* (1.03, 1.27) |
| Top third | 29.1 | 1.16 (1.00, 1.35) |
| Not reported | 14.7 | 0.99 (0.78, 1.27) |
| Change in budget from previous y | ||
| Approx. the same, greater, or unknown | 56.8 | 1.00 (Ref) |
| < previous y | 43.2* | 1.10* (1.01, 1.20) |
| Total clinical services | 11.08* (5.17) | 1.03* (1.02, 1.05) |
| State level (level 2)b | ||
| Population, millions | 6.33 (6.94) | 0.99 (0.99, 1.00) |
| % in poverty | 14.87 (3.19) | 0.99 (0.96, 1.02) |
| State governance | ||
| Decentralized | 55.3 | 1.00 (Ref) |
| Centralized | 25.5* | 0.74* (0.55, 0.98) |
| Mixed | 10.6 | 0.90 (0.70, 1.16) |
| Shared | 8.5 | 0.89 (0.72, 1.10) |
| Expenditures/person | 87.62 (42.82) | 1.00 (1.00, 1.00) |
Note. CI = confidence interval; IRR = incidence rate ratio; LHD = local health department. All values are adjusted for other variables in the model.
Sample size for LHDs was n = 444.
Sample size for states was n = 45.
*P < .05.
In the workforce model, we found 6 significant predictors of using EBDM practices (Table 3). LHDs that had a top executive with a public health degree; LHDs that employed epidemiologists, health educators, and emergency preparedness staff; LHDs in which the staff had participated in a training session for health impact assessments in the past year; and LHDs with more employees all used significantly more EBDM practices than other LHDs (Table 3).
TABLE 3—
Descriptive Statistics and Incidence Rate Ratios From Multilevel Poisson Regression Model Testing Associations Between Workforce Factors and Evidence-Based Decision-Making: National Association of County and City Health Officials Profile Study, Association of State and Territorial Health Officials Profile Survey, and Census of Population and Housing; United States; 2010
| Predictor variables | Mean (SD) or % | IRR (95% CI) |
| LHD level (level 1)a | ||
| Population (in 100 000s) | 1.99* (3.71) | 1.02* (1.01, 1.03) |
| % in poverty | 14.59 (5.89) | 0.99 (0.98, 1.00) |
| % < 18 y | 23.44* (2.94) | 0.98* (0.97, 0.99) |
| % non-White | 22.70 (19.49) | 1.00 (1.00, 1.00) |
| % with a college education | 30.19 (10.57) | 1.00 (0.99, 1.00) |
| Existence of a local board of health | ||
| Does not have a local board of health | 26.9 | 1.00 (Ref) |
| Has a local board of health | 73.1 | 0.93 (0.83, 1.05) |
| Top executive’s job experience | ||
| Previous or unknown past experience | 22.9 | 1.00 (Ref) |
| First position as top executive at LHD | 77.1 | 0.99 (0.88, 1.10) |
| Top executive’s tenure, y | ||
| < 5 | 40.9 | 1.01 (0.93, 1.09) |
| 5–19 | 49.4 | 1.00 (Ref) |
| ≥ 20 | 9.7 | 1.00 (0.89, 1.12) |
| Top executive’s education | ||
| No public health degree | 73.6 | 1.00 (Ref) |
| Public health degree (MPH, DrPH, PhD) | 26.4* | 1.17* (1.05, 1.29) |
| No medical degree | 83.3 | 1.00 (Ref) |
| Medical degree (MD or DO) | 16.7 | 0.94 (0.83, 1.06) |
| No nursing degree | 72.5 | 1.00 (Ref) |
| Nursing degree (BSN, MSN, RN) | 27.5 | 1.02 (0.91, 1.14) |
| LHD employees/1000 capita | 0.80* (0.74) | 1.08* (1.02, 1.15) |
| Expertise in the department | ||
| No epidemiologist | 65.7 | 1.00 (Ref) |
| Epidemiologist | 34.3* | 1.13* (1.03, 1.25) |
| No health educator | 42.6 | 1.00 (Ref) |
| Health educator | 57.4* | 1.18* (1.05, 1.34) |
| No nutritionist | 43.2 | 1.00 (Ref) |
| Nutritionist | 56.8 | 1.10 (0.99, 1.22) |
| No emergency preparedness staff | 15.5 | 1.00 (Ref) |
| Emergency preparedness staff | 84.5* | 1.30* (1.07, 1.58) |
| Staff participation in health impact assessment? | ||
| No or unknown participation | 86.4 | 1.00 (Ref) |
| Staff participated in HIA training | 13.6* | 1.12* (1.00, 1.25) |
| State level (level 2)b | ||
| Population (in millions) | 6.33 (6.94) | 1.00 (0.99, 1.01) |
| % in poverty | 14.87 (3.19) | 0.99 (0.97, 1.01) |
| State governance | ||
| Decentralized | 55.3 | 1.00 (Ref) |
| Centralized | 25.5* | 0.74* (0.58, 0.94) |
| Mixed | 10.6 | 0.87 (0.70, 1.09) |
| Shared | 8.5 | 0.92 (0.78, 1.09) |
Note. CI = confidence interval; HIA = health impact assessment; IRR = incidence rate ratio; LHD = local health department. All values are adjusted for other variables in the model.
Sample size for LHDs was n = 476.
Sample size for states was n = 47.
*P < .05.
Across models, LHDs located in states that had a centralized public health governance structure used significantly fewer EBDM practices than those in states with a decentralized governance. By contrast, LHDs that served larger populations and LHDs for which a smaller percentage of the population was younger than 18 years used more EBDM practices.
DISCUSSION
The study findings suggest several directions for future research and practice. First, the low use of certain strategies is striking, particularly the application of research findings (only 25.7% of LHDs) and the use of the Guide to Community Preventive Services4 (only 22.3%). A majority of LHDs (60.9%) reported the completion of a community health assessment, but only 49.5% then participated in community health improvement planning. Because all these strategies are considered key to EBDM for population health improvement,4,5,7,8,10,15,49–52 researchers should pursue why some strategies get little use and what is needed to enhance uptake. Also, continuing to explore the contribution of specific EBDM practices to LHD performance and community health will be important.
Second, leadership of the LHD, as measured by the executive’s academic degree, was significantly related to the number of EBDM practices used. Our finding that LHDs led by executives with public health degrees used more EBDM practices than other LHDs stands in contrast with other studies that have found a negative association between having a public health degreed director and LHD performance.27,28 The difference in findings may reflect different outcome measures (performance on essential public health services vs EBDM) or a change in practice since 2004. In this study, having a medical or nursing degree did not predict the use of EBDM practices. Bekemeier and Jones29 found that nurse-led LHDs used a greater breadth of population-focused primary prevention activities than did non–nurse-led LHDs, but they were less likely to conduct community health assessment and planning (2 of the practices in our EBDM measure). Their study also did not investigate the unique impact of the director having a public health degree. In our research, only 25% of the nurse executives had bachelor’s degrees or higher; they would be less likely than were those with BSN degrees to have formal training in community-based prevention.53 Directors with public health training may be more engaged in providing the needed leadership for EBDM—particularly around population health issues—an important driver of EBDM identified in several studies.16 Although director tenure has been associated with performance on the essential public health services, it was not associated with the number of EBDM practices. This finding may reflect the diversity of education and training among LHD executives or the relatively recent focus on EBDM in public health practice and training programs.
Third, beyond characteristics of the executive, employing other key LHD staff was also important. LHDs that employed people in the roles of epidemiologist, health educator, or preparedness coordinator used more EBDM practices. However, in this study, only 34% of LHDs employed epidemiologists, mirroring a national shortage of epidemiologists32,33,54,55; 67% and 84% of LHDs, respectively, employed health educators and preparedness coordinators. Together, these positions are assigned activities that correspond to critical EBDM practices in our index as well as to core health education and epidemiology competencies.34,56 It is not clear, though, if the people in these roles held degrees and qualifications that matched the job descriptions. For example, the epidemiologists employed by LHDs may have completed a BS or MPH in epidemiology, a short course, on-the-job training, or none of these. Even if degreed health educators fill health education positions, some training programs focus on individual-level change, in keeping with a patient education role, and others focus on changing programs, systems, and policies. In future research, it will be important to explore the qualifications, skills, and importance of personnel in these key positions as well as the most effective level and mix of staffing for public health service delivery.57 One reason that staff member attendance at health impact assessment training may have been positively associated with the number of EBDM practices used is that these LHDs may have a culture focused on innovation and the use of evidence, including moving toward the use of health impact assessments. LHDs with a larger per capita workforce may have been able to hire personnel who had skills tailored to these roles or had more time for EBDM. Notably, there were more workforce than resource predictors, suggesting that how resources are used is critical.
Fourth, the results from the resources model suggest that spending more money (i.e., being in the top or middle tertile of expenditures) may confer several advantages to LHDs, including higher salaries for more qualified or experienced personnel, advanced technological resources and personnel with data analysis expertise, and less conflict between population and clinical services. LHDs with fewer resources may concentrate on necessary and expected clinical services rather than engage in EBDM processes that could lead to programs that would be too costly to support. The results also indicate that losing funding may stimulate LHDs to use more EBDM practices with the remaining resources. Losing employees through layoffs or attrition was not associated with the number of EBDM practices used, possibly because some LHDs that lost personnel became more strategic in how they used their resources. In the resources model, EBDM was positively associated with the breadth of clinical services, suggesting that there is not necessarily a trade-off between these. Because the NACCHO data do not allow tracking expenditures to specific public health activities, the possibilities we suggest should be viewed with caution. The large percentage of missing resource data may have obscured some relationships. Further study is needed to examine specifically how resources are used to improve the public’s health and which services may be reduced or eliminated when resources become more restricted. To do this, it will be necessary to institute financial accounting systems that support overall financial and program management (i.e., that clarify linkages among resources, processes, outputs, and outcomes).58 These systems will enable the development of cost measures that enable researchers and practitioners to accurately track what time, personnel, and money are needed for specific LHD activities and programs.
Fifth, although Brownson et al.40 found that LHDs with centralized state governance scored higher on administrative evidence-based practices than those with local governance, we found that LHDs in states with decentralized control of LHDs used more EBDM practices than those in centralized states. These differing results, however, may reflect differences in measurement because the Brownson measure encompassed a wider breadth of practices and the perceived value and access to EBDM skills, but not the use of them.11,13 This measure also included some practices that may have easily resulted from statewide training, such as quality improvement initiatives. Taken together, studies suggest that some types of LHD performance may be handled better in decentralized states,55 some in mixed or shared states,21,59 and some in centralized states.38,39 In terms of EBDM, it is possible that local autonomy could outweigh the effects of professionalism and standards, brought by state governance, by stimulating innovation and experimentation or by fostering an attention to local needs.42 Mays and Smith60 found that LHDs in decentralized, compared with centralized, states spent an average of 25% more per capita. Together, these findings suggest that LHDs may invest more in public health when they have the autonomy and local support to do so, whether through spending measures or through the use of EBDM practices that focus on the local jurisdiction. By contrast, with the evidence suggesting that decentralization is important for EBDM, the presence of local oversight in the form of a local board of health did not lead to more EBDM practices, even when testing for different types of board responsibilities (e.g., hiring or firing the director). It is likely that even boards with policymaking authority vary in their oversight and activism, and this difference could mask any associations.41
Sixth, our findings regarding jurisdiction size were consistent with those of other studies that have also found jurisdiction size to be 1 of the strongest predictors of LHD performance.21,38,39,61–66 Jurisdiction size may be related to access to in-kind resources such as the availability of expertise from universities and hospitals. Hyde and Shortell67 noted that the inefficiencies resulting from small LHDs have resulted in calls for consolidation since the mid-1940s, but research to suggest the effectiveness of this strategy is still limited.
An implication from this study is that LHDs should be encouraged to be strategic in their hiring and training processes and consider hiring directors with a public health degree as well as epidemiologists and health educators. With expected retirements in the governmental public health system, there may be opportunities for LHDs to make these strategic hires.25,68 Changes in hiring policy and requirements could be promoted through communication with governing officials (e.g., state officials, legislators, board of health members, county commissioners), addressing the public health system’s focus on population health and the importance of requiring hires to hold appropriate degrees and training for EBDM. It is also important to ensure that public health schools and programs provide a pipeline of workers for employment in governmental health and that funding enables LHDs to recruit these workers.68
Much of the current public health workforce does not have public health training,24 and many LHDs do not have the occupational roles that we found to be important for EBDM; however, other resources exist for LHDs. EBDM training courses and tools have been developed and disseminated,4,6,11,15,36,69 and new research is emerging on the use of evidence-based administrative practices.40,70 LHD personnel could be trained to use county health rankings to inform their planning efforts as well as deidentified patient record or service data from local hospitals or agencies. State centers for health statistics may also provide local or regional data. Training efforts could promote the use of the community guide and the application of research to organizational and community practices. Together, these efforts may increase awareness of EBPH and EBDM as important areas of competency needed for LHD performance. Taking advantage of these resources is needed but not sufficient in the translation and dissemination process. Harris’s71 research on networks among LHDs suggests that a state-by-state strategy might be most effective in disseminating and implementing evidence-based strategies. Dissemination would best be facilitated through LHDs with similar staffing and funding levels, and implementation might be enhanced through partnerships with the LHDs in the largest jurisdictions. Practice-based research networks are another effective mechanism for facilitating research translation in practice.72 Finally, policymakers could influence the use of EBDM for population health by requiring it as a condition of funding, an important lever identified in at least 1 EBDM study.16
LHDs in smaller jurisdictions face more challenges. LHDs could develop resource-sharing agreements through which specialized personnel (e.g., epidemiologists) could serve several LHDs, or LHDs could obtain access to hospital epidemiologists. Institutional or individual affiliations with academic institutions may also enhance LHDs’ access to EBDM skills, resources, and personnel.73
Limitations
This study had a number of limitations. The cross-sectional data did not allow us to determine the temporal order of relationships or test causal relationships between our independent and dependent variables. For example, it is possible that LHDs using more EBDM practices see the need for and hire the specific personnel identified in our study rather than the reverse. The secondary data set limited the extent to which we could explore every predictor of public health system performance. For example, we could not assess factors such as access to academic institutions through appointments or other affiliations,74 legally mandated services, or categorical funding, which have been identified as affecting LHDs’ ability to use EBDM.41,73 The data were self-reported and not independently verified. In some cases, particularly in large LHDs, the person who completed the survey may not have known about all LHD activities. Thus, the extent to which LHDs practice EBDM might have been underestimated. Finally, to test linkages between EBDM and practice outcomes, more detail is needed on LHDs’ use and adaptation of evidence-based population health interventions, such as those promoted by the community guide.51
Our outcome measure also had limitations. The survey questions we used only addressed the front end of EBDM because the data set did not include items on the use of interventions, evaluation, and dissemination.35 Thus, our measure could only capture capacity for making decisions about evidence-based interventions. Moreover, no questions addressed the frequency or quality of LHDs’ activities because the NACCHO profile survey only asked LHDs to report whether they performed the activities making up the outcome variable. Thus, LHDs that reported similar EBDM use may have varied in their actual use of these practices. A more informative measure would assess the quantity (or extent) of a practice as well as the quality of the practice. Also, our measure of EBDM weighted all activities except epidemiology and surveillance equally; however, it is possible that some activities are more important indicators of EBDM, and the EBDM practices most needed in any LHD may vary considerably depending on available resources and the context of the local community. Finally, this study did not determine whether combinations of EBDM practices were more important than others.
Conclusions
Despite its limitations, our research contributes to the literature on how to assess LHDs’ use of EBDM practices. Using data from routinely available data sets, we developed an EBDM measure focused on population health practices and identified the extent to which a nationally representative sample of LHDs use EBDM practices. Moreover, we identified modifiable factors at state and local levels that promote the use of these practices. Thus, this research also provides a basis for future studies on the linkages among EBDM, EBPH interventions, and improvement of the public’s health. Finally, our measure of EBDM requires further validation, as well as testing of its association with the use of evidence-based programs and various population health outcomes. Nevertheless, this study has important practical and policy implications because it clearly highlights the underutilization of EBDM practices in LHDs.
Acknowledgments
This research was funded by the Robert Wood Johnson Foundation (grant 69686); we thank the Robert Wood Johnson Foundation for the support for this research. We also thank the members of the project's Advisory Panel for providing input to our study and the National Coordinating Center for Public Health Services and Systems Research at the University of Kentucky College of Public Health, the National Association of County and City Health Officials, and the Association of State and Territorial Health Officers for providing data used in this study and advice about the data. We appreciate the very helpful advice of 3 anonymous reviewers.
Human Participant Protection
This research was approved as exempt by the institutional review board of the University of North Carolina at Greensboro.
References
- 1.Committee on Assuring the Health of the Public in the 21st Century. The Future of the Public’s Health in the 21st Century. Washington, DC: Institute of Medicine; 2003. [Google Scholar]
- 2.Panel on Understanding Cross-National Health Differences Among High-Income Countries. US Health in International Perspective: Shorter Lives, Poorer Health. Washington, DC: National Academies Press; 2013. [PubMed] [Google Scholar]
- 3.Rose G. The Strategy of Preventive Medicine. Oxford, England: Oxford University Press; 1992. [Google Scholar]
- 4.Briss PA, Brownson RC, Fielding JE, Zaza S. Developing and using the Guide to Community Preventive Services: lessons learned about evidence-based public health. Annu Rev Public Health. 2004;25:281–302. doi: 10.1146/annurev.publhealth.25.050503.153933. [DOI] [PubMed] [Google Scholar]
- 5.Brownson RC, Baker EA, Leet TL, Gillespie KN, True W. Evidence-Based Public Health. Oxford, England: Oxford University Press; 2011. [Google Scholar]
- 6.Brownson RC, Ballew P, Dieffenderfer B et al. Evidence-based interventions to promote physical activity: what contributes to dissemination by state health departments. Am J Prev Med. 2007;33(1 suppl):S66–S73. doi: 10.1016/j.amepre.2007.03.011. [DOI] [PubMed] [Google Scholar]
- 7.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201. doi: 10.1146/annurev.publhealth.031308.100134. [DOI] [PubMed] [Google Scholar]
- 8.Centers for Disease Control and Prevention. The Guide to Community Preventive Services: what works to improve health. Available at: http://www.thecommunityguide.org/CG-in-Action/index.html. Accessed June 11, 2011.
- 9.Kohatsu ND, Robinson JG, Turner JC. Evidence-based public health: an evolving concept. Am J Prev Med. 2004;27(5):417–421. doi: 10.1016/j.amepre.2004.07.019. [DOI] [PubMed] [Google Scholar]
- 10.Brownson RC, Gurney JG, Land GH. Evidence-based decision making in public health. J Public Health Manag Pract. 1999;5(5):86–97. doi: 10.1097/00124784-199909000-00012. [DOI] [PubMed] [Google Scholar]
- 11.Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A. Examining the role of training in evidence-based public health: a qualitative study. Health Promot Pract. 2009;10(3):342–348. doi: 10.1177/1524839909336649. [DOI] [PubMed] [Google Scholar]
- 12.Dodson EA, Baker EA, Brownson RC. Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions. J Public Health Manag Pract. 2010;16(6):E9–E15. doi: 10.1097/PHH.0b013e3181d1f1e2. [DOI] [PubMed] [Google Scholar]
- 13.Jacobs JA, Clayton PF, Dove C et al. A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012;12:57. doi: 10.1186/1472-6963-12-57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Tabak RG, Khoong EC, Chambers D, Brownson RC. Models in dissemination and implementation research: useful tools in public health services and systems research. Frontiers Public Health Serv Syst Res. 2013;2(1) Article 8. [Google Scholar]
- 15.Jacobs JA, Jones E, Gabella BA, Spring B, Brownson RC. Tools for implementing an evidence-based approach in public health practice. Prev Chronic Dis. 2012;9:E116. doi: 10.5888/pcd9.110324. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Sosnowy CD, Weiss LJ, Maylahn CM, Pirani SJ, Katagiri NJ. Factors affecting evidence-based decision making in local health departments. Am J Prev Med. 2013;45(6):763–768. doi: 10.1016/j.amepre.2013.08.004. [DOI] [PubMed] [Google Scholar]
- 17.Handler A, Issel M, Turnock B. A conceptual framework to measure performance of the public health system. Am J Public Health. 2001;91(8):1235–1239. doi: 10.2105/ajph.91.8.1235. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Meyer A-M, Davis M, Mays GP. Defining organizational capacity for public health services and systems research. J Public Health Manag Pract. 2012;18(6):535–544. doi: 10.1097/PHH.0b013e31825ce928. [DOI] [PubMed] [Google Scholar]
- 19.Mays GP, Smith SA. Evidence links increase in public health spending to declines in preventable deaths. Health Aff (Millwood) 2011;30(8):1585–1593. doi: 10.1377/hlthaff.2011.0196. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Erwin PC, Mays GP, Riley WJ. Resources that may matter: the impact of local health department expenditures on health status. Public Health Rep. 2012;127(1):89–95. doi: 10.1177/003335491212700110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Mays GP, McHugh MC, Shim K et al. Institutional and economic determinants of public health system performance. Am J Public Health. 2006;96(3):523–531. doi: 10.2105/AJPH.2005.064253. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Trust for America’s Health. A Healthier America 2013: Strategies to Move From Sick Care to Health Care in the Next Four Years. Washington, DC: Trust for America’s Health; 2013. [Google Scholar]
- 23.National Association of County and City Health Officials. Local Health Department Job Losses and Program Cuts: 2008-2010. Washington, DC: National Association of County and City Health Officials; 2011. [Google Scholar]
- 24.Tilson H, Gebbie KM. The public health workforce. Annu Rev Public Health. 2004;25:341–356. doi: 10.1146/annurev.publhealth.25.102802.124357. [DOI] [PubMed] [Google Scholar]
- 25.Gerzoff RB, Richards TB. The education of local health department top executives. J Public Health Manag Pract. 1997;3(4):50–56. doi: 10.1097/00124784-199707000-00010. [DOI] [PubMed] [Google Scholar]
- 26.Turnock BJ. Competency-based credentialing of public health administrators in Illinois. J Public Health Manag Pract. 2001;7(4):74–82. doi: 10.1097/00124784-200107040-00012. [DOI] [PubMed] [Google Scholar]
- 27.Bhandari MW, Scutchfield FD, Charnigo R, Riddell MC, Mays GP. New data, same story? Revisiting studies on the relationship of local public health systems characteristics to public health performance. J Public Health Manag Pract. 2010;16(2):110–117. doi: 10.1097/PHH.0b013e3181c6b525. [DOI] [PubMed] [Google Scholar]
- 28.Scutchfield FD, Knight EA, Kelly AV, Bhandari MW, Vasilescu IP. Local public health agency capacity and its relationship to public health system performance. J Public Health Manag Pract. 2004;10(3):204–215. doi: 10.1097/00124784-200405000-00004. [DOI] [PubMed] [Google Scholar]
- 29.Bekemeier B, Jones M. Relationships between local public health agency functions and agency leadership and staffing: a look at nurses. J Public Health Manag Pract. 2010;16(2):E8–E16. doi: 10.1097/PHH.0b013e3181bdebfe. [DOI] [PubMed] [Google Scholar]
- 30.Bekemeier B, Grembowski D, Yang Y, Herting JR. Leadership matters: local health department clinician leaders and their relationship to decreasing health disparities. J Public Health Manag Pract. 2012;18(2):E1–E10. doi: 10.1097/PHH.0b013e318242d4fc. [DOI] [PubMed] [Google Scholar]
- 31.Satterfield JM, Spring B, Brownson RC et al. Toward a transdisciplinary model of evidence-based practice. Milbank Q. 2009;87(2):368–390. doi: 10.1111/j.1468-0009.2009.00561.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Moehrle C. Who conducts epidemiology activities in local health departments? Public Health Rep. 2008;123(suppl 1):6–7. doi: 10.1177/00333549081230S103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Council of State and Territorial Epidemiologists. 2009 National Assessment of Epidemiology Capacity: Findings and Recommendations. Atlanta, GA: Council of State and Territorial Epidemiologists; 2009. [Google Scholar]
- 34.National Commission for Health Education Credentialing. A Competency-Based Framework for Health Education Specialists—2010. Whitehall, PA: National Commission for Health Education Credentialing; 2010. [Google Scholar]
- 35.National Association of County and City Health Officials. 2010 National Profile of Local Health Departments. Washington, DC: National Association of County and City Health Officials; 2011. [Google Scholar]
- 36.Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC. Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138–143. doi: 10.1097/01.PHH.0000311891.73078.50. [DOI] [PubMed] [Google Scholar]
- 37.Meit M, Sellers K, Kronstadt J et al. Governance typology: a consensus classification of state-local health department relationships. J Public Health Manag Pract. 2012;18(6):520–528. doi: 10.1097/PHH.0b013e31825ce90b. [DOI] [PubMed] [Google Scholar]
- 38.Suen J, Christenson GM, Cooper A, Taylor M. Analysis of the current status of public health practice in local health departments. Am J Prev Med. 1995;11(6 suppl):51–54. [PubMed] [Google Scholar]
- 39.Richards TB, Rogers JJ, Christenson GM, Miller CA, Taylor MS, Cooper AD. Evaluating local public health performance at a community level on a statewide basis. J Public Health Manag Pract. 1995;1(4):70–83. [PubMed] [Google Scholar]
- 40.Brownson RC, Reis RS, Allen P et al. Understanding administrative evidence-based practices: findings from a survey of local health department leaders. Am J Prev Med. 2014;46(1):49–57. doi: 10.1016/j.amepre.2013.08.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Bekemeier B, Chen AL-T, Kawakyu N, Yang Y. Local public health resource allocation: limited choices and strategic decisions. Am J Prev Med. 2013;45(6):769–775. doi: 10.1016/j.amepre.2013.08.009. [DOI] [PubMed] [Google Scholar]
- 42.Hays SP, Poes M, Toth J, Mulhall P, Remmert D, O’Rourke T. Public Health Governance and the Relationship With Population Health Outcomes. Champaign: University of Illinois at Urbana-Champaign; 2014. [Google Scholar]
- 43.Association of State and Territorial Health Officials. 2010 ASTHO State and Territorial Public Health Survey. Arlington, VA: Association of State and Territorial Health Officials; 2011. [Google Scholar]
- 44.US Census Bureau. Census coverage measurement. Available at: http://www.census.gov/2010census. Accessed March 4, 2012.
- 45. Jones JA. Harmonizing data: the synthesis of knowledge. Paper presented at: Keeneland Conference; April 13, 2011; Lexington, KY.
- 46.DeVellis RF. Scale Development: Theory and Applications. 3rd ed. Thousand Oaks, CA: Sage; 2012. [Google Scholar]
- 47.National Association of County and City Health Officials. 2010 National Profile of Local Health Departments Study Questionnaire. Washington, DC: National Association of County and City Health Officials; 2011. [Google Scholar]
- 48.Muthén LK, Muthén BO. Mplus User’s Guide. 7th ed. Los Angeles, CA: Muthén & Muthén; 1998–2012. [Google Scholar]
- 49.Cilenti D, Brownson RC, Umble K, Erwin PC, Summers R. Information-seeking behaviors and other factors contributing to successful implementation of evidence-based practices in local health departments. J Public Health Manag Pract. 2012;18(6):571–576. doi: 10.1097/PHH.0b013e31825ce8e2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Myers BA. Getting people to want sliced bread—an update on dissemination of the Guide to Community Preventive Services. J Public Health Manag Pract. 2003;9(6):545–551. doi: 10.1097/00124784-200311000-00017. [DOI] [PubMed] [Google Scholar]
- 51.Wilson KM, Fridinger F. Focusing on public health: a different look at translating research to practice. J Womens Health (Larchmt) 2008;17(2):173–179. doi: 10.1089/jwh.2007.0699. [DOI] [PubMed] [Google Scholar]
- 52.Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30:151–174. doi: 10.1146/annurev.publhealth.031308.100049. [DOI] [PubMed] [Google Scholar]
- 53.Baker EL, Potter MA, Jones DL et al. The public health infrastructure and our nation’s health. Annu Rev Public Health. 2005;26:303–318. doi: 10.1146/annurev.publhealth.26.021304.144647. [DOI] [PubMed] [Google Scholar]
- 54.Council of State and Territorial Epidemiologists. 2010 Epidemiology Enumeration Assessment: Findings and Recommendations. Atlanta, GA: Council of State and Territorial Epidemiologists; 2010. [Google Scholar]
- 55.Shah GH, Laymon B, Elligers JJ, Leep CJ, Bhutta CB. Community health assessment by local health departments: presence of epidemiologist, governance, and federal and state funds are critical. Frontiers Public Health Serv Syst Res. 2013;2(5) Article 1. [Google Scholar]
- 56.Centers for Disease Control and Prevention, Council of State and Territorial Epidemiologists. Competencies for Applied Epidemiologists in Governmental Public Health Agencies (AECs) Atlanta, GA: Centers for Disease Control and Prevention; 2008. [Google Scholar]
- 57.Mays GP, Smith SA, Ingram RC, Racster LJ, Lamberth CD, Lovely ES. Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009;36(3):256–265. doi: 10.1016/j.amepre.2008.11.008. [DOI] [PubMed] [Google Scholar]
- 58.Institute of Medicine. For the Public’s Health: Investing in a Healthier Future. Washington, DC: National Academies Press; 2012. [PubMed] [Google Scholar]
- 59.Mays GP, Halverson PK, Baker EL, Stevens R, Vann JJ. Availability and perceived effectiveness of public health activities in the nation’s most populous communities. Am J Public Health. 2004;94(6):1019–1026. doi: 10.2105/ajph.94.6.1019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Mays GP, Smith SA. Geographic variation in public health spending: correlates and consequences. Health Serv Res. 2009;44(5 pt 2):1796–1817. doi: 10.1111/j.1475-6773.2009.01014.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Turnock BJ, Handler A, Hall W, Potsic S, Nalluri R, Vaughn EH. Local public health department effectiveness in addressing the core functions of public health. Public Health Rep. 1994;109(5):653–658. [PMC free article] [PubMed] [Google Scholar]
- 62.Turnock BJ, Handler AS, Miller CA. Core function-related local public health practice effectiveness. J Public Health Manag Pract. 1998;4(5):26–32. doi: 10.1097/00124784-199809000-00005. [DOI] [PubMed] [Google Scholar]
- 63.Freund CG, Liu Z. Local health department capacity and performance in New Jersey. J Public Health Manag Pract. 2000;6(5):42–50. doi: 10.1097/00124784-200006050-00007. [DOI] [PubMed] [Google Scholar]
- 64.Santerre RE. Jurisdiction size and local public health spending. Health Serv Res. 2009;44(6):2148–2166. doi: 10.1111/j.1475-6773.2009.01006.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Zahner SJ, Vandermause R. Local health department performance: compliance with state statutes and rules. J Public Health Manag Pract. 2003;9(1):25–34. doi: 10.1097/00124784-200301000-00004. [DOI] [PubMed] [Google Scholar]
- 66.Kennedy VC. A study of local public health system performance in Texas. J Public Health Manag Pract. 2003;9(3):183–187. doi: 10.1097/00124784-200305000-00002. [DOI] [PubMed] [Google Scholar]
- 67.Hyde JK, Shortell SM. The structure and organization of local and state public health agencies in the US: a systematic review. Am J Prev Med. 2012;42(5 suppl 1):S29–S41. doi: 10.1016/j.amepre.2012.01.021. [DOI] [PubMed] [Google Scholar]
- 68.Hilliard TM, Bouton ML. Public health workforce research in review: a 25-year retrospective. Am J Prev Med. 2012;42(5 suppl 1):S17–S28. doi: 10.1016/j.amepre.2012.01.031. [DOI] [PubMed] [Google Scholar]
- 69.Kumanyika S, Brownson RC, Cheadle A. The L.E.A.D. framework: using tools from evidence-based public health to address evidence needs for obesity prevention. Prev Chronic Dis. 2012;9:E125. doi: 10.5888/pcd9:120157. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC. Fostering more effective public health by identifying administrative evidence-based practices: a review of the literature. Am J Prev Med. 2012;43(3):309–319. doi: 10.1016/j.amepre.2012.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Harris JK. Communication ties across the national network of local health departments. Am J Prev Med. 2013;44(3):247–253. doi: 10.1016/j.amepre.2012.10.028. [DOI] [PubMed] [Google Scholar]
- 72.Mays GP, Hogg RA, Castellanos-Crus D, Hoover A, Fowler LC. Public health research implementation and translation: evidence from practice-based research networks. Am J Prev Med. 2013;45(6):752–762. doi: 10.1016/j.amepre.2013.08.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Chudgar RB, Shirey LA, Sznycer-Taub M, Read R, Pearson RL, Erwin PC. Local health department and academic institution linkages for community health assessment and improvement processes: a national overview and local case study. J Public Health Manag Pract. 2014;20(3):349–355. doi: 10.1097/PHH.0b013e31829dc26b. [DOI] [PubMed] [Google Scholar]
- 74.Miller AL, Krusky AM, Franzen S, Cochran S, Zimmerman MA. Partnering to translate evidence-based programs to community settings: bridging the gap between research and practice. Health Promot Pract. 2012;13(4):559–566. doi: 10.1177/1524839912438749. [DOI] [PMC free article] [PubMed] [Google Scholar]
