Abstract
Objectives. We examined the perceived importance of scientific resources for decision-making among local health department (LHD) practitioners in the United States.
Methods. We used data from LHD practitioners (n = 849). Respondents ranked important decision-making resources, methods for learning about public health research, and academic journal use. We calculated descriptive statistics and used logistic regression to measure associations of individual and LHD characteristics with importance of scientific resources.
Results. Systematic reviews of scientific literature (24.7%) were most frequently ranked as important among scientific resources, followed by scientific reports (15.9%), general literature review articles (6.5%), and 1 or a few scientific studies (4.8%). Graduate-level education (adjusted odds ratios [AORs] = 1.7–3.5), larger LHD size (AORs = 2.0–3.5), and leadership support (AOR = 1.6; 95% confidence interval = 1.1, 2.3) were associated with a higher ranking of importance of scientific resources.
Conclusions. Graduate training, larger LHD size, and leadership that supports a culture of evidence-based decision-making may increase the likelihood of practitioners viewing scientific resources as important. Targeting communication channels that practitioners view as important can also guide research dissemination strategies.
Local health department (LHD) practitioners develop evidence-based programs and policies by means of a complex decision-making process. Often referred to as evidence-based decision-making (EBDM), this process involves making decisions on the basis of the best available scientific evidence, applying program planning and quality improvement frameworks, engaging the community in assessment and decision-making, and conducting sound evaluation.1 To inform efforts aimed at boosting performance of public health agencies, there should be a clear understanding of decision-making patterns in LHDs, particularly those based on scientific information.
Documenting and implementing improvements in EBDM is increasingly important as a foundation of public health services and systems research2–4 and as part of accreditation programs such as the Public Health Accreditation Board Standards.5 The Public Health Accreditation Board has directly included EBDM in its set of accreditation standards by recommending and incentivizing LHDs to “contribute to and apply the evidence base of public health.”5(p3) Understanding how to enhance EBDM among public health directors, managers of programs or divisions, program coordinators, and other public health staff has become relevant for LHDs seeking or wishing to maintain accreditation status. Tools such as the Community Guide–Public Health Accreditation Board Crosswalk 6 aim to increase the use of evidence-based approaches and support accreditation. The tool’s purpose is to show how implementing the evidence-based interventions in the Guide to Community Preventive Services (referred to hereinafter as the Community Guide) can fulfill the various requirements of the Public Health Accreditation Board.7
The use of scientific resources (e.g., systematic reviews, scientific reports, scientific articles) and capacity-building approaches may enhance EBDM in LHDs.8–10 A growing abundance of accessible evidence-based resources are available, including the Community Guide.11 However, few studies have documented the perceived importance and reported use of scientific resources that are part of the LHD decision-making process, and significant barriers exist. Enhancing access to resources, training, and leadership that fosters a supportive culture have been identified as a key component to overcoming barriers that prevent use of evidence-based resources.1,8,12 Additionally, to improve the translation of science to practice, researchers need new knowledge on the most appropriate methods for communicating their research findings.13
METHODS
We collected data on the perceived importance and reported use of resources among LHD practitioners from responses to a nationwide survey of LHDs. The sampling frame, questionnaire development and testing, and data collection steps have previously been described.14,15 Briefly, we drew a stratified random sample of 1067 US LHDs from the database of 2565 LHDs maintained by the National Association of County and City Health Officials, with stratification by size of jurisdictional population.
Data collection occurred over a 4-month time period (October 2012–February 2013) by means of an online survey (Qualtrics Survey Research Suite, Qualtrics LLC, Provo, UT) that was delivered nationally to valid e-mail accounts of 967 LHD leaders; 517 LHD leaders responded (53.5% response rate). The conclusion of the survey prompted LHD leaders to identify program managers in their own LHD. The online survey was then sent directly to e-mail accounts of 522 program managers who had been identified by their LHD leaders; 332 program managers responded (63.6% response rate), representing 196 individual LHDs. We derived individual characteristics from the survey responses, with additional data on LHDs drawn from linked National Association of County and City Health Officials Profile Survey data.16 The separate data sets for LHD leaders and LHD program managers were merged to form a pooled sample. Overall, the online survey was delivered to 1489 unique e-mail accounts, and 849 LHD practitioners responded to the survey (57% response rate).
The survey instrument was based in part on a public health systems logic model and related frameworks17–20 and previous evidence-based public health research with local and state health departments, where validated and standardized questions existed.21–26 The questionnaire consisted of 6 sections (biographical data, administrative practices, diffusion attributes, barriers to evidence-based public health, importance and use of resources, competencies in evidence-based public health), with a total of 66 questions. Seven questions were related to the importance and reported use of resources. Three of these questions used a ranking method to measure the perceived importance of resources and reported use of resources (i.e., academic journals most often read). Respondents were asked the following questions:
“When you make decisions about such things as program planning, policy development, or funding, how important to you are the following?”
“What methods allow you to learn about the current findings in public health research?” and
“Which journals do you most often read to stay up to date on current findings in public health?”
For each question, participants were provided with a list of predetermined response options and asked to rank the top 3, with that ranked first being the most important or most often read (for the journal question). Using yes–no and multiple-choice questions, the remaining 4 survey items measured reported use of the Community Guide (2 questions), reported barriers to journal use, and reported facilitators of journal access.
For the purpose of this analysis, responses listed as any of the top 3 were given equal weight. Any response option given a first, second, or third ranking was coded or grouped together into a single dichotomous variable. These dichotomous variables were constructed in this manner to compare LHD practitioners who perceived a resource as most important or most reportedly used with those who did not perceive the resource to be important or reportedly used.
We calculated descriptive statistics about the characteristics of the LHD practitioners and the departments in which they worked (n = 849). We also calculated descriptive statistics for all variables about the perceived importance and reported use of resources. To conduct multivariate analysis, we created a new dependent variable: perceived importance of scientific resources, which was coded as a single dichotomous variable and defined as perceiving any of the following as a top resource: systematic reviews of the body of scientific literature, scientific reports, general literature reviews, or 1 or a few scientific articles.
Using logistic regression, we calculated adjusted odds ratios (AORs) and 95% confidence intervals (CIs). Independent variables listed in Table 1 that were significant at the a P value of less than .2 in unadjusted analyses were retained in the final model to calculate AORs. The AORs represent the odds of perceiving a resource to be important.
TABLE 1—
Importance of Scientific Resourcesa |
||||
Characteristic | No. Ranked as Importantb | %c | OR (95% CI) | AORd (95% CI) |
Individual | ||||
Job position | ||||
Top executivee (Ref) | 183 | 49.9 | 1.0 | 1.0 |
Manager or other stafff | 118 | 37.0 | 0.6 (0.4, 0.8) | 0.9 (0.6, 1.2) |
Administrator, deputy or assistant director | 53 | 33.5 | 0.5 (0.3, 0.7) | 0.9 (0.5, 1.4) |
Age, y | ||||
20–39 (Ref) | 50 | 38.8 | 1.0 | 1.0 |
40–49 | 68 | 36.4 | 0.9 (0.6, 1.4) | 1.0 (0.6, 1.6) |
50–59 | 153 | 43.8 | 1.2 (0.8, 1.9) | 0.8 (0.5, 1.3) |
≥ 60 | 83 | 46.4 | 1.4 (0.9, 2.2) | 1.1 (0.7, 1.7) |
Highest degree | ||||
≤ bachelor’s (Ref) | 57 | 25.1 | 1.0 | 1.0 |
Nursing | 54 | 32.0 | 1.4 (0.9, 2.2) | 1.3 (0.8, 2.2) |
Other master’s degree | 101 | 49.3 | 2.9 (1.9, 4.3) | 2.0 (1.3, 3.1) |
MPH or MSPH | 65 | 47.4 | 2.7 (1.7, 4.2) | 1.7 (1.1, 2.8) |
Doctoral degree | 76 | 69.7 | 6.9 (4.1, 11.4) | 3.5 (1.9, 6.3) |
Gender | ||||
Female (Ref) | 211 | 39.7 | 1.0 | 1.0 |
Male | 143 | 45.7 | 1.30 (0.96, 1.70) | 1.0 (0.7, 1.4) |
Health department | ||||
Population of jurisdiction | ||||
< 25 000 (Ref) | 39 | 21.7 | 1.0 | 1.0 |
25 000–49 999 | 70 | 37.6 | 2.2 (1.4, 3.5) | 2.0 (1.2, 3.4) |
50 000–99 999 | 65 | 43.3 | 2.8 (1.7, 4.5) | 2.2 (1.3, 3.7) |
100 000–499 999 | 107 | 51.0 | 3.8 (2.4, 5.9) | 3.0 (1.8, 5.0) |
≥ 500 000 | 72 | 59.5 | 5.3 (3.2, 8.8) | 3.5 (1.9, 6.4) |
Governance structure | ||||
Locally governed (Ref) | 284 | 42.3 | 1.0 | —g |
State governed | 29 | 39.7 | 0.9 (0.6, 1.5) | —g |
Shared governance | 40 | 39.6 | 0.9 (0.6, 1.4) | —g |
Census region | ||||
Northeast (Ref) | 55 | 44.7 | 1.0 | 1.0 |
Midwest | 118 | 37.0 | 0.7 (0.5, 1.1) | 0.8 (0.5, 1.3) |
South | 107 | 40.5 | 0.8 (0.5. 1.3) | 0.60 (0.40, 1.04) |
West | 73 | 51.4 | 1.3 (0.8, 2.1) | 1.1 (0.7, 2.0) |
Leadership structures and practicesh | ||||
Ability to lead efforts in EBDM | 212 | 46.4 | 1.4 (1.1, 1.8) | 1.0 (0.7, 1.5) |
Encourages EBDM use | 232 | 49.7 | 1.9 (1.5, 2.6) | 1.6 (1.1, 2.3) |
Fosters participation of staff in decision-making | 261 | 43.3 | 1.1 (0.8, 1.5) | —g |
Important to hire people with a public health degree | 145 | 52.2 | 1.8 (1.3, 2.4) | 1.4 (0.9. 2.0) |
Important to hire people with public health experience | 186 | 46.9 | 1.40 (1.05, 1.80) | 0.9 (0.6, 1.2) |
Note. AOR = adjusted odds ratio; CI = confidence interval; EBDM = evidence-based decision-making; MPH = Master of Public Health; MSPH = Master of Science in Public Health; OR = odds ratio.
Perceived importance of scientific resources defined as systematic reviews of the body of scientific literature, scientific reports, general literature reviews, or 1 or a few scientific articles.
Perceived importance was dichotomized on the basis of whether a respondent ranked the resource in any of their top 3 (first, second, or third most important).
Row percentages are shown.
Variables that were significant at the P < .2 level in unadjusted analyses were retained in the final model to calculate AORs. The ORs represent the odds of perceiving a resource to be important.
Includes top executives, health directors, health officers, commissioners, or equivalent in office of the director.
Includes managers of a division or program, program coordinators, technical expert positions, or other staff.
Dash indicates that the variable was not included in the final adjusted model since the variable was not significant at the P < .2 level in unadjusted analyses.
7-point Likert-scale response option; frequency shown is those who strongly agree and agree.
RESULTS
In the sample of 849 LHD practitioners, most were top executives (43.5%), followed by managers or other staff (37.8%) and administrators, deputies, or assistant directors (18.7%; Table 2). The most common characteristics of respondents were being female (62.9%), aged 50 to 59 years (41.4%), and having a bachelor’s degree or less (26.8%). The average length of public health experience was 18.5 years (SD = 13.0), with an average of 8.2 years (SD = 7.1) in the current position. The highest percentage of the sample worked in a LHD located in the Midwest (37.6%). Most LHDs (79.4%) were locally governed, 11.9% had shared governance, and 8.6% had a state governance structure.
TABLE 2—
Characteristic | No. | % or Mean ±SD |
Individual | ||
Job position | ||
Top executivea | 367 | 43.5 |
Manager or other staffb | 319 | 37.8 |
Administrator, deputy, or assistant director | 158 | 18.7 |
Age, y | ||
20–39 | 129 | 15.3 |
40–49 | 187 | 22.2 |
50–59 | 349 | 41.4 |
≥ 60 | 179 | 21.2 |
Highest degree | ||
≤ bachelor’s degree | 227 | 26.8 |
Nursing | 169 | 20.0 |
Other master’s degree | 205 | 24.2 |
MPH or MSPH | 137 | 16.2 |
Doctoral degree | 109 | 12.9 |
Gender | ||
Female | 531 | 62.9 |
Male | 313 | 37.1 |
Length worked in current position | 843 | 8.2 ±7.1 |
Length worked in public health | 844 | 18.5 ±13.0 |
Health department | ||
Population of jurisdiction | ||
< 25 000 | 180 | 21.3 |
25 000–49 999 | 186 | 22.0 |
50 000–99 999 | 150 | 17.7 |
100 000–499 999 | 210 | 24.8 |
≥ 500 000 | 121 | 14.3 |
Governance structure | ||
Locally governed | 672 | 79.4 |
State governed | 73 | 8.6 |
Shared governance | 101 | 11.9 |
Census region | ||
Northeast | 123 | 14.5 |
Midwest | 319 | 37.6 |
South | 264 | 31.1 |
West | 142 | 16.7 |
Note. LHD = local health department; MPH = Master of Public Health; MSPH = Master of Science in Public Health.
Includes top executives, health directors, health officers, commissioners, or equivalent in office of the director.
Includes managers of a division or program, program coordinators, technical expert positions, or other staff.
Funding guidance from a legislative authority or federal funding source was most often reported as 1 of the most important resources when making decisions about program planning, policy development, or funding (54.5%; Table 3). Half of the sample (49.8%) ranked guidance from the state health agency as 1 of the most important resources, followed by perspectives or priorities of agency leadership (38.9%), success stories and lessons learned from peers (38.4%), and health planning tools (37.2%). The scientific resources most frequently ranked as important were systematic reviews of scientific literature (24.7%), scientific reports (15.9%), general literature review articles (6.5%), and 1 or a few scientific studies (4.8%).
TABLE 3—
Variable | No. (%)a |
Perceived importance of resources when making decisions about programs, policy, or funding | |
Funding guidance (legislative authority or federal funding source) | 463 (54.5) |
Guidance from the state health agency | 423 (49.8) |
Perspectives or priorities of agency leadershipb | 330 (38.9) |
Success stories and lessons learned from peers | 326 (38.4) |
Health planning tools (e.g., MAPP or Healthy People 2020) | 316 (37.2) |
Systematic reviews of the body of scientific literature (Community Guide) | 210 (24.7) |
Scientific reports (e.g., IOM reports, surgeon general reports) | 135 (15.9) |
General literature review articles | 55 (6.5) |
1 or a few scientific studies | 41 (4.8) |
Other | 37 (4.4) |
Reports to funders | 21 (2.5) |
Perceived importance of resources when seeking to learn about current findings in public health research | |
Seminars or workshops (phone, webinars, or in-person) | 447 (52.7) |
Professional associations | 410 (48.3) |
E-mail alerts | 288 (33.9) |
Academic journals | 279 (32.9) |
Academic conferences | 187 (22.0) |
Newsletters | 178 (21.0) |
Policy briefs | 143 (16.8) |
Other conferences | 138 (16.3) |
Press releases | 106 (12.5) |
Face-to-face meetings with stakeholders | 91 (10.7) |
Targeted mailings | 45 (5.3) |
Other | 34 (4.0) |
Social media (Facebook, Twitter) | 20 (2.4) |
Media interviews | 6 (0.7) |
CD-ROMs | 5 (0.6) |
Journals most often read to stay up to date on current public health findings | |
Morbidity and Mortality Weekly Report | 194 (22.9) |
American Journal of Public Health | 179 (21.1) |
Public Health Reports | 87 (10.2) |
Journal of Public Health Management and Practice | 82 (9.7) |
Emerging Infectious Diseases | 57 (6.7) |
New England Journal of Medicine | 44 (5.2) |
Other | 43 (5.1) |
Journal of the American Medical Association | 39 (4.6) |
American Journal of Preventive Medicine | 26 (3.1) |
Health Affairs | 13 (1.5) |
Preventing Chronic Disease | 13 (1.5) |
Annual Review of Public Health | 5 (0.6) |
Preventive Medicine | 5 (0.6) |
BMC Public Health | 2 (0.2) |
Frontiers in Public Health Services and Systems Research | 1 (0.1) |
Implementation Science | 0 (0.0) |
Note. IOM = institute of medicine; LHD = local health department; MAPP = mobilizing for action through planning and partnerships.
As a result of equal weighting, it is possible that the percentages within each of the 3 domains can total up to 300%, since each respondent was able to rank a maximum of 3 items for each of the 3 domains. For example, if every single respondent ranked the same 3 items within 1 of the 3 domains (i.e., complete agreement among respondents), then these 3 items would be 100% each; thus, totaling 300%.
The percentage of managers and other staff who ranked “perspectives or priorities of agency leadership” in their top 3 (44%) was slightly higher than top executives (37%) and administrators, deputy or assistant directors (35%). This does not affect the relative ranking for managers and other staff, the group for which this variable is likely to be the most meaningful.
Seminars or workshops had the highest percentage of respondents reporting it as 1 of the most important ways to learn about current findings in public health research (52.7%; Table 3). Almost half (48.3%) rated professional associations as 1 of the most important methods, followed by e-mail alerts (33.9%), academic journals (32.9%), and academic conferences (22%). Of the sample, 21.0% perceived newsletters as an important resource, followed by policy briefs (16.8%), other conferences (16.3%), and press releases (12.5%). Social media were perceived as important by 2.4% of the sample. Among the 813 respondents who provided data about use of the Community Guide, more than one third (36.9%) were not familiar with this resource, and 21.8% did not personally use this resource in practice.
Respondents were asked to rank their top 3 most-often-read journals among a predetermined list of 15 public health or health journals. Overall, respondents selected fewer than 1 journal, on average, from the list of 15 journals. Morbidity and Mortality Weekly Report (22.9%), American Journal of Public Health (21.1%), Public Health Reports (10.2%), and the Journal of Public Health Management and Practice (9.7%) were among the highest read journals in the overall sample (Table 3).
Respondents who ranked academic journals as a top resource to learn about public health research were asked to indicate how their agency gained access to journals. Among these respondents (n = 279), 68.1% reported that an agency subscription was used, followed by access through a state health agency (22.9%) and access through academic partners (12.5%). Respondents who did not rank academic journals as a top resource to learn about public health research (n = 570) were asked to indicate reasons for not using journals. Among these respondents, almost half (47.4%) reported that subscriptions were too expensive, followed by lack of access to journals (32.4%). Of respondents, 30.2% also indicated other barriers to journals. The “other” responses included not enough time or competing priorities, information was not practical, and journals were used but were not a top 3 resource.
Several characteristics of respondents and LHDs were associated with perceived importance of scientific resources in logistic regression models. After adjustment, several variables remained significantly associated (P < .05) with perceived importance of scientific resources (Table 1). For highest degree, other master’s degree (AOR = 2.0; 95% CI = 1.3, 3.1), MPH or MSPH degree (AOR = 1.7; 95% CI = 1.1, 2.8), and doctoral degree (AOR = 3.5; 95% CI = 1.9, 6.3) were all associated with perceived importance of scientific resources. Population jurisdiction of 25 000 or larger (AORs ranging from 2.0 to 3.5) was also associated with perceived importance of scientific resources. Lastly, leadership that encouraged EBDM was associated with perceived importance of scientific resources (AOR = 1.6; 95% CI = 1.1, 2.3).
DISCUSSION
The findings reveal that a relatively low percentage of practitioners reported that systematic reviews of scientific literature were a resource viewed as important in decision-making. Similarly, we also found that a relatively low percentage of practitioners used the Community Guide and were familiar with this resource. These findings are generally consistent with the 2013 National Profile of Local Health Departments, which showed that 38% of LHDs do not use the Community Guide.27 Many LHD practitioners may not be using scientific resources to address public health problems locally. However, using the best available scientific evidence is a cornerstone of EBDM and supports the likelihood of programmatic and policy effectiveness that has implications for public health outcomes.
Characteristics of LHD practitioners and the departments in which they work may influence the use of scientific resources for EBDM, which is a requirement for LHDs to progress toward or maintain Public Health Accreditation Board accreditation.28 Improving the use of scientific resources also aligns with LHDs seeking to boost quality improvement efforts or those wishing to broadly improve performance. Most important, applying the best evidence in practice is likely to improve performance toward achieving benchmarks for community health improvement.
Funding guidance and state health departments are also important contributors to decision-making in LHDs, suggesting that the LHD workforce, although largely locally governed, is likely to be influenced by foundations and state and federal partners. Though perhaps not surprising, these results suggest that decision-making practices may be difficult to modify without support from authoritative sources. To improve performance in EBPH, funding streams and state health departments should provide incentives that support these practices in the LHD system.
Overall, this study suggests that there is general disagreement among LHD practitioners based on the range of resources viewed as important in both decision-making and when seeking to learn about public health research. Some of this disagreement is likely a result of the diversity of the LHD workforce and structure of the LHD system. Scientific resources, in particular, are more likely to be viewed as important in decision-making among specific segments of the LHD workforce. Identifying and addressing workforce characteristics, such as educational attainment, may help inform future studies examining potential strategies for improving organizational performance and EBDM in the LHD system. Moreover, practitioners in LHDs serving larger populations were more likely to view scientific resources as important, suggesting that new and creative strategies are needed to reach smaller, more rural LHDs.
The implications of these findings for LHDs serving smaller populations should be understood in the context of rural public health. LHDs in these rural areas are often confronted with the challenges of limited health care access.29 Consequently, more prioritization may be placed on delivery of health care services as opposed to population-level services that are the focus of resources such as the Community Guide. These rural LHDs, often led by nurses, may also lack training and education on how to develop and administer population-level services.29
Certain leadership structures and practices have previously been identified as being positively associated with performance measures.30 These administrative evidence-based practices consist of 5 major domains, including leadership. The current findings suggest that leadership encouragement may shape decision-making practices, which implies that, although providing training and access to scientific resources is an important first step in the EBDM process among practitioners, having strong leadership may be necessary to nudge LHD practitioners to use evidence-based resources. LaPelle et al.8 found that strong leadership supporting use of evidence-based resources can foster a culture change, which could be especially important for health departments in which EBDM is not a common organizational practice.
Most LHD practitioners rely on seminars, workshops, and professional associations to learn about public health research, suggesting that linkages to scientific evidence do not often occur through traditional academic sources (e.g., journals, professional conferences). Additional efforts are needed to better understand how targeted strategies (e.g., knowledge brokering31) might contribute to more effective translation of research into local public health practice. The research community should evaluate other media for communicating scientific evidence that are more widely preferred by practitioners. Active dissemination is 1 approach researchers may adopt that aims to spread evidence-based information and research through determined channels using planned strategies.32 Researchers can facilitate the use of scientific resources by practicing active dissemination, which may improve EBDM among LHD practitioners. However, studies have suggested that many researchers spend little time on dissemination and lack the infrastructure necessary to support this activity.33 Funding incentives (e.g., from the Centers for Disease Control and Prevention or the National Institutes of Health) may also act as catalysts for supporting dissemination efforts among researchers. To guide the process of dissemination, researchers (and the broader research community, including funders and professional associations) should tailor efforts to specific audiences34 through defined communication messages and channels.35–37 These results offer insight into the preferred channels of communication that LHD practitioners view as important to find research, informing active dissemination strategies for future evaluation.
Limitations
A few limitations are worth noting. The scope and definition of EBDM used in the analyses for this study do not fully capture the concept in its entirety. EBDM involves several key components, including the integration of the best available scientific evidence in public health practice. However, it also includes the practice of community engagement, application of program planning and quality improvement, and evaluation. The data used for this study were based on self-reported questionnaire responses and were cross-sectional, limiting inferences. LHD practitioners may have over- or underestimated perceived importance and reported use of resources in self-report survey responses. It is worth noting that if a respondent did not rank a resource in any of the top 3, it did not necessarily mean that the resource was not important or not used. However, the practical aim of the study was to capture the most important and relevant resources that LHD practitioners consider in the decision-making process. Therefore, we viewed the top 3 as a reasonable cut-off. Finally, nonresponse bias is another potential limitation, given the relatively low response rate (57%) to the survey. Moreover, small LHDs and LHDs located in the Northeast census region were less likely than were other groups to respond; thus, additional caution may be warranted when applying study findings to these groups.
In analyzing characteristics of LHDs, multiple potential sources of clustering may result in unknown quantities of dependency (e.g., multiple LHD responses; overlapping structures of programs within LHDs, state, or region). These data are limited such that we were unable to address the impact of these complex structures on our results. In a separate analysis, we restricted the sample to 1 response per LHD, based on the initial sample of leaders (n = 517), to examine results not affected by LHD-level multiple-response clustering. Compared with the results with the full data set, the point estimates and confidence intervals were highly similar.
Conclusions
This study suggests that graduate training and leadership practices may enhance the perceived importance of scientific resources among LHD practitioners. Funding guidance is also an important driver in decision-making and needs to be carefully constructed to reflect the principles of EBDM. Additionally, this study has implications for translating research into local public health practice. Only a few academic journals are widely read among LHD practitioners, and most individuals rely heavily on seminars, workshops, and their professional associations (e.g., the National Association of County and City Health Officials) when seeking out research findings. Future research and practice should aim to design and evaluate strategies to improve access and uptake of the use of scientific resources among LHD practitioners.
Acknowledgments
This study was supported by the Robert Wood Johnson Foundation (69964; Public Health Services and Systems Research). This article is the product of a Prevention Research Center and was also supported by a cooperative agreement from the Centers for Disease Control and Prevention (Cooperative Agreement Number U48/DP001903).
We thank members of our research team, Carolyn Leep (National Association of City and County Health Officials), Rodrigo Reis, PhD (the Pontifical Catholic University of Parana and the Federal University of Parana), Paul Erwin, MD, DrPH (University of Tennessee), and Carson Smith (Washington University in St. Louis).
Note. The findings and conclusions in this article are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Human Participant Protection
Human participant approval was obtained from the Washington University in St. Louis institutional review board.
References
- 1.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201. doi: 10.1146/annurev.publhealth.031308.100134. [DOI] [PubMed] [Google Scholar]
- 2.Lenaway D, Corso LC, Buchanan S, Thomas C, Astles R. Quality improvement and performance: CDC’s strategies to strengthen public health. J Public Health Manag Pract. 2010;16(1):11–13. doi: 10.1097/PHH.0b013e3181c115ee. [DOI] [PubMed] [Google Scholar]
- 3.Randolph GD, Stanley C, Rowe B et al. Lessons learned from building a culture and infrastructure for continuous quality improvement at Cabarrus Health Alliance. J Public Health Manag Pract. 2012;18(1):55–62. doi: 10.1097/PHH.0b013e31822d2e23. [DOI] [PubMed] [Google Scholar]
- 4.Riley WJ, Bender K, Lownik E. Public health department accreditation implementation: transforming public health department performance. Am J Public Health. 2012;102(2):237–242. doi: 10.2105/AJPH.2011.300375. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Public Health Accreditation Board. Public Health Accreditation Board standards: an overview. Available at: http://www.phaboard.org/wp-content/uploads/PHAB-Standards-Overview-Version-1.0.pdf. Accessed October 24, 2014.
- 6.National Association of County and City Health Officials, Centers for Disease Control and Prevention. The Community Guide—Public Health Accreditation Board standards crosswalk: a tool to support accreditation and increase use of evidence-based approaches. Available at: http://www.thecommunityguide.org/uses/Community%20Guide-PHAB%20Crosswalk%20Version%201.pdf. Accessed October 24, 2014.
- 7.Mercer SL, Banks SM, Verma P, Fisher JS, Corso LC, Carlson V. Guiding the way to public health improvement: exploring the connections between the community guide’s evidence-based interventions and health department accreditation standards. J Public Health Manag Pract. 2014;20(1):104–110. doi: 10.1097/PHH.0b013e3182aa444c. [DOI] [PubMed] [Google Scholar]
- 8.LaPelle NR, Dahlen K, Gabella BA, Juhl AL, Martin E. Overcoming inertia: increasing public health departments’ access to evidence-based information and promoting usage to inform practice. Am J Public Health. 2014;104(1):77–79. doi: 10.2105/AJPH.2013.301404. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Revere D, Turner AM, Madhavan A et al. Understanding the information needs of public health practitioners: a literature review to inform design of an interactive digital knowledge management system. J Biomed Inform. 2007;40(4):410–421. doi: 10.1016/j.jbi.2006.12.008. [DOI] [PubMed] [Google Scholar]
- 10.Whitener BL, Van Horne VV, Gauthier AK. Health services research tools for public health professionals. Am J Public Health. 2005;95(2):204–207. doi: 10.2105/AJPH.2003.035030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Jacobs JA, Jones E, Gabella BA, Spring B, Brownson RC. Tools for implementing an evidence-based approach in public health practice. Prev Chronic Dis. 2012;9:E116. doi: 10.5888/pcd9.110324. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.LaPelle NR, Luckmann R, Simpson EH, Martin ER. Identifying strategies to improve access to credible and relevant information for public health professionals: a qualitative study. BMC Public Health. 2006;6:89. doi: 10.1186/1471-2458-6-89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Stamatakis KA, McBride TD, Brownson RC. Communicating prevention messages to policy makers: the role of stories in promoting physical activity. J Phys Act Health. 2010;7(suppl 1):S99–S107. doi: 10.1123/jpah.7.s1.s99. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Brownson RC, Reis RS, Allen P et al. Understanding administrative evidence-based practices: findings from a survey of local health department leaders. Am J Prev Med. 2014;46(1):49–57. doi: 10.1016/j.amepre.2013.08.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Reis RS, Duggan K, Allen P, Stamatakis KA, Erwin PC, Brownson RC. Developing a tool to assess administrative evidence-based practices in local health departments. Frontiers in Public Health Services and Systems Research. 2014;3(3):2. [Google Scholar]
- 16. National Association of County and City Health Officials. National Profile of Local Health Departments Survey: Core and Modules. Washington, DC: National Association of County and City Health Officials; 2010. [Data file]
- 17.Hajat A, Cilenti D, Harrison LM et al. What predicts local public health agency performance improvement? A pilot study in North Carolina. J Public Health Manag Pract. 2009;15(2):E22–E33. doi: 10.1097/01.PHH.0000346022.14426.84. [DOI] [PubMed] [Google Scholar]
- 18.Handler A, Issel M, Turnock B. A conceptual framework to measure performance of the public health system. Am J Public Health. 2001;91(8):1235–1239. doi: 10.2105/ajph.91.8.1235. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Mays GP, Smith SA, Ingram RC, Racster LJ, Lamberth CD, Lovely ES. Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009;36(3):256–265. doi: 10.1016/j.amepre.2008.11.008. [DOI] [PubMed] [Google Scholar]
- 20.Tilburt JC. Evidence-based medicine beyond the bedside: keeping an eye on context. J Eval Clin Pract. 2008;14(5):721–725. doi: 10.1111/j.1365-2753.2008.00948.x. [DOI] [PubMed] [Google Scholar]
- 21.Brownson RC, Ballew P, Brown KL et al. The effect of disseminating evidence-based interventions that promote physical activity to health departments. Am J Public Health. 2007;97(10):1900–1907. doi: 10.2105/AJPH.2006.090399. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Brownson RC, Ballew P, Dieffenderfer B et al. Evidence-based interventions to promote physical activity: what contributes to dissemination by state health departments. Am J Prev Med. 2007;33(1 suppl):S66–S73. doi: 10.1016/j.amepre.2007.03.011. [DOI] [PubMed] [Google Scholar]
- 23.Brownson RC, Ballew P, Kittur ND et al. Developing competencies for training practitioners in evidence-based cancer control. J Cancer Educ. 2009;24(3):186–193. doi: 10.1080/08858190902876395. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC. Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138–143. doi: 10.1097/01.PHH.0000311891.73078.50. [DOI] [PubMed] [Google Scholar]
- 25.Jacobs JA, Clayton PF, Dove C et al. A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012;12:57. doi: 10.1186/1472-6963-12-57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep. 2010;125(5):736–742. doi: 10.1177/003335491012500516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.National Association of County and City Health Officials. 2013 national profile of local health departments. Available at: http://www.naccho.org/topics/infrastructure/profile/upload/2013-National-Profile-of-Local-Health-Departments-report.pdf. Accessed October 24, 2014.
- 28.Public Health Accreditation Board. Public Health Accreditation Board standards and measures. Version 1.5. Available at: http://www.phaboard.org/wp-content/uploads/SM-Version-1.5-Board-adopted-FINAL-01-24-2014.docx.pdf. Accessed October 24, 2014.
- 29.Meit M, Knudson A. Why is rural public health important? A look to the future. J Public Health Manag Pract. 2009;15(3):185–190. doi: 10.1097/PHH.0b013e3181a117b4. [DOI] [PubMed] [Google Scholar]
- 30.Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC. Fostering more-effective public health by identifying administrative evidence-based practices: a review of the literature. Am J Prev Med. 2012;43(3):309–319. doi: 10.1016/j.amepre.2012.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Dobbins M, Robeson P, Ciliska D et al. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci. 2009;4:23. doi: 10.1186/1748-5908-4-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Rabin BBR. Developing the terminology for dissemination and implementation research. In: Brownson RCG, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012. pp. 23–51. [Google Scholar]
- 33.Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA. Designing for dissemination among public health researchers: findings from a national survey in the United States. Am J Public Health. 2013;103(9):1693–1699. doi: 10.2105/AJPH.2012.301165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Lomas J. Diffusion, dissemination, and implementation: who should do what? Ann N Y Acad Sci. 1993;703:226–235; discussion 235–227. doi: 10.1111/j.1749-6632.1993.tb26351.x. [DOI] [PubMed] [Google Scholar]
- 35.Owen NGA, Fjeldsoe B, Sugiyama T, Eakin E. Designing for the dissemination of environmental and policy initiatives and programs for high-risk groups. In: Brownson RCG, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012. pp. 114–127. [Google Scholar]
- 36.National Cancer Institute. Designing for Dissemination: Conference Summary Report. Washington DC: National Cancer Institute; 2002. [Google Scholar]
- 37.Slater MD. Theory and method in health audience segmentation. J Health Commun. 1996;1(3):267–283. doi: 10.1080/108107396128059. [DOI] [PubMed] [Google Scholar]