Abstract
Dissemination of prevention-focused evidence-based programs (EBPs) from research to community settings may improve population health and reduce health disparities, but such flow has been limited. Academic–community partnerships using community-based participatory research (CBPR) principles may support increased dissemination of EBPs to community-based organizations (CBOs). This qualitative study examined the EBP-related perceptions and needs of CBOs targeting underserved populations. As part of PLANET MassCONECT, a CBPR study, we conducted six key informant interviews with community leaders and four focus groups with CBO staff members in Boston, Worcester and Lawrence, Massachusetts, in 2008. Working definitions of EBPs among CBO staff members varied greatly from typical definitions used by researchers or funders. Key barriers to using EBPs included: resource constraints, program adaptation challenges and conflicts with organizational culture. Important facilitators of EBP usage included: program supports for implementation and adaptation, collaborative technical assistance and perceived benefits of using established programs. This exploratory study highlights differences among key stakeholders regarding the role of evidence in program planning and delivery. An updated perspective should better incorporate CBO perspectives on evidence and place greater, and much needed, emphasis on the impact of context for EBP dissemination in community settings.
Introduction
Modifiable behaviors play a leading role in health promotion and disease prevention and interventions to promote healthy behaviors could have a significant impact on population health [1, 2]. Accordingly, a number of interventions and campaigns have been developed that successfully address risky behaviors and promote healthy behaviors [1, 3–5]. Despite the availability of effective programs, it is unclear how best to translate knowledge from controlled intervention settings to population settings. This gap between research and practice has been an important focus of attention in public health recently. The impetus stems from the assumption that translation of evidence-based programs (EBPs) to population settings will have a pronounced influence on population health [6, 7]. Community-based organizations (CBOs) are vital channels for disseminating health promotion EBPs given their tremendous influence on the public’s health and ability to support community participation in preventive care and the healthcare system overall [7–9]. CBOs conducting health outreach among underserved populations are of particular importance as improvements in the dissemination and large-scale implementation of EBPs are thought to be a critical approach to addressing health disparities [10–12]. CBOs working with the underserved can serve as an efficient channel of program delivery as they understand the needs of their clientele intimately and may reach them with greater sensitivity to local cultures and constraints than other organizations [13, 14].
Despite the promise of program delivery in community settings, successful dissemination of EBPs to CBOs has been limited, and the reasons for the research-practice gap are not clear [7, 15]. One explanation is that earlier attempts to increase EBP usage focused on top-down approaches, a ‘push’ approach from the research and funding communities. Current strategies proposed also include a ‘pull’ approach, driven by the needs and capacity of practitioners implementing EBPs in practice settings, as well as attempts to change the overall climate around the use of evidence in decision-making processes and improvements in evaluation activities [16–18]. The historic mismatch between needs and offerings is reflected in common barriers to EBP usage in community settings: (i) expensive, overly complex and/or time-consuming interventions, (ii) organizational barriers, including limited funding and staff capacity or a lack of an organizational champion for EBPs, (iii) failure of interventions to address outcomes of interest to CBOs, (iv) challenges in finding, using and adapting EBPs, (v) lack of effective community engagement in promoting the intervention and (vi) challenges in implementing the EBP as designed [6, 7, 15, 19–21].
Academic–community partnerships may offer a useful alternative to top-down efforts to create change [17, 22]. These partnerships are one among many leverage points for increasing the spread of EBPs but are an important (and likely underutilized) resource prompting our attention here. One way for academic–community partnerships to improve dissemination of EBPs is by using a two-pronged approach of participatory research and capacity building. A community-based participatory research (CBPR) approach builds on strengths and resources held by the community, combines knowledge and action to benefit all partners, utilizes an iterative process that supports colearning and empowerment, considers health using positive and ecological perspectives and facilitates collaborative equitable involvement of all partners throughout the research process [23]. This strengths perspective and emphasis on end-user voices align well with capacity-building efforts (processes that assist communities in efforts to manage, develop and utilize resources at their disposal to solve health problems, largely at their own initiative). These efforts represent an intentional shift away from top-down technical assistance toward an endogenous process, owned and driven by those who will ultimately benefit from and sustain changes in their systems [24, 25].
Given the goals of capacity-building and sustainable change, it is crucial to look at practitioners’ perspectives on the use of EBPs as part of a new way of work, which relies on the use of evidence as the driver of decision-making processes. Despite the potential to improve system-level capacity for EBP delivery, adopting a new way of work is often associated with challenges related to culture change and other contextual factors [16, 26, 27, 28]. In this arena, part of the cultural disconnect may relate to the idea of research evidence. Stakeholders may understand ‘evidence’ differently (with CBOs emphasizing tacit knowledge or community preferences), available evidence may not be seen as relevant to community settings and/or decision-makers may not see research evidence as an important driver of decision-making processes [18, 29, 30]. Solutions may rely on taking a more inclusive approach regarding the range of evidence considered meaningful in these settings or assisting research users with applying available research to local circumstances [31, 32].
An associated challenge for EBP dissemination in CBOs relates to the need to understand and intervene on the processes by and context within which programs are delivered [11, 15, 33]. As described by Brownson et al. [30], individual, interpersonal, organizational, sociocultural and political/economic contextual variables impact intervention development, implementation and adaptation. The challenge to understand the impact of context on EBP usage is particularly important for CBOs targeting the underserved, as EBPs are typically developed for other target audiences. A recent review focused on spreading effective programs to community settings noted a lack of diversity among target audiences and settings for most studies [34]. This results in a high degree of mismatch in terms of audience attributes, staff characteristics, organizational resources and/or community-level factors [15, 35, 36]. CBOs utilizing programs developed for other populations/contexts must engage in program adaptation, or the modification of a tested program to meet the needs of the context in which it is being implemented [15]. This is a major stumbling block for organizations as they must balance the tension between adapting EBPs to the contextual characteristics that support adoption and implementation and maintaining the intervention elements that have been tested and proven [37]. Given the challenges described here and the public health importance of CBOs targeting the underserved, it is appropriate and necessary to study this subset of organizations in detail.
By better understanding the ground-level challenges of organizations working with vulnerable populations, we can better assess the needs of such organizations related to EBP utilization. In this way, researchers and program developers can improve their product offerings to be more attractive to and appropriate for community-based practitioners, and thus increase demand for EBPs [8]. With this in mind, two broad research questions guided the study of organizations working with underserved groups. First, how do those conducting health outreach in organizations targeting the underserved understand EBPs? Second, what are the perceived barriers and facilitators that influence usage of EBPs by these organizations?
Methods
Study design
Data for this study come from formative research conducted as a part of PLANET MassCONECT, a large-scale dissemination/knowledge translation project aimed at building capacity to adopt EBPs among CBOs working with the underserved in three Massachusetts communities: Boston, Lawrence and Worcester. The three cities represent considerable diversity in terms of population size, racial/ethnic composition and socioeconomic status (SES) (Table I). This capacity-building initiative focuses on developing practitioner skills in finding, adapting and evaluating EBPs. Rather than focusing on a specific program, the intervention attempts to build skills around a systematic approach to program planning, which centers around the use of EBPs.
Table I.
Exemplar sociodemographic characteristics of Boston, Lawrence and Worcester, Massachusetts, data from the 2000 Census [38]
| Population (thousands) | Racial/ethnic breakdown |
Families living below poverty line (%) | |||
|---|---|---|---|---|---|
| Hispanic (of any race) (%) | White (%) | Black (%) | |||
| Boston | 589 | 14 | 54 | 25 | 15 |
| Lawrence | 72 | 60 | 49 | 5 | 21 |
| Worcester | 173 | 15 | 77 | 7 | 14 |
The aim of this project is to achieve capacity-building goals through CBPR processes, thus promoting sustainable change in multiple ways. In line with this philosophy, this study was conducted in collaboration with the Community Project Advisory Committee (C-PAC), an advisory group that includes community partners from each community as well as investigators, dissemination partners and study staff. This group was involved in study design and interpretation of findings. This strategy ensures that that community context is understood as part of the intervention strategy and not variation that should be controlled for in an analysis [39]. Given that some C-PAC members would ultimately serve as respondents, we solicited high-level feedback from community partners early in the development of interview guides and materials, thus limiting potential conflicts and biases. We utilized key informant interviews and focus group discussions, two qualitative methods that support the goals of obtaining in-depth understanding in an area in which the important questions and concerns have not already been defined [40, 41]. These methods allow for open conversation between the moderator and respondent(s) and allow for extensive and flexible probing [42].
Participants
We utilized a purposive sampling technique to select study participants. We conducted six key informant interviews with community leaders, two from each of the three communities, during the summer and fall of 2008. These individuals were members of the C-PAC and represented three community coalitions engaged in health outreach in Boston, Lawrence and Worcester. Some of these individuals were coalition leaders, others held leadership roles in non-profit organizations. The purpose of these interviews was to solicit community leaders’ perspectives on the use of EBPs in local CBOs.
Focus group respondents, referred to throughout as general practitioners, were staff members (managers, project planners and field staff) from CBOs that conduct health program planning in Boston, Lawrence and Worcester, Massachusetts. Focus group respondents were recruited with assistance from our partners in the three communities. We made an effort to draw representatives from organizations working with diverse racial, ethnic and socioeconomic groups. We conducted four focus groups, one each in Boston, Lawrence and Worcester in English and one group in Spanish in Lawrence, given the predominance of Spanish-speaking staff members in that area. These discussions were held between September and December 2008. A total of 31 individuals participated in the focus groups. Across the focus group and key informant pools, the majority of the 37 participants were female, with substantial diversity in terms of race/ethnicity (Table II).
Table II.
Participant characteristics (n = 37)
| Number of groups | Gender |
Race/ethnicity |
|||||
|---|---|---|---|---|---|---|---|
| Male | Female | White (non-Latino/Hispanic) | Black (non-Latino/Hispanic) | Latino/Hispanic (any race) | Other | ||
| Focus groups (n = 31) | |||||||
| Boston | 1 | 0 | 7 | 1 | 5 | 0 | 1 |
| Lawrence | 2 | 1 | 15 | 3 | 0 | 11 | 2 |
| Worcester | 1 | 1 | 7 | 4 | 3 | 1 | 0 |
| Key informant interviews (n = 6) | |||||||
| n/a | 2 | 4 | 2 | 2 | 2 | 0 | |
Data collection
The key informant interview and focus group guides both focused on respondents’ understanding of and experiences with EBPs and other health promotion programs. Given our interest in perspectives on the class of programs known as EBPs, the guides focused on respondents’ experiences with a range of health promotion programs and EBPs that they have used in practice settings. Respondents were also asked about current practices in program planning and evaluation, EBP adoption and implementation and barriers to/facilitators of EBP usage. Additional information was collected for an assessment related to the intervention under development but those data were excluded from this analysis. Key informant interviews were conducted by research staff with experience in qualitative methods and took approximately 1 hour to complete. The English focus groups were conducted by a cultural anthropologist with extensive qualitative research experience. An experienced Spanish-speaking focus group facilitator moderated the discussion held in Spanish. The focus groups were approximately 2 hours in length and were held at locations easily accessible by respondents in each of the three communities. Each participant was paid an incentive of $35 for his or her participation.
Data analysis
The focus groups and key informant interviews were audiotaped and transcripts were created for data analysis. The Spanish language group’s transcripts were translated into English for review. First, three members of the research team reviewed the data using the crystallization/immersion method, an inductive technique that allows researchers to examine transcripts and highlight (or crystallize) the most important concepts [43]. This step was followed by coding of the data according to structured hierarchical database indexing, utilizing dedicated qualitative data analysis software, NVivo 8 [44]. Structural coding reports were generated utilizing NVivo and read by three members of the research team, prompting the identification and refining of key themes. Data from the focus group discussions and key informant interviews were reviewed separately to allow for comparison of themes and patterns. Following established techniques of qualitative research aimed at increasing the validity of findings, the research team presented data and interpretations to C-PAC members to gauge perceived accuracy. These efforts, or member-checks, prompted the research team to refine theme definition and interpretation of the data [45].
Results
Familiarity with and conceptualization of EBPs
While respondents were familiar with the term ‘evidence-based’ or ‘evidence-based program’ interpretations of the definitions of EBPs varied considerably. Community leaders (interviewed as key informants) typically gave definitions of EBPs that are quite similar to those used by academics. Among general practitioners (interviewed in focus groups), some noted that EBPs were comprehensive programs tested in a research setting or programs with demonstrated success for a given population. However, the common theme was that the use of data at any point in the lifecourse of a program was indication that a program was an EBP. Such applications of data included needs assessments, program development, program selection and program evaluation. The following quotes represent typical definitions of EBPs:
So I would say that … (the) program that we currently have through the (government organization) would be an evidence-based project because it was supported by data gathered through the school system … So it was all based on data collection, and then, how do we then create a model to provide.—Participant 1 (Focus Group, Boston)
For me, that is evidence-based because they are looking to obtain information about areas with the most need … To be able to collect that information we know where the need is, what we can do, and what programs we have to implement to continue helping those that still do not have that help.—Participant 2 (Focus Group, Lawrence)
(Evidence-based programs mean that) somebody has systematically studied and demonstrated the efficacy of a particular intervention or approach, and that they’ve published it, usually in a peer-reviewed journal.—Participant 3 (Key Informant, Boston)
A program that has shown success, based on positive, or whatever outcomes that are proven and have been shown to work.—Participant 4 (Focus Group, Worcester)
General practitioners noted that EBPs are often identified as such by funders, researchers or national agencies. They also noted that their organizations would like to use EBPs but were not currently using them. Discussion of EBPs consistently included mention of external funding agencies and their requirements related to this class of programs, many of which appear to be fulfilled without perceived benefit to the organization delivering the program.
I have learned, now, that you read the end, first—what they want, the outcomes, and, um … you read that first, and then you work to the outcomes so that you’ll, or, you know, work toward it, but . . . that’s like testing to the test.—Participant 4 (Focus Group, Worcester)
Barriers to EBP usage
Community leaders and general practitioners described three broad groups of barriers to using EBPs in practice: resource constraints, challenges with program adaptation and organizational culture. Resource constraints, particularly those related to staff capacity, material resources and time, were cited as the predominant barrier to EBP usage. Respondents identified lack of sufficient number of staff and/or lack of expertise needed to apply for grants, implement programs as designed and conduct the intense and complex evaluations required by funders. In this vein, evaluations and reporting were typically discussed as effort taken to fulfill funder requirements rather than work that had value to the organization, a classic illustration of the ‘top-down’ approach discussed earlier.
It’s a lot of paperwork! . . . First they said you had to do an evidence-based program. And then they said, ‘Okay, but their evidence-based evaluation isn’t enough, on top of that, you have to do ours.’ … I’ve always found it too restrictive … Participant 5 (Focus Group, Boston)
The evaluation … tends to be more process evaluation, than outcome evaluation … Some grants are fairly rudimentary in their outcome, in their evaluation. They just look at, ‘Did you provide the service?’ and they don’t really look at who was served or what the outcome of the person. You know, the people we serve, were they any better off at the end? Although, I think everybody would like to get to that point, nobody really knows exactly how to do it, given the limited resources to do it.—Participant 6 (Key Informant, Worcester)
A second set of barriers was issues related to difficulty in adapting programs for the target population. Respondents noted that programs were developed in contexts that were dramatically different from the ones in which they were to be implemented. General practitioners who had used EBPs reported having adapted programs to fit their community, typically in terms of language, literacy levels and cultural relevance.
I’m thinking like, across the country. So, if something may work really well in the Midwest, you have to think of inner-city populations of people … you’re gonna definitely find, clearly a difference in what works, scientifically, here.—Participant 1 (Focus Group, Boston)
We can adapt all we want, but then we have to really say, ‘Is it really … are we maintaining and keeping what it’s supposed to be?’ Cause when you translate, you kind of lose it. Not the whole thing, but you lose pieces of it.—Participant 7 (Key Informant, Lawrence)
We don’t change the concept, we change sort of the way maybe it’s delivered.—Participant 8 (Key Informant, Lawrence)
Another facet of this barrier was the perception that funders had placed prohibitive restrictions on adaptation. Greater flexibility in modifying programs to meet the needs of the target audience (based on culture, age, language, etc.) was seen as vital.
That’s all, I find that’s a challenge sometimes when … a funder kind of says … ‘Stick to the letter!’—Participant 1 (Focus Group, Boston)
A third class of barriers centered on organizational culture: both community leaders and general practitioners noted that it is not part of institutional culture in CBOs to use EBPs and changing organizational practice to incorporate EBPs is challenging. General practitioners noted that rather than seeking tested programs, they typically developed health promotion programs based on their own knowledge or they used programs local organizations providing similar services had used successfully. Community leaders echoed this view,
Sometimes, if there’s an important individual who’s interested in evidence-based programs, or who has heard about something that works, they’re more likely to do it. But I don’t think it’s part of the culture.—Participant 6 (Key Informant, Worcester)
I think a lot of people believe that solutions need to be developed locally. And, so, what I’ve seen more often is trying sometimes to reinvent the wheel too many times.—Participant 9 (Key Informant, Worcester)
Facilitators of EBP usage
Three important facilitators of EBP usage described by community leaders and general practitioners were as follows: program supports for implementation and adaptation, collaborative technical assistance and perceived benefits of using proven programs. Programs with established guidelines, manuals, training opportunities and provisions for adaptation were seen as easier to adopt. Such supports were expected to reduce the costs and effort required to start and utilize the program.
One of the pieces that usually comes with an evidence-based program is that there is a training that goes along with it so you don’t have to come up with a training for people. And also, an evaluation that comes up with it. Because, coming up with your own evaluation is a killer.—Participant 5 (Focus Group, Boston)
One of the grants that we’ve got … what they did was give you some intensive training on the background of how to do it … They had all the evidence for you, and gave you the tools to work with it so that you could modify and adapt it to your situation and take it. It helped.—Participant 10 (Focus Group, Worcester)
Community leaders and general practitioners also emphasized the importance of collaborative technical assistance in successful adoption, adaptation, implementation and evaluation. General practitioners provided detail regarding these processes and noted that it was easier to implement EBPs when they could access support from funders, researchers and others with whom they had an established relationship that allowed for open conversation and negotiation as needed. They noted that they were able to learn how to modify EBPs to meet their needs through technical assistance. Guidance around evaluation, particularly in terms of setting up metrics and demonstrating impact, were cited as valuable supports. In addition to staff related to the program under adoption, other academic partners and graduate students or interns were listed as potential sources for assistance. The ability to take advantage of academic partners as resources appeared to be facilitated through strong relationships with these partners, including the types developed through the CBPR processes related to the project.
One of the difficulties was, they have a very controlled population (but) … our population was very varied … So that was really difficult to quantify. You could see the result, but, it wasn’t as easy to quantify the way they did it in their report. So, it was a big drawback. But I had to call some of the guys … and they were very helpful to tell me what to do.—Participant 11 (Focus Group, Worcester)
We (went) back to the people who developed the curriculum … (to) say, “This doesn’t work for our population, this age. We have to drop it down.” And they said, “You know, as long as there was this much knowledge that they already had, then you could drop it down.”—Participant 5 (Focus Group, Boston)
When you demand all of this sort of technical stuff, we need more assistance from the funders, from the hospitals, from scientific-based community, whatever it is … so that we can go and do the work. We need your partnership more than just, you know, plopping money on us.—Participant 5 (Focus Group, Boston)
Another set of facilitators of EBP usage described by community leaders and general practitioners included confidence in the program based on the research that supported it and the credibility of the program developer as well as the prestige of using a recognized program or a program developed by a prestigious organization. General practitioners noted that they appreciated the ability to see results and successes that other organizations have had. In this way, they expected initial costs to be lower when using a previously developed program, particularly as they were able to learn from other organizations’ experience with programs and save some effort associated with trial and error. They suggested that they found it easier to use programs that were identified as strong EBPs by funders.
It has the accreditation, the American Diabetes Association stamp behind it, so I certainly feel that I’m comfortable and, um, you know, that it’s got that background.—Participant 12 (Focus Group, Lawrence)
I think it just lends credibility to what you’re trying to do. With funders, whether it’s your … your current funder, or if you’re seeking new funding, . . there’s a credibility behind it when you can say that you use evidence-based programs.—Participant 13 (Focus Group, Lawrence)
Discussion
This exploratory study highlights opportunities to facilitate the dissemination of EBPs to CBOs serving underserved populations by better understanding their needs and perspectives. First, we found a disconnect between the ways EBPs were understood by those delivering health programs compared with those often developing and disseminating such programs. Second, our data suggest support and partnership from academics and program developers can help CBOs successfully incorporate the context of practice settings into program delivery, particularly related to program adoption and adaptation. Increased collaboration among community partners, the research community, and funding organizations, through CBPR and other processes, can provide support for addressing the many challenges to EBP dissemination in community settings.
A first step toward increasing the spread of EBPs requires an understanding of what constitutes an EBP to different stakeholders. Our study suggests that the range of definitions of EBPs varies considerably among general practitioners (focus group respondents), community leaders (key informant interviewees), funders and academics, with general practitioners taking a broader view of this class of programs. For example, we found that general practitioners viewed the inclusion of data/evidence at any point in the program as the key requirement for a program being ‘evidence based’. Contrast this with the definition offered by the National Cancer Institute’s ‘Using What Works’ curriculum, which trains practitioners on the use and adaptation of EBPs. The training manual notes that an EBP must have been implemented with a specific group, evaluated rigorously and found to be effective [46]. Similar definitions, focused on scientific evaluation, positive outcomes and often publication in a peer-reviewed format, are often offered by academics and governmental agencies [7, 47, 48]. Our data suggest that there may be a gap in conceptualization between the research and practice worlds regarding how science and data come into play in programming efforts. Despite the differences, all is not lost. For example, we found that community leaders in this study used definitions closer to what academics and funders use in terms of the utility of evidence for decision making. While our research design prohibits us from making definitive statements about the differences between groups, further study to determine whether this pattern holds may be useful in finding an intervention point. These leaders already serve as a bridge between practitioners and external groups, such as policymakers, academics and others and may be able to lead efforts change the conversation around EBPs in practice settings.
Another potential point of intervention relates to the recognition among community leaders and general practitioners that data matter for this work, though how and where data should inform programming choices may be viewed differently from academics. This can serve as a starting point to reframe the discussion in two ways. First, it provides impetus to apply principles of evidence-based public health, which integrates community needs and preferences with development and implementation of effective interventions to improve population health [49, 50]. This perspective recognizes the value of diverse types of evidence to drive practices and policies and stresses systematic data-driven program planning, delivery and evaluation practices [51, 52]. In addition to increasing the relevance of evidence presented to practitioners, recognizing the utility of a range of types of evidence will allow the field of health promotion to take advantage of the vast knowledge held by practitioners rather than focusing only on flow from research to practice settings [7, 18, 53]. Taking this type of strengths perspective is characteristic of CBPR efforts [23, 54].
At the same time, an improved focus on the utility of data will also help CBOs improve program delivery. CBO staff members may benefit from prompts to use data and evidence across the lifecourse of program planning and delivery. Respondents in this study appeared to perceive evaluation as an onerous mandate required by the funders. Emphasizing the ability to use data to drive continuous improvement, and thereby improve services and community health, may lead to more favorable views of the process by CBO staff members. This will allow community organizations to take advantage of existing knowledge and apply the evidence-based perspective in a cyclical process involving practice, evaluation and adjustments to practice [55, 56]. For example, YMCA afterschool childcare sites are using data-driven decision-making methods and principles of organizational learning for such efforts [57]. A perspective shift regarding EBPs, and a new way of work—which requires systematic and data-driven approaches to program planning and delivery—can have tremendous impact on the type and quality of health promotion programs offered by CBOs.
In addition to a prompt to develop a common understanding of the role of evidence in program planning and delivery, these findings also echo the call to better incorporate the context of practice settings into the process of program development and dissemination [6, 30, 51]. For example, consistent with the literature, we found that resource constraints, particularly in terms of material resources and staff capacity were major barriers to EBP usage [6, 15, 36]. This is particularly critical for organizations that work with the underserved as demand for services seldom corresponds with available resources. Context may find a place more easily in the picture if the production and dissemination of EBPs are viewed as iterative processes that require interaction between program developers/distributors and practice-based users [58].
Similarly, ongoing interaction with partners may provide CBOs targeting the underserved with much-needed support related to program adaptation challenges. There is often a large contextual mismatch between program source and implementation site [15, 36], which may be mitigated by collaborative technical assistance from program developers or others who can balance the requirements of science and practice [59, 60]. As described by study respondents, supportive partnerships that allow for multiple interactions and negotiation are vital, another move away from the top-down technical support of the past. Through collaboration and changing the culture around knowledge translation, the value of academic–community partnerships and CBPR work extends through both the research and practice domains and can support the adoption of evidence-based decision making [49, 54].
As an exploratory, qualitative assessment, there are some limitations to the findings reported here. First, given the focus group and key informant interview methods, the generalizability of the findings is limited. Additionally, a wide range of factors impact program dissemination and we did not attempt to address all of these factors in this study. Instead, we focused on in-depth exploration of a small number of concepts. Despite these limitations, the study makes a valuable contribution to the literature based on a series of strengths. First, our selection strategies allowed us to hone in on a group that is not often studied. We selected community-based organizations targeting underserved populations and recruited respondents strategically to ensure diversity in terms of the constituents they serve. Second, we interviewed a wide range of practitioners, from field staff, program planners and managers (focus groups) to community leaders (key informant interviews) to ensure a diversity of perspectives. Third, by using a CBPR approach for this study, we have ensured that practitioners’ perspectives are incorporated throughout. Future research regarding practitioners’ working definitions of EBPs as well as opportunities to use capacity-building efforts and partnerships to support the use of evidence-based decision making will be vital.
The findings from this study and the extant literature suggest that a collaborative partnership between community organizations, academics and funders is required to understand: (i) the range of ways evidence can be incorporated into programming and (ii) the ways in which the realities of practice impact dissemination and implementation. These interactions are particularly important for organizations serving communities of racial/ethnic minorities and those of low SES given the pressing challenges faced by these organizations and their constituents. By ensuring a place for community context in dissemination efforts, we can improve the sustainable success of EBP dissemination in community-based settings.
Funding
National Cancer Institute at the National Institutes of Health (R01 CA132651 to K.V., PI).
Acknowledgments
The study was conducted in collaboration with the PLANET MassCONECT C-PAC, which includes the following members: Community Partners: Chrasandra Reeves, MHA, Certified MCH (Boston Alliance for Community Health); Chyke Doubeni, MD, PhD (University of Massachusetts Medical School); Clara Savage, EdD (Common Pathways); E. John Hess, MRP (Great Brook Valley Health Center); Ediss Gandelman, MBA, MEd (Beth Israel Deaconness Medical Center); Erline Achille (Boston Public Health Commission); Milagro Grullón, MM (Lawrence Community Connections); Nashira Baril, MPH (Boston Public Health Commission) and Vilma Lora (YWCA of Greater Lawrence/Mayor’s Health Task Force). Investigators: K. ‘Vish’ Viswanath, PhD (Harvard School of Public Health/Dana-Farber Cancer Institute); Karen Emmons, PhD, (Harvard School of Public Health/Dana-Farber Cancer Institute); Elaine Puleo, PhD (University of Massachusetts) and Glorian Sorensen, PhD, MPH (Harvard School of Public Health/Dana-Farber Cancer Institute). PLANET MassCONECT Study Team: Jaclyn Alexander-Molloy, MS, Cassandra Andersen, Carmenza Bruff, Elizabeth Eichel, Josephine Crisostomo, MPH, Lisa Lowery, Sara Minsky, MPH, Shoba Ramanadhan, ScD, MPH. (Dana-Farber Cancer Institute)
Conflict of interest statement
None declared.
References
- 1.Curry SJ, Byers T, Hewitt M, editors. Curry SJ, Byers T, HewittM, Institute of Medicine. Fulfilling the potential of cancer prevention and early detection. Washington, DC: The National Academies, 2003. [PubMed] [Google Scholar]
- 2.Institute of Medicine. Promoting Health: Intervention Strategies from Social and Behavioral Research. Washington, DC: National Academy Press; 2000. [PubMed] [Google Scholar]
- 3.Viswanath K, Finnegan JR. Community health campaigns and secular trends: insights from the Minnesota Heart Health Program and community trials in heart disease prevention. In: Hornik RC, editor. Public Health Communication: Evidence for Behavior Change. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. pp. 289–312. [Google Scholar]
- 4.Emmons KM. Promoting health: intervention strategies from social and behavioral research. In: Smedley BD, Syme SL, editors. Behavioral and Social Science Contributions to the Health of Adults in the United States. Washington, DC: National Academy P ress; 2000. pp. 254–321. [Google Scholar]
- 5.Hornik RC. Public health communication: Making sense of contradictory evidence. In: Hornik RC, editor. Public Health Communication: Evidence for Behavior Change. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. [Google Scholar]
- 6.Glasgow RE, Emmons K. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28:413–433. doi: 10.1146/annurev.publhealth.28.021406.144145. [DOI] [PubMed] [Google Scholar]
- 7.Kerner J, Gruiguis-Blake J, Hennessy KD, et al. Translating research into improved outcomes in comprehensive cancer control. Cancer Causes Control. 2005;16:27–40. doi: 10.1007/s10552-005-0488-y. [DOI] [PubMed] [Google Scholar]
- 8.Maibach EW, Van Duyn MAS, Bloodgood B. A marketing perspective on disseminating evidence-based approaches to disease prevention and health promotion. Prev Chronic Dis. 2006;3 [PMC free article] [PubMed] [Google Scholar]
- 9.World Health Organization. Health 21: Health for All in the 21st Century. Copenhagen, Denmark: World Health Organization Regional Office for Europe; 1999. [Google Scholar]
- 10.Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academies Press; 2003. [PubMed] [Google Scholar]
- 11.Institute of Medicine. Challenges and Successes in Reducing Health Disparities: Workshop Summary. Washington, DC: Institute of Medicine; 2008. [PubMed] [Google Scholar]
- 12.Viswanath K. Public communication and its role in reducing and eliminating health disparities. In: Thomson GE, Mitchell F, WIlliams MB, editors. Examining the Health Disparities Research Plan of the National Institutes of Health: Unfinished business. Washington, DC: Institute of Medicine; 2006. pp. 215–53. [Google Scholar]
- 13.Stephens KK, Rimal RN. Expanding the reach of health campaigns: community organizations as meta-channels for the dissemination of health information. J Health Commun. 2004;9:97–111. doi: 10.1080/10810730490271557. [DOI] [PubMed] [Google Scholar]
- 14.Griffith DM, Allen JO, DeLoney EH, et al. Community-based organizational capacity building as a strategy to reduce racial health disparities. J Prim Prev. 2010;31:31–9. doi: 10.1007/s10935-010-0202-z. [DOI] [PubMed] [Google Scholar]
- 15.Fixsen DL, Naoom SF, Blase KA, et al. Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. [Google Scholar]
- 16.Orleans CT. Increasing the demand for and use of effective smoking-cessation treatments reaping the full health benefits of tobacco-control science and policy gains–in our lifetime. Am J Prev Med. 2007;33:S340–8. doi: 10.1016/j.amepre.2007.09.003. [DOI] [PubMed] [Google Scholar]
- 17.Green LW, Mercer SL. Can public health researchers and agencies reconcile the push from funding bodies and the pull from communities? Am J Public Health. 2001;91:1926–8. doi: 10.2105/ajph.91.12.1926. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Lavis JN, Lomas J, Hamid M, et al. Assessing country-level efforts to link research to action. Bull World Health Organ. 2006;84:620–8. doi: 10.2471/blt.06.030312. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Rogers E. Diffusion of Innovations. New York, NY: The Free Press; 2003. [Google Scholar]
- 20.Kerner J, Rimer B, Emmons K. Introduction to the special section on dissemination: dissemination research and research dissemination: how can we close the gap? Health Psychol. 2005;24:443–6. doi: 10.1037/0278-6133.24.5.443. [DOI] [PubMed] [Google Scholar]
- 21.McKleroy VS, Galbraith JS, Cummings B, et al. Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Educ Prev. 2006;18:59–73. doi: 10.1521/aeap.2006.18.supp.59. [DOI] [PubMed] [Google Scholar]
- 22.Minkler M, Wallerstein N, WIlson N. Improving health through community organization and community building. In: Glanz K, Rimer B, Viswanath K, editors. Health Behavior and Health Education: Theory, Research and Practice. San Francisco, CA: Jossey-Bass Inc. Publishers; 2008. pp. 287–312. [Google Scholar]
- 23.Israel BA, Schulz AJ, Parker EA, et al. Review of community-based research: assessing partnership approaches to improve public health. Annu Rev Public Health. 1998;19:173–201. doi: 10.1146/annurev.publhealth.19.1.173. [DOI] [PubMed] [Google Scholar]
- 24.Hawe P, Noort M, King L, et al. Multiplying health gains: the critical role of capacity-building within health promotion programs. Health Policy. 1997;39:29–42. doi: 10.1016/s0168-8510(96)00847-0. [DOI] [PubMed] [Google Scholar]
- 25.Organization for Economic Co-operation and Development. The challenge of capacity development: working towards good practice. Paris, London: Development Assistance Committee, Organization for Economic Co-operation and Development; 2006. [Google Scholar]
- 26.Bender KW, Cedeno JE, Cirone JF, et al. Process innovation–case studies of critical success factors. Eng Manag J. 2000;12:17–24. [Google Scholar]
- 27.Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41:171–81. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
- 28.Tidd J, Bessant J, Pavitt K. Managing Innovation. New York, NY: John Wiley & Sons; 2001. [Google Scholar]
- 29.Kothari A, Armstrong R. Community-based knowledge translation: unexplored opportunities. Implement Sci. 2011;5 doi: 10.1186/1748-5908-6-59. 10.1186/1748-5908-6-59. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Brownson RC, Baker EA, Leet TL, et al. Evidence-Based Public Health. New York, NY: Oxford University Press; 2011. [Google Scholar]
- 31.Upshur REG, VanDenKerkhof EG, Goel V. Meaning and measurement: an inclusive model of evidence in health care. J Eval Clin Pract. 2001;7:91–6. doi: 10.1046/j.1365-2753.2001.00279.x. [DOI] [PubMed] [Google Scholar]
- 32.Weiss CH, Murphy-Graham E, Petrosino A, et al. The fairy godmother and her warts: making the dream of evidence-based policy come true. Am J Eval. 2008;29:29–47. [Google Scholar]
- 33.Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. [PubMed] [Google Scholar]
- 34.Rabin BA, Glasgow RE, Kerner JF, et al. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am J Prev Med. 2010;38:443–56. doi: 10.1016/j.amepre.2009.12.035. [DOI] [PubMed] [Google Scholar]
- 35.Castro FG, Barrera M, Jr, Martinez CR., Jr The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–5. doi: 10.1023/b:prev.0000013980.12412.cd. [DOI] [PubMed] [Google Scholar]
- 36.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201. doi: 10.1146/annurev.publhealth.031308.100134. [DOI] [PubMed] [Google Scholar]
- 37.Rabin BA, Brownson RC, Kerner JF, et al. Methodologic challenges in disseminating evidence-based interventions to promote physical activity. Am J Prev Med. 2006;31:24–34. doi: 10.1016/j.amepre.2006.06.009. [DOI] [PubMed] [Google Scholar]
- 38.U.S. Census Bureau. State and County QuickFacts. Washington, DC: U.S. Census Bureau.: 2000. [Google Scholar]
- 39.Warnecke RB, Oh A, Breen N, et al. Approaching health disparities from a population perspective: the National Institutes of Health Centers for Population Health and Health Disparities. Am J Public Health. 2008;98:1608–15. doi: 10.2105/AJPH.2006.102525. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Gilchrist VJ, Williams RL. Key informant interviews. In: Crabtree B, Miller B, editors. Doing Qualitative Research. Thousand Oaks, CA: SAGE Publications; 1999. [Google Scholar]
- 41.Brown JB. The use of focus groups in clinical research. In: Crabtree B, Miller B, editors. Doing Qualitative Research. Thousand Oaks, CA: SAGE Publications; 1999. [Google Scholar]
- 42.Fontana A, Frey JH. Handbook of qualitative research. In: Denzin NK, Lincoln YS, editors. Handbook of Qualitative Research. Thousand Oaks, CA: SAGE Publications; 1994. [Google Scholar]
- 43.Borkan J. Immersion/crystallization. In: Crabtree BF, Miller WL, editors. Doing Qualitative Research. Thousand Oaks, CA: Sage Publishing; 1999. pp. 179–94. [Google Scholar]
- 44.QSR International Pty Ltd. NVivo Qualitative Data Analysis Software; Version 8. Melbourne, Australia: 2008. [Google Scholar]
- 45.Lincoln YS, Guba EG. Naturalistic Inquiry. Beverly Hills, CA: Sage; 1985. [Google Scholar]
- 46.National Cancer Institute. Using What Works: Adapting Evidence-based Programs to Fit Your Needs. Washington, DC: U.S. Department of Health and Human Services, National Institutes of Health; 2006. [Google Scholar]
- 47.Cooney SM, Huser M, Small S, et al. Evidence-Based Programs: An Overview. Issue #6. What Works, Wisconsion—Research to Practice Series. Madison, WI: University of Wisconsin–Madison and University of Wisconsin–Extension; 2007. [Google Scholar]
- 48.Centers for Disease Control and Prevention. 2009 Compendium of Evidence-Based HIV Prevention Interventions. Atlanta, GA: Centers for Disease Control and Prevention; 2009. [Google Scholar]
- 49.Kohatsu ND, Robinson JG, Torner JC. Evidence-based public health: an evolving concept. Am J Prev Med. 2004;27:417–21. doi: 10.1016/j.amepre.2004.07.019. [DOI] [PubMed] [Google Scholar]
- 50.Brownson RC, Gurney JG, Land G. Evidence-based decision making in public health. J Public Health Manag Pract. 1999;5:86–97. doi: 10.1097/00124784-199909000-00012. [DOI] [PubMed] [Google Scholar]
- 51.Green LW, Ottoson JM, García C, et al. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30:151–74. doi: 10.1146/annurev.publhealth.031308.100049. [DOI] [PubMed] [Google Scholar]
- 52.Brownson RC, Diem G, Grabauskas V, et al. Training practitioners in evidence-based chronic disease prevention for global health. Promot Educ. 2007;14:159–63. [PubMed] [Google Scholar]
- 53.Dearing JW, Greene SM, Stewart WF, et al. If only we knew what we know: principles for knowledge sharing across people, practices, and platforms. Trans Behav Med. 2011;1:15–25. doi: 10.1007/s13142-010-0012-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Wilson MG, Lavis JN, Travers R, et al. Community-based knowledge transfer and exchange: helping community-based organizations link research to action. Implement Sci. 2010 doi: 10.1186/1748-5908-5-33. 5/1/33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Rychetnik L, Hawe P, Waters E, et al. A glossary for evidence based public health. J Epidemiol Community Health. 2004;58:538–45. doi: 10.1136/jech.2003.011585. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Institute for Healthcare Improvement. The Breakthrough Series: IHI’s Collaborative Model for Achieving Breakthrough Improvement. IHI Innovation Series white paper. Boston, MA: Institute for Healthcare Improvement; 2003. [Google Scholar]
- 57.Wiecha JL, Nelson TF, Roth BA, et al. Disseminating health promotion practices in after-school programs through YMCA learning collaboratives. Am J Health Promot. 2010;24:190–8. doi: 10.4278/ajhp.08022216. [DOI] [PubMed] [Google Scholar]
- 58.Baker EL, White LE, Lichtveld MY. Reducing health disparities through community-based research. Public Health Rep. 2001;116:517–9. doi: 10.1016/S0033-3549(04)50083-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Lattimore D, Griffin SL, Wilcox S, et al. Understanding the challenges encountered and adaptations made by community organizations in translation of evidence-based behavior change physical activity interventions: a qualitative study. Am J Health Promot. 2010;24:427–34. doi: 10.4278/ajhp.081024-QUAL-252. [DOI] [PubMed] [Google Scholar]
- 60.Kothari A, Edwards N, Brajtman S, et al. Fostering Interactions: The Networking Needs of Community Health Nursing Researchers and Decision Makers. HealthStudies Publications Paper 9. London, ON: University of Western Ontario, 2005. [Google Scholar]
