Abstract
Mini-grants are an increasingly common tool for engaging communities in evidence-based interventions (EBI) for promoting public health. This paper describes efforts by four Centers for Disease Control and Prevention/National Cancer Institute-funded Cancer Prevention and Control Research Networks to design and implement mini-grant programs to disseminate EBIs for cancer prevention and control. This paper describes source of EBI, funding levels, selection criteria, timeframe, number and size of grants, types of organizations funded, selected accomplishments, training and technical assistance (TA), and evaluation topics/methods. Grant size ranged from $1,000–10,000 (median=$6,250). This mini-grant opportunity was characterized by its emphasis on training and TA for evidence-based programming and dissemination of interventions from NCI’s Research Tested Intervention Programs and CDC’s Community Guide. All projects had an evaluation component although they varied in scope. Mini-grant processes described can serve as a model for organizations such as state health departments working to bridge the gap between research and practice.
Introduction
Appreciation is mounting for the importance of evidence-based approaches, especially to address the growing challenge of chronic disease in the U.S.1–4 Given concerns about efficient use of health care resources,5,6 evidence-based interventions are gaining ground as strategies for meeting the nation’s triple aims of health care reform in terms of achieving “better health, better health care, and better value.”7,8 The interest in evidence-based approaches also arises from the emergence of a growing inventory of evidence-based programs and approaches in public health.9 This is especially true in the cancer prevention and control field in which the National Cancer Institute (NCI) maintains a searchable database of Research-Tested Intervention Programs (RTIPS) 10 and the Centers for Disease Control and Prevention (CDC) publishes The Guide to Community Prevention Services (Community Guide) 11 that summarizes recommended strategies for a range of health issues, including cancer prevention and control. Unfortunately, even with these web-based resources, a gap persists between what is known from research and what is implemented in practice.2,12
The science about how to disseminate evidence-based research, guidelines, and interventions continues to evolve.12–14 Recent research demonstrates that multi-modal dissemination strategies involving participation and interaction are most likely to be effective.12,15 In a systematic review of community-based cancer prevention studies, trainings (e.g., train the trainer, certificate training, workshop) supplemented with some technical assistance (TA) was the most common dissemination strategy.15 Electronic methods are also increasingly used for the dissemination of evidence-based program information and related resources. In addition to web-based warehouses of evidence-based interventions, webinars and electronically-delivered resources provide an efficient means for conducting program trainings and providing basic TA prior to program implementation.16 Additionally, the use of hardcopy facilitator guides, training and implementation manuals, and packaged educational materials for in-person, interactive trainings is still important for dissemination17 and continues to be evaluated and compared with web-based toolkits and TA.18 Throughout the implementation phase of an intervention, multiple methods of communication (e.g., e-mail, teleconference, site visits) have been utilized for project oversight, guidance, and TA and this communication is often tailored and made relevant to the needs and capabilities of each organization.19
Even with web-based inventories of effective intervention strategies and training on how to find and adapt them, funds, either internal or external to the implementing organization, are typically needed for implementation of specific evidence-based interventions.20–22 Several dissemination projects have used monetary resources in the form of mini-grants for community partners in order to support the implementation of evidence-based health promotion interventions. 23–30 Funding can be used to incentivize community-based organizations to implement evidence-based approaches and can provide practical and much needed fiscal aid. The enthusiasm of a community with a recognized health need, funds for the actual implementation of the evidence-based intervention, and oversight and TA can build capacity for evidence-based approaches to health behavior change and improvement of health outcomes.17,19 For example, the Texas Tobacco Prevention Initiative provided $2,000 mini-grants for up to two years to schools to follow CDC Guidelines to Prevent Tobacco Use and Addiction.27 The People with Arthritis Can Exercise (PACE) program gave $2,000 mini-grants to community organizations to support training and program start-up costs.28 Other mini-grant initiatives have demonstrated that small seed funding to community agencies can help to promote healthier nutrition and physical activity environments while strengthening partnerships, coalitions, and community-based participatory research.31,32 However, more efforts are needed to consider what types of resources and TA contribute to successful dissemination of interventions.31,33,34
The Interactive Systems Framework is a useful conceptual tool for understanding the dissemination and implementation process.35 It identified three interconnected systems, the first of which is the Prevention Synthesis and Translation System. This system involves the distillation of research and packaging of it for use by practitioners. Web-based resources such as RTIPs and the Community Guide fit within this system. The Prevention Support System focuses on building general and innovation-specific capacity of organizations that will be adopting and implementing evidence-based approaches. Mini-grants, combined with training and TA, are one approach to operationalizing this system identified as a critical component of the research translation process. The third system, labeled the Prevention Delivery System, focuses on the application of these capacities to actual implementation of evidence-based approaches in real-world settings.
The purpose of the current paper is to describe how the CDC and NCI-funded Cancer Prevention and Control Research Network (CPCRN) has used mini-grants to promote evidence-based strategies for cancer prevention and control at multiple sites and in a variety of settings. Mini-grants programs such as those discussed here can serve as models for others interested in promoting evidence-based approaches to cancer and chronic disease prevention. They also represent one approach for establishing a Prevention Support System that actively supports the translation of research into practice.
Methods
The CPCRN is a national network of partners-academic, community and public health-who work together to reduce the burden of cancer. Four of the ten CPCRN Centers have used mini-grants to help achieve the CPCRN’s mission of accelerating the adoption of evidence-based cancer prevention and control in communities through increased understanding of the dissemination and implementation process. Each CPCRN’s mini-grants program is briefly described below and in Tables 1–4.
Table 1.
Description of the CPCRN Mini-Grants Programs
| Description | Total/ Summary (across 6 grants programs) | Emory University | University of South Carolina | Texas A & M | University of Texas | ||
|---|---|---|---|---|---|---|---|
| A | B | A | A | A | B | ||
| Source of Evidence-Based Strategy in RFA | |||||||
| Research Tested Intervention Programs (RTIPS) | 4 | X | X | X | X | ||
| Community Guide | 4 | X | X | X | X | ||
| Evaluation data | 3 | X | X | X | |||
| CDC initiatives/best practices | 2 | X | X | ||||
| Peer-reviewed Literature | 2 | X | X | ||||
| Health Topics Covered in RFA | |||||||
| Physical activity | 5 | X | X | X | X | X | |
| Nutrition | 4 | X | X | X | X | ||
| Cancer screening | 4 | X | X | X | X | ||
| Tobacco | 3 | X | X | X | |||
| HPV vaccination | 2 | X | X | ||||
| Other cancer control activities | 1 | X | |||||
| Selection Criteria | |||||||
| Detailed plan | 6 | X | X | X | X | X | X |
| Organizational capacity | 6 | X | X | X | X | X | X |
| Budget justification | 6 | X | X | X | X | X | X |
| Documented need | 5 | X | X | X | X | X | |
| Diversity (e.g. geography, topics, etc.) | 5 | X | X | X | X | X | |
| Academic partner | 2 | X | X | ||||
| Evaluation plan | 2 | X | X | X | |||
| Impact/significance | 2 | X | X | ||||
| Number of Grantees | 44 | 16 | 7 | 3 | 5 | 2 | 11 |
| Number of Applicants | 105 | 40 | 11 | 12 | 6 | 4 | 32 |
| Number of Cohorts | 10 | 3 | 1 | 1 | 1 | 1 | 3 |
| Funding Period (in months) | 9–24 | 12 & 24 | 12 | 12 | 12 | 9–12 | 12 |
| Award Size (in dollars) | 1,000–10,000 | 4,000 & 8,000 | 1,000–4,000 | 10,000 | 10,000 | 3,500 | 4,500 & 8,000 |
| Types of Organizations Funded | |||||||
| Community-based organizations | 5 | X | X | X | X | X | |
| Health care | 4 | X | X | X | X | ||
| Coalitions/networks | 3 | X | X | X | |||
| Faith-based organizations | 3 | X | X | X | |||
| Schools | 3 | X | X | X | |||
| Worksites | 1 | X | |||||
Emory CPCRN Mini-Grants Program
The Emory CPCRN provided funding, training, and TA to organizations in largely rural southwest Georgia to implement evidence-based strategies to promote physical activity, healthy eating, and tobacco control. For four funding cycles since 2007, over 23 community organizations in 16 southwest Georgia counties were awarded mini-grants ranging from $1,000–8,000 (median of $4,000) to support chronic disease prevention efforts. Sites selected strategies based on recommendations from the Community Guide, RTIPs, and CDC’s Healthy Communities Program. Mini-grants in the first three funding cycles of the mini-grants program (2007–2012) focused on implementing seven different evidence-based packaged programs (Program A). Training workshops were conducted for community practitioners about how to find, select and implement evidence-based prevention programs.36 TA topics and process evaluation focused on implementation of core elements and program adaptation.17,30 The focus of the fourth round of mini-grants (2012–2014) was to change organizational policies and environmental settings to promote healthy eating, physical activity, and reduce secondhand smoke exposure in faith-based organizations (Program B). These changes included strategies such as offering healthy foods and beverages at events, limiting unhealthy food options, promoting use of places to exercise like walking trails and stairs, and implementing tobacco-free policies at organizational events. Training and TA shifted to focus on how to formulate, enact, implement, and maintain policy and environmental changes within their organizations.37 The key distinction between the two programs was a switch from pre-packaged programs in Model A to policy and environmental change in Model B.
Texas A&M University CPCRN Mini-Grants Program
In 2010, the Texas A&M CPCRN established a mini-grant program with the purpose of increasing physical activity as a means of cancer prevention and control. Drawing on a community partner engagement model encouraged by the CDC’s CPCRN initiative, Texas A&M with direct assistance from Emory University, introduced a mini-grant program to the Brazos Valley Health Partnership. As a result of initial meetings with this collaborative network, a request for application was distributed, a training session was conducted, and five mini-grants of $10,000 each were awarded to community-based organizations in four counties in the Brazos Valley of Texas. Based on The Community Guide evidence-based strategies for increasing physical activity, grantees implemented one or more of the following strategies in their projects: enhanced infrastructure, increased safety, and improved access to places for physical activity. Grantees were funded to make improvements (e.g. signs, water fountains, pet waste disposal system) to an existing community trail; to create walking maps and improve dissemination strategies to increase walking; to create a downtown walking trail with maps and signage; to install lights around a park and walking trail; and to create a safe fenced outdoor play space.
University of South Carolina CPCRN Mini-Grants Program
The South Carolina Cancer Prevention and Control Research Network implemented a mini-grants initiative to address cancer-related health disparities among high-risk populations across the state.19 Applicants were asked to implement an evidence-based intervention from the RTIPs website. The three mini-grant projects, funded at $10,000 each, adapted evidence-based programs to meet the self-identified needs of their communities effectively using community health educators and partners to build public health capacity. The project provided grantees with a means of community building and improving community relations. While the focus of their project was cancer related, the evidence-based programs promoted improvements in physical activity and healthful eating, potentially having a positive impact on various chronic conditions including diabetes and obesity, in addition to cancer.
University of Texas (UT), Houston CPCRN Mini-Grants Program
The purpose of the UT CPCRN mini-grants program was to accelerate the adoption and use of effective cancer control programs by providing funds and TA to enable community-based organizations to adapt, implement and evaluate such programs in underserved and minority communities in Texas. Building on the experience of the Emory CPCRN and others, the UT CPCRN initiated a mini-grants program in 2011. After a solicitation process which announced the program, UT funded two projects at $3,500 each (Program A). In 2012, the UT CPCRN began collaborating with other Texas programs to increase the resources available for mini-grants. Recognizing a common purpose of community capacity building and dissemination of evidence-based cancer control, the Community Networks Program, and the Center for Clinical and Translational Sciences Community Engagement Component, two NCI-funded initiatives, and the Cervical Cancer Free Texas program contributed resources to the mini-grants program. Following feedback from community partners, the UT team increased the award from $3,500 to $8,000 after the first year and was able to increase the number of available awards. Identifying a need among community partners to better articulate plans and design evaluation approaches, the UT team recruited research fellows to assist applicants (with evidence base, evaluation plans) and grant recipients (with implementation and evaluation). A broad range of types of evidence based programs was acceptable. Examples of evidence-based approaches include interventions delivering one-on-one education and small media (e.g., brochures) to increase breast and cervical cancer screening and social support (e.g., walking support groups) interventions in community settings to increase physical activity. Topics were expanded from cancer screenings and HPV vaccination in 2011 (Program A) to include physical activity, nutrition and other cancer control acticities (Program B) in later years. In program A, the Community Guide was not specifically mention in the RFA, and it was added in Program B. Impact and significance were added as selection criteria in Program B, and training and more forms of technical assistance were also added. These changes were made based on feedback from the Review Committee.
Results
Description of the Mini-Grants Programs
The mini-grants programs are described in Table 1. Across the six mini-grant programs, the most common sources for evidence-based strategies were the RTIPs website from the NCI10 and the Community Guide website from the CDC11 each with four programs using them to identify strategies for dissemination. Other sources included evaluation data, CDC-identified Best Practices, and peer-reviewed literature. The UT mini-grant program used a broader definition of evidence, and allowed grantees more latitude in selecting a strategy as long as they could justify it. For example, in addition to RTIPS and the Community Guide, evidence could come from evaluations and peer-reviewed literature or websites such as the Center for Training and Research Translation (Center TRT) that lists both research and practice-tested interventions.38
The mini-grants programs focused on both primary and secondary prevention of cancer. Across programs, physical activity was most commonly addressed, followed by nutrition and cancer screening. Tobacco use and HPV vaccinations were addressed by fewer of the grants programs.
Many similarities existed in selection criteria across the grant programs, partly due to the programs building off each others’ work, Requests for Applications and lessons learned. All of the programs assessed organizational capacity to manage the project, including leadership and experience. Assessment of other capacity indicators was more varied across the grants programs, but included organizational structure, resources and proposed staffing. All of the programs also assessed the quality of the work plans, with several paying attention to fidelity to the original evidence-based intervention and/or plans for adaptation. Five included community need for the program and all six included budget justification as selection criteria. Most also considered diversity of the applications including geographic distribution, proposed topics, organizational capacity, and the populations served. Two programs required an academic partner as part of its selection process. Two assessed quality of the evaluation plan and potential impact and significance of the proposed project.
Across 10 cohorts of grantees, 44 organizations were funded out of 105 applicants. The majority of the mini-grant programs funded community-based or health care organizations, with three programs also funding faith-based organizations, schools, and coalitions/networks. Worksites were also funded in one program. The grants ranged from 9 to 24 months, with a median length of 12 months. Size of the awards ranged from $1,000 to $10,000, with a median size of $6,250.
Training and TA
All of the grants programs involved training, TA or both (Table 2). Training took several forms across the mini-grant programs. Five programs included training for potential applicants. Formats included a six-hour in-person training, a one-hour conference call, and a two-hour meeting with community leaders combined with a one-hour training for interested applicants. Common topics included defining, finding, adapting, implementing and evaluating evidence-based approaches. Detailed reviews of the Request for Application were also common in these trainings.
Table 2.
Description of Training and Technical Assistance Provided through the CPCRN Mini-Grants Programs
| Description | Total Combined | Emory University |
University of South Carolina | Texas A & M | University of Texas | ||
|---|---|---|---|---|---|---|---|
| A | B | A | A | A | B | ||
| Training Audience | |||||||
| Potential applicants/applicants | 5 | X | X | X | X | X | |
| Grantees | 4 | X | X | X | X | ||
| Training Formats | |||||||
| Conference calls | 4 | X | X | X | X | ||
| In-person | 4 | X | X | X | X | ||
| Webinars | 3 | X | X | X | |||
| Training Topics | |||||||
| Request for Applications | 5 | X | X | X | X | X | |
| Defining evidence | 4 | X | X | X | X | ||
| Evaluation | 4 | X | X | X | X | ||
| Finding evidence-based approaches | 5 | X | X | X | X | X | |
| Adapting evidence-based approaches | 3 | X | X | X | |||
| Implementing evidence-based approaches | 3 | X | X | X | |||
| Sustainability | 3 | X | X | X | |||
| Health disparities | 1 | X | |||||
| Technical Assistance Topics | |||||||
| Administrative issues | 6 | X | X | X | X | X | X |
| Troubleshooting/addressing barriers | 6 | X | X | X | X | X | X |
| Budget | 5 | X | X | X | X | X | |
| Progress updates | 6 | X | X | X | X | X | X |
| Adaptation/fit | 4 | X | X | X | X | ||
| Defining evidence | 4 | X | X | X | X | ||
| Evaluation | 4 | X | X | X | X | ||
| Sharing resources and tools | 4 | X | X | X | X | ||
| Sustainability | 4 | X | X | X | X | ||
| Core elements | 3 | X | X | X | |||
| Finding evidence-based strategies | 3 | X | X | X | |||
| Implementation | 3 | X | X | X | |||
| Dissemination | 1 | X | |||||
| Average number of contacts via phone, e-mail and in-person (per month) | 2.2 | 3.1 | 2.9 | >1 | 2.5 | NA | NA |
NA=not applicable due to no documentation of TA
CPCRN mini-grant programs also provided trainings for the selected grantees. These included kick-off events and more in-depth training on the selected evidence-based strategies. Strategies for sustaining the EBA post-grant funding were also covered in many of the trainings. TA was provided to the grantees in all of the mini-grant programs. In the programs that documented TA, contact was monthly, with more frequent contact documented in some cases. Frequently covered topics included: administrative and budgeting issues, and defining evidence. Troubleshooting and issues related to implementation, including adaptation and identifying core elements of an evidence-based intervention were also addressed through TA.
Evaluation
Evaluation questions varied by mini-grant program and covered a range of domains: implementation, context, adaptation, outcomes, sustainability, partnerships, reach, facilitators, barriers and achievement of objectives (see Table; supplemental digital content 1). Most of the mini-grant programs assisted grantees in evaluating their own progress and several of the programs relied heavily on final reports produced by the grantees to evaluate progress and outcomes. Four programs required grantees to conduct their own evaluations (see Table; supplemental digital content 2). For example, the University of South Carolina mini-grants program required an evaluation component and faculty and staff liaisons provided TA to grantees who conducted their own evaluations and provided results in reports and presentations. The Texas A&M grants program was the only one in which grantees were encouraged to conduct an outcome evaluation using pre and post surveys, though this was not required. The Emory programs were unique in that the Emory CPCRN conducted cross-site process evaluations with multiple data collection strategies (e.g., surveys, interviews, TA logs) in an effort to learn about the adaptation and implementation process. While Emory grantees were not required to conduct evaluations, several opted to do so on their own.
Selected Accomplishments
The CPCRNs captured and measured major accomplishments across a wide spectrum of outcomes from dissemination products to leveraging of funds for sustainability to health behavior change. In the two Emory programs, one rural coalition received a national foundation grant for Healthy Communities and local foundation grants due in part to initial pilot work from the mini-grant. In addition, new rural Faith-Health Networks were formed in part due to the Emory partnerships and associated trainings; and the work was disseminated through three papers, two posters and multiple conference presentations often with community co-authors.17,30 The Emory CPCRN grants program also informed the development of a national training curriculum for CPCRN.
The South Carolina CPCRN’s mini-grant program documented reach and health outcomes as assessed by the grantees. The intended reach of the South Carolina initiative was approximately 880 participants combined. However, the actual total reach of the three evidence-based community interventions was 1,072. Evaluations revealed an increase in physical activity, dietary improvements, and breast and cervical screening participation among participants. The University of South Carolina grants program also resulted in a published manuscript and conference abstract with grantees as co-authors and one grantee received additional funding from a state cancer alliance.
The Texas A & M mini-grant program facilitated select grantees to leverage additional academic and business partnerships to expand and sustain their projects. This mini-grant mechanism also served as a model for two other programmatic areas (e.g., obesity prevention efforts). Positive results of the mini-grant program included increased accessibility to physical activity, increased awareness of places for physical activity, changes in knowledge about the benefits of physical activity, and an overall increase in physical activity behaviors. Partners who were most successful with sustaining their programs were those who used creative approaches in leveraging additional partnerships. For example, one grantee that created a downtown walking trail was able to engage the local downtown business association to help with trail promotion and map dissemination. The downtown association has since taken on the project and even expanded the trail.
The University of Texas mini-grant program highlights sustainability of partnerships and leveraged funding as key outcomes. This project led to leveraging of funding for a new project in which the grantee is the lead organization and the UT CPCRN is responsible for the evaluation. A second grantee has become a UT CPCRN partner in a project to provide navigation and cancer prevention services to underserved clients. A third grantee has been a partner in a CDC funded cross-CPCRN project to evaluate special events. At least one of the grantees is also working on a manuscript and is planning to use the data to obtain a larger grant for a project in another location. These partnerships also provided dissemination of evidence-based projects to populations that the researchers might not have reached on their own, for example, the Lesbian, Gay, Bisexual, Transgendered population.
Discussion
This paper describes mini-grants as a strategy for disseminating evidence-based approaches to cancer prevention and control. It also illustrates one approach to linking the science of prevention to its practice. One of the first considerations in designing each of the mini-grants programs was to identify which interventions to disseminate. The majority of the mini-grants programs chose to draw from RTIPS and the Community Guide—both well-established but still underutilized sources of evidence for prevention programs focused on cancer and related risk factors. In a survey of cancer control planners in eight states, Hannon et al. found that only 28% had ever used the Community Guide and just 13% had used RTIPS.22 Given the relatively high interest in the CPCRN mini-grants programs, our experience suggest that mini-grants may be an effective way to encourage community-based organizations to consider implementing evidence-based approaches from these federal government websites.
A related consideration is what counts as evidence. The standard for classifying programs as evidence-based varies by discipline and also by funding agency, but generally requires that the program be tested and results reported in peer-reviewed literature.1 Brownson and colleagues define four levels of evidence for decision making: evidence-based (review of multiple studies), effective (peer reviewed study), promising (written program evaluation) and emerging (on-going work or practice-based summary).2 The strongest evidence is generally based on a test of the program or policy in more than one controlled study (e.g., CDC’s Community Guide which synthesizes across studies). Effective programs and strategies based on peer-reviewed studies are available in the peer-reviewed literature and on websites such as R-TIPS. Examples of promising or emerging programs can be found in evaluation reports or websites that report practice-tested intervention (e.g., Center for Training and Research Translation).38 Whereas practitioners and researchers may not agree on the definition of evidence, with their different perspectives valuing different kinds of evidence, for the purposes of implementing mini-grant programs, funders need to be clear on what they will accept as evidence.2 That may vary from program to program, as it did across the CPCRN centers.
All of the grants programs involved training and TA. Aside from mini-grants administration, the topics covered can be categorized into general and innovation-specific capacity building per the Interactive Systems Framework.35 Examples of general capacity-building include defining, finding and adapting evidence-based approaches, evaluation and sustainability. Innovation-specific content focused on trouble-shooting and addressing barriers, and applying general capacities to specific interventions being implemented (e.g., adapting an intervention to a specific organizational context). Across mini-grants programs, the trainings tended to address general capacity-building and TA tended to be more innovation specific.
Interestingly, Wandersman et al. identify creating partnerships, developing leadership skills and stabilizing infrastructure as general capacity-building.35 In applying the model to teen pregnancy prevention, Lesesne and colleagues viewed a “science-based approach” as the innovation whereas the CPCRN mini-grants programs viewed specific interventions as the innovation while at the same time building general capacity for an evidence-based approach.39 Regardless of how capacities are categorized (general or innovation-specific), all of the CPCRN mini-grant programs considered organizational capacity when selecting grantees. Leadership and experience were common selection criteria across programs; organizational structure, organizational resources and staffing were also important indicators of organizational capacity in several of the programs.
The size of the mini-grants varied considerably, with $4,000, $8,000 and $10,000 awards being most common. CPCRNs giving the bigger awards were encouraged to do so by community partners or were emphasizing environmental changes that required some capital expenditures (e.g., lighting, fending, bike racks). With a few exceptions, grants were not large enough to attract local government agencies, but did attract a broad range of other types of organizations.
Another observation from the CPCRN mini-grants programs is that evaluation approaches varied widely. One CPCRN site conducted a cross-site evaluation focused on implementation and others provided TA to grantees to conduct their own evaluations typically focused on outcomes. Interestingly, none conducted rigorous outcome evaluations similar to the original study methods. This raises the question of when a rigorous outcome evaluation is warranted given the resources and expertise needed to do it well. At a minimum, outcome evaluation should occur if the program has been adapted to such an extent that the underlying logic and/or core elements are altered or the intervention is implemented in a very different context or population such that determinants of the health problem may differ.40
Mini-grants to select, adapt, and implement cancer prevention and control interventions provide an excellent opportunity to explore the real challenges of translating research into practice.17,41 Future mini-grant efforts may benefit from an increased emphasis on evaluation and common questions across programs so the field can begin to accumulate a knowledge-base concerning this form of prevention support system. Exploring whether mini-grants are best used to disseminate programs or to promote environmental change would also be interesting given the potential for the latter to have a greater reach and impact.42 Relevant evaluation questions include: What level of capacity is needed for effective implementation? What are the critical organizational capacity domains that predict successful implementation? How do various contextual factors influence implementation? For specific interventions, what components can be adapted while still maintaining fidelity to core elements sufficiently to yield comparable outcomes? Can mini-grants lead to sustainable programs and/or environmental changes? A better understanding of the role of mini-grants as a dissemination strategy, as well as a deeper understanding of the dissemination and adaptation process, will help to close the remaining gap between research and practice.
Supplementary Material
Acknowledgement
The authors thank the grantees for their participation in the project.
Sources of Funding
This publication was supported by the CDC and the National Cancer Institute through the Cancer Prevention and Control Research Network, a network within the CDC’s Prevention Research Centers Program (Emory University, U48DP001909; Texas A&M Health Science Center, U48DP001924; University of South Carolina, U48DP001936; and University of Texas at Houston, U48DP001949). The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the CDC and NCI.
Footnotes
Conflict of Interest
The authors report no conflicts of interest.
References
- 1.Brownson RC, Baker EA, Leet TL, Gillespie KN, True WR. Evidence-Based Public Health. 2nd Ed. New York: Oxford University Press; 2010. [Google Scholar]
- 2.Brownson RC, Fielding JE, Maylahn CM. Evidence-based public pealth: A fundamental concept for public health practice. Annu Rev Public Health. 2009;30(1):175–201. [DOI] [PubMed] [Google Scholar]
- 3.Institute of Medicine. Living Well with Chronic Illness: A Call for Public Health Action. Washington, D.C: The National Academies Press, 2012. [Google Scholar]
- 4.U.S. Department of Health and Human Services . U.S. Department of Health and Human Services Inventory of Programs, Activities, and Initiatives Focused on Improving the Health of Individuals with Multiple Chronic Conditions. Office of the Assistant Secretary for Health; 2011. [Google Scholar]
- 5.111th Congress. Patient Protection and Affordable Care Act: Public Law 111–148, 2010.
- 6.Kaiser Family Foundation. Health Care Costs: A Primer: Key Information on Health Care Costs and their Impact, 2012. http://kff.org/health-costs/issue-brief/health-care-costs-a-primer/. 8/29/14.
- 7.Berwick DM, Nolan TW, Whittington J. The triple aim: care, health, and cost. Health Aff. 2008;27(3):759–769. [DOI] [PubMed] [Google Scholar]
- 8.Ory MG, Ahn S, Jiang L, et al. Successes of a national study of the Chronic Disease Self-Management Program: meeting the triple aim of health care reform. Med Care. 2013;51(11):992–998. [DOI] [PubMed] [Google Scholar]
- 9.Ory M, Smith M. Exemplifying the evidence base for health promotion programs across populations and settings. Fam Community Health. 2012;35(3):188–191. [Google Scholar]
- 10.NCI. Research Tested Intervention Programs 2011; Cancer Control PLANET. National Cancer Institute. http://rtips.cancer.gov/rtips/. March 4, 2011.
- 11.Centers for Disease Control and Prevention. The Community Guide to Preventive Services. 2011. http://www.thecommunityguide.org. March 4 2011.
- 12.Sanchez MA, Vinson CA, La Porta M, Viswanath K, Kerner JF, Glasgow RE. Evolution of Cancer Control PLANET: Moving research into practice. Cancer Causes Control. 2012;23(7):1205–1212. [DOI] [PubMed] [Google Scholar]
- 13.Ciliska D, Robinson P, Armour T, et al. Diffusion and dissemination of evidence-based dietary strategies for the prevention of cancer. Nutr J. 2005;4(1):13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Mccormack L, Sheridan S, Lewis M, et al. Communication and Dissemination Strategies to Facilitate the Use of Health-Related Evidence Evidence Reports/Technology Assessments No. 213.. Rockville, MD: Agency for Healthcare Quality and Research; 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on community-based cancer prevention: A systematic review. Am J Prev Med. 2010;38(4):443–456. [DOI] [PubMed] [Google Scholar]
- 16.Allen P, Sequeira S, Jacob RR, et al. Promoting state health department evidence-based cancer and chronic disease prevention: a multi-phase dissemination study with a cluster randomized trial component. Implement Sci. 2013;8(1):141. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Carvalho ML, Honeycutt S, Escoffery C, Glanz K, Sabbs D, Kegler MC. Balancing fidelity and adaptation: Implementing evidence-based chronic disease prevention programs. J Public Health Manag. Pract. 2013;19(4):348–356. [DOI] [PubMed] [Google Scholar]
- 18.Smith SA, Blumenthal DS. Efficacy to effectiveness transition of an educational program to Increase Colorectal Cancer Screening (EPICS): study protocol of a cluster randomized controlled trial. Implement Sci. 2013;8(1):86. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.McCracken JL, Friedman DB, Brandt HM, et al. Findings from the community health intervention program in South Carolina: Implications for reducing cancer-related health disparities. J Cancer Educ. 2013;28(3):412–419. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC. Improving the public health workforce: Evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138–143 [DOI] [PubMed] [Google Scholar]
- 21.Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–1267. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Hannon PA, Fernandez ME, Williams RS, et al. Cancer control planners’ perceptions and use of evidence-based programs. J Public Health Manag Pract. 2010;16(3):E1–E8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Báezconde‐Garbanati L, Beebe LA, Pérez‐Stable EJ. Building capacity to address tobacco‐related disparities among American Indian and Hispanic/Latino communities: conceptual and systemic considerations. Addiction. 2007;102(s2):112–122. [DOI] [PubMed] [Google Scholar]
- 24.Campbell MK, Motsinger BM, Ingram A, et al. The North Carolina Black Churches United for Better Health Project: intervention and process evaluation. Health Educ Behav. 2000;27(2):241–253. [DOI] [PubMed] [Google Scholar]
- 25.Collins C, Harshbarger C, Sawyer R, Hamdallah M. The diffusion of effective behavioral interventions project: development, implementation, and lessons learned. AIDS Educ Prev. 2006;18(4 Suppl A): 5–20. [DOI] [PubMed] [Google Scholar]
- 26.Elliott DS, Mihalic SF. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5(1):47–53. [DOI] [PubMed] [Google Scholar]
- 27.Gingiss P, Boerm M, Roberts-Gray C. Follow-Up comparisons of intervention and comparison schools in a state tobacco prevention and control initiative. J Sch Health. 2006;76(3):98–103. [DOI] [PubMed] [Google Scholar]
- 28.Gyurcsik NC, Brittain DR. Partial examination of the public health impact of the People with Arthritis Can Exercise (PACE) program: reach, adoption, and maintenance. Public Health Nurs. 2006;23(6):516–522. [DOI] [PubMed] [Google Scholar]
- 29.Hallfors D, Godette D. Will the ‘principles of effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Educ Res. 2002;17(4):461–470. [DOI] [PubMed] [Google Scholar]
- 30.Honeycutt S, Carvalho M, Glanz K, Daniel SD, Kegler MC. Research to reality: a process evaluation of a mini-grants program to disseminate evidence-based nutrition programs to rural churches and worksites. J Public Health Manag Pract. 2012;18(5):431–439. [DOI] [PubMed] [Google Scholar]
- 31.Johnson DB, Smith LT, Bruemmer B. Small-grants programs: lessons from community-based approaches to changing nutrition environments. J Am Diet Assoc. 2007;107(2):301–305. [DOI] [PubMed] [Google Scholar]
- 32.Sloane DC, Diamant AL, Lewis LB, et al. Improving the nutritional resource environment for healthy living through community-based participatory research. J Gen Intern Med. 2003;18(7):568–575. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Rohrbach LA, D’Onofrio CN, Backer TE, Montgomery SB. Diffusion of school-based substance abuse prevention programs. Am Behav Sci. 1996;39(7):919–934. [Google Scholar]
- 34.Rohrbach LA, Grana R, Sussman S, Valente TW. Type II translation: transporting prevention interventions from research to real-world settings. Eval Health Prof 2006;29(3):302–333. [DOI] [PubMed] [Google Scholar]
- 35.Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: The Interactive Systems Framework for Dissemination and Implementation. Am J Community Psychol. 2008;41(3):171–181. [DOI] [PubMed] [Google Scholar]
- 36.Escoffery C, Carvalho M, Kegler MC. Evaluation of the Prevention Programs That Work Curriculum to teach use of public health evidence to community practitioners. Health Promot Pract. 2012;13(5):707–715. [DOI] [PubMed] [Google Scholar]
- 37.Longest B Health Policymaking in the United States. Chicago: Health Administration Press; 2009. [Google Scholar]
- 38.Center for Training and Research Translation. http://www.centertrt.org/. August 08, 2014.
- 39.Lesesne CA, Lewis K, White C, Green D, Duffy J & Wandersman A. Promoting science-based approaches to teen pregnancy prevention: Proactively engaging the three systems of the Interactive Systems Framework. Am J Community Psychol. 2008;41:379–392. [DOI] [PubMed] [Google Scholar]
- 40.Bartholomew LK, Parcel GS, Kok G, Gottlieb NH, Fernandez ME. Planning Health Promotion Programs: An Intervention Mapping Approach. 3rd Ed. San Fransico: Jossey-Bass; 2011. [Google Scholar]
- 41.Vanderpool RC, Gainor SJ, Conn ME, Spencer C, Allen AR, Kennedy S. Adapting and implementing evidence-based cancer education interventions in rural Appalachia: real world experiences and challenges. Rural Remote Health. 2011;11(4):1807. [PMC free article] [PubMed] [Google Scholar]
- 42.Frieden T A framework for public health action: the health impact pyramid. Am J Public Health. 2010;100(4):590–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
