Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2023 Sep 1.
Published in final edited form as: Cancer Causes Control. 2022 Jul 17;33(9):1181–1191. doi: 10.1007/s10552-022-01604-8

Building implementation science capacity among practitioners of cancer control: Development of a pilot training curriculum

Joseph A Astorino a, Sarah Kerch a, Mandi L Pratt-Chapman a
PMCID: PMC9534372  NIHMSID: NIHMS1837709  PMID: 35842850

Abstract

Purpose

Cancer control interventions are difficult to implement with fidelity, while tailoring to fit contexts. Engaged approaches are suggested to advance equity. On-the-ground practitioners are needed to serve as collaborators in the implementation process with research teams, but few trainings are designed with them in mind.

Methods

The Cancer Control Implementation Science Base Camp (CCISBC) was created to improve capacity among cancer control practitioners when implementing evidence-based cancer screening programs in specific contexts. Development of the curriculum included: 1) performing a literature review assessing extant curricula, 2) comparing competencies of these curricula, 3) user-centered design, 4) producing learning materials, 5) recruiting two teams to test a pilot, 6) running the pilot and 7) evaluating results.

Results

Nine competencies overlapped between four of the curricula scanned in this study, all of which served as the basis for learning objectives. Principles that emerged from design sessions included: staying clear about terminology, supporting the brokerage of knowledge, reframing theories, models, and frameworks as tools, and including equity in everything. Pilot testing showed that the average learner increased 74.5% in knowledge and 75% in confidence regarding implementing evidence-based cancer screening. Evidence suggests that the training increased the skill of implementing evidence-based interventions (EBIs) with a health equity lens.

Conclusions

In order to scale practice-based evidence, practitioners will need to be engaged. This engagement is optimized when practitioners are trained to collaborate on implementation research. The CCISBC is a feasible program to develop capacity among practitioners in comprehensive cancer control in order to optimize EBIs tailored to context.

Keywords: Capacity Building, Training, Implementation Science, Comprehensive Cancer Control Practitioners, Cancer Screening, Health Equity

Background

While many resources exist for researchers, few resources exist to equip clinical and public health partners to better understand implementation principles in order to optimize fit of interventions to specific contexts. We aimed to build capacity for co-creation between cancer control practitioners and researchers. Adult professionals have reported learning practice skills on the job or through self-study, while learning research skills in their formal education [1]. The literature shows that leaders of cancer control coalitions [2], navigators [3], and leaders of health departments [4] could benefit from more extensive knowledge about the science of implementation. Some scholars claim that all people who have a hand in implementation should be trained [5]. While this is not feasible to accomplish via a one-time event, training cancer control practitioners to build capacity for team-based implementation locally can extend the reach of evidence-based interventions (EBIs).

Educational initiatives, including mini-courses, bootcamps, workshops, toolkits, guidebooks, institutes, certificate programs, Master’s and PhD programs, mentored training, and clinician-scholar programs equip researchers,[6,7,8] as well as some practitioners, with competencies in implementation science [9,10,11,12].

However, many of the facilitated and/or synchronous trainings on implementation science that are available are intended for researchers. While guides and written materials exist for practitioners, there is no accessible, no-cost, facilitated and interactive training for cancer control practitioners that aligns with adult learning approaches [9,13,14]. Systematic methods have been used to develop domains and competencies that help in the development of training for practitioners of cancer control generally [15]. Although these cancer control competencies have much overlap with implementation science, they do not make implementation science specifically accessible to practitioners [16].

Some existing work fills this structured training gap. An example specific to cancer prevention and control is the Cancer Prevention and Control Research Network’s (CPCRN) “Putting Public Health Evidence in Action Training Workshop.” The CPCRN is a network of centers working to speed up the adoption of the evidence-base into cancer prevention and control. Accordingly, this workshop covers how to integrate evidence-based practices into program planning. The workshop facilitator’s guide lists nine key objectives that describe what trainees should be able to do at the completion of the workshop [17]. The Implementation Science team at the National Cancer Institute has also produced a resource guide for cancer control practitioners introducing the field of implementation science (ISG) [18]. Organized through a phased approach, the workbook orients practitioners through the stages of implementation. This resource guide aims to serve as a digestible first step to the implementation science field. The CCISBC builds off of these key principles through the design of a facilitated, interactive training program.

The Center for Implementation (TCI) is a private organization that supports organizations in building capacity for implementation science for a fee. TCI produced a guidebook of nine core domains and thirty-seven competencies for implementation practitioners by scanning the published and grey literature [11]. Similarly, a collaboration between University of North Carolina and the National Implementation Research Network produced “Implementation Support Practitioner Competencies” (UNC) [12]. The goal was to outline the skills needed to build capacity to integrate EBIs into everyday practice [12]. This work was produced through a literature review, environmental scan, expert review, and content validation [12]. Recognizing that the linear pipeline model of knowledge translation was becoming obsolete in Canadian research sectors, scholars have begun identifying proficiency standards for new careers in knowledge mobilization (KMB) work, specifically within new job roles titled “research impact practitioners” [10]. Eleven domains and eighty competencies were identified through stakeholder consultation, literature review, and an iterative process of categorization [10].

Table 2 below describes the comparison of domains and competencies across the curricular materials reviewed. Domains included in the CCISBC originating from the sample curricula had the highest representation (between 3–8 occurrences).

Table 2:

Curricular Comparison Across Final Domains (Bold) and Competencies Included in Design

TIDIRC UCSF CU-MTDIRC CCP CPCRN ISG TCI UNC KMB
Introduction and Background to Implementation Science X X
Understand terminology, types of expertise, and distinction from other research fields X X
Understand equity as a process and product in implementation science
Appreciate multi-level approaches in Implementation Science
Understand the Problem & Assess the Context X X X X X
Use data to understand the problem X X X X X
Assess readiness X X X X
Understand the system, context, and culture X X X X X X X X
Assess contextual fit X X X X X X
Support individuals/groups to prioritize needs and opportunities X X X X X X X
Understand power structures and complex Challenges X X X X X X
Critically reflect on issue X X X X
Use Evidence and Theories to Inform All Aspects of Implementation X X X X X X
Synthesize and appraise evidence X X X X X X X X
Use evidence and theory to plan implementation strategies X X X X X X X X X
Adapt the program, practice, and/or implementation strategies to the local context X X X X X X X
Facilitate Implementation X X X X X
Identify champions X X X X X
Use process models and frameworks to guide implementation X X X X X X X
Develop and execute an implementation plan X X X X X X X
Address resistance to change X X X X
Develop action plans to address challenges X X X X
Conduct quality improvement cycles X X X X X X X
Evaluate X X X X X X
Use a framework to guide evaluation X X X X X X X X
Assess implementation quality X X X X X X
Sustainability Planning X X X
Assess factors that influence sustainability X X X X X X
Identify how to scale down an ineffective but often used intervention X
Establish a sustainability plan X

Domains were finalized and the process described above was used to construct fifteen learning objectives, as described next. Table 3 is the finalized design of the curriculum.

Table 3:

CCISBC Final Learning Domains and Objectives

Domains Learning Objectives
Introduction
  • Define basic implementation science terms

  • Understand how implementation science can be used to improve cancer screening

  • Describe how to implement evidence-based interventions through a health equity lens

Assess Context
  • Explain the key elements of assessing context

  • Develop a plan to assess context

Finding Evidence
  • Describe sources and examples of evidence-based interventions (EBIs)

Adapting Evidence
  • Identify key process components of adapting an EBI

Implementation Strategies
  • Propose implementation strategies that fit the unique needs of a specific intervention

Facilitating
  • Determine implementation strategies based on context

  • Identify approaches to quality improvement

  • Identify needed adaptations to optimize success

Evaluation
  • Describe a framework’s use in evaluation

  • Explore and identify measurable outcomes of implementation quality

Sustainability
  • Identify elements critical for sustaining an intervention

  • Describe how to integrate a sustainability tool into cancer screening implementation planning

Gaps in existing curricula

Despite these resources, access to tailored practitioner training is limited. Existing training programs that are accessible to practitioners often include learning domains related to choosing EBIs but are not specifically designed with implementation science at the core. Another issue is confusion around roles for implementers: there is currently a lack of formal roles for implementation practice. Many initiatives have developed around the goal of clarifying the training of “implementation practitioners”[12] as an emerging role trained for this task. Implementation practitioners often work between front-line service delivery and management; this role can focus on supporting all staff in delivering, sustaining, and scaling up evidence-based practices [9]. Their professional roles may vary, and while working with implementation projects is likely only one aspect of their job duties, the current individuals that perform this work have differing capacity based on their level in the organization [9]. Without this specificity, it is challenging to design tailored, user-centered curriculum and training materials. Research calling to further clarify this role has paved the way in developing core workforce competencies revolving around knowledge, attitudes, and skills [9]. By integrating the existing literature on training competencies for implementation researchers with the emerging research on implementation practitioners, a curriculum was developed to support the distribution of the work of implementation among many diverse professionals, including cancer control practitioners.

The intended audience for this training was defined early on as Comprehensive Cancer Control (CCC) practitioners, including both program staff and coalition leadership. The National Comprehensive Cancer Control Program (NCCCP) was established by the Centers for Disease Control and Prevention (CDC) in 1998 to promote a collaborative approach for reducing the burden of cancer through evidence-based cancer control strategies [19]. NCCCP provides support to CCC programs in all 50 states, the District of Columbia, eight tribes or tribal organizations, and seven Pacific Island Jurisdictions and US territories. CCC practitioners from these funded programs and coalitions see themselves as “implementors,” but not necessarily implementation practitioners. Consequently, CCC practitioners can be considered one of the last stops in the knowledge pipeline [20]. Although other sectors have developed formal competency clusters for implementation support practitioners, there remains a need for an accessible, foundational training in implementation science for practitioners in cancer control. Cancer control practitioners do not formally perform research, yet are not on-the-ground enough to perform change management or knowledge mobilization in specific organizations. Rather, because of their rich understanding of contextual difference of each of their regional landscapes, cancer control practitioners often act as knowledge brokers working between scientific research and clinical reality and public health practice. The tasks of finding and adapting evidence-based practices to context is central to this role, with facilitation skills related to working with different levels of evidence needed for each type of stakeholder being paramount. A recent assessment by the Comprehensive Cancer Control National Partnership (CCCNP) surveyed 66 CCC programs and coalitions (n=96) regarding their training and technical assistance needs (21). Even though the group surveyed is somewhat familiar with implementation science (84.5%), practitioners reported challenges to applying findings from this research and becoming involved in performing implementation studies. Sixty-one percent expressed interest in training in implementation science for cancer control and 54% are interested in engaging in practice-based research. The purpose of the CCISBC was to fill this gap in accessible, tailored training in implementation science focused on practitioners in cancer control working to apply lessons from implementation science to improve the fit and rigor of evidence-based cancer control practices.

Methods

Conceptual Framework

User-centered design was used to develop an understanding of the end-user to inform all decisions, such as recruitment, designing content, and evaluating the results.

Participant Recruitment

Evidence suggests that having multiple people trained together creates momentum for change in settings such as cancer control coalitions and programs [22]. Previous research on capacity building initiatives within cancer control also identified interaction with peers as being correlated with satisfaction in the training [23]. Thus, teams were recruited for a pilot version of the training via a set of three videos explaining what implementation science is, how to assemble a team, and how to recruit partners for your team. Recruiting aimed to select teams from diverse geographic regions and historically excluded populations. Team member roles could include: coalition member or staff, cancer control program director or staff, executive staff from a screening site, at least one clinical or public health champion, and another person involved in screening for the coalition or program.

Curriculum Development

Development of the curriculum included: 1) scanning the environment and performing a literature review assessing extant curricula, 2) comparing the competencies of implementation science curricula, 3) employing user-centered design methods to develop an intuitive understanding of the learner to inform all decisions such as recruitment planning, designing content and activities, as well as evaluating the results, 4) producing learning materials, 5) recruiting teams from the cancer control audience to apply to the training program, 6) running the full CCISBC pilot model with two teams and 7) evaluating the pilot program.

Scan of Existing Curricula.

A broad sample of published research about curriculum development and curricular materials were identified based on a scan of the literature. Domains and competencies across nine sources were analyzed to examine wording and scope (Training Institute in Dissemination and Implementation Research in Cancer (TIDIRC) program, University of California of San Francisco’s (UCSF) online certificate program, University of Colorado at Anschutz’s (CU) graduate certificate program, Washington University in St. Louis’s Mentored Training in Dissemination and Implementation Research in Cancer (MT-DIRC) program, Cancer Control Practitioner Competencies, Cancer Prevention and Control Research Network’s (CPCRN) workshop, Implementation Science at a Glance, The Center for Implementation, University of North Carolina and the National Implementation Research Network Implementation Support Practitioner Competencies, and Knowledge Mobilization standards) [24, 7, 8,13,15,17,18,11,12, 10]. Each of the curriculum documents was examined and a list of all domains and competencies was generated to align competencies. The materials were then analyzed for the presence or absence of each domain and competency, and when a domain or competency was matched, a linked cell was created with the matching text within a table, which allowed for the start of the comparison process among different styles of implementation research and practice. High priority competencies (those that existed across multiple sources) guided the selection of competencies and learning objectives. Through content analysis and by comparing the text of each domain and competency across materials, a preliminary outline was created based upon frequency and end-user needs.

Subject matter expert feedback.

A steering committee (n=11) composed of diverse stakeholders was recruited to inform the project. Members included cancer control practitioners and experts in implementation science from organizations such as academia, government, healthcare, and consulting, among others. Feedback was sought from the five cancer control practitioners who advocated for the needs of the intended audience. Scientists provided perspectives on what their practice partners would need to know in order to collaborate on implementation research projects. Priority competencies were determined via brainstorming meetings, workgroups, and iterative reviews. Six domains and fourteen competencies were prioritized. These competencies were then translated into fifteen specific learning objectives that framed each session of the CCISBC. All of the steering and workgroup meetings were recorded and transcribed.

Core competencies.

Brownson et al. defined a competency as: “a cluster of related knowledge, attitudes, and skills that affect the major part of one’s job and can be measured against well-accepted standards and improved through training” [15]. Table 2 shows the comparison of curricular materials. Highest frequencies included: “Assessing the Context,” “Using Evidence to Inform All Aspects of Implementation,” “Facilitate Implementation,” and “Evaluate Implementation.” Although “Inspiring Stakeholders and Developing Relationships” was a high frequency item, input from the steering committee was that while this topic is important, training on how to do it is not necessary for this audience and it was therefore eliminated from the curriculum design. Other domains not included from sample curricula were those with only one occurrence, those related to advanced research (measures, designs) and those related to relational practices (communication, building teams, facilitation, change management, leadership, etc.). Relational domains were not included because the purpose of this training program is not to refine people management skills. Rather, these were discussed throughout the training and refracted through the key stages of implementation. The domains of policy and managing legal issues are very specialized topics that may not be general enough to be useful to a majority of cancer control settings. Some items fell under both domains and competencies depending on the curriculum ( e.g., communication, adaptation/fidelity). Similarly, adapting interventions was listed as a competency in nearly all curricula, but was elevated to a domain in one of the curricular materials. Topics such as multi-level approaches and health equity were not summarized in a way in the published literature that emphasizes these domains, although they are implicitly included and cut across multiple sequential phases of implementation—thus we added these as specific competencies. For example, approaching implementation problems with a multi-level approach cuts across sequential domains because it is a perspective that affects how each implementation phase is approached. This means including individual, group/team, organizational, and community levels at every stage of the process is crucial [25]. Health equity was implicit as a domain or competency from the literature reviewed for this paper, but research demonstrating the centrality of this topic led to including this as a foundational principle and outcome [26, 27].

Figure 1 focuses on a sample of four curricular materials representing the diversity of implementation capacity building initiatives highlighting overlapping and unique competencies. Nine competencies overlapped between all four curricula, all of which were developed into learning objectives for the CCISBC training. Unique competencies demonstrated different approaches to training which highlight skill clusters such as facilitation, execution, project management, or research. Each of these was discussed by the project team and steering committee and relevant competencies were tailored, added, or eliminated based on the objectives of the CCISBC. By comparing domains and competencies in Table 2 and Figure 1, learning objectives were prioritized.

Figure 1. Overlapping and unique competencies among a sample of four curricular materials extracted from environmental scan.

Figure 1

CPCRN=Cancer Prevention and Control Research Network

TCI=The Center for Implementation

UNC=University of North Carolina Implementation Support Practitioner Competencies

CU-MTDIRC=University of Colorado Graduate Certificate in Implementation Science and Washington University Mentored Training in D/I Research Program

Note: Numbers represent shared learning competencies between curricula

Pilot curriculum.

Slide decks based on content catalogued from the literature review and environmental scan were produced. These were presented by six two-person teams composed of an implementation researcher and a cancer control practitioner. Interactive questions were developed to center the learner’s experience and demonstrate the variation among learners. A companion guide was developed that integrated tools in a seamless format with accessible language. A hypothetical case study was developed to make each session come to life for learners, with details demonstrating how implementation science can be used to solve real world problems. An implementation blueprint was developed by adapting the implementation research logic model [28]. This interactive process allowed teams to create a logic model collaboratively by developing a focused objective, mapping context and selecting EBIs. Importantly, learners were guided in determining the core and adaptable components of the intervention, a key step in tailoring for equitable implementation. Learners were also guided in choosing implementation strategies, evaluation metrics, and sustainability factors. To develop facilitating implementation competencies, three case studies of breast, colorectal, and lung cancer screening were chosen and plain-language summaries were written. The authors of each study were contacted and agreed to participate in an interactive panel. The learners read these case studies before the session, and then had time in breakout groups to develop questions for the panelists. Predetermined questions focused on common competencies from the environmental scan were also included and sent in advance to all panelists.

Evaluation of Pilot study

Evaluation was developed using the Kirkpatrick Model as a framework [29]. This model structured the evaluation questions based on primary reactions, short-term knowledge outcomes, and longer-term behavioral outcomes. Questions asked about the perceived quality of the training, as well as changes in knowledge, motivation, intentions, and capacity as a result of participating. Process evaluation included tracking the number of applications, the demographics of applicants, attendance, number of people completing the pre- and post-workshop survey, and the number of teams completing and submitting the implementation blueprint (a modified logic model). Initial reactions and short-term outcomes related to changes in knowledge were measured with a survey at the end of the training. The majority of questions were developed based on the learning objectives. This process has been used for a similar training developed for practitioners of knowledge translation, with the addition of a retrospective pre-post design as an innovation for the CCISBC evaluation plan [30]. Other questions were included related to satisfaction with content and delivery, overall accessibility, experience of time allotted to different components, and self-reported assessment of each team’s implementation blueprint. Lastly, the survey asked about self-reported knowledge gains for implementing cancer screening interventions. Descriptive analysis was run on the results and change to mean scores from pre- to post-pilot knowledge was analyzed.

Results

Findings

Six out of eight pilot participants completed the evaluation of the initial pilot phase of the project. The majority of these participants were CCC program staff. Evaluation data were positive with 33% strongly agreeing and 66% agreeing with the statement that they were satisfied with the content. Most people (83%) agreed that the content and delivery of the training was accessible. A majority (66%) of learners also felt that they could apply the plan created in team huddles to their coalition’s work. Ninety-one percent felt they could put lessons learned from the training into their work. In terms of implementing evidence-based cancer screening, the average learner increased 74.5% in knowledge and 75% in confidence in these key areas. Lastly, a series of retrospective pre-post questions was utilized to determine effectiveness. See Table 4.

Table 4:

Selected Summary of Retrospective Pre-Post Evaluation Data (n=6)

How do you rate your ability to explain the following concepts? Knowledge Before Base Camp
M (SD)
Knowledge After Base Camp
M (SD)
Change from Pre- to Post-intervention
M
How implementation science can be used to improve cancer screening 2.33 (1.211) 4.33 (0.516) +2.00
How to implement evidence-based interventions through a health equity lens 3.50 (1.049) 4.17 (0.753) +0.67
Basic implementation science terms 2.00 (0.894) 4.17 (0.408) +2.17
Sources and examples of EBI 4.33 (0.516) 4.83 (0.408) +0.50
Factors critical for sustaining an intervention 3.17 (0.983) 4.17 (0.408) +1.00

When comparing means for several questions measuring learning objective knowledge, the data from the evaluation suggests the training was most effective at increasing understanding of how implementation science can be used to improve cancer screening comparing pre-training (mean value= 2.33, n =6) to post-training (mean value= 4.33, n=6). An increase in mean was also found for the learning objective “how to implement EBIs through a health equity lens” comparing pre-training (mean value= 3.50, n =6) to post-training (mean value= 4.17, n=6). There was also an increase in knowledge of using implementation science terminology after the training discovered by comparing pre-training (mean value= 2.00, n =6) to post-training (mean value= 4.17, n=6). Other notable knowledge increases occurred with sourcing and sustaining EBIs. These data led to the conclusion that the content of the CCISBC is effective at reaching its top goals and objectives, in terms of content.

On the other hand, 50% were neutral about the method of training as shown by responses to the statement “I am satisfied with the delivery.” Specifically, all of the learners felt that there was not enough time. Several suggestions for improvement included restructuring the sequence and timing and making the user experience of navigating the materials easier, both of which will be completed for future iterations.

Discussion

Cancer screening and early detection is a priority of the NCCCP, but there is often a gap in knowledge about how to implement EBIs for screening [31]. Research indicates that cancer control practitioners can benefit from improved understanding of how to integrate EBIs into strategic plans, such as CCC plans, and strategies to implement these interventions [32].

The CCISBC contributes to integration and implementation of EBIs in several ways. First, it promotes a shared language. Terminology often originates from fields ranging from epidemiology to economics, and the meaning of disciplinary concepts may vary from definitions of the same terms within implementation science. Researchers use terminology differently than practitioners in some cases. Concepts from one sector often mean something different in another (e.g., research, evaluation, quality improvement). To overcome these challenges we developed instructive definitions, a glossary, and an interactive space to tag concepts that were unclear throughout the training.

Second, the CCISBC provided applied practice tailoring cancer screening EBIs to diverse contexts. Using examples from breast, colorectal, and lung cancer within case studies, anecdotal examples, and panelist presentations helped target the instructive material at a level of granularity that showed the value of training for current job duties. We created interactive team huddles to model this domain through action. To structure the instructional content related to evidence, we broke down the large domain of “Using Evidence and Theories to Inform All Aspects of Implementation” into several different domains. Learners were supported in adapting evidence based on contextual needs.

Third, we made implementation frameworks accessible through plain language and practical tools. Drawing from the Consolidated Framework for Implementation Research (CFIR) and RE-AIM, we created an implementation blueprint activity for teams to practice applying tools to solve real problems. We also provided access to tools to solve other problems after the training, such as the Framework for Reporting Adaptations and Modifications-Enhanced (FRAME) for adapting evidence [33], the Expert Recommendations for Implementing Change (ERIC) menu of implementation strategies [34] and the Program Sustainability Assessment Tool (PSAT) for assessing sustainability of interventions [35].

One weakness in extant frameworks was lack of centering equity [36]. Thus, we integrated equity into each step of the CCISBC training. SMARTIE objectives (with Inclusion and Equity added to the traditional Specific Measurable Achievable Relevant and Time-bound objective framework) set the tone for the teams to add equity components to their planning and to include the population of focus in each step. Spending a significant amount of time doing a contextual assessment can also ensure a match between an intervention and the setting, potentially preventing making conditions even worse [37]. Similarly, adapting interventions can address health inequities when performed proactively [38]. Including critical reflection and quantitative measures of equity in an evaluation plan can build accountability into the quality of the implementation, especially in marginalized communities. Lastly, sustainability of interventions is an equity problem, i.e., communities with fewer resources often need to build capacity in order to sustain changes such as screening interventions. Thus several “equity reflections” were included throughout the training to demonstrate that equity as process is as important as equity as outcome.

Our study has some limitations. Evaluation results underrepresent those not having capacity to attend a three-day synchronous training due to staffing or time-zone. There was also a small sample size of participants in the initial pilot study. Teams also had challenges recruiting executive and clinical stakeholders to attend the training. Due to these limitations, the findings presented here are only reflective of the individual team members attending the pilot and cannot be immediately generalized to other cancer control coalitions or programs. However, these data were successful in refining future iterations of the CCISBC.

Conclusion

The CCISBC aims to elevate practice-based evidence as a means of achieving equitable implementation through developing the capacity of practitioners to collaborate with researchers. Interest in implementation science within the cancer control arena is growing. If academics are going to scale practice-based evidence, practitioners need to be engaged. This is only possible when practitioners’ capacity to collaborate in fields such as implementation science is increased. The CCISBC is a feasible, applied program with strong preliminary findings to enhance capacity of cancer control practitioners to engage in evidence-based research and practice with an implementation science foundation. CCISBC will be regularly available as part of George Washington University Cancer Center’s technical assistance offerings for cancer control practitioners. Materials related to applying to participating in this training synchronously [39], asynchronously [40], as well as other implementation science trainings and tools reviewed in this article are available on the technical assistance portal [41].

Table 1:

Curricular Training Materials Reviewed

Name Intended Audience Training Format Documentation of Domains and Competencies
Training Institute Dissemination and Implementation Research in Cancer (TIDIRC) Researchers Facilitated Course and Open Access Course Online Modules
University of California San Francisco Certificate in Implementation Science (UCSF) Researchers Online Certificate Peer-reviewed article describing curriculum development
Colorado University Graduate Certificate Program (CU-MTDIRC) Researchers Synchronous Online Classes Non-peer reviewed document describing evaluation
Developing Competencies for Training Practitioners in evidence-based cancer control (CCP) Practitioners Not affiliated with specific training Peer-reviewed article
Cancer Prevention and Control Research Network: Putting Public Health Evidence in Action (CPCRN) Practitioners Training Materials Available for download Toolkit
National Cancer Institute’s Implementation Science at a Glance (ISG) Practitioners Not affiliated with specific training Guidebook
The Center for Implementation (TCI) Practitioners Online training program Core competencies guide used to develop trainings
University of North Carolina Implementation Support Practitioner Profile (UNC) Practitioners Learning Hub Peer-reviewed article describing curriculum development
Development of a Framework for Knowledge Mobilization and Impact Competencies (KMB) Practitioners Not affiliated with specific training Peer-reviewed article

Acknowledgements and Funding

We are grateful to Rachel Silber, Leila Habib, and Sarah Adler for providing contributions to the CCISBC. We are also grateful to Steering Committee Members: Heather Brandt, Christi Cahill, David Chambers, Gloria Coronado, Shauntay Davis-Patterson, Polly Hager, Erin Hahn, Caleb Levell, Tamara Robinson, Randy Schwartz, and Kelly Wells Sittig; their input was invaluable to the development and delivery process. This project was supported by Cooperative Agreement #NU58DP006461-03 from the Centers for Disease Control and Prevention (CDC). The views expressed in written workshop materials or publications and by speakers and moderators do not necessarily reflect the official policies of the Department of Health and Human Services, nor does the mention of trade names, commercial practices, or organizations imply endorsement by the U.S. Government.

List of Abbreviations

CCC

Comprehensive Cancer Control

CCCNP

Comprehensive Cancer Control National Partnership

CCISBC

Cancer Control Implementation Science Base Camp

CCP

Cancer Control Practitioners

CDC

Centers for Disease Control and Prevention

CFIR

Consolidated Framework for Implementation Research

CPCRN

Cancer Prevention and Control Research Network

CU

University of Colorado

EBIs

Evidence-based interventions

ERIC

Expert Recommendations for Implementing Change

FRAME

Framework for Reporting Adaptations and Modifications-Enhanced

ISG

Implementation Science at a Glance

KMB

Knowledge Mobilization Competencies

MT-DIRC

Mentored Training in Dissemination and Implementation Research in Cancer

NCCCP

National Comprehensive Cancer Control Program

NCI

National Cancer Institute

PSAT

Program Sustainability Assessment Tool

TCI

The Center for Implementation

TIDIRC

Training Institute for Dissemination and Implementation Research in Cancer

UCSF

University of California at San Francisco

UNC

University of North Carolina

Footnotes

Competing interests

No competing interests.

Ethics approval and consent to participate

The project received ethical approval via waiver by the Institutional Review Board of the George Washington University.

Availability of data and materials

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References:

  • 1.Schultes MT, Aijaz M, Klug J, Fixsen DL. Competences for implementation science: what trainees need to learn and where they learn it. Adv Health Sci Educ. 2021;26(1):19–35. doi: 10.1007/s10459-020-09969-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Hayes NS, Hohman K, Vinson C, Pratt-Chapman M. Comprehensive cancer control in the U.S.: summarizing twenty years of progress and looking ahead. Cancer Causes Control CCC. 2018;29(12):1305–1309. doi: 10.1007/s10552-018-1124-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Shockney L Team-Based Oncology Care: The Pivotal Role of Oncology Navigation; 2018. Accessed September 28, 2021. https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=1782174
  • 4.Allen P, Jacob RR, Lakshman M, Best LA, Bass K, Brownson RC. Lessons Learned in Promoting Evidence-Based Public Health: Perspectives from Managers in State Public Health Departments. J Community Health. 2018;43(5):856–863. doi: 10.1007/s10900-018-0494-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Proctor EK, Chambers DA. Training in dissemination and implementation research: a field-wide perspective. Transl Behav Med. 2017;7(3):624–635. doi: 10.1007/s13142-016-0406-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Padek M, Colditz G, Dobbins M, et al. Developing educational competencies for dissemination and implementation research training programs: an exploratory analysis using card sorts. Implement Sci. 2015;10(1):114. doi: 10.1186/s13012-015-0304-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Gonzales R, Handley MA, Ackerman S, OʼSullivan PS. A Framework for Training Health Professionals in Implementation and Dissemination Science: Acad Med. Published online January 2012:1. doi: 10.1097/ACM.0b013e3182449d33 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Morrato EH, Rabin B, Proctor J, et al. Bringing it home: expanding the local reach of dissemination and implementation training via a university-based workshop. Implement Sci. 2015;10(1):94. doi: 10.1186/s13012-015-0281-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Albers B, Metz A, Burke K. Implementation support practitioners – a proposal for consolidating a diverse evidence base. BMC Health Serv Res. 2020;20(1):368. doi: 10.1186/s12913-020-05145-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Bayley JE, Phipps D, Batac M, Stevens E. Development of a framework for knowledge mobilisation and impact competencies. Evid Policy J Res Debate Pract. 2018;14(4):725–738. doi: 10.1332/174426417X14945838375124 [DOI] [Google Scholar]
  • 11.Moore J, Khan S. Core Competencies for Implementation Practice. Published online April 2020. https://static1.squarespace.com/static/5b1150d95ffd205e7185bf2d/t/5e9ed8d8e188753f74bf6ce4/1587468523111/Core+Competencies+for+Implementation+Practice_v5.pdf [DOI] [PMC free article] [PubMed]
  • 12.Metz A, Louison L, Ward C, Burke K. Implementation support practitioner profile: Guiding principles and core competencies for implementation practice. Published online 2017. https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/imce/images/IS%20Practice%20Profile-single%20page%20printing-v10-November%202020_1.pdf
  • 13.Meissner HI, Glasgow RE, Vinson CA, et al. The U.S. training institute for dissemination and implementation research in health. Implement Sci. 2013;8(1):12. doi: 10.1186/1748-5908-8-12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15(1):97. doi: 10.1186/s13012-020-01051-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Brownson RC, Ballew P, Kittur ND, et al. Developing Competencies for Training Practitioners in Evidence-Based Cancer Control. J Cancer Educ Off J Am Assoc Cancer Educ. 2009;24(3):10.1080/08858190902876395. doi: 10.1080/08858190902876395 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Ramaswamy R, Mosnier J, Reed K, Powell BJ, Schenck AP. Building capacity for Public Health 3.0: introducing implementation science into an MPH curriculum. Implement Sci. 2019;14(1):18. doi: 10.1186/s13012-019-0866-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Friedman DB, Escoffery C, Noblet SB, Agnone CM, Flicker KJ. Building Capacity in Implementation Science for Cancer Prevention and Control Through a Research Network Scholars Program. J Cancer Educ. Published online July 9, 2021. doi: 10.1007/s13187-021-02066-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Implementation Science at a Glance. https://cancercontrol.cancer.gov/sites/default/files/2020-07/NCI-ISaaG-Workbook.pdf
  • 19.Major A, Stewart SL. Celebrating 10 years of the National Comprehensive Cancer Control Program, 1998 to 2008. Prev Chronic Dis. 2009;6(4):A133. [PMC free article] [PubMed] [Google Scholar]
  • 20.Brown CH, Curran G, Palinkas LA, et al. An Overview of Research and Evaluation Designs for Dissemination and Implementation. Annu Rev Public Health. 2017;38(1):1–22. doi: 10.1146/annurev-publhealth-031816-044215 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Comprehensive Cancer Control, American Cancer Society. 2021 Comprehensive Cancer Control National Partnership Survey.; 2021:1–10. [Google Scholar]
  • 22.Jacobs JA, Duggan K, Erwin P, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci IS. 2014;9:124. doi: 10.1186/s13012-014-0124-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Mainor AG, Decosimo K, Escoffrey C, et al. Scaling Up and Tailoring the “Putting Public Health in Action” Training Curriculum. Health Promot Pract. 2018;19(5):664–672. doi: 10.1177/1524839917741486 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.National Cancer Institute. Training Institute for Dissemination and Implementation Research in Cancer (TIDIRC) OpenAccess. https://cancercontrol.cancer.gov/is/training-education/training-in-cancer/TIDIRC-open-access
  • 25.Neta G Ensuring the Value of Cancer Research: Opportunities in Implementation Science. Trends Cancer. 2021;7(2):87–89. doi: 10.1016/j.trecan.2020.10.003 [DOI] [PubMed] [Google Scholar]
  • 26.Chinman M, Woodward EN, Curran GM, Hausmann LRM. Harnessing Implementation Science to Increase the Impact of Health Equity Research. Med Care. 2017;55(Suppl 2):S16–S23. doi: 10.1097/MLR.0000000000000769 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. doi: 10.1186/s12913-020-4975-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Smith JD, Li DH, Rafferty MR. The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15(1):84. doi: 10.1186/s13012-020-01041-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. ReadHowYouWant ed., set in 16 pt. Verdana, complete, unabridged text of the original publisher’s 3. ed., San Francisco, Calif., Berrett-Koehler, 2006. ReadHowYouWant; 2010. [Google Scholar]
  • 30.Moore JE, Rashid S, Park JS, Khan S, Straus SE. Longitudinal evaluation of a course to build core competencies in implementation practice. Implement Sci. 2018;13(1):106. doi: 10.1186/s13012-018-0800-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Brouwers MC, De Vito C, Bahirathan L, et al. What implementation interventions increase cancer screening rates? a systematic review. Implement Sci. 2011;6(1):111. doi: 10.1186/1748-5908-6-111 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Soori M, Platz EA, Kanarek N. Inclusion of Evidence-Based Breast Cancer Control Recommendations and Guidelines in State Comprehensive Cancer Control Plans. Prev Chronic Dis. 2020;17:200046. doi: 10.5888/pcd17.200046 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58. doi: 10.1186/s13012-019-0898-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21. doi: 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Calhoun A, Mainor A, Moreland-Russell S, Maier RC, Brossart L, Luke DA. Using the Program Sustainability Assessment Tool to Assess and Plan for Sustainability. Prev Chronic Dis. 2014;11:130185. doi: 10.5888/pcd11.130185 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Woodward EN, Matthieu MM, Uchendu US, Rogal S, Kirchner JE. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci. 2019;14(1):26. doi: 10.1186/s13012-019-0861-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Senier L, McBride CM, Ramsey AT, Bonham VL, Chambers DA. Blending Insights from Implementation Science and the Social Sciences to Mitigate Inequities in Screening for Hereditary Cancer Syndromes. Int J Environ Res Public Health. 2019;16(20):3899. doi: 10.3390/ijerph16203899 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):28. doi: 10.1186/s13012-021-01097-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Cancer Control Implementation Science Base Camp. Accessed May 25, 2022. https://cancercontroltap.smhs.gwu.edu/news/cancer-control-implementation-science-base-camp-0
  • 40.GW School of Medicine and Health Sciences. Accessed May 25, 2022. https://cme.smhs.gwu.edu
  • 41.Cancer Control Technical Assistance Portal. Accessed May 25, 2022. https://cancercontroltap.smhs.gwu.edu

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

RESOURCES