Skip to main content
AEM Education and Training logoLink to AEM Education and Training
. 2024 May 27;8(3):e10996. doi: 10.1002/aet2.10996

Incorporating implementation science principles into curricular design

Michael Gottlieb 1,, Julie Bobitt 2, Pavitra Kotini‐Shah 3, Shaveta Khosla 3, Dennis P Watson 4
PMCID: PMC11129323  PMID: 38808130

Abstract

Implementation science (IS) is an approach focused on increasing the application of evidence‐based health interventions into practice, through purposive and thoughtful planning to maximize uptake, scalability, and sustainability. Many of these principles can be readily applied to medical education, to help augment traditional approaches to curriculum design. In this paper, we summarize key components of IS with an emphasis on application to the medical educator.

INTRODUCTION

Implementation science (IS) is “the scientific study of methods to promote the systematic uptake of research findings and other [evidence‐based practices] into routine practice to improve the quality and effectiveness of health services.” 1 There are often notable time delays when translating knowledge into practice, with some literature suggesting that the process may take up to 17 years. 2 , 3 IS evolved out of a need to improve translation of evidence‐based medicine to practice and has been increasingly utilized for clinical applications. 4 By planning the design and implementation of an intervention with a focus on uptake, scalability, and sustainability, implementation scientists seek to ensure a broader and more successful intervention.

Educators are often tasked with similar challenges when creating curricula to train learners in core content. In fact, many parallels can be drawn between IS principles in health interventions and educational interventions (Table 1). A common model used for curricular design is Kern's six‐step approach to curricular development. 5 While this model can serve as a valuable foundation, IS principles can build on this by offering a unique lens geared toward implementation, dissemination, and sustainability. In the following sections, we provide a brief overview of IS and discuss how IS principles can be applied to curricular design to enhance the uptake, scalability, and sustainability.

TABLE 1.

Comparison of IS components in health interventions versus educational interventions.

IS components Health intervention examples Educational intervention examples
Evidence‐based practice
  • Clinical interventions with an established evidence base

  • Core curriculum requirements

  • Teaching learners to implement evidence‐based medicine in practice

Determinants
  • Clinical setting

  • Patient population

  • Clinicians

  • Hospital leadership/administration

  • Educational environment

  • Learner population

  • Faculty

  • Educational leadership/administration

Process Implementation strategies
  • Staff educational sessions

  • Flyers

  • Intervention monitoring and staff feedback

Learning activities
  • Group work

  • Simulation sessions

  • Handouts

  • Quizzes

  • Feedback sessions

Outcomes
  • Rate of intervention

  • Occurrence

  • Patient and staff acceptability

  • Changes in practice

  • Attendance

  • Participation, assignment completion

  • Clinical application of knowledge or skills acquired

  • Improvements in patient care

Abbreviation: IS, implementation science.

Implementation Science

IS can be divided into three key components: determinants, process, and outcomes. 4 Determinants refer to the factors that may influence the implementation outcome and can include both positive factors (facilitators) and negative factors (barriers). Process refers to strategies and mechanisms used to implement a given intervention and should directly link to the determinants. Outcomes refer to measures used to identify the success of a given implementation strategy. Each of these components plays a key role in the implementation of a program and are interlinked (Figure 1). 6

FIGURE 1.

FIGURE 1

Implementation research logic model. From “The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects,” by Smith JD, Li DH, Rafferty MR, 2020, Implement Sci, 15 (1), p. 84. Copyright 2007 by Smith JD. Reprinted with permission. 6

APPLICATION TO MEDICAL EDUCATION

IS principles could be directly applied to medical education in a multitude of areas to help enhance the design and implementation of these educational interventions. Herein, we present a case example of an emergency medicine residency program seeking to revise its ultrasound training. We demonstrate how an IS‐informed approach can enhance traditional curricular design approaches, using the Kern six‐step model as an example. 5

Step 1: Problem identification and general needs assessment

This begins by identifying a problem, followed by analysis of the current approach and comparison with the ideal approach. 5 In this case, program faculty identify a problem of having insufficient ultrasound training for their residents. They identify several affected groups, including learners, faculty, and patients and then compare the current approach with national guidelines (e.g., ACGME, ACEP). 7 , 8 This stage would parallel the guideline‐ or evidence‐based practice in IS models.

Step 2: Targeted needs assessment

In practice, this often involves a literature review, interviews, focus groups, or surveys of learners. 5 Applying IS principles would expand the needs assessment to include a broader range of participants and contributing factors to identify a more expansive list of key determinants. 9 , 10 While several IS frameworks exist, the Consolidated Framework for Implementation Research (CFIR) is one of the most common (Table S1). 11 Adapting CFIR to a medical education context, one could begin with innovation‐specific determinants, such as the source (e.g., ultrasound‐trained faculty), relative advantage (e.g., improved format/structure), resources (e.g., ultrasound machines, simulators), and cost (e.g., time required to run the curriculum). They could assess the inner setting (e.g., ultrasound faculty, nonclinical time available for teaching, leadership support, middleware for ultrasound examination storage and quality improvement) and outer setting (e.g., institutional support, expectation of adequate ultrasound training upon graduation). 7 , 12 They could assess individual characteristics among implementation facilitators (i.e., ultrasound faculty), midlevel leaders (e.g., core faculty), high‐level leaders (e.g., chair, program director), and innovation recipients (e.g., residents/students). They could also assess process factors (e.g., competing curricular priorities from faculty/residents, key roles/responsibilities). While educators do not need to assess each CFIR construct, they can use this as a guide to consider the broader range of potential determinants in their needs assessment. By accounting for these determinants (both positive and negative) across a wider range of domains and components, educators can design their curricula with a more tailored model to enhance implementation success.

Step 3: Goals and objectives

Goals communicate the curriculum's overall purpose, whereas objectives guide the educational and evaluation methods. 5 Seen through an IS lens, goals and objectives serve to frame the overarching program and implementation, ranging from the selection of determinants through outcomes assessed to ensure continuity throughout each stage. Using the implementation research logic model framework from Figure 1, each item added to a box should be directly linked to the others. For example, if one specific objective was “by completion of residency, all residents will be able to perform a cardiac ultrasound with ≥80% accuracy” then the program faculty might build their approach by assessing how their determinants align with their cardiac ultrasound objective (e.g., faculty interest/skills in cardiac ultrasound, volume of ultrasound available, machine capabilities, availability of simulators) and then specifically tailor the intervention, implementation, and outcome approaches to either build upon strengths or improve weaknesses identified.

Step 4: Education strategies

Education strategies are the means by which the curricular intervention is achieved. 5 This step closely aligns with the clinical interventions used in IS and should closely align with the other implementation components. Importantly, when determining interventions, educators should consider the ideal approach and contextualize it within their current state (identified by the determinants in the needs assessment stage). Educators can then use the implementation strategies to help bolster and ensure fidelity to the intended intervention. Using the cardiac ultrasound objective, faculty may determine the best approaches include a combination of didactics, simulation, and hands‐on practice. In the next step, we will discuss how to use implementation strategies to optimize the buy‐in and conduct of these strategies.

Step 5: Implementation

Implementation includes both the IS strategies and the proposed mechanisms by which they work. The Expert Recommendations for Implementing Change (ERIC) provides a robust list of implementation strategies spread across nine domains: using evaluative and iterative strategies, providing interactive assistance, adapting and tailoring context, developing stakeholder interrelationships, training and educating stakeholders, supporting clinicians, engaging consumers, utilizing financial strategies, and changing infrastructure. 13 While the exact number of strategies can vary across the curricular innovation, one must consider available resources and ensure consistent alignment across all components when selecting implementation strategies. Examples could include tailoring the ultrasound curriculum based on postgraduate surveys, identifying local ultrasound champions within and external to the department (e.g., cardiology, critical care), identifying academic partnerships (e.g., shared training sessions, resident ultrasound rotation exchange program), engaging opinion leaders (e.g., core faculty, chief residents), actively involving participants (i.e., residents) in the process, more frequent evaluations of quality and fidelity to the curriculum, and supporting/incentivizing faculty (e.g., nonclinical time, incentive compensation).

Step 6: Evaluation and feedback

Program evaluation is contingent upon the implementation and fidelity of the curriculum. For example, an intervention seeking to improve residents’ ability to perform cardiac ultrasound will likely be unsuccessful if there is varied attendance and disinterested faculty due to poor implementation, despite a well‐designed intervention. Therefore, it is important to use a wider lens that includes evaluation of the program and its implementation.

Multiple outcome frameworks have been described in the IS literature. 14 The Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE‐AIM) framework is a common and practical tool that can evaluate the implementation longitudinally from the initial reach of the program through sustained maintenance over time (Table 2). 15 , 16 RE‐AIM is versatile and has been previously used to evaluate several online medical education curricula during COVID‐19 17 , 18 , 19 , 20 as well as mentoring programs for faculty underrepresented in medicine. 21 RE‐AIM has advantages over narrower tools, such as the Kirkpatrick model, 22 by providing a more expansive view of the effect of the innovation and its implementation. A more in‐depth review of RE‐AIM with practical examples and tools is available at https://re‐aim.org/.

TABLE 2.

Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE‐AIM) framework.

Construct Definition
Reach The absolute number, proportion, and representativeness of individuals who are willing to participate in a given initiative, intervention, or program.
Effectiveness The impact of an intervention on important outcomes, including potential negative effects, quality of life, and economic outcomes.
Adoption The absolute number, proportion, and representativeness of settings and intervention agents (i.e., people who deliver the program) who are willing to initiate a program.
Implementation At the setting level, implementation refers to the intervention agents’ fidelity to the various elements of an intervention's protocol, including consistency of delivery as intended and the time and cost of the intervention. At the individual level, implementation refers to clients’ use of the intervention strategies.
Maintenance The extent to which a program or policy becomes institutionalized or part of the routine organizational practices and policies. Within the RE‐AIM framework, maintenance also applies at the individual level. At the individual level, maintenance has been defined as the long‐term effects of a program on outcomes ≥6 months after the most recent intervention contact.

Applying RE‐AIM to the ultrasound curriculum, faculty could assess the reach by tracking total resident didactic attendance at lectures and representativeness of those attendees (e.g., postgraduate year, prior ultrasound experience). They could assess effectiveness using participant satisfaction, multiple‐choice test questions, objective structured clinical examinations, and standardized direct observation tools. 23 Adoption could include measures of faculty willingness to help teach the curricula. Implementation could be assessed via adherence to the educational intervention, tracking of any adaptations made, and focus groups with faculty and participants to solicit feedback. Finally, maintenance could be assessed by continuation of the educational program in subsequent years and sustained engagement. 24

PLANNING FOR SUSTAINABILITY AND SCALABILITY

While not an explicit step in Kern's six‐step model, curricular maintenance, enhancement, and dissemination are discussed as additional relevant elements. 5 These also serve as critical components within IS, often referred to as sustainability and scalability. This should be considered and planned for early in developing a program/curriculum. One IS framework is the Designing for Dissemination and Sustainability (D4DS) model. 25 In D4DS, a dedicated design‐focused phase is emphasized early in the process, wherein the program is analyzed from both product and dissemination design (i.e., messaging, packaging, distribution). 25 This approach emphasizes active engagement of participants and influencers, using the following: (a) participatory codesign and end‐user involvement; (b) application of dissemination and IS theories/frameworks; (c) incorporation of marketing and business theories; (d) context‐ and situation‐specific analysis; (e) using systems, engineering, and complexity science approaches; and (f) incorporating tools from communication fields and the arts (e.g., media production, advertising, graphic design). 25

Similar models could be applied for curricular design. At the program level, this should include deliberate attention to building sustainability by identifying the strategies that can sustain well beyond the initial implementation period (e.g., developing future leaders and pathways to leadership, identifying sustainable funding mechanisms). At the classroom level, this could include developing standardized materials (e.g., handouts, instructor guides, presentation slides). These approaches should also consider how to enhance scalability through development of train‐the‐trainer materials for other programs to launch similar initiatives or virtual expansion through massive open online courses.

CONCLUSIONS

Implementation science is an approach to increase incorporation of evidence‐based interventions that emphasizes purposive and thoughtful planning to maximize uptake, scalability, and sustainability. Many of these principles align well with medical education and can augment traditional approaches to curriculum design. By utilizing the strategies described above, faculty can help ensure more robust curriculum implementation, evaluation, and sustainability.

CONFLICT OF INTEREST STATEMENT

The authors declare no conflicts of interest.

Supporting information

Table S1. Consolidated framework for implementation research constructs and definitions. Adapted from Damschroder et al. 11

AET2-8-e10996-s001.docx (21.2KB, docx)

Gottlieb M, Bobitt J, Kotini‐Shah P, Khosla S, Watson DP. Incorporating implementation science principles into curricular design. AEM Educ Train. 2024;8:e10996. doi: 10.1002/aet2.10996

Supervising Editor: Daniel J. Egan

REFERENCES

  • 1. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non‐specialist. BMC Psychol. 2015;3(1):32. doi: 10.1186/s40359-015-0089-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Institute of Medicine . Crossing the Quality Chasm: A New Health System for the 21st Century. National Academies Press; 2001:10027. doi: 10.17226/10027 [DOI] [PubMed] [Google Scholar]
  • 3. Grant J, Green L, Mason B. Basic research and health: a reassessment of the scientific basis for the support of biomedical science. Res Eval. 2003;12(3):217‐224. doi: 10.3152/147154403781776618 [DOI] [Google Scholar]
  • 4. Damschroder LJ. Clarity out of chaos: use of theory in implementation research. Psychiatry Res. 2020;283:112461. doi: 10.1016/j.psychres.2019.06.036 [DOI] [PubMed] [Google Scholar]
  • 5. Thomas PA, Kern DE, Hughes MT, Tackett SA, Chen BY, eds. Curriculum Development for Medical Education: A Six‐Step Approach. 2nd ed. Johns Hopkins University Press; 2022. [Google Scholar]
  • 6. Smith JD, Li DH, Rafferty MR. The implementation research logic model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15(1):84. doi: 10.1186/s13012-020-01041-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. American College of Emergency Physicians . Ultrasound guidelines: emergency, point‐of‐care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2023;82(3):e115‐e155. doi: 10.1016/j.annemergmed.2023.06.005 [DOI] [PubMed] [Google Scholar]
  • 8. Accreditation Council for Graduate Medical Education . Emergency Medicine Defined Key Index Procedure Minimums. Accessed July 8, 2023. https://www.acgme.org/globalassets/pfassets/programresources/em_key_index_procedure_minimums_103117.pdf
  • 9. Gottlieb M, Wagner E, Wagner A, Chan T. Applying design thinking principles to curricular development in medical education. AEM Educ Train. 2017;1(1):21‐26. doi: 10.1002/aet2.10003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Chan TM, Jordan J, Clarke SO, et al. Beyond the CLAIM: a comprehensive needs assessment strategy for creating an advanced medical education research training program (ARMED‐MedEd). AEM Educ Train. 2022;6(1):e10720. doi: 10.1002/aet2.10720 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):75. doi: 10.1186/s13012-022-01245-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Beeson MS, Bhat R, Broder JS, et al. The 2022 model of the clinical practice of emergency medicine. J Emerg Med. 2023;64(6):659‐695. doi: 10.1016/j.jemermed.2023.02.016 [DOI] [PubMed] [Google Scholar]
  • 13. Waltz TJ, Powell BJ, Matthieu MM, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the expert recommendations for implementing change (ERIC) study. Implement Sci. 2015;10(1):109. doi: 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence‐based rating criteria. Implement Sci. 2015;10:155. doi: 10.1186/s13012-015-0342-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Gaglio B, Shoup JA, Glasgow RE. The RE‐AIM framework: a systematic review of use over time. Am J Public Health. 2013;103(6):e38‐e46. doi: 10.2105/AJPH.2013.301299 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Glasgow RE, Harden SM, Gaglio B, et al. RE‐AIM planning and evaluation framework: adapting to new science and practice with a 20‐year review. Front Public Health. 2019;7:64. doi: 10.3389/fpubh.2019.00064 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Yilmaz Y, Sarikaya O, Senol Y, et al. RE‐AIMing COVID‐19 online learning for medical students: a massive open online course evaluation. BMC Med Educ. 2021;21(1):303. doi: 10.1186/s12909-021-02751-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Nagji A, Yilmaz Y, Zhang P, et al. Converting to connect: a rapid RE‐AIM evaluation of the digital conversion of a clerkship curriculum in the age of COVID‐19. AEM Educ Train. 2020;4(4):330‐339. doi: 10.1002/aet2.10498 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Rose CC, Haas MRC, Yilmaz Y, et al. ALiEM connect: large‐scale, interactive, virtual residency programming in response to COVID‐19. Acad Med. 2021;96(10):1419‐1424. doi: 10.1097/ACM.0000000000004122 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Gisondi MA, Keyes T, Zucker S, Bumgardner D. Teaching LGBTQ+ health, a web‐based faculty development course: program evaluation study using the RE‐AIM framework. JMIR Med Educ. 2023;9:e47777. doi: 10.2196/47777 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Beech BM, Calles‐Escandon J, Hairston KG, Langdon SE, Latham‐Sadler BA, Bell RA. Mentoring programs for underrepresented minority faculty in academic medical centers: a systematic review of the literature. Acad Med. 2013;88(4):541‐549. doi: 10.1097/ACM.0b013e31828589e3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 3rd ed. Berrett‐Koehler; 2006. [Google Scholar]
  • 23. Damewood SC, Leo M, Bailitz J, et al. Tools for measuring clinical ultrasound competency: recommendations from the ultrasound competency work group. AEM Educ Train. 2020;4(Suppl 1):S106‐S112. doi: 10.1002/aet2.10368 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Luke DA, Calhoun A, Robichaux CB, Elliott MB, Moreland‐Russell S. The program sustainability assessment tool: a new instrument for public health programs. Prev Chronic Dis. 2014;11:130184. doi: 10.5888/pcd11.130184 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Kwan BM, Brownson RC, Glasgow RE, Morrato EH, Luke DA. Designing for dissemination and sustainability to promote equitable impacts on health. Annu Rev Public Health. 2022;43:331‐353. doi: 10.1146/annurev-publhealth-052220-112457 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Table S1. Consolidated framework for implementation research constructs and definitions. Adapted from Damschroder et al. 11

AET2-8-e10996-s001.docx (21.2KB, docx)

Articles from AEM Education and Training are provided here courtesy of Wiley

RESOURCES