Abstract
Background
It is essential that medical education (MedEd) fellows achieve desired outcomes prior to graduation. Despite the increase in postgraduate MedEd fellowships in emergency medicine (EM), there is no consistently applied competency framework. We sought to develop entrustable professional activities (EPAs) for EM MedEd fellows.
Methods
From 2021 to 2022, we used a modified Delphi method to achieve consensus for EPAs. EM education experts generated an initial list of 173 EPAs after literature review. In each Delphi round, panelists were asked to make a binary choice of whether to include the EPA. We determined an inclusion threshold of 70% agreement a priori. After the first round, given the large number of EPAs meeting inclusion threshold, panelists were instructed to vote whether each EPA should be included in the “20 most important” EPAs for a MedEd fellowship. Modifications were made between rounds based on expert feedback. We calculated descriptive statistics.
Results
Seventeen experts completed four Delphi rounds each with 100% response. After Round 1, 87 EPAs were eliminated and two were combined. Following Round 2, 46 EPAs were eliminated, seven were combined, and three were included in the final list. After the third round, one EPA was eliminated and 13 were included. After the fourth round, 11 EPAs were eliminated. The final list consisted of 16 EPAs in domains of career development, education theory and methods, research and scholarship, and educational program administration.
Conclusions
We developed a list of 16 EPAs for EM MedEd fellowships, the first step in implementing competency‐based MedEd.
INTRODUCTION
Postgraduate medical education (MedEd) fellowships have gained popularity in recent years, especially in emergency medicine (EM). The general aim of these fellowships is to train fellows to be outstanding clinical educators, researchers and leaders, capable of advancing theory, practices, and scholarship in MedEd and primed to secure positions as academic core faculty, clerkship directors, and assistant residency directors. 1 , 2 , 3 , 4 Postgraduate EM MedEd fellowships can be 1‐year, typically with a focus on teaching, or 2‐year, typically incorporating training in scholarship. 3 , 4 Additionally, some fellowships offer fellows the opportunity to complete a master's degree in an education or research related field during the fellowship. 3 , 4 Exemplar curricula and objectives have been published previously; however, no standardized curricula and objectives, currently exist for EM MedEd fellowships. 5 , 6 MedEd fellowships are not governed by the Accreditation Council for Graduate Medical Education (ACGME). Society for Academic Medicine (SAEM) created a formal endorsement process for EM MedEd scholarship fellowships in 2013, but not all fellowships apply or are ultimately certified. 7 , 8
Components and outcomes of both 1‐year teaching and 2‐year education scholarship fellowships are variable and site‐dependent. 5 , 6 Limited studies have assessed outcomes of MedEd fellowships, but objective outcome data for fellows and what they pursue after fellowship continues to be a challenge. 9 , 10 , 11 Department chairs, fellowship directors, and other major stakeholders continue to perceive a need for further training in many areas. 11 Having a unified competency‐based framework for EM MedEd fellowships could help minimize variability and clarify expected outcomes. It would also enable confidence in institutional leaders and potential employers that graduates were prepared for job tasks, a need that has been previously identified. 11
In recent years, competency‐based education has gained momentum in health professions education. 12 Entrustable professional activities (EPAs) provide an objective competency‐based framework that is defined as “a unit of professional practice that can be entrusted to a sufficiently competent learner or professional.” 13 Many specialties and organizations including the Association of American Medical Colleges (AAMC) have embraced EPAs as a useful strategy in competency based education because EPAs are “executable within a given time frame, observable, measurable, confined to a qualified personnel.” 13 , 14 , 15 , 16 , 17 Within competency‐based education, competencies provide a framework that describe qualities of professionals providing a description to guide learners and supervisors. 18 EPAs facilitate the theoretical framework. 18 EPAs allow for a more integrated holistic evaluation of trainees, not only for specific skills but also for readiness of a trainee concerning a professional activity. 13 The potential benefit of EPAs can extend to any level of learner and there have been calls to establish EPAs for teachers in MedEd. 19 In response to this call, we have seen the development of EPAs for graduate students in health professions education. 20 , 21 , 22 While clinician educator milestones were also released in 2022, EPAs have not been created nor is this work specifically designed for EM MedEd fellows. 23 Despite these advances in related areas, EPAs specific to postgraduate EM MedEd fellowships do not exist. Fellowships differ from degree‐granting graduate programs in terms of their structure and curricula. There is a need for EPAs in this unique setting.
It is instrumental to assess MedEd fellows’ competency to ensure graduates have the appropriate training to be the next leaders in their field. This is especially true for education fellows as they have been shown to consistently hold departmental and academic leadership positions after graduation. 1 Further, a standardized set of outcome competencies offers a framework for fellowship directors to ensure their fellows are growing and progressing appropriately as well as provide them with specific, objective, anchored, and actionable feedback and continued professional development. The objective of this study was to develop an outcomes‐based framework of EPAs for MedEd fellows in one specialty (i.e., EM) that could be applied to MedEd fellowships across all specialties or education components of other training programs.
METHODS
Study design
We used a modified Delphi method utilizing an online survey platform (SurveyMonkey) to achieve expert consensus for EPAs for postgraduate MedEd fellowships. The modified Delphi technique is a systematic group consensus strategy designed to increase content validity. 24 , 25 , 26 The modified Delphi technique was intentionally selected as it has the potential to mitigate the effects of bias that may arise from the study group's interactions, assure anonymity with responses, and provide controlled feedback to study group participants following each round. 25 This study was reviewed by the institutional review board of the David Geffen School of Medicine at UCLA and determined to be exempt.
Setting and participants
We identified EM education experts, preferably with prior experience in developing EPAs, of diverse career stages, gender, and regions of the United States through review of Council of Residency Directors in Emergency Medicine MedEd Fellowship Community of Practice, personal knowledge of study authors, and snowball sampling, to participate in the Delphi process. These experts included graduates of MedEd fellowships and national leaders in EM MedEd who have held critical roles in running MedEd fellowships at their respective institutions. All members who were asked to contribute to the study participated. We collected data between June 2021 and August 2022.
Study protocol
We first conducted a literature search to review the current body of knowledge surrounding core curricular content and assessment methods in MedEd fellowships as well as existing EPAs from other fields such as clinical medicine and graduate health professions education. 14 , 21 , 27 , 28 All 17 members of the Delphi panel convened a drafting meeting via online platform (Zoom Video Communications Inc., 2011). At the beginning of this meeting, one of the authors with expertise in competency based MedEd (CBME; HC) provided an orientation to the goals and process to ensure a shared mental model. The panel then drafted an initial list of EPAs and domains. Initial domains were drawn from existing curricula and literature. 5 , 6 We utilized an electronic platform (Survey Monkey, 1999) to administer and collect data from the Delphi surveys. In each Delphi round, we asked panelists to make a binary choice of whether the EPA should be included in the final list of EPAs fellowship directors adopt into their fellowship. While Delphi studies commonly utilize Likert scales, a variety of methods can be used including binary rating. 29 , 30 , 31 We determined a threshold of 70% agreement by majority vote of the Delphi panel for inclusion a priori. This is consistent with prior literature on consensus thresholds. 32 In each round, panelists were asked to provide written comments about the reasoning behind their vote, linguistic edits to the current draft EPAs, and suggestions for combining or adding EPAs. The first and last authors made modifications between rounds based on feedback from the expert panelists. The panelists were then asked to vote on the revised EPAs and provide additional edits or comments in all subsequent rounds. Panelists were given access to all results and comments from the previous round. We planned to continue the Delphi process until consensus was reached for all items or there was evidence of diminishing returns. 33 In the first round, we did not place a limitation on the number of “yes” votes. After the first round, due to the significant number of EPAs that met threshold for inclusion and a desire to create a list of EPAs that were essential and manageable for fellows to achieve during the 1‐ or 2‐year duration of a MedEd fellowship, panelists were instructed to vote “yes” or “no” on whether each EPA should be included in a list of the “20 most important” EPAs for a MedEd fellowship. We used the same inclusion threshold of 70% “yes” and also added an exclusion threshold of 70% “no.” Panelists were given similar voting instructions in Rounds 3 and 4.
After the conclusion of the Delphi process, we decided to evaluate the quality of our resulting EPAs to ensure they met EPA standards. Five authors (four authors with experience developing EPAs and critical roles in MedEd fellowships at their institutions and the first author) independently applied a minimally modified version of the EQual rubric to assess quality and structure of each EPA title and made modifications as indicated. 34 Because items within the EQual rubric contains “clinical outcome,” which is not aligned with the educationally oriented EPAs of this study, members of the group were asked to substitute “clinical outcome” with “educational outcome.” Following independent scoring, the five authors met to discuss results and modify EPA titles accordingly.
Data analysis
We calculated and reported descriptive statistics for item response during each Delphi round. We calculated descriptive statistics for EPA EQual scores. We used a threshold of 4.07 as our cutoff based on prior literature. 35 It is suggested that EPAs with scores below this cutpoint may possibly require revision. 35
RESULTS
We invited 17 experts to participate in the Delphi process, all of whom agreed to participate. The characteristics of the expert panelists are described in Table 1.
TABLE 1.
Characteristics of Delphi panelists in the development of curricular objectives for EM MedEd Fellowships.
| n (%) Total n = 17 | |
|---|---|
| Region of practice | |
| West | 8 (47) |
| South | 3 (18) |
| Northeast | 1 (6) |
| Midwest | 5 (29) |
| Academic rank | |
| Fellow | 3 (18) |
| Assistant professor | 4 (24) |
| Associate professor | 5 (29) |
| Professor | 4 (24) |
| Emeritus | 1 (6) |
| Completed a MedEd fellowship | 10 (59) |
| Experience directing a MedEd fellowship | 12 (71) |
| Experience in developing EPAs | 4 (24) |
Abbreviations: EPA, entrustable professional activity; MedEd, medical education.
The initial drafting resulted in the generation of 173 EPAs in four core domains: career development, education theory and methods, research and scholarship, and educational program administration. We completed four Delphi rounds. The response rate was 100% across all rounds. Figure 1 shows the detailed results of the Delphi process. After Round 1, 85 EPAs met the 70% inclusion criteria. Given this large number of EPAs that met threshold for inclusion and a desire to create a list of EPAs that were essential and manageable for fellows to achieve during the 1‐ or 2‐year duration of a MedEd fellowship, we decided to pass these 85 EPAs on to the next round and instructed subsequently instructed panelists to vote “yes” or “no” on whether each EPA should be included in a list of the “20 most important” EPAs for a MedEd fellowship. Following Round 2, three items met the threshold for inclusion in the final list of EPAs. Thirty‐one EPAs did not meet the threshold for inclusion or exclusion in the final tool and advanced to Round 3. After the third round, 13 EPAs met the inclusion threshold and were added to the final evaluation tool. After the fourth and final rounds, we did not add any additional EPAs to our final list. Given that we had completed four rounds and no additional EPA met the inclusion threshold in the final round, we determined that we had met the law of diminishing returns and decided to stop the process. 33 The final list after the four Delphi rounds resulted in 16 EPAs, with representation in each of the four initial domains. There were six EPAs that did not achieve consensus (Box 1; see Table S1 for full results of the modified Delphi process).
FIGURE 1.

EPA study Delphi process. After the generation of 173 potential EPAs in Round 1, 87 were eliminated after not hitting the 70% “yes, should be an EPA.” Due to the high number of potential EPAs (85) that were above the 70% threshold, group members were asked to “select the top 20” and EPAs that were above 70% were included in the final list (3) and EPAs that were 70% “no” were eliminated (46). After feedback, EPAs were combined, added, or collapsed into existing EPAs. The same thresholds were carried into each subsequent round with 16 EPAs included in the final list and six EPAs never meeting threshold for exclusion or inclusion. EPA, entrustable professional activity.
BOX 1. EPA titles that did not achieve consensus.
|
|
|
|
|
|
The results of application of the EQual rubric to the 16 EPAs resulting from the modified Delphi process are show in Table 2. Two EPAs did not meet the threshold of 4.07 and were revised. We also revised additional EPAs for clarity and format according to consistent comments from the group members. The final list of EPA titles after the modified Delphi process and application of the EQual rubric is shown in Box 2.
TABLE 2.
Application of the EQual rubric to the 16 generated EPAs.
| Domain | EPA title | Mean total score, n = 5 | Mean average score (mean total/14; cutscore 4.07), n = 5 |
|---|---|---|---|
| Career development | Create academic CV | 59.6 | 4.26 |
| Develop an educator portfolio | 62 | 4.43 | |
| Participate in mentorship relationships | 56.8 | 4.06 | |
| Education theory and methods | Perform a needs assessment | 64.2 | 4.59 |
| Create a curriculum informed by education theory | 65.4 | 4.67 | |
| Apply evidence‐based teaching methods to didactic instruction | 62.2 | 4.44 | |
| Utilize a variety of bedside teaching techniques | 63.8 | 4.56 | |
| Incorporate diversity and inclusion in educational methods | 56.8 | 4.06 | |
| Evaluate a curriculum | 62.8 | 4.49 | |
| Provide feedback to learners | 59.6 | 4.26 | |
| Create an individualized learning plan to support the struggling learner | 68.2 | 4.87 | |
| Research and scholarship | Design a scholarly project using appropriate conceptual framework, methods, and assessment tools | 64.6 | 4.61 |
| Assess quality of MedEd research | 60.6 | 4.33 | |
| Educational program administration | Participate in program evaluation | 59.8 | 4.27 |
| Apply a variety of strategies to assess learners | 61 | 4.36 | |
| Participate in education committees | 63 | 4.5 |
Note: Application of the EQual rubric to the 16 EPAs that met threshold. The EQual rubric is a 14‐item rubric, where items can be scored from 1 to 5. Mean total score refers to the mean of the total score that specific EPA, based on our five reviewers. The “Mean Average Score” is the total score divided by the number of items (14). A mean cut score of 4.07 was chosen based on prior published literature. EPAs with scores below 4.07 may require revision.
Abbreviations: EPA, entrustable professional activity; MedEd, medical education.
BOX 2. Final list of MedEd fellowship EPA titles.
| Career development |
|
|
|
| Education theory and methods |
|
|
|
|
|
|
|
|
| Research and scholarship |
|
|
| Educational program administration |
|
|
|
*Tense of the EPA were changed.
DISCUSSION
This study adds to existing literature by creating the first set of EPA titles for postgraduate MedEd fellows in EM. It is crucial that MedEd fellows have defined outcomes of their training as a first step to identify areas of focus for growth and development as well as graduation targets. 36 As this type of training becomes more desired by both institutional leaders and trainees and the number of fellowships increases, having a robust outcomes framework toward which to target assessment and requisite graduation thresholds can protect all stakeholders and lend further legitimacy to the value of this training. 1 , 8 , 11 , 37
The EPAs developed in this study align with the core curricular domains of postgraduate MedEd fellowships both internal and external to the specialty of EM including teaching skills, learning theory, curriculum design, educational program administration, research and scholarship, and career development. 5 , 9 , 38 Additionally, they were built upon the framework of CBME, which is now the predominant training paradigm globally. 12 , 39 , 40 Many of the EPAs developed in this study also align with EPAs developed for graduate programs in health professions education, particularly in the areas of education and scholarship, which is not surprising given the existence of shared goals and curricular content of these different training programs. 20 , 22 , 27 Our EPAs seemed to be more distinctive in the domains of career development and education program administration, which may point to the unique aspects of MedEd fellowship training in EM. By applying this outcomes‐based framework, the EPAs developed in this study will be better able to meet the current needs of MedEd systems. 12 , 39 , 40 Additionally, as many MedEd fellowship graduates go on to serve in leadership positions in MedEd, the application of this framework can have macro level influence, impacting patients served by physicians these fellows train. 1 , 10 , 12 , 39 , 40 , 41
This list of core EPAs has multiple applications. This is the first step in implementing competency‐based education in EM MedEd fellowships as the EPAs created in this study, once comprehensively described, can be adopted by all EM MedEd fellowships as a competency‐based outcomes framework. 42 This work can be followed by incorporating a full map of competencies and knowledge, skills, attitudes/objectives for each EPA title. While this study was focused on MedEd fellowships in EM in the United States and therefore may have limited generalizability, MedEd is not specialty specific. There is significant overlap in MedEd fellowship core content between EM and other specialties, so this EPA framework can also be applied and/or adapted in existing postgraduate MedEd fellowships in other specialties and can guide those who are seeking to develop such postgraduate fellowships. From this work, education leaders can begin to operationalize the other four core components of competency based MedEd: developmental progression, individualized learning, coaching, and assessment. 36 These EPAs could be organized according to anticipated developmental progression and observable practice activities that map to these EPAs could be created and sequenced progressively to support the development of expertise. 36 , 43 These EPAs could guide fellowship directors in creating or iterating specific curricula and learning experiences that will lead to the achievement of these competencies prior to graduation. Beyond content, the resulting list of EPAs can also focus attention on resources, design, and personnel factors necessary for the successful administration of an education fellowship. Having this competency framework can help fellowship directors and faculty target coaching efforts and focus teaching practices to the individual fellow's stage of progress in achievement of these EPAs. Finally, these EPAs could be applied as part of a fellowship's system of assessment as standards to be met and a source of data for important feedback. Such feedback information may guide changes in individual fellow behavior (i.e., more dedicated study of a particular topic or attempts at different methods of teaching). Improved feedback to fellows can also enhance their professional development. 44 , 45
There were several EPAs that did not meet consensus after four rounds. We hypothesize that this may be due to several of the nonconsensus EPAs having significant overlap with other listed EPAs. This likely resulted in our experts choosing the option they felt was best or most specific. Other nonconsensus EPAs were felt by some panelists to be too basic for MedEd fellows or natural byproducts of MedEd fellowships, so when trying to create a list of the “most important” EPAs, they preferred other options. These nonconsensus EPAs could be utilized as a list of “optional” EPAs or serve as a guide for how individual programs can design optional EPAs to incorporate as necessary to meet their individual program needs.
There are still many unanswered questions on the topic of a unified outcomes framework of EPAs for MedEd fellowships. We have provided content validity evidence, but future studies could add additional validity evidence to support the use of these EPAs in practice as programmatic assessment tools are designed and implemented and developmental progression tracked. While we anticipate that use of these EPAs in MedEd fellowship will lead to a more standardized set of expectations and outcomes for fellows and provide a framework for high‐quality feedback that will enhance their growth and development, future studies evaluating the impact of broad utilization and changes in fellow and programmatic behavior are needed.
LIMITATIONS
The study has limitations that must be considered in interpreting the results. The choice of Delphi panel experts may not be representative of the larger pool of MedEd fellowship stakeholders. Although we attempted to gather broad representation of medical educators within the field, it is possible that a differently composed panel may have yielded different results. Additionally, as this method was implemented electronically, there may be limited discussion and elaboration; however, this is key to the design of a Delphi. Other consensus methods such as nominal group technique may have yielded different results with the added benefit of in person discussion although at the expense of anonymity in the consensus process. Furthermore, we only utilized EPA titles during the Delphi process rather than comprehensive EPA descriptions. 42 While we believe that the shared expertise of the panelists and the initial group drafting meeting that allowed for discussion, elaboration, and clarification of initially created EPAs led to a shared understanding of the EPAs involved in the Delphi process, it is possible that the absence of a full description could have created some ambiguity. Finally, expert responses in the modified Delphi process may not be truly independent, considering how involved EM experts may be on national groups regarding the expectations of MedEd fellowships.
CONCLUSIONS
We developed an outcomes framework list of 16 entrustable professional activities for emergency medicine medical education fellows. This is the first step in implementing competency‐based assessment of medical education in medical education fellowships.
AUTHOR CONTRIBUTIONS
Stephen Villa: study concept and design, acquisition of data, data analysis, interpretation of the data, drafting of the manuscript, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Holly Caretta‐Weyer: study concept and design, data analysis, interpretation of the data, technical support, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Lalena M. Yarris: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Samuel O. Clarke: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Wendy C. Coates: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Kimberly A. Sokol: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Amanda Jurvis: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Dimitrios Papanagnou: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. James Ahn: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Emily Hillman: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Melanie Camejo: study concept and design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Nicole Deiorio: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Kathryn M. Fischer: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Meg Wolff: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Sara Dimeo: study design, interpretation of the data, critical revision of the manuscript for important intellectual content, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work. Jaime Jordan: study concept and design, data analysis, interpretation of the data, drafting of the manuscript, critical revision of the manuscript for important intellectual content, study supervision, final approval of the submitted manuscript, agreement to be accountable for all aspects of the work.
CONFLICT OF INTEREST STATEMENT
The authors declare no conflicts of interest.
Supporting information
Data S1.
Villa S, Caretta‐Weyer H, Yarris LM, et al. Development of entrustable professional activities for emergency medicine medical education fellowships: A modified Delphi study. AEM Educ Train. 2024;8:e10944. doi: 10.1002/aet2.10944
Supervising Editor: Sally Santen
REFERENCES
- 1. Jordan J, Ahn J, Diller D, et al. Outcome assessment of medical education fellowships in emergency medicine. AEM Educ Train. 2021;5(4):e10650. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Yarris LM, Jordan J, Coates WC. Education scholarship fellowships: an emerging model for creating educational leaders. J Grad Med Educ. 2016;8(5):668‐673. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. EMRA Match . EMRA. Accessed May 30, 2023. https://webapps.acep.org/utils/spa/match#/search/map
- 4. Society for Academic Emergency Medicine . Fellowship Directory. Accessed January 8, 2023. https://member.saem.org/SAEMIMIS/SAEM_Directories/Fellowship_Directory/SAEM_Directories/P/FellowshipList.aspx
- 5. Yarris LM, Coates WC, Lin M, et al. A suggested core content for education scholarship fellowships in emergency medicine. Acad Emerg Med. 2012;19(12):1425‐1433. [DOI] [PubMed] [Google Scholar]
- 6. Coates WC, Lin M, Clark S, et al. Defining a core curriculum for education scholarship fellowships in emergency medicine. Acad Emerg Med. 2012;19(12):1411‐1418. [DOI] [PubMed] [Google Scholar]
- 7. Society for Academic Emergency Medicine . Fellowship Approval Program. Accessed January 8th, 2023. https://www.saem.org/about‐saem/Services/fellowship‐approval‐program
- 8. Gottlieb M, Chan T, Clark S, et al. Emergency medicine education research since the 2012 consensus conference: how far have we come and what's next? AEM Educ Train. 2019;22(4):S57‐S66. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Coates WC, Runde DP, Yarris LM, et al. Creating a cadre of fellowship‐trained medical educators: a qualitative study of faculty development program leaders’ perspectives and advice. Acad Med. 2016;91(12):1696‐1704. [DOI] [PubMed] [Google Scholar]
- 10. Jordan J, Gisondi MA, Buchanavage J, et al. Is it worth it? A qualitative analysis of the impact of medical education fellowships on careers. AEM Educ Train. 2022;6(6):e10819. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Jordan J, Yarris L, Santen S, et al. Creating a cadre of fellowship‐trained medical educations part II: a formal needs assessment to structure postgraduate fellowships in medical education scholarship and leadership. Acad Med. 2017;92:1181‐1188. [DOI] [PubMed] [Google Scholar]
- 12. Holmboe ES. The transformational path ahead: competency‐based medical education in family medicine. Fam Med. 2021;53(7):583‐589. [DOI] [PubMed] [Google Scholar]
- 13. Cate O, Chen H, Hoff R, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE guide No. 99. MedTeach. 2015;37(11):983‐1002. [DOI] [PubMed] [Google Scholar]
- 14. Hart D, Franzen D, Beeson M, et al. Integration of entrustable professional activities with the milestones for emergency medicine residents. West J Emerg Med. 2019;20(1):35‐42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Pugh D, Cavalcanti R, Halman S, et al. Using the entrustable professional activities framework in the assessment of procedural skills. J Grad Med Educ. 2017;9(2):209‐214. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Greenberg J, Minter R. The future of competency‐based education in surgery may already be here. Ann Surg. 2019;269(3):407‐408. [DOI] [PubMed] [Google Scholar]
- 17. Core Entrustable Professional Activities (EPAs) what for Entering Residency: Summary of the 10‐School Pilot, 2014–2021. Association of American Medical Colleges. Accessed February 1, 2023. https://www.aamc.org/about‐us/mission‐areas/medical‐education/cbme/core‐epas
- 18. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157‐158. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Dewey CM, Jonker G, Ten Cate O, Turner TL. Entrustable professional activities (EPAs) for teachers in medical education: has the time come? Med Teach. 2017;39(8):894‐896. [DOI] [PubMed] [Google Scholar]
- 20. van Bruggen L, van Dijk EE, van der Schaaf M, Kluijtmans M, Ten Cate O. Developing entrustable professional activities for university teachers in the health professions. Med Teach. 2022;44(4):425‐432. [DOI] [PubMed] [Google Scholar]
- 21. Gruppen LD, Burkhardt JC, Fitzgerald JT, et al. Competency‐based education: programme design and challenges to implementation. Med Educ. 2016;50(5):532‐539. [DOI] [PubMed] [Google Scholar]
- 22. Zaeri R, Gandomkar R. Developing entrustable professional activities for doctoral graduates in health professions education: obtaining a national consensus in Iran. BMC Med Educ. 2022;22:424. doi: 10.1186/s12909-022-03469-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Boyle T, Chou C, Croom N, et al. The Clinician Educator Milestones Project. Accessed October 2, 2023. https://www.acgme.org/what‐we‐do/accreditation/milestones/resources/clinician‐educator‐milestones/
- 24. Hasson F, Keeney S, Mckenna HP. Research guidelines for the Delphi survey technique research guidelines for the Delphi survey. J Adv Nurs. 2000;32(4):1008‐1015. [PubMed] [Google Scholar]
- 25. Nasa P, Jain R, Juneja D. Delphi methodology in healthcare research: how to decide its appropriateness. World J Methodol. 2021;11(4):116‐129. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Waggoner J, Jan D, Carline JD, Durning SJ. Is there a consensus on consensus methodology? Descriptions and recommendations for future consensus research. Acad Med. 2016;91(5):663‐668. [DOI] [PubMed] [Google Scholar]
- 27. Entrustable Professional Activities . University of Michigan Medical School. Accessed May 30, 2023. https://medicine.umich.edu/dept/lhs/education‐professional‐development/graduate‐programs/masters‐health‐professions‐education/curriculum/entrustable‐professional‐activities
- 28. Fessler HE, Addrizzo‐Harris D, Beck JM, et al. Entrustable professional activities and curricular milestones for fellowship training in pulmonary and critical care medicine: report of a multisociety working group. Chest. 2014;146(3):813‐834. [DOI] [PubMed] [Google Scholar]
- 29. Robertson S, Kremer P, Aisbett B, Tran J, Cerin E. Consensus on measurement properties and feasibility of performance tests for the exercise and sport sciences: a Delphi study. Sports Med Open. 2017;3(2):2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Skulimowski AMJ. Expert Delphi survey as a cloud‐based decision support service. 2017 IEEE 10th Conference on Service‐Oriented Computing and Applications (SOCA). IEEE; 2017:190‐197. doi: 10.1109/SOCA.2017.33 [DOI] [Google Scholar]
- 31. Taze D, Hartley C, Morgan AW, Chakrabarty A, Mackie SL, Griffin KJ. Developing consensus in histopathology: the role of the Delphi method. Histopathology. 2022;81(2):159‐167. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Diamond IR, Grant RC, Feldman BM, et al. Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies. J Clin Epidemiol. 2014;67(4):401‐409. [DOI] [PubMed] [Google Scholar]
- 33. Santaguida P, Dolovich L, Oliver D, et al. Protocol for a Delphi consensus exercise to identify a core set of criteria for selecting health related outcome measures (HROM) to be used in primary health care. BMC Fam Pract. 2018;19(1):152. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. Taylor DR, Park YS, Egan R, et al. Equal, a novel rubric to evaluate entrustable professional activities for quality and structure. Acad Med. 2017;92:S110‐S117. [DOI] [PubMed] [Google Scholar]
- 35. Meyer E, Taylor D, Uijtdehaage S, et al. Equal rubric evaluation of the Association of American Medical Colleges' core entrustable professional activities for entering residency. Acad Med. 2020;95(11):1755‐1762. [DOI] [PubMed] [Google Scholar]
- 36. Van Melle E, Frank J, Holmboe E, et al. A core components framework for evaluating implementation of competency based medical education programs. Acad Med. 2019;94(7):1002‐1009. [DOI] [PubMed] [Google Scholar]
- 37. Clarke SO, Jordan J, Yarris LM, et al. The view from the top: academic emergency department chairs' perspectives on education scholarship. AEM Educ Train. 2017;2(1):26‐32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Cataldi ML, Kelly‐Hedrick M, Nanavati J, Chisolm MS, Anne LW. Post‐residency medical education fellowships: a scoping review. Med Educ Online. 2021;26(1):1920084. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Shah N, Desai C, Jorwekar G, Badyal D, Singh T. Competency‐based medical education: an overview and application in pharmacology. Indian J Pharmacol. 2016;48(Suppl 1):S5‐S9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40. Ryan M, Holmboe E, Chandra S. Competency‐based medical education: considering its past, present, and a post–COVID‐19 era. Acad Med. 2022;97(3S):S90‐S97. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41. Hall A, Schumaker D, Thoma B, et al. Outcomes of competency‐based medical education: a taxonomy for shared language. Med Teach. 2021;43(7):788‐793. [DOI] [PubMed] [Google Scholar]
- 42. Ten Cate O, Taylor DR. The recommended description of an entrustable professional activity: AMEE guide No. 140. Med Teach. 2021;43(10):1106‐1114. [DOI] [PubMed] [Google Scholar]
- 43. Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014;29(8):1177‐1182. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44. Bates CC, Morgan DN. Seven elements of effective professional development. Read Teach. 2018;71(5):623‐626. [Google Scholar]
- 45. Nevins SR, Floden R. Intensive mentoring as a way to help beginning teachers develop balanced instruction. J Teach Educ. 2009;60(2):112‐122. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data S1.
