Abstract
Introduction
The Association of American Medical Colleges (AAMC) proposed thirteen core Entrustable Professional Activities (EPAs) that all graduates should be able to perform under indirect supervision upon entering residency. As an underlying premise is that graduates ready to do so will be better prepared to transition to the responsibilities of residency, we explored the relationship between postgraduate year (PGY)-1 residents’ self-assessed preparedness to perform core EPAs under indirect supervision at the start of residency with their ease of transition to residency.
Methods
Using response data to a questionnaire administered in September 2019 to PGY-1 residents who graduated from AAMC core EPA pilot schools, we examined between-group differences and independent associations for each of PGY-1 position type, specialty, and “EPA-preparedness” score (proportion of EPAs the resident reported as prepared to perform under indirect supervision at the start of residency) and ease of transition to residency (from 1 = much harder to 5 = much easier than expected).
Results
Of 274 questionnaire respondents (19% of 1438 graduates), 241 (88% of 274) had entered PGY-1 training and completed all questionnaire items of interest. EPA-preparedness score (mean 0.71 [standard deviation 0.26]) correlated with ease of transition (3.1 [0.9]; correlation = .291, p < .001). In linear regression controlling for specialty (among other variables), EPA-preparedness score (β-coefficient 1.08; 95% confidence interval .64–1.52; p < .001) predicted ease of transition to residency.
Conclusion
Graduates who felt prepared to perform many of the core EPAs under indirect supervision at the start of PGY-1 training reported an easier-than-expected transition to residency.
Supplementary Information
The online version contains supplementary material available at 10.1007/s40670-021-01370-3.
Keywords: Undergraduate medical education, Entrustable Professional Activities, Residency preparation, Graduate medical education, Specialty, Competency-based medical education
Introduction
In 2014, the Association of American Medical Colleges (AAMC) convened experts across the undergraduate to graduate medical education continuum to develop core Entrustable Professional Activities (EPAs) that all graduates should be ready to perform under indirect supervision upon entering residency regardless of specialty [1]. The final list of 13 core EPAs is shown in Table 1. In 2015, the AAMC convened the core EPA pilot project with ten participating schools to evaluate the feasibility of teaching and assessing the 13 core EPAs, on the premise that those graduates ready to perform these activities under indirect supervision would be better prepared to transition to the responsibilities of residency [2]. Each pilot school implemented curricula and formative assessments in an entrustment framework for at least 4, and up to 13, core EPAs. Although there were differences across schools in specific approaches to core EPA implementation, all schools developed a set of shared guiding principles for their implementation activities [1], summarized in Online Resource 1. Timelines for implementation activities also varied across the 10 schools: 6 schools implemented activities in the clinical years starting with the graduating class of 2019 and 4 schools started doing so with subsequent graduating classes (i.e., the graduating class of 2020 or the graduating class of 2021) [1, 3].
Table 1.
AAMC Core EPAs for Entering Residency [1]. The 13 core Entrustable Professional Activities for entering residency
EPA number | Description |
---|---|
EPA1 | Gather a history and perform a physical examination |
EPA2 | Prioritize a differential diagnosis following a clinical encounter |
EPA3 | Recommend and interpret common diagnostic and screening tests |
EPA4 | Enter and discuss orders/prescriptions |
EPA5 | Document a clinical encounter in the patient record |
EPA6 | Provide an oral presentation of a clinical encounter |
EPA7 | Form clinical questions and retrieve evidence to advance patient care |
EPA8 | Give or receive a patient handover to transition care responsibility |
EPA9 | Collaborate as a member of an interprofessional team |
EPA10 | Recognize a patient requiring urgent or emergent care and initiate evaluation and management |
EPA11 | Obtain informed consent for tests and/or procedures |
EPA12 | Perform general procedures of a physician |
EPA13 | Identify system failures and contribute to a culture of safety and improvement |
EPA, Entrustable Professional Activity
All pilot schools were offered the opportunity for their learners to participate in AAMC-administered surveys about their academic and professional development. We sought to analyze data previously collected in an early postgraduate year (PGY)-1 survey administered by the AAMC to class of 2019 pilot schools’ graduates (including graduates of those schools that had implemented core EPAs’ curricula and formative assessments in the clinical years for the class of 2019 as well as of those schools that had not) to determine if graduates who had felt prepared to perform core EPAs under indirect supervision at the start of PGY-1 training reported an easier-than-expected transition to residency.
Materials and Methods
In September 2019, the AAMC administered a 13-item questionnaire, including the items shown in Online Resource 2, to class of 2019 graduates of nine pilot schools (one school had declined participation) under an Institutional Review Board–approved protocol. The questionnaire included multiple-choice items about specialty, preparedness to perform each EPA under indirect supervision at the start of residency, level of supervision received when the EPA was first performed during residency and ease of transition to PGY-1 training, and open-ended response items about medical school activities that were, or could have been, helpful in preparing for PGY-1 training. Three months into PGY-1 training, the AAMC sent email invitations to graduates to complete the confidential online questionnaire; non-respondents received up to four reminders, but incentives were not offered for completion.
In 2020, for our present study, a database of de-identified, individualized records for all nine participating schools’ class of 2019 graduates was created for analysis. As described in more detail below, this database included early PGY-1 questionnaire response data, AAMC Student Records System data [4], and United States Medical Licensing Examination (USMLE) Step 2 Clinical Knowledge (CK) scores and Step 2 Clinical Skills (CS) results.
Early PGY-1 Questionnaire Data
Based on responses to the item, “Are you currently doing a preliminary/transitional year of training?,” we created a dichotomous position-type variable (“preliminary” including all preliminary/transitional year responses vs. “categorical” for all “categorical position” responses). Based on responses to items about specialty and position type, we created a seven-category variable for PGY-1 training specialty, including “internal medicine/neurology” (categorical internal medicine, alone or combined, categorical neurology and preliminary internal medicine), “family medicine” (categorical family medicine and transitional year), “pediatrics” (all pediatrics), “surgery” (all categorical and preliminary surgery specialties), “emergency medicine” (all emergency medicine), “obstetrics and gynecology” (all obstetrics and gynecology), and “other unspecified” (all other positions not specified in the preceding categories, e.g., categorical psychiatry, categorical anesthesiology, categorical pathology).
Based on responses to the item about preparedness to perform each of 13 different activities, corresponding to core EPAs (with “intravenous [IV] line placement” selected as the procedure for EPA12: Perform general procedures of a physician) under indirect supervision at the start of residency, we created a dichotomous variable for preparedness to perform each activity under indirect supervision (yes [“I was prepared to perform this activity under indirect supervision”] vs. no [“I was prepared to do this activity under direct supervision” and “I was not prepared to do this activity”]).
Our main outcome of interest, ease of transition to residency, was based on responses to the item, “How was your transition from medical student to PGY-1 resident regarding the responsibilities you assumed professionally?” Response options included “much harder than I expected,” “somewhat harder than I expected,” “just about as I expected (not easier or harder),” “somewhat easier than I expected,” and “much easier than I expected.” We analyzed this outcome as a continuous variable, from 1 = “much harder than I expected” to 5 = “much easier than I expected.”
Based on responses to two open-ended questions (“During medical school, what activities/experiences helped you prepare for the start of your PGY-1 training?” and “How can medical schools better prepare students for the start of PGY-1 training?”), two AAMC research staff inductively developed organizational and classification codes to describe activities and experiences.
AAMC Student Records System Data
We used AAMC Student Records System (SRS) data for each graduate’s medical school, sex, age, and degree program. We created a dichotomous variable for degree program (all combined MD/other vs. MD). We created a dichotomous variable for each graduate’s medical school based on the school’s timeline for implementation of core EPAs (i.e., curriculum content and formative assessments) in an entrustment framework (implemented for class of 2019 graduates vs. not implemented for class of 2019 graduates).
USMLE Step 2 CK and Step 2 CS Data
The National Board of Medical Examiners (NBME) provided permission for release of first attempt USMLE Step 2 CK 3-digit scores and Step 2 CS pass/fail results for graduates in our database. We included Step 2 CK and Step 2 CS as non-self-reported measures that have content alignment with one or more core EPAs and have been shown by other investigators to be associated with performance during residency [5–8].
Data Analysis
The individual-level, de-identified records of data from these different sources were linked using unique AAMC identifiers to create a single database of individual-level, de-identified records for analysis. We used chi-square, Pearson correlations, and analysis of variance to test the significance of bivariate associations and multivariable linear regression to identify predictors of ease of transition to residency. We performed all quantitative analyses using Stata 15 (StataCorp, College Station, TX) and categorized all questionnaire respondents’ open-ended comments using Atlas.ti 6.2 (ATLAS.ti Scientific Software Development GmbH, Berlin, Germany).
Results
Of the nine schools’ 1575 class of 2019 graduates, valid email addresses were available for 1438 (91% of 1575); these 1438 graduates who received the questionnaire invitation comprised the potential respondents’ group. Of these 1438 graduates, 274 (19%) completed the questionnaire, at least in part (“respondents”) and 1164 (81%) did not (“non-respondents”). Of the 274 respondents, 241 (88%) who responded to all study items of interest comprised our sample. Characteristics of all 1438 graduates, grouped by respondent status and final study sample inclusion status, are shown in Table 2. As shown, those included in the final sample differed from those excluded from the final sample by sex (p = .029), and Step 2 CK score (p = .023) but not by degree program (p = .913), medical school implementation status (p = .682), age (p = .870), or Step 2 CS results (p = .336).
Table 2.
Characteristics of graduates invited to complete the early PGY-1 questionnaire, grouped by respondent status and by final sample inclusion status, N = 1438
Characteristic | Totala N = 1438 |
Respondentsa n = 274 |
Non-respondentsa n = 1164 |
p-value | Included in final samplea n = 241 |
Not included in final samplea n = 1197 |
p-value |
---|---|---|---|---|---|---|---|
Sex | |||||||
Men | 775 (54) | 133 (49) | 642 (55) | .055b | 114 (48) | 661 (55) | .029b |
Women | 662 (46) | 140 (51) | 522 (45) | 126 (52) | 536 (45) | ||
Missing | 1 (< 1) | 1 (< 1) | 0 (0) | 1 (< 1) | 0 (0) | ||
Degree program | |||||||
MD | 1286 (89) | 247 (90) | 1039 (89) | .668 | 216 (90) | 1070 (89) | .913 |
Combined MD/other | 152 (11) | 27 (10) | 125 (11) | 25 (10) | 127 (11) | ||
Medical school core EPA implementation status | |||||||
Not implemented for class of 2019 | 781 (54) | 147 (54) | 634 (54) | .807 | 128 (53) | 653 (55) | .682 |
Implemented for class of 2019 | 657 (46) | 127 (46) | 530 (46) | 113 (47) | 544 (45) | ||
N = 1431 | n = 274 | n = 1157 | n = 241 | n = 1190 | |||
USMLE Step 2 CS | |||||||
Pass | 1360 (95) | 262 (96) | 1098 (95) | .622 | 232 (96) | 1128 (95) | .336 |
Fail | 71 (5) | 12 (4) | 59 (5) | 9 (4) | 62 (5) | ||
Mean [SD] | Mean [SD] | Mean [SD] | Mean [SD] | Mean [SD] | |||
Age (years) | 28.5 [2.7] | 28.5 [2.9] | 28.5 [2.6] | .978 | 28.5 [23.1] | 28.5 [2.6] | .870 |
N = 1432 | n = 274 | n = 1158 | n = 241 | n = 1191 | |||
Mean [SD] | Mean [SD] | Mean [SD] | Mean [SD] | Mean [SD] | |||
UMSLE Step 2 CK score | 244.0 [16.5] | 245.5 [16.0] | 243.6 [16.6] | .081 | 246.2 [15.5] | 243.5 [16.6] | .023 |
PGY postgraduate year, EPA Entrustable Professional Activities, USMLE United States Medical Licensing Examination, CK clinical knowledge, CS clinical skills
aPercentages shown are for column percentages within each category; totals may not add up to 100% due to rounding
bP-value for 2 × 2 chi-square that excluded 1 individual for whom information regarding sex was missing
Table 3 shows the study sample characteristics grouped by ease of transition. As shown, the mean [standard deviation] ease of transition was 3.1 [0.9]—indicating, on average, the transition to residency was just about as respondents had expected it would be (response options range from 1 = “much harder than I expected” to 5 = “much easier than I expected”). Not shown in Table 3, neither age (correlation = .04; p = .540) nor Step 2 CK score (correlation = −.06; p = .380) was associated with ease of transition.
Table 3.
Characteristics of PGY-1 residents grouped by ease of transition, N = 241
Characteristic | # (%)a | Mean [SD] ease of transitionb | p-value |
---|---|---|---|
Sexc | 240 | 3.1 [0.9] | .790 |
Men | 114 (48) | 3.1 [0.9] | |
Women | 126 (52) | 3.1 [1.0] | |
Degree program | 241 | 3.1 [0.9] | .083 |
MD | 216 (90) | 3.1 [0.9] | |
Combined MD/other | 25 (10) | 3.4 [1.0] | |
Medical school | 241 | 3.1 [0.9] | .389 |
Not implemented | 128 (53) | 3.0 [0.9] | |
Implemented | 113 (47) | 3.2 [0.9] | |
USMLE Step 2 CS | 241 | 3.1 [0.9] | .959 |
Pass | 232 | 3.1[0.9] | |
Fail | 9 | 3.1 [1.1] | |
PGY-1 specialty | 241 | 3.1 [0.9] | .070 |
Internal medicine | 82 (34) | 3.1 [0.9] | |
Surgery | 42 (17) | 3.1 [0.7] | |
Pediatrics | 30 (12) | 3.2 [1.0] | |
Emergency medicine | 23 (10) | 3.0 [0.7] | |
Obstetrics and gynecology | 18 (7) | 2.5 [0.9] | |
Family medicine | 23 (10) | 3.3 [1.2] | |
All other/unspecified | 23 (10) | 3.4 [1.0] | |
PGY-1 position type | 241 | 3.1 [0.9] | .087 |
Preliminary | 40 (17) | 3.3 [1.0] | |
Categorical | 201 (83) | 3.0 [0.9] |
PGY postgraduate year, SD standard deviation, EPA Entrustable Professional Activity, USMLE United States Medical Licensing Examination, CS clinical skills
aPercentages shown are for column percentages within each category; totals may not add up to 100% due to rounding
bValues shown are based on the following scale: 1 = “much harder than I expected,” 2 = “somewhat harder than I expected,” 3 = “just about as I expected (not easier or harder),” 4 = “somewhat easier than I expected,” 5 = “much easier than I expected”
cSex information not available for one graduate
Table 4 shows associations between reported preparedness to perform each of the 13 EPAs under indirect supervision at the start of residency and ease of transition. As shown, preparedness to perform 11 of the 13 EPAs under indirect supervision at the start of residency was associated (each p < .05) with ease of transition. Each of EPA1 and EPA5 was not associated with ease of transition (p = .533 and p = .221, respectively). As also shown in Table 4, the proportion of residents who were prepared at the start of the PGY-1 to perform each activity under indirect supervision varied across activities, from 34% (81/240) for EPA12 to 97% (232/238) for EPA1.
Table 4.
Preparedness to perform each activity at the start of the PGY-1 year under indirect supervision grouped by ease of transition
“I was prepared to do this activity under indirect supervision” | # (%)a | Mean [SD] ease of transitionb | p-value |
---|---|---|---|
1. Gather history/perform physical examination | 238 | 3.1 [0.9] | .533 |
Yes | 232 (97) | 3.1 [0.9] | |
No | 6 (3) | 3.3 [1.5] | |
2. Prioritize differential diagnosis following clinical encounter | 241 | 3.1 [0.9] | .039 |
Yes | 210 (87) | 3.1 [0.9] | |
No | 31 (13) | 2.8 [1.0] | |
3. Recommend/interpret common diagnostic tests/procedures | 241 | 3.1 [0.9] | .009 |
Yes | 158 (66) | 3.2 [0.9] | |
No | 83 (34) | 2.9 [0.9] | |
4. Enter and discuss orders and prescriptions | 241 | 3.1 [0.9] | .004 |
Yes | 109 (45) | 3.3 [0.9] | |
No | 132 (55) | 2.9 [0.9] | |
5. Document clinical encounter in patient record | 241 | 3.1 [0.9] | .221 |
Yes | 217 (90) | 3.1 [0.9] | |
No | 24 (10) | 2.9 [1.0] | |
6. Provide oral presentation of clinical encounter | 241 | 3.1 [0.9] | .014 |
Yes | 220 (91) | 3.1 [0.9] | |
No | 21 (9) | 2.6 [1.0] | |
7. Form clinical questions and retrieve evidence to advance patient care | 240 | 3.1 [0.9] | .003 |
Yes | 201 (84) | 3.2 [0.9] | |
No | 39 (16) | 2.7 [1.0] | |
8. Give/receive patient handover to transition care responsibility | 240 | 3.1 [0.9] | < .001 |
Yes | 160 (67) | 3.3 [0.8] | |
No | 80 (33) | 2.8 [1.0] | |
9. Collaborate as member of interprofessional team | 239 | 3.1 [0.9] | .007 |
Yes | 200 (84) | 3.2 [0.9] | |
No | 39 (16) | 2.7 [1.0] | |
10. Recognize a patient requiring urgent/emergent care and initiative evaluation and management | 240 | 3.1 [0.9] | .006 |
Yes | 124 (52) | 3.3 [0.9] | |
No | 116 (48) | 2.9 [0.9] | |
11. Obtain informed consent for tests and/or procedures | 240 | 3.1 [0.9] | .008 |
Yes | 146 (61) | 3.2 [0.9] | |
No | 94 (39) | 2.9 [1.0] | |
12. Insert an intravenous (IV) line | 240 | 3.1 [0.9] | .036 |
Yes | 81 (34) | 3.3 [0.8] | |
No | 159 (66) | 3.0 [1.0] | |
13. Identify system failures and contribute to a culture of safety and improvement | 239 | 3.1 [0.9] | < .001 |
Yes | 134 (56) | 3.3 [0.8] | |
No | 105 (44) | 2.9 [1.0] |
PGY, postgraduate year; SD, standard deviation
aPercentages shown are for column percentages within each category; totals may not add up to 100% due to rounding
bValues shown are based on the following scale: 1 = “much harder than I expected,” 2 = “somewhat harder than I expected,” 3 = “just about as I expected (not easier or harder),” 4 = “somewhat easier than I expected,” 5 = “much easier than I expected”
As shown in Online Resource 3, which tabulates data for resident preparedness to perform each activity at the start of PGY-1 training under indirect supervision grouped by specialty, the proportion of residents who were prepared at the start of the PGY-1 to perform each activity under indirect supervision varied by specialty for EPAs 2, 4, 7, 12, and 13, described in more detail as follows. For EPA2, the overall proportion of 87% (210/241) ranged by specialty (p = .044) from 70% (16/23; “all other specialties”) to 98% (41/42, “surgery”). For EPA4, the overall proportion of 45% (109/241) ranged by specialty (p = .001) from 17% (3/18, “obstetrics and gynecology”) to 67% (28/42, “surgery”). For EPA7, the overall proportion of 84% (201/240) ranged by specialty (p = .048) from 70% (16/23; “all other specialties”) to 96% (22/23, “family medicine”). For EPA12, the overall proportion of 34% (81/240) ranged by specialty (p = .004) from 6% (1/18, “obstetrics and gynecology”) to 57% (13/23, “emergency medicine”). Finally, for EPA13, the overall proportion of 56% (134/239) ranged by specialty (p = .005) from 30% (7/23, “family medicine”) to 81% (34/42, “surgery”).
Online Resource 4 tabulates data for associations between preparedness to perform each of these 13 different activities under indirect supervision at the start of residency and level of supervision (direct or indirect) the first time the resident performed the activity during residency; these associations were significant for all 13 activities (each p < .05). Residents who had not felt prepared to perform the activity under indirect supervision were generally underrepresented among those who reported having done the activity under indirect supervision the first time they performed the activity and were overrepresented among those who reported having done the activity under direct supervision the first time they performed the activity and those who reported they had not yet done the activity during residency. Numbers of residents who reported initially doing the activity under indirect supervision despite not having felt prepared to do so were generally low, ranging from 2 residents for EPA1 (2/236, 1%) to 36 residents for EPA11 (36/237, 15%). Finally, since starting residency most residents had performed EPAs 1–11, but much lower numbers had performed EPA12 and EPA13.
Due to the small study sample size, we took a conservative approach to inclusion of variables to examine as possible predictors of ease of transition, including all variables significant at p < .10 in bivariate analysis in the multivariable linear regression model. We created a composite EPA-preparedness score as the proportion of EPAs that the resident had felt prepared to perform under indirect supervision at the start of residency from among those EPAs that were associated (at p < .10) with ease of transition in bivariate analysis (Table 4) and that essentially all residents had performed since starting residency. Thus, this composite EPA-preparedness score included preparedness to perform each of 9 activities: EPAs 2–4 and 6–11. Among all 241 residents for these 9 EPA items, there were 9 sporadic missing responses (9/2169; 0.4%); individual denominators were adjusted accordingly in calculating composite EPA-preparedness scores for these 9 missing responses. The EPA-preparedness score (mean = 0.71 [standard deviation = 0.26]) correlated with ease of transition (correlation = 0.291; p < .001) and was also higher among the 113 graduates of schools that had implemented core EPAs for their class of 2019 graduates compared to the 128 graduates of schools that had not done so (0.76 [0.25] vs. 0.66 [0.26], respectively; p = .003). We included EPA-preparedness score as well as each of degree program, PGY-1 position type, and PGY-1 specialty in the linear regression model as each of these variables was associated at p < . 10 with ease of transition in bivariate analysis. The multivariable linear regression results are shown in Fig. 1. As shown, each of EPA-preparedness score (β-coefficient 1.08; 95% confidence interval .64–1.52; p< .001) and PGY-1 specialty (each of “surgery” [β-coefficient -.53; 95% confidence interval -.98– -.08; p= .022] and “obstetrics and gynecology” [β-coefficient -.89; 95% confidence interval -.1.43– -.35; p= .001], each vs. “all other/unspecified specialties”) predicted ease of transition. Combined MD/other (vs. MD) degree program graduation (β-coefficient -.13; 95% confidence interval -.51–.24; p = .493) and preliminary (vs. categorical) position type (β-coefficient -.22; 95% confidence interval -.54–.11; p = .196) were not independently associated with ease of transition.
Fig. 1.
Results of multivariable linear regression model predicting ease of transition, N = 241. Abbreviations: EPAs, Entrustable Professional Activities; OBGYN, obstetrics and gynecology; PGY, postgraduate year. a“EPA-preparedness score” ranges from 0 to 1 and refers to the proportion of nine EPA (EPAs 2–4 and 6–11) that the respondent reported being prepared to perform under “indirect supervision” at the start of residency
Detailed results of the open-ended comment categorization are shown in Online Resource 5, which shows classification of codes describing activities/experiences from comments made in response to open-ended questions and Online Resource 6, which shows examples of these verbatim comments. For the question, “during medical school, what activities/experiences helped you prepare for the start of your PGY-1 training?”, hands-on clinical experiences were the most prevalent experiences described (n = 161, including 47 who explicitly noted sub-internships or acting internships). For the question, “how can medical schools better prepare students for the start of PGY-1 training?”, the most prevalent experiences described were practical learning/training (n = 101).
Discussion
Residents who reported that they had been prepared to perform core EPAs under indirect supervision at the start of training felt that their transition to residency was easier than expected. PGY-1 specialty was also independently associated with ease of transition. Our study differs from several other studies of trainees’ self-assessments on their preparedness to perform core EPAs in timing of administration. Individuals responded to our questionnaire about their preparedness to perform these activities at the start of residency after they had completed several months of residency rather than at resident orientation or at medical school graduation [9–11]. Our study also differs from previous work in the literature on level of supervision provided to new residents in performing various activities. For items on our questionnaire about level of supervision that residents reported receiving the first time they performed various core EPAs during PGY-1 training, we defined “supervisor” as including a more senior resident, fellow, or attending physician. This approach addresses a limitation of a previous national survey administered by the NBME to residents regarding supervision they received in their initial months of training when performing a range of activities (including some that were subsequently described as core EPAs) [12]. In this previous survey, items about supervision pertained only to supervision by faculty and, as noted by the authors of this previous study, “… our findings are tempered by certain limitations. Although our phrasing of the supervision question in terms of the attending physician was consistent with prior research, the fact remains that new residents often receive supervision from more senior residents …” [12].
Like other studies in the literature of learners’ perspectives, we observed differences across EPAs in learners’ self-report of their preparedness to perform these activities [9–11]. Supervisors have similarly reported differences across EPAs regarding their perceptions of learners’ preparedness to perform these activities [11, 13]. At least 90% of graduates in our study sample felt prepared to perform EPAs 1, 5, and 6 under indirect supervision at the start of residency, EPAs that were identified as “reporter” level in the Reporter-Interpreter-Manager-Educator (“RIME”) framework of the 13 core EPAs [14]. In contrast to some other core EPAs, learners may have extensive opportunities throughout the curriculum to practice these three core EPAs with feedback, gradually progressing during medical school to less supervision. In a single-institutional study of the extent to which activities described as core EPAs were already incorporated in faculty assessments of students, Colbert-Goetz and colleagues noted that EPAs 1, 5, and 6, as well as 2 and 9, encompassed responsibilities often assigned to clerkship students; written comments by their physician assessors most frequently aligned with these EPAs [15]. Unlike EPAs 1, 5, and 6, there were several EPAs that most residents in our study had done since starting training but relatively lower proportions had felt they had been prepared, in retrospect, to perform under indirect supervision at the start of training, including EPAs 4, 8, 10, and 11. Relatively lower levels of preparedness for these EPAs have also been reported by others [9, 11, 13]. Focused curricular content and assessment for these activities, such as has been described for EPA8 [16–18], may be warranted for students and incoming residents; residents in our study specifically commented on the need for more opportunities to practice EPAs 4 and 8 prior to graduation. Finally, many residents in our study reported that they had not performed EPA12 or EPA13 since starting residency.
Our study has notable limitations. One limitation of our study was the low questionnaire response rate. Low rates for questionnaires administered to residents already engaged in training under similar circumstances have been reported by others [19, 20]. We also observed selection bias in our final sample by sex and Step 2 CK score; however, neither of these variables was associated with our outcome of interest. We also note that, despite our relatively small study sample, the proportional distributions of PGY-1 specialties and position types among residents in our study sample were generally like the proportional distributions of specialty and position-type matches for class of 2019 graduates nationally [21, 22], as shown in Online Resource 7. Our study sample composition regarding sex and degree program was like the composition nationally among AAMC 2019 Graduation Questionnaire respondents [10].
Our findings about preparedness to perform EPAs and ease of transition to residency reflect residents’ self-assessments only and are subject to recall bias. Several studies have documented gaps, at various stages in training, between learners’ EPA self-assessments and supervisors’ assessments of their trainees as a group; learners as a group generally rated themselves more highly than did supervisors [11, 23]. The non-self-assessment data available to us were standardized test performance data collected by the NBME (Step 2 CK and Step 2 CS); we did not have access to identified data from supervisors’ assessments or other performance assessments (e.g., in-service scores, milestone ratings as reported to the Accreditation Council for Graduate Medical Education) which is a further limitation of our study.
All residents in our study had attended Liaison Committee for Medical Education (LCME)-accredited core EPA pilot schools; thus, our findings may not generalize to graduates of non-pilot LCME-accredited or non-LCME-accredited schools. Previous investigators have reported EPA-specific differences in incoming residents’ self-assessment of readiness to perform EPAs in association with type of medical school attended [9]. Finally, our study was an observational retrospective study; although there was an association between core EPA preparedness score and ease of transition, causality cannot be inferred.
Educators at medical schools implementing core EPAs in an entrustment framework may find the Guiding Principles that were shared across the core EPA pilot schools (as shown in Online Resource 1) of value [1]. For some EPAs, acting internships/sub-internships (rotations that typically provide increased student autonomy and clinical responsibilities for students) rather than third year clerkship rotations may provide more substantive opportunities for students to regularly perform and get feedback on many of the core EPAs. Investigators at one medical school that is among those participating in the AAMC Core EPA pilot recently reported on the design of an acting internship curriculum focused on EPA4, EPA6, EPA8, EPA9, and EPA10; this may also serve as a useful model for other educators interested in incorporating core EPA curricula in acting internships/sub-internships [24].
It is important to note that all pilot schools in our study implemented the core EPAs for formative assessment purposes only; workplace-based assessments, generally implemented in a student-driven process, were provided as formative feedback. Theoretical decisions about graduating students’ readiness for entrustment in core EPAs under indirect supervision have been made only for program evaluation and process improvement purposes among core EPA pilot schools; these decisions have not had any impact on student advancement or graduation, nor is this information included in Medical Student Performance Evaluations or shared with residency program directors [25].
An important area for future research regarding the core EPAs is examination of the evidence regarding the extent to which decisions made about students’ readiness to perform core EPAs under indirect supervision, among other evidence of students’ skills in performing core EPAs, may be associated with subsequent performance during residency—for example, as judged by program directors. In the Coalition for Physician Accountability Undergraduate-Graduate Review Committee Recommendations that were recently released for Public Comment, it was noted that, “… challenges in the transition between medical school and residency that are negatively impacting the UME-GME transition … include overreliance on licensure examination scores in the absence of valid, trustworthy measures of students’ competence and clinical abilities …” [26]. Thus, such evidence about students’ skills in performing core EPAs could inform criteria to be used in the resident selection process in the future. Information about readiness to perform core EPAs under indirect supervision, or other evidence of students’ skills in performing core EPAs, may also have value as part of a post-match “warm handover” from undergraduate medical education (UME) to graduate medical education (GME) [27].
Although we identified several variables associated with ease of transition, there are likely other contributory variables not included in our study. For example, in a national survey of internal medicine residents about important skills for internship, “identifying when to seek additional help and expertise” and “prioritizing clinical tasks and managing time efficiently” were two of the most highly ranked skills of 10 listed [28]. Residents in this study selected sub-internship/acting internship as being the most helpful course in preparing for internship, an observation aligned with residents’ comments about the value of these experiences in our study [28]. Thus, future research examining the extent to which graduates’ participation in specific required or elective curricular experiences during medical school (e.g., “night on call” requirements, sub-internships/acting internships, residency preparation courses) may be associated with ease of transition could be informative.
Finally, there may be specialty-specific differences in program directors’ expectations for their incoming PGY-1 residents regarding readiness to perform the core EPAs under indirect supervision [29, 30]. According to a national survey of internal medicine program directors, most respondents felt that it was not necessary that incoming PGY-1 residents be ready to perform EPA12 under indirect supervision at the start of training but did indicate that they must or should be ready to perform all the other core EPAs [29]. A recently published Consensus Statement from the Association of Program Directors in Surgery on Ideal Senior Medical Student Experiences for Preparedness for General Surgery Internship includes the following Recommendation, “We support that medical students should achieve entrustability in the AAMC Core EPAs” [30]. Recognition of specialty-specific differences in program directors’ expectations regarding incoming PGY-1 residents’ preparedness to perform various core EPAs under indirect supervision could inform medical school efforts (e.g., through specialty-specific residency preparation courses developed in collaboration and with the support of program directors in various specialties) in preparing students to transition to residency.
Conclusion
Graduates who felt prepared to perform many of the core EPAs under indirect supervision at the start of PGY-1 training reported an easier-than-expected transition to residency. Our findings provide support for medical schools’ efforts to prepare their students for readiness to perform many of the core EPAs under indirect supervision at the start of residency and can also inform specialty-specific efforts by program directors and other graduate medical educators, in close collaboration with undergraduate medical educators, to further optimize the UME to GME transition [26, 27, 31, 32].
Supplementary Information
Below is the link to the electronic supplementary material.
Acknowledgements
We thank Lynn Shaull, senior research analyst at the Association of American Medical Colleges, Washington, DC, USA, for assistance with classification of the activities/experiences reported by respondents from the open-ended questions. We thank the National Board of Medical Examiners, Philadelphia, PA, USA, for permission to use United States Medical Licensing Examination Step 2 Clinical Knowledge and Step 2 Clinical Skills data. Preliminary results of this analysis were presented at the AMEE virtual meeting, September 2020.
Funding
This work was supported by the Association of American Medical Colleges and by the medical schools participating in the core EPAs for entering residency pilot. All participating pilot institutions and individuals can be found at https://www.aamc.org/initiatives/coreepas/pilotparticipants/.
Availability of Data and Material
The data that support the findings of this study are available from the Association of American Medical Colleges and the National Board of Medical Examiners. Because these data were collected on a confidential, identified basis, restrictions apply to the availability of these data to protect the identity of participating individuals and of participating institutions in this study. Requests for AAMC data can be submitted at https://www.aamc.org/request-aamc-data and requests for NBME data can be submitted at https://nbme.org/services/request-data.
Declarations
Ethics Approval
The AAMC Human Subjects Protection Program staff reviewed this study and determined that it was exempt from further Institutional Review Board review because it does not involve human subjects.
Informed Consent
Not applicable.
Conflict of Interest
The authors are all directly involved in the Core EPA pilot in various roles. Three authors are employed by the organization (AAMC) that convened the Core EPA pilot; the remaining five authors are faculty at medical schools participating in the core EPAs for entering residency pilot and are involved in core EPA implementation at their respective schools. The authors declare no competing interests.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Lomis K, Amiel JM, Ryan MS, Esposito K, Green M, Stagnaro-Green A, et al. Implementing an entrustable professional activities framework in undergraduate medical education: early lessons from the AAMC core entrustable professional activities for entering residency pilot. Acad Med. 2017;92 (6):765–70. 10.1097/ACM.0000000000001543. [DOI] [PubMed]
- 2.Brown D, Warren JB, Hyderi A, Drusin RE, Moeller J, Rosenfeld M, et al. Finding a path to entrustment in undergraduate medical education: a progress report from the AAMC core entrustable professional activities for entering residency entrustment concept group. Acad Med. 2017;92(6):774–9. 10.1097/ACM.0000000000001544. [DOI] [PubMed]
- 3.Moeller JJ, Warren JB, Crowe RM, Wagner DP, Cutrer WB, Hyderi AA, et al. Determining entrustment: a progress report from the AAMC core EPA entrustment concept group. Med Sci Educ. 2020;30:395–401. doi: 10.1007/s40670-020-00918-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Association of American Medical Colleges. Student Records System. Association of American Medical Colleges website. © 2019. In https://www.staging.aamc.org/services/srs/. Accessed December 12, 2020.
- 5.Winward ML, Lipner RS, Johnston MM, Cuddy MM, Clauser BE. The relationship between communication scores from the USMLE Step 2 Clinical Skills examination and communication ratings for first-year internal medicine residents. Acad Med. 2013;88(5):693–698. doi: 10.1097/ACM.0b013e31828b2df1. [DOI] [PubMed] [Google Scholar]
- 6.Cuddy MM, Winward ML, Johnston MM, Lipner RS, Clauser BE. Evaluating validity evidence for USMLE Step 2 Clinical Skills data gathering and data interpretation scores: does performance predict history-taking and physical examination ratings for first-year internal medicine residents? Acad Med. 2016;91(1):133–139. doi: 10.1097/ACM.0000000000000908. [DOI] [PubMed] [Google Scholar]
- 7.Marcus-Blank B, Dahlke JA, Braman JP, Borman-Shoap E, Tiryaki E, Chipman J, et al. Predicting performance of first-year residents: correlations between structured interview, licensure exam, and competency scores in a multi-institutional study. Acad Med. 2019;94(3):378–387. doi: 10.1097/ACM.0000000000002429. [DOI] [PubMed] [Google Scholar]
- 8.Sharma A, Schauer DP, Kelleher M, Kinnear B, all D, Warm E. USMLE Step 2 CK: best predictor of multimodal performance in an internal medicine residency. J Grad Med Educ. 2019;11(4):412–9. 10.4300/JGME-D-19-00099.1. [DOI] [PMC free article] [PubMed]
- 9.Pearlman RE, Pawelczak MA, Bird JB, Yacht AC, Farina GA. Incoming interns perceived preparedness for core entrustable professional activities. Med Sci Educ. 2019;29:247–253. doi: 10.1007/s40670-018-00685-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Association of American Medical Colleges. Medical School Graduation Questionnaire. 2019 All schools’ report. In https://www.aamc.org/system/files/2019-08/2019-gq-all-schools-summary-report.pdf. Accessed May 30, 2021.
- 11.Lindeman BM, Sacks BC, Lipsett PA. Graduating students’ and surgery program directors’ views of the Association of American Medical Colleges core entrustable professional activities for entering residency: where are the gaps? J Surg Educ. 2015;72:e184–e192. doi: 10.1016/j.jsurg.2015.07.005. [DOI] [PubMed] [Google Scholar]
- 12.Raymond MR, Mee J, King A, Haist SA, Winward ML. What new residents do during their initial months of training. Acad Med. 2011;86(10):S59–S62. doi: 10.1097/ACM.0b013e31822a70ff. [DOI] [PubMed] [Google Scholar]
- 13.Pearlman RE, Pawelczak M, Yacht AC, Akbar S, Farina GA. Program director perceptions of proficiency in the core entrustable professional activities. J Grad Med Educ. 2017;9:588–592. doi: 10.4300/JGME-D-16-00864.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Meyer EG, Kelly WF, Hemmer PA, Pangaro LN. The RIME model provides a context for entrustable professional activities across undergraduate medical education. Acad Med. 2018;93(6):954. doi: 10.1097/ACM.0000000000002211. [DOI] [PubMed] [Google Scholar]
- 15.Colbert-Getz JM, Lappie K, Northrup M, Roussel D. To what degree are the 13 entrustable professional activities already incorporated into physicians’ performance schemas for medical students? Teach Learn Med. 2019;31(4):361–369. doi: 10.1080/10401334.2019.1573146. [DOI] [PubMed] [Google Scholar]
- 16.Gaffney S, Farnan JM, Hirsch K, McGinty M, Arora VM. The modified, multi-patient observed simulated handoff experience (M-OSHE): assessment and feedback for entering residents on handoff performance. J Gen Intern Med. 2016;31(4):438–441. doi: 10.1007/s11606-016-3591-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Lescinskas E, Stewart D, Shah C. Improving handoffs: implementing a training program for incoming internal medicine residents. J Grad Med Educ. 2018;10(6):698–701. doi: 10.4300/JGME-D-18-00244.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Posel N, Hoover ML, Bergman S, Grushka J, Rosenzveig A, Fleiszer D. Objective assessment of the entrustable professional activity handover in undergraduate and postgraduate surgical learners. J Surg Educ. 2019;76(5):1258–1266. doi: 10.1016/j.jsurg.2019.03.008. [DOI] [PubMed] [Google Scholar]
- 19.Stan VA, Correa R, Deslauriers JR, Faynboym S, Shah T, Widge AS. Support, technology and mental health: correlates of trainee workplace satisfaction. Perspect Med Educ. 2020;9:31–40. doi: 10.1007/s40037-019-00555-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Dyrbye LN, Satele D, Sloan J, Tait D. Shanafelt TD. Ability of the physician well-being index to identify residents in distress. J Grad Med Educ. 2014;6 (1):78–84. 10.4300/JGME-D-13-00117.1. [DOI] [PMC free article] [PubMed]
- 21.National Resident Matching Program, Results and Data: 2019 Main Residency Match®. National Resident Matching Program, Washington, DC. 2019. In https://mk0nrmp3oyqui6wqfm.kinstacdn.com/wp-content/uploads/2019/04/NRMP-Results-and-Data-2019_04112019_final.pdf. Accessed May 30, 2021.
- 22.American Urological Association. 2019 Urology Residency Match Statistics. In file:///C:/Users/dandriole/Downloads/2019-Urology-Residency-Match-Statistics%20(1).pdf. Accessed May 30, 2021
- 23.Soukoulis V, Gusic ME. Comparing student and clerkship director perspectives about readiness to perform the core entrustable professional activities at the start of the clerkship curriculum. Med Sci Educ. 2018;28:277–280. doi: 10.1007/s40670-018-0547-0. [DOI] [Google Scholar]
- 24.Garber AM, Feldman M, Ryan M, Santen SA, Dow A, Goldberg SR. Core EPAs in the acting internship: early outcomes from an interdepartmental experience. Med Sci Educ. 2021;31:527–533. doi: 10.1007/s40670-021-01208-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Brown DR, Moeller JJ, Grbic D, Biskobing DM, Crowe R et al. Outcomes of the entrustment process in the core EPAs (Entrustable Professional Activities) for entering residency pilot. Acad. Med. 2021. In press.
- 26.Coalition for Physician Accountability. Initial summary report and preliminary recommendations of the undergraduate medical education to graduate medical education review committee (UGRC). https://physicianaccountability.org/wp-content/uploads/2021/04/UGRC-Initial-Summary-Report-and-Preliminary-Recommendations-1.pdf. Accessed May 16, 2021.
- 27.Walker CA, Splinter A, Khan M, Schaffernocker T, Verbeck N, Kman N, McCallister JW. Educational handoffs between medical school and residency: a national survey of residency program directors. MedEdPublish. 2018. Online published February 27, 2018. 10.15694/mep.2018.0000047.1.
- 28.Pereira AG, Harrell HE, Weissman A, Smith CD, Dupras D, Kane GC. Important skills for internship and the fourth-year medical school courses to acquire them: a national survey of internal medicine residents. Acad Med. 2016;91(6):821–826. doi: 10.1097/ACM.0000000000001134. [DOI] [PubMed] [Google Scholar]
- 29.Angus SV, Vu TR, Willett LL, Call S, Halvorsen AJ, Chaudhry S. Internal medicine residency program directors’ views of the core entrustable professional activities for entering residency: an opportunity to enhance communication of competency along the continuum. Acad Med. 2017;92(6):785–791. doi: 10.1097/AC<.0000000000001419. [DOI] [PubMed] [Google Scholar]
- 30.LaFemina J, Ahuva V, Alseidi A, Balters M, Brasel K, Clark III C, et al. APDS consensus statement: ideal senior medical student experiences for preparedness for general surgery internship. J Surg Educ. 2020. Online July 28 2020. 10.1016/j.jsurg.2020.07.015. [DOI] [PubMed]
- 31.American College of Surgeons. ACS/APDS/ASE Resident Prep Curriculum. In https://www.facs.org/education/program/resident-prep. Accessed May 30, 2021.
- 32.American College of Obstetricians and Gynecologists. Postmatch curriculum. In https://www.acog.org/education-and-events/creog/curriculum-resources/postmatch-curriculum. Accessed May 30, 2021.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The data that support the findings of this study are available from the Association of American Medical Colleges and the National Board of Medical Examiners. Because these data were collected on a confidential, identified basis, restrictions apply to the availability of these data to protect the identity of participating individuals and of participating institutions in this study. Requests for AAMC data can be submitted at https://www.aamc.org/request-aamc-data and requests for NBME data can be submitted at https://nbme.org/services/request-data.