Abstract
Introduction
The last 3 decades have seen significant changes in medical education and corresponding assessment of medical trainees. Competency-based medical education provided a more comprehensive model than the previous time-based process but remained insufficient. Introduced in 2005, entrustable professional activities (EPAs) offer a more robust curriculum development and assessment process, especially in regard to clinician-oriented workplace-based assessments. Despite their intuitive match with decisions made in the clinical environment daily by clinicians, the development of specialty-specific EPAs and corresponding culturally situated assessment tools has lagged.
Methods
To address this gap, a 90-minute faculty development workshop was created to introduce faculty to EPAs and their assessment and to provide hands-on practice developing and using EPAs.
Results
Previous facilitations of this workshop received favorable responses from participants regarding level of detail, understanding of the content, and intent to employ EPAs at their own institutions.
Discussion
Implementation of EPAs into the assessment portfolio of medical trainees following this workshop will maximize confidence in determining when a trainee is ready for independent practice.
Keywords: Competency-Based Education, Entrustable Professional Activities, Competency-Based Medical Education
Educational Objectives
By the end of this session, attendees will be able to:
-
1.
Review the changes in medical education and assessment from a time-based to competency-based model.
-
2.
Discuss ongoing challenges in assessing medical trainees.
-
3.
Describe the role of entrustable professional activities in assessment.
Introduction
The landscape of medical education continues to evolve, and the quest to identify an appropriate, unifying framework for assessment of medical trainees continues. The ACGME competencies and milestones were, in the United States, the first step to providing a more learner-centered approach and are clearly focused on the individual skills required by society and patients.1–3 Competency-based medical education provides significant advancement in the field. However, the intention has not yet been fully realized because the reductionist approach to assessment risks graduating trainees who are still unprepared to do the integrated job required of physicians.2,4 Thus, a more robust framework for assessing trainees in the clinical environment is needed.
Entrustable professional activities (EPAs), as outlined by ten Cate, offer the needed framework for workplace-based assessments.2 Individual milestones and competencies separate the tasks of clinical work into discrete elements and, thus, fail to equate to the ability to practice clinically in an integrated fashion. The EPAs, through a focus on patient care outcomes, allow clinical educators to assess a medical trainee's ability to integrate and apply medical knowledge, skills, and attitudes to novel patients and contexts. In this way, EPAs allow for the competencies to be combined and reframed in the language of the work done by practicing clinicians. Assessment with the EPAs is not reduced to a list of person descriptors but is focused on work descriptors allowing assessment of multiple skills as well as their integration and application.5
Although introduced in 2005, EPAs are still in the process of being operationalized by graduate medical programs in all disciplines across the United States. EPAs are gaining popularity, though, because trainee assessment of EPAs is based on the level of independence a supervisor can allow. Using the language of trust, EPA assessment operationalizes the decisions clinical supervisors make every day with trainees. Therefore, the ability of frontline or clinical faculty to understand the creation, use, and limitations of EPAs is extremely relevant. During this 90-minute faculty development workshop, we introduce faculty to competency-based medical education and to the ongoing challenges surrounding assessment methods in the clinical environment, as well as describe the role of EPAs in medical trainee assessments. No prerequisite knowledge is needed to attend this workshop, and the ideal context would include all faculty within the division in order to create the context- and culture-specific aspects of the EPAs and definitions equating to the levels of trust.
Methods
Overview
Complex concepts with which learners have limited or no prior experience require both didactic instruction and an opportunity to apply or practice what has been learned. Thus, this faculty development workshop combines focused didactics interspersed with small-group activities followed by large-group discussion to allow for immediate application of the concepts provided in the didactic portions. Specifically, the group activities allow faculty participants both to make decisions of trust after reading a complex vignette and to develop EPAs for a nonmedical setting.
This workshop may be facilitated with two to five people, with any combination of facilitators presenting portions of the PowerPoint slides (Appendix E), while the others facilitate the small-group sessions. In addition to a laptop and projection system for displaying the PowerPoint slides, a whiteboard with dry-erase markers or an easel with broad markers is also needed. Copies of the attendee notes page (Appendix A) should be on hand for distribution to learners at the start of the workshop. When providing this workshop at a single institution within a single division/department, no specific seating arrangement is necessary. When providing this workshop at a meeting that includes participants from multiple institutions or divisions/departments, the room should be equipped with round tables, and participants should be encouraged to sit at the tables in groups of five to eight. Handouts should be printed on different colors of paper for easy identification.
Workshop Organization
The total workshop takes approximately 90 minutes to administer (50 minutes of didactics, 40 minutes of activities). The workshop is organized into the four broad sections outlined below.
-
1.
Introduction to assessment and competency-based medical education: This section utilizes slides 1–7 of the PowerPoint presentation (Appendix E) and should take approximately 10 minutes. Appendix A is distributed to participants at the beginning.
-
2.
ACGME competencies and milestones: This section utilizes slides 8–15. Slide review requires 10 minutes, and Group Activity A (“Current Challenges in Assessing Your Trainees”) requires a further 10 minutes. This activity is explained in detail below.
-
3.EPAs and their assessment: This section is delivered in three subsections.
-
a.The first subsection utilizes slides 16–20 for a 10-minute introduction to EPAs.
-
b.The second subsection utilizes slides 21–28 as a lead-in to Group Activity B (“Would You Trust This Resident?”—Appendix B). This subsection should take approximately 20 minutes (5 minutes for the slides, 15 minutes for the activity).
-
c.The third subsection utilizes slides 29–33 as a lead-in to Group Activity C (“Developing Coffeehouse EPAs”—Appendix C). This subsection should take approximately 20 minutes (2 minutes for the slides, 10 minutes to discuss the activity in small groups, 8 minutes to discuss it as a large group).
-
a.
-
4.
Next steps, summary, and conclusions: This section utilizes slides 34–42 and should take approximately 10 minutes.
Attendee Activities
Activity A: Audience Question—“What Are the Challenges You Have With the Assessments You Are Currently Using?”
For a single institution or division, participants in groups of two are asked to discuss their challenges for 3 minutes. Individuals are then asked to share their answers with the larger group for 7 minutes. One of the facilitators records the challenges identified on the whiteboard or easel. For multiple institutions and divisions, participants discuss the question at their tables for 5 minutes. Each small group then shares the challenges it has identified with the larger group for 5 minutes. Facilitators record these challenges on a whiteboard or easel.
Activity B: “Would You Trust This Resident?”
This exercise asks the participants to decide and discuss reasons why they would or would not trust a resident to perform a specified procedure based on the clinical vignette provided. Full activity description and activity handout are found in Appendix B. One of the facilitators records audience responses regarding why participants chose to trust the resident on a whiteboard or easel.
For single institutions or divisions, participants have 5 minutes to individually read the vignette and reflect, leaving 10 minutes to discuss the trust decision and factors playing into it as a large group. For multiple institutions and divisions, participants have 7 minutes to read the vignette and discuss it at the table, leaving 8 minutes to discuss the trust decision and factors playing into it as a large group.
Activity C: “Developing Coffeehouse EPAs”
This is a small-group exercise asking participants to create EPAs for one of three coffeehouse professional activities (Appendix C). Participants learned the components of an EPA in the preceding didactic component. By asking them to develop EPAs for three different levels of employees, this exercise also simulates the concept of EPAs for different levels of medical trainees (cashier = medical student, barista = resident, manager = fellow), which are discussed in the subsequent section of the workshop. Participants have 10 minutes to create the EPAs in small groups, with 5 minutes to discuss them with the large group. Facilitators record audience responses on a whiteboard or easel. Both single institution/division and multiple institutions/divisions perform this activity in the same way.
Results
This workshop was delivered to 23 faculty members at a master's of health professions education residency day. Attendees were from multiple institutions and held various degrees, but all were involved in the education of clinical trainees. Specifically, attendees included two doctors of philosophy (one in neuroscience and one in measurement, evaluation, and statistics), one doctor of education, 18 physicians, one physiotherapist, and one physician assistant. Six attendees were from countries other than the United States (i.e., Lebanon, France, Canada, and Ireland).
The seminar evaluation form (Appendix D) was completed anonymously using a 5-point Likert scale (1 = strongly disagree, 5 = strongly agree). The responses to the workshop were favorable, with a mean score of 4.4. Full results are provided in the Table.
Table. Workshop Evaluation Scoresa0 (N = 23).
| Question | Average Score | Score Range |
|---|---|---|
| I was well informed about the objectives of this seminar. | 4.5 | 4–5 |
| The presentation style was effective. | 4.4 | 4–5 |
| The activities in this seminar gave me sufficient practice and feedback. | 4.1 | 3–5 |
| The difficulty level of this semester was appropriate. | 4.5 | 4–5 |
| The instructors were well prepared. | 4.6 | 4–5 |
| I feel comfortable using EPAs as an assessment model in my practice. | 4.1 | 3–5 |
Abbreviation: EPAs, entrustable professional activities.
5-point Likert scale (1 = strongly disagree, 5 = strongly agree).
Narrative Feedback/Comments
-
•
“Great workshop! I really like the coffee house EPA. I would maybe then take it back and do a med ed EPA as a group.”
-
•
“Provide example for coffee EPA that participants can use to model their activity.”
-
•
“Define EPAs more in the beginning of your presentation.”
-
•
“I think the workshop was excellent! We had great discussion points and it was fine as well. One suggestion perhaps other participants could have a little more time to talk. Overall excellent discussion and participation.”
Discussion
Despite 10 years of research and literature on EPAs, their operationalization in medical education has been challenging. Given the impact of an evolving medical education landscape on frontline clinicians (many without specific training in medical education), the dearth of literature regarding the introduction of frontline faculty to the creation and use of EPAs in their daily work is surprising. To address this gap, five clinical physicians involved in health professions education developed this workshop as part of an assignment for obtaining their master of health profession education degrees from the University of Illinois at Chicago. The developers had varying backgrounds in terms of clinical specialty (internal medicine, family medicine, urology, pediatrics, general surgery), residency (United States, France, Lebanon), and institutional affiliation (community and university based). Understandably, these differences led to challenges during development because of time differences, additional clinical and academic expectations, and level of experience with EPAs.
The importance of EPAs in providing meaningful assessments of medical trainees made the development of this workshop an important step in training frontline faculty. A breadth of theoretical information on this topic exists, making initial decisions about what content to include difficult. In keeping with the goal of creating a workshop for frontline faculty, the decision was made to focus on the creation and assessment of EPAs. This focus allowed the workshop discussions to create important institution-focused outcomes and a shared mental model acceptable to the faculty using the final product. This workshop's success with and acceptance by a diverse group of educators, along with its lack of medical or specialty specificity, ensure that it adapts easily to other institutions and settings.
Assessments based on the EPAs are still in their infancy with regard to determinations of validity. That said, EPAs are subject to the same rigor and, potentially, criticisms as other workplace-based assessments. Specifically, rater bias (i.e., construct-irrelevant variance) is a strong potential challenge that needs to be addressed. Construct underrepresentation from too few items/cases/observations is another threat to EPA assessment tool validity. Because of the newness of the assessment process, no information about the generalizability of the assessments exists. In general, as the development of EPAs and corresponding assessment tools moves forward, the focus must be on ensuring that the EPAs are both context- and culture-specific while continuing to allow for comparison of ability across programs. This provides a future opportunity for workshops.
Finally, there have been recent moves to introduce EPAs in undergraduate medical education. By adjusting the assessment process to accept the lack of full independence at this level of training, EPAs can provide a curricular and assessment framework for undergraduate medical education. Using EPAs in this way creates a continuum between undergraduate and graduate medical education. Future opportunities for workshops would include the incorporation of EPAs at this level of training.
Appendices
A. Attendee Notes.docx
B. Would You Trust This Resident.docx
C. Developing Coffeehouse EPAs.docx
D. Seminar Evaluation Form.docx
E. EPA Faculty Development Presentation.pptx
All appendices are peer reviewed as integral parts of the Original Publication.
Disclosures
None to report.
Funding/Support
None to report.
Ethical Approval
Reported as not applicable.
References
- Swing SR. The ACGME Outcome Project: retrospective and prospective. Med Teach. 2007;29(7):648–654. https://doi.org/10.1080/01421590701392903 [DOI] [PubMed] [Google Scholar]
- ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39(12):1176–1177. https://doi.org/10.1111/j.1365-2929.2005.02341.x [DOI] [PubMed] [Google Scholar]
- Snell LS, Frank JR. Competencies, the tea bag model, and the end of time. Med Teach. 2010;32(8):629–630. https://doi.org/10.3109/0142159X.2010.500707 [DOI] [PubMed] [Google Scholar]
- Tekian A, Hodges BD, Roberts TE, Schuwirth L, Norcini J. Assessing competencies using milestones along the way. Med Teach. 2015;37(4):399–402. https://doi.org/10.3109/0142159X.2014.993954 [DOI] [PubMed] [Google Scholar]
- ten Cate O, Scheele F. Viewpoint: competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542–547. https://doi.org/10.1097/ACM.0b013e31805559c7 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
A. Attendee Notes.docx
B. Would You Trust This Resident.docx
C. Developing Coffeehouse EPAs.docx
D. Seminar Evaluation Form.docx
E. EPA Faculty Development Presentation.pptx
All appendices are peer reviewed as integral parts of the Original Publication.
