Competency-based medical education (CBME) organizes the educational experience around competencies, emphasizes performance outcomes, promises greater accountability to patients and society, and is flexible and learner-centered.1 Competencies are multifaceted and integrated. Hence, our approach to competency assessment must involve an integrated assessment system and focus on performance in the workplace. Focusing on the workplace activities of trainees and inferring competence based on learner performance on those activities is most appropriate.2 Entrustable Professional Activities (EPAs) are an attractive assessment framework because they focus on day-to-day activities and inherently address multiple competencies and skills at once.2,3
First described by ten Cate, an EPA is an essential professional work activity or task for medical practice that requires specialized knowledge and skills, and encompasses multiple competencies.2,3 EPAs as an assessment methodology allow supervisors to observe the performance of a learner in an authentic environment, executing professional work. For example, the EPA of performing an appendectomy requires that a physician has the knowledge and skills to know when and how to perform the procedure, can explain to the patient why the procedure is necessary and what is to be expected, and will coordinate with other health care providers to complete the procedure safely. The supervisor or a group of supervisors observes how the learner has performed the task and judges how much they would entrust the learner to do the EPA with a certain level of supervision in the future. Observation of a learner conducting an EPA would enable a supervisor to determine the learner’s ability to perform that activity with decreasing supervision and increasing autonomy.2,3
In this issue of JGIM, Warm et al. describe an innovative application of competency-based assessment in Internal Medicine training.4 Their innovation addresses how to create a robust system of assessment that reduces competencies into units smaller than EPAs and milestones (i.e., a significant point in a learner’s development demonstrated progressively by a learner during the course of their education).5 Warm et al. propose a structure for connected assessment units (EPAs and milestones to competencies) to create an incremental and progressive assessment system. Termed Observable Practice Activities (OPAs), OPAs are a collection of learning objectives as activities that must be observed in daily practice in order to form entrustment decisions. OPAs are described as small units, a number of which map to an EPA and one or more competencies and milestones. The authors further categorize OPAs into content and process OPAs. Content OPAs are activities that depend on the discipline learned, while process OPAs are activities conserved across rotations (example authors provide include “minimizing unfamiliar terms during patient encounters”). Ultimately, the OPAs and their related EPAs, competencies, and milestones are expected to be mastered at the end (or before the end) of Internal Medicine training.
Warm et al. address an issue the assessment community has grappled with: Even though milestones and EPAs conceptualize competencies in measurable units, both continue to be broad and encompassing, and are often not easy to measure.6 Through OPAs, Warm et al. provide a set of measurements to determine a learner’s knowledge and ability to perform a larger set of EPAs. Warm et al. also demonstrate a successfully implemented assessment system, through mapping observed behaviors to reporting milestones and EPAs. In this mapping process, the authors ensured that each milestone had as many as ten or greater OPAs representing learner competency. Including multiple OPAs to represent the milestones provided a multi-pronged method to measure learner performance. Of key importance is this system’s ability to provide the institution metrics to determine whether the proposed OPAs are appropriate and show growth in a given milestone that is incremental and appropriate to the learners’ level.
Warm et al. created the OPA framework through involving many residents and faculty. The creation of the OPAs was accomplished through active involvement and engagement of program directors, chief residents, and faculty members from across clinical settings and content areas, thereby strengthening the developmental process as well as substance of the OPAs. Their methodology provided a proof of feasibility for a large scale application through active engagement to increase institutional education surrounding OPAs.
One of the challenges for Warm et al. will be to further delineate the training on the use of OPAs. One concern about CBME has been that competence is broken into the smallest observable units of behavior, creating endless nested lists of abilities that frustrate learners and teachers.1 Faculty development of the front line clinical teachers will need to include education on how to distinguish between the OPAs and other similar constructs (milestones, EPAs), how these constructs build upon one another, and how performance should best be judged. The entrustment rating for each OPA is essential, and best practices for how faculty members are educated to entrust will be critical.
Warm et al. present three examples of progress by three different residents with varying rates of performance over time. This display reveals a systematic process that allows for progress based on individual growth. Presently, it isn’t definitively known how learners should be progressing in each of the milestones. As educators, we make determinations about our learner’s progress based on educated deductions and experiences with learners over time. The data displayed by Warm et al. allows us to understand how learners are performing on OPAs at various levels and make progress decisions grounded in actual learner performance data. A challenge will be ensuring that a process for documenting defensible decisions about resident progression is maintained consistently and carefully over time. Warm et al. point out that residents showed progression on most but not all OPAs and milestones. OPAs that do not readily point to progression might suggest OPAs that are inappropriate, inadequate, or include design flaws. Likewise, a program that notes second-year residents are not reaching independence in an appropriate OPA should look back at their curriculum and determine if, and where, educational experiences might be lacking.
Ultimately, the challenge for the faculty members involved in decision making about residents competence will mean learning and delineating which rate of progress means what for each learner: Should all OPAs be weighted equally or do some carry greater weight? What if entrustment occurs, but is subsequently followed by error leading to lack of entrustment? How many points of supervision are needed to form an entrustment decision? Does the competence of the faculty members making entrustment decisions need to be factored into the assessment?
If institutions elect to move forward with assessment via OPAs, a few key questions about design will require contemplation. Should OPAs be developed first, following the example of Warm et al., with the entire curriculum being written with OPAs as the subunits? Or rather, should EPAs be developed first and OPAs gleaned from those EPAs? The benefits of starting with OPAs include a meticulous look at the activities learners engage in and subsequent creation of a list of activities to be completed. The drawback of starting with OPAs is that the larger perspective of what learners should be participating in regardless of what occurs in real-time will be neglected. The benefit of creating EPAs first ensures developers start from the activities that we expect residents to engage in and consequently delineate the details within those activities. The drawback of the latter procedure includes difficulty deciding which OPAs fall under each EPA. OPAs, particularly those that are setting-dependent, might not be aligned with what occurs during the educational experience.
Warm et al. provide a framework that advances our knowledge of assessment of competencies, milestones, and EPAs. The authors also set the stage for future research in this area. Next research steps should include looking at resident performance on the content and process OPAs to determine how learners progress in each type of OPA. More research needs to address the process for updating OPAs. What will be the time line and/or mechanism by which institutions should update OPAs? What is the optimal method by which OPAs and related assessment data should be tracked and reported out to individual residents and the residency program? And finally, how should resident progression be handled for OPAs independently accomplished early in training vs. those accomplished late in or not at all during training?
With OPAs, learners focus on smaller tasks to master, and these smaller tasks can be structured as building blocks toward achieving independence in larger activities. As conceptualized, the application of OPAs has important implications for assessment in both graduate and undergraduate medical education. Many institutions are moving toward EPAs. OPAs may be key to determining our learners’ incremental achievement of entrustment in EPAs and milestones, and ultimately judgment of our learners’ competence.
REFERENCES
- 1.Frank JR, Mungroo R, Ahmad Y, Wang M, DeRossi S, Horsely T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32:631–637. doi: 10.3109/0142159X.2010.500898. [DOI] [PubMed] [Google Scholar]
- 2.Ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39:1176–77. doi: 10.1111/j.1365-2929.2005.02341.x. [DOI] [PubMed] [Google Scholar]
- 3.Ten Cate O. Trust, competence and the supervisor’s role in postgraduate training. BMJ. 2006;333:748–51. doi: 10.1136/bmj.38938.407569.94. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Warm EJ, Mathis BR, Held JD, et al. Entrustment and Mapping of Observable Practice Activities for Resident Assessment. doi: 10.1007/s11606-014-2801-5 [DOI] [PMC free article] [PubMed]
- 5.Accreditation Council of Graduate Medical Education (https://www.acgme.org/acgmeweb/tabid/430/ProgramandInstitutionalAccreditation/NextAccreditationSystem/Milestones.aspx)
- 6.Babbot S. Watching Closely at a Distance: Key Tensions in Supervising Resident Physicians. Acad Med. 2010;85:1408–1417. doi: 10.1097/ACM.0b013e3181eab0ec. [DOI] [PubMed] [Google Scholar]