Skip to main content
. 2024 Feb 6;13(1):44–55. doi: 10.5334/pme.956

Table 5.

Reflections and lessons learned from EPA implementation.


UNINTENDED CONSEQUENCE PA PRINCIPLES REFLECTIONS AND LESSONS LEARNED

In some programs, EPAs became the sole target of assessments. The EPA observation form template became the default assessment tool at the exclusion of a “suite” of assessments that capture both EPA and non-EPA based data across different levels of Miller’s pyramid. 4, 8
  • Program leaders and CCs have identified that relying solely on EPA observation form data results in assessment gaps; they are well positioned to serve as agents to refine the system of assessment [38,39,40,41].

  • A suite of assessment methods and tools that address multiple levels of Miller’s pyramid and content beyond those captured in the EPA framework are necessary to obtain a holistic view of trainee development and support high-stakes decisions about progress by a CC.

  • During large-scale implementation, while change management efforts will necessarily devote resources to new innovations (e.g., EPA framework), the integration of existing elements (i.e., suite of assessment methods) that will be carried forward must also be supported.


Observation and assessment of EPAs are perceived by trainees as high stakes. 1,2,3,7
  • The EPA system has been viewed as a set of requirements to progress through the CBD stages of training rather than as a framework to guide opportunities for coaching and growth.

  • National guidelines for the context variety and number of successful EPA observations for achievement have been interpreted as strict requirements, which has promoted a “checklist” mentality around collection of EPA-based assessment data [42,43].

    • The Royal College has disseminated a technical guide and statement of essential requirements to clarify for programs and trainees that the context variety and number of successful EPA observations should serve as guidance to CCs rather than strict criteria [29,44].

  • There is an ongoing need to create safe learning environments that promote a growth mindset and enable workplace-based assessments to be perceived as low stakes and positively by learners. Research suggests that a) the trainee’s interaction with the assessor and b) their understanding of the meaning and consequences of the assessment influences their perception of the assessment stakes [43].

    • The Royal College model of coaching in the moment [15] was developed to help programs and faculty establish positive trainee-assessor interactions that emphasize actionable feedback and optimize the learning function of assessment.

    • National initiatives to clarify the role of EPA observations for programs and residents have been developed and disseminated [45,46]. These initiatives emphasize the learning function of EPA observations, that pass/fail decisions are not made on a single observation, and that many data points collected from various sources are used to inform decisions about EPA achievement and progress.


Abbreviations: CBD Competence by Design; CC Competence Committee; EPA entrustable professional activity; PA programmatic assessment.