Skip to main content
JBJS Open Access logoLink to JBJS Open Access
. 2021 Apr 9;6(2):e20.00010. doi: 10.2106/JBJS.OA.20.00010

Entrustable Professional Activities in Orthopaedics

Adam Watson 1,2,a, Timothy Leroux 2,3, Darrell Ogilvie-Harris 2,3, Markku Nousiainen 2,4, Peter C Ferguson 5, Lucas Murnahan 2,6, Tim Dwyer 2,6
PMCID: PMC8154482  PMID: 34056510

Abstract

Background:

An entrustable professional activity (EPA) is defined as a core task of a specialty that is entrusted to a trainee once sufficient competence has been reached. A group of EPAs reflects the activities that clinicians commonly do on a day-to-day basis. Lists of EPAs have been created for most medical subspecialties, but not orthopaedic surgery. The aim of this study was to create a peer-reviewed list of essential EPAs that a resident must perform independently before completing orthopaedic residency training.

Methods:

A focus group of 7 orthopaedic surgeons from the University of Toronto developed a comprehensive list of 285 EPAs. For each subspecialty group, the opinions of at least 15 academic and nonacademic surgeons, as well as subspecialty-trained and non–subspecialty-trained surgeons, were used. The modified Delphi method was used to rank EPAs on a five-point scale from not important to mandatory for a resident to competent before exiting training. Two Delphi rounds were used, using a threshold of >50% of surgeons considering the EPA as mandatory before being considered for the next round. A final list of EPAs was ratified using the focus group of academic surgeons involved in the study.

Results:

Seventy-five (75) of 107 (70%) surgeons invited responded to the survey. Nearly half (129) of the 285 EPAs were discarded after the first round of Delphi. A further 118 EPAs were discarded after the second Delphi round, leaving 49 final EPAs, across 9 subspecialties in orthopaedic surgery.

Conclusions:

Expert consensus was used to create a list of EPAs considered mandatory for completion of resident training in orthopaedics in our province. The final 49 peer-reviewed EPAs will be a valuable benchmark in curriculum design and assessment in orthopaedic surgery in the competency-based era for other programs.


An entrustable professional activity (EPA) is “a critical part of professional work that can be identified as a unit, to be entrusted to a trainee once sufficient competence has been reached”1. EPAs are considered core tasks that must be taught during postgraduate training, consisting of a group of tasks that make up a management or assessment process2. An example of an EPA is “Managing a patient with knee arthritis,” which would involve being able to assess and diagnose a patient at their first clinical presentation, maximize their nonoperative care, perform a knee replacement competently, and follow them up appropriately on the ward and in clinic. An EPA combines all the tasks, knowledge, and skills that are required to function independently on this core activity3. Completing an EPA also requires a resident to demonstrate a certain level of competency or proficiency in multiple clinical skills4. For example, performing a surgical procedure requires a good knowledge of anatomy, instrumentation, and details of the technical procedure, while also requiring nontechnical skills such as communication with both health care professionals and with patients3,5.

EPAs are becoming increasingly recognized as key components of good curriculum design3,6,7. Importantly, EPAs can act as a means of bridging the gap between these theoretical competencies and the assessment of competence3,8 which is important as competency-based medical education (CBME) becomes more widespread in both undergraduate and postgraduate medical education. CBME is an outcomes-based education approach that involves identifying the abilities required of the physician and then designing the curriculum to both support and assess the achievement of these predetermined competencies9. Furthermore, the Accreditation Council for Graduate Medical Education (ACGME) has mandated that all residency training programs in orthopaedics in the United States align their assessments to its 6 core competencies, through the ACGME milestones project10. The ACGME core competencies are patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice11. Use of EPAs can help focus a curriculum to include tasks that residents must be able to perform, unsupervised and at a competent level, at the end of training1, and provide a means of translating these 6 core competencies into clinical practice12.

Key to the idea of an EPA is entrustment, whereby a supervisor entrusts tasks and responsibilities to a trainee to perform unsupervised. In this way, an EPA can be used as a statement of awarded responsibility, signifying that a trainee is competent to perform a specific task unsupervised13. This formal entrustment ensures that a trainee is given an appropriate amount of responsibility, which is beneficial to patient care and trainee progression. EPAs are also considered to be easier to assess than competencies because they are framed around professional activities that are familiar to the daily work of those assessing14.

To date, EPAs have been created for pediatrics15, internal medicine4,14,16, family medicine17, anesthesiology18, and psychiatry19. At this time, there is no published list of EPAs that has been peer reviewed for orthopaedic surgery, the development of which would serve as a valuable starting point for discussion across training programs in North America. The purpose was to use expert consensus to develop a list of EPAs that a trainee should be able to independently perform at a competent level, before graduation from orthopaedic training.

Materials and Methods

The methodology published by Shaughnessy et al.17 was used as a guideline. A literature review focusing on competencies in orthopaedic training was performed using 2 textbooks of orthopaedics, Campbell's Operative Orthopaedics20 and Millers Review of Orthopaedics21. These references were chosen because they are comprehensive, widely used, and cover topics pertinent to a North American orthopaedic graduate medical program. Previously published lists of core competencies in orthopaedic surgery were cross-referenced to create a list of potential EPAs6,7,22. After creation, the list was broken down into the common subspecialties of orthopaedics in North America; trauma, pediatrics, lower-limb arthroplasty, tumor, upper limb, foot and ankle, spine, and sports, similar to previously published competency lists6,7.

The list of potential EPAs was reviewed by a focus group of 7 academic orthopaedic surgeons from a single institution, each with fellowship training in the aforementioned subspecialties of orthopaedics. Each member holds a university appointment. The list was reviewed for completeness, with additional EPAs added and some reworded.

Review of the literature and orthopaedic textbooks resulted in 271 possible EPAs, broken down into 8 subspecialties: trauma, pediatrics, lower-limb arthroplasty, tumor, upper limb, foot and ankle, spine, and sports. After further review and discussion during the focus group stage, the 7 surgeons involved in the study provided 14 more, resulting in a total of 285. The complete list can be found in Appendix 1.

Once a comprehensive list of EPAs was developed, a modified Delphi methodology was used to obtain consensus on the most critical EPAs for a competency-based orthopaedic curriculum. The Delphi technique is a means of congregating expert opinion through a series of iterative questionnaires, with a goal of coming to a group consensus on a specific topic23. Each iteration involves a feedback process, allowing participants to evaluate their initial judgments based on comments and feedback from other participants24. The modified Delphi technique involves beginning the process with a set of carefully selected items, as opposed to an initial open-ended questionnaire to develop a list to be evaluated, thus providing a solid grounding in previously developed work25.

For each subspecialty with the exception of tumor, both academic and nonacademic specialist surgeons were approached to evaluate the list of EPAs, from the province of Ontario in Canada. A minimum of 5 academic and 5 community-based surgeons were enrolled to review each subspecialty list. In addition, 5 surgeons without specific training in that subspecialty were asked to review each list, in an effort to ensure the EPAs remained applicable to the general orthopaedic surgeon. As orthopaedic oncology is typically performed in academic centers, nonacademic specialists were not used.

An invitation email was sent to all participants with information on the background of EPAs and the rationale behind the study. It invited them to participate in the research, with a link to an online survey tool (www.surverymonkey.com). A formal consent process was coded into the online survey program, which was mandatory before starting the survey. Demographic data about the respondent was obtained, academic vs. community practice, years in practice, and subspecialization.

Participants were sent a link to the survey to rank EPAs in their subspecialty. Some participants had multiple subspecialties or were invited as nonexpert respondents and were sent more than 1 link. Each EPA was rated on a 5-stage Likert scale, see Table I, taken from a previous study6. It was specified that the primary aim of this round of Delphi questioning was to produce a list of EPAs that were mandatory for a trainee exiting general orthopaedic training to be independently proficient in, not a subspecialty fellow. A free-text question asking the respondents to suggest any EPAs that have been left out was included.

TABLE I.

Entrustable Professional Activity Rating Scale*

Ranking 1 2 3 4 5
Description Not important Somewhat unimportant Somewhat important Important Mandatory
*

Each entrustable professional activity was rated on a 5-stage Likert scale, taken from Kellam et al.6.

Using a previous published methodology6, the results were analyzed, using a weighted mean of the ranking seen in Table I. All EPAs that had a weighted mean greater than 3.5 and more than 50% of respondents ranked as mandatory were retained.

The second-stage Delphi survey included the retained EPAs and gave feedback on the percentage of surgeons that considered it mandatory. This aimed to inform the survey participants of the first round about the entire group's consideration of the importance of that EPA26, allowing participants to reassess their judgments based on the feedback by the group24. In this round, any new EPAs that were suggested in the free-text component of stage 1 were included, with a note that they were new. Finally, a summary of all the EPAs discarded after stage 1 was included, and respondents instructed to list any they felt strongly should not be discarded. The final list of EPAs was established, accepting those with a weighted mean greater than 3.5 and where more than 50% of respondents ranked as mandatory were retained, similar to the second-stage Delphi in previous research6.

The data were collated by the online survey software and were analyzed with measures of central tendency (means, median, and mode) to present information concerning the collective judgments of respondents26.

Research Ethics Board approval was approved at our institution (REB # 2016-0109-E).

Results

Seventy-five (75) of 107 (70%) surgeons invited responded to the first-stage Delphi questionnaire. The demographic data of those respondents are included in Table II, particularly with reference to their years after residency and focus of practice. The overall number of respondents, across all subspecialties, was 122, given that some respondents covered multiple subspecialties.

TABLE II.

Demographic Data of Respondents*

No. of Respondents Years After Residency (%) Practice Type (%)
0-5 6-10 11-15 >15 Academic Community Mixed
Sports 19 10 (52.5) 5 (26.3) 3 (15.7) 1 (5.2) 13 (68.4) 5 (26.3) 1 (5.2)
Lower-limb arthroplasty 16 8 (50) 4 (25) 2 (12.5) 2 (12.5) 10 (62.5) 5 (31.3) 1 (6.3)
Tumor 10 4 (40) 2 (20) 2 (20) 2 (20) 10 (100) 0 0
Pediatrics 16 10 (62.5) 3 (18.8) 2 (12.5) 1 (6.3) 11 (68.8) 5 (31.3) 0
Foot and ankle 15 8 (53.3) 3 (20) 2 (13.3) 2 (13.3) 8 (53.3) 4 (26.6) 3 (20)
Spine 15 7 (46.6) 4 (26.6) 3 (20) 1 (6.6) 10 (66.6) 5 (33.3) 0
Upper limb 16 4 (25) 5 (31.3) 3 (18.8) 4 (25) 10 (62.5) 4 (25) 2 (12.5)
Trauma 15 4 (26.6) 6 (40) 3 (20) 2 (13.3) 10 (66.6) 4 (26.6) 1 (6.6)
*

The demographic data of respondents, particularly with reference to their years after residency and focus of practice.

During the first Delphi stage, 125 of the 285 EPAs initially described had a weighted mean <3.5 and had less than 50% of respondents considering them mandatory. These were discarded, listed in Appendix 2. The breakdown of the number of EPAs discarded for each subspecialty can be seen in Table III.

TABLE III.

Discarded EPAs at Each Stage*

Subspecialty EPA Development Stage Discarded at First Delphi Stage Suggested Additional EPAs Analyzed at Second Stage Discarded at Second Stage Final List
Sports 48 24 0 24 19 5
Lower-limb arthroplasty 20 6 0 14 8 6
Tumor 22 10 0 12 8 4
Pediatrics 66 42 0 24 18 6
Foot and ankle 19 6 0 13 10 3
Spine 25 9 0 16 12 4
Upper limb 38 19 1 20 15 5
Trauma 47 9 1 39 23 16
Total 285 125 2 162 113 49
*

EPA = entrustable professional activity.

A breakdown of the number of EPAs discarded at each stage, categorized by subspecialty.

Additional EPAs were suggested by free text at the end of the survey for each subspecialty. A single EPA was added in upper limb (managing a patient with a high-pressure injection injury) and in trauma (emergent management of a knee dislocation).

All the 75 (100%) surgeons responded for the second-stage Delphi survey. One hundred sixty-two EPAs were analyzed, with 113 were discarded, because their weighted mean was less than 3.5 and less than 50% of surgeons considered them mandatory. The EPAs discarded at this second stage can be seen in Appendix 3.

The final list of 49 EPAs is presented in Table IV, including the weighted mean and percentage of surgeons that considered it mandatory at each survey stage.

TABLE IV.

Final List of EPAs*

Specialty EPA First Round Second Round
Mean % Mean %
Sports Management of the patient with shoulder instability 4.89 86 4.67 72
Management of the patient with ACL tear 4.78 75 4.79 84
Management of the patient with patellar/quadriceps tendon rupture 4.78 75 4.95 95
Management of the patient with meniscal tear 4.78 75 4.84 84
Management of the patient with Achilles tendon rupture 4.89 87 4.95 95
Spine Management of the patient with spinal cord injury 4.86 86 4.69 81
Management of the patient with cauda equina injury 4.79 93 4.88 86
Management of the patient with lumbar herniated disc 4.57 64 4.25 51
Management of the patient with spinal infection 4.71 78 4.44 56
Trauma Management of the patient with compartment syndrome of leg, forearm and foot 5.00 100 5.00 100
Management of the patient with open fracture 5.00 100 5.00 100
Management of the patient with clavicle fracture 4.54 62 4.28 55
Management of the patient with proximal humerus fracture 4.62 69 4.33 50
Management of the patient with humeral shaft fracture 4.54 69 4.53 53
Management of the patient with elbow dislocation 4.92 92 4.76 78
Management of the patient with olecranon fractures 4.69 69 4.76 78
Management of the patient with forearm fractures, including Galleazi and Monteggia 4.54 80 4.61 67
Management of the patient with distal radius fracture 5.00 100 4.83 89
Management of the patient with hip dislocation 4.77 76 4.56 72
Management of the patient with femoral neck fracture, including subtrochanteric fractures 5.00 100 4.94 94
Management of the patient with femoral shaft fracture 5.00 100 5.00 100
Management of the patient with knee dislocation NA NA 4.72 83
Management of the patient with tibial plateau fracture 4.77 76 4.61 67
Management of the patient with tibial shaft fracture 4.92 92 4.83 83
Management of the patient with an ankle fracture 5.0 100 4.94 94
Upper limb Management of the patient with subacromial impingement 4.8 80 4.78 78
Management of the patient with rotator cuff tear 5.00 100 4.89 89
Management of the patient with adhesive capsulitis 4.94 94 4.78 83
Management of the patient with distal biceps rupture 5.00 100 4.67 78
Management of the patient with carpal tunnel syndrome 4.60 73 4.44 61
Tumor Management of the patient with soft-tissue sarcoma NA NA 4.55 61
Management of the patient with bone sarcoma NA NA 4.91 91
Management of the patient with myeloma and bony metastatic disease NA NA 4.55 64
Management of common benign bone tumors NA NA 4.73 82
Pediatrics Management of the patient with non accidental injury 4.87 93 4.95 94
Management of the patient with humerus fractures, including supracondylar 5.00 100 4.89 89
Management of the patient with lateral condyle fracture 4.87 86 4.67 67
Management of the patient with distal radius fractures 5.00 100 4.94 94
Management of the patient with the acutely painful hip 4.93 93 5.00 100
Management of the patient with slipped capital femoral epiphysis 5.00 100 4.94 94
Arthroplasty Nonoperative management of the patient with hip osteoarthritis 4.88 86 4.93 93
Nonoperative management of the patient with knee osteoarthritis 4.75 80 4.93 93
Managing the patient undergoing a simple TKR 4.81 80 4.87 87
Management of the patient with an infected TKR 4.69 66 4.4 60
Management of the patient undergoing THR 4.87 86 4.87 87
Management of the patient with an infected THR 4.75 73 4.47 60
Foot and ankle Management of the patient with ankle osteoarthritis 4.40 63 4.56 64
Management of the patient with hallux valgus 4.73 71 4.44 69
Management of the patient with hallux rigidus 4.67 64 4.38 63
*

ACL = anterior cruciate ligament, EPA = entrustable professional activity, NA = not applicable, THR = total hip replacement, and TKR = total knee replacement.

The final list of EPAs, including the weighted mean and percentage of surgeons that considered it mandatory at each survey stage.

Discussion

CBME was introduced over 2 decades ago and has been widely adopted by the medical community27,28, influencing the recent reform of undergraduate and postgraduate training programs. Although lists of core competencies7 and milestones11 have been developed to guide training programs, how to focus the assessment of competence has been less clear. Using expert consensus, we introduce a series of EPAs for orthopaedic residency training that can be used in the assessment of competency.

An EPA is an individual unit of professional activity that represents the critical components of what clinicians do and is focused on outcomes of care29,30. Each EPA is made up of several different competencies, and as such an EPA does not replace competencies, rather it translates them into clinical practice in a meaningful way13,29. Rather than evaluating the attainment of a single competency, the fundamental idea of an EPA is to say that a trainee can perform a relevant clinical task safely and independently, which may be much more meaningful. Furthermore, the assessment of an EPA does not focus simply on the performance of technical skills. Management of patient conditions requires competency in many nontechnical skills, such as professionalism, interpersonal and communication skills, and systems-based practice13, which should also be assessed.

Many medical specialties have begun to develop EPAs as a foundation for assessment in CBME14,17-19. However, the implementation of EPAs in graduate medical education at this time has been relatively limited31. Incorporation of EPAs into assessment has been used to determine the clinical skills of general surgery32,33, pediatric34, and family medicine residents8,35. Valentine et al.36 used EPA assessment in general practice to demonstrate that entrustment levels changed over time, a finding mirrored in a study of pediatric fellows37. In 2016, Dwyer et al. used simulation to assess the performance of orthopaedic residents in ankle fracture, hip fracture, and total knee arthroplasty EPAs, finding that senior residents performed better than junior2.

There are challenges associated with the creation of lists of EPAs for training programs. The first challenge is the development of a list of EPAs that reflects only the critical activities performed by the medical or surgical specialist. ten Cate and Scheele has proposed that between 20 to 30 EPAs should serve as the foundation for a postgraduate curriculum12, on the basis that a more exhaustive list may compromise the feasibility of an EPA-based curriculum, resulting in a burdensome checklist exercise that lacks insight into the trainee ability29. In our study, we used the Delphi method to develop a list of 49 EPAs that were determined as mandatory for a trainee exiting general orthopaedic training to be proficient in by 50% of our experts, exceeding the number recommended, but representing the fact that orthopaedics is an extremely large specialty. Despite this, many subspecialists might be concerned regarding the limited number of hand and wrist EPAs for example. Although the management of distal radius fractures in the adult and the pediatric population made the list, as well the management of carpal tunnel syndrome, this limited number of hand EPAs may be due to hand surgery being primarily performed by subspecialty surgeons in North America, as opposed to the general orthopaedic surgeon.

However, it is important to note that these EPAs do not define all knowledge that must be obtained during orthopaedic training, rather serve as a guiding platform for the assessment of competence during training. For these reasons, this list of orthopaedic EPAs is presented for wider discussion by training programs as appropriate, depending on their specific circumstances and resources. Although we believe that the list of mandatory EPAs we have created will be extremely useful, it is not meant to be definitive or mandatory for all training programs. Rather, it can serve as a scientific basis to which programs can add or delete as required. For example, training programs may decide that some other EPAs in our original list of 285 in Appendix 1 are important components of training; alternatively, programs might decide that 49 EPAs are too burdensome for assessment during training and shorten the list. These EPAs were developed by participants from a wide range of institutions, albeit from a single province within Canada. Broader consensus and participation across North America might serve as further validation of this list. A study by Freedman and Bernstein38 used department chairmen to create a list of orthopaedic topics for medical students—a formal review of our list of orthopaedic EPAs by programs heads across North America may help provide further validation.

Although there continues to be widespread adoption of CBME, there have emerged criticisms and challenges with this training and assessment model39-41, particularly for postgraduate programs. At this time, there is limited evidence that competency-based training is superior to traditional time-based training. Nousiainen et al.42 published a review of the first 8 years of a competency-based training program in orthopaedic residency, demonstrating comparable graduation rates, and high satisfaction in a cohort of trainees within a competency-based model. Although the competency-based model was both more expensive and complex to organize, it was associated with a decrease in training duration in some of the participating residents. However, at this time, there is no evidence that the CBME results in improved graduate performance, and without suitable metrics, this may be difficult to prove.

This article has several limitations. We used the modified Delphi technique to develop the list of EPAs—different methods of list generation may have produced a different number of mandatory EPAs, either shorter or longer. However, the modified Delphi methodology is commonly accepted and widely used in this manner. In this study, 2 textbooks were used to create an initial list of possible EPAs—other books may have changed the initial list created. We also used a minimum of 5 academic and 5 community surgeons with subspecialty expertise to review each list—the number of participants required for a modified Delphi technique is unknown24. However, the approximate size of a Delphi panel is generally under 5043 with fewer number of participants required if the background of the Delphi participants is more homogenous24. We have made no designation as to case complexity applicable to the specific EPA. For example, in the management of a patient with an ankle fracture, will this be reserved for healthy patients with simple fracture patterns or will we include diabetics, elderly patients, and comminuted fractures that are more challenging. We have also not determined timelines for attaining independence on all EPA tasks—if it is mandatory for trainees to perform these EPAs independently before completing their training. In that case, a training program based on EPAs will need to have flexible timelines to accommodate for trainee variation. This flexibility is in line with the 2010 Carnegie Report, which stated the need to reform medical training to fixed standards with flexible timelines44.

Finally, as previously mentioned, we acknowledge that there is lack of evidence to support the idea that independence on these activities correlates directly with success in independent practice, or that CBME results in improved performance of graduates in the workplace, and as such requires further study.

Conclusion

Expert consensus was used to create a list of EPAs considered mandatory for completion of resident training in orthopaedics in our province. The final list of 49 peer-reviewed EPAs will be a valuable benchmark in curriculum design and assessment in orthopaedic surgery in the competency-based era for other programs.

Appendix

Supporting material provided by the authors is posted with the online version of this article as a data supplement at jbjs.org (http://links.lww.com/JBJSOA/A248, http://links.lww.com/JBJSOA/A249, and http://links.lww.com/JBJSOA/A250). This content was not copy-edited or verified by JBJS.

Acknowledgments

Note: The authors would like to thank Professor Debra Nestel, Associate Professor Martin Richardson for their guidance and the Royal Australian College of Surgeons for its support through the Ian and Ruth Gough Surgical Education Fellowship.

Footnotes

Investigation performed at Women's College Hospital, Toronto, Ontario, Canada

Disclosure: The Disclosure of Potential Conflicts of Interest forms are provided with the online version of the article (http://links.lww.com/JBJSOA/A245).

References

  • 1.ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-7. [DOI] [PubMed] [Google Scholar]
  • 2.Dwyer T, Wadey V, Archibald D, Kraemer W, Shantz JS, Townley J, Ogilvie-Harris D, Petrera M, Ferguson P, Nousiainen M. Cognitive and psychomotor entrustable professional activities: can simulators help assess competency in trainees? Clin Orthop Relat Res. 2016;474(4):926-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Mulder H, Ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: the case of physician assistant training. Med Teach. 2010;32(10):e453-9. [DOI] [PubMed] [Google Scholar]
  • 4.Chang A, Bowen JL, Buranosky RA, Frankel RM, Ghosh N, Rosenblum MJ, Thompson S, Green ML. Transforming primary care training--patient-centered medical home entrustable professional activities for internal medicine residents. J Gen Intern Med. 2013;28(6):801-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Carraccio C, Burke AE. Beyond competencies and milestones: adding meaning through context. J Grad Med Educ. 2010;2(3):419-22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Kellam JF, Archibald D, Barber JW, Christian EP, D'Ascoli RJ, Haynes RJ, Hecht SS, Hurwitz SR, Kellam JF, McLaren AC, Peabody TD, Southworth SR, Strauss RW, Wadey VMR. The core competencies for general orthopaedic surgeons. J Bone Joint Surg Am. 2017;99(2):175-81. [DOI] [PubMed] [Google Scholar]
  • 7.Wadey VM, Dev P, Buckley R, Walker D, Hedden D. Competencies for a Canadian orthopaedic surgery core curriculum. J Bone Joint Surg Br. 2009;91(12):1618-22. [DOI] [PubMed] [Google Scholar]
  • 8.Shaughnessy AF, Chang KT, Sparks J, Cohen-Osher M, Gravel J, Jr. Assessing and documenting the cognitive performance of family medicine residents practicing outpatient medicine. J Grad Med Educ. 2014;6(3):526-31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-45. [DOI] [PubMed] [Google Scholar]
  • 10.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051-6. [DOI] [PubMed] [Google Scholar]
  • 11.Mery CM, Greenberg JA, Patel A, Jaik NP. Teaching and assessing the ACGME competencies in surgical residency. Bull Am Coll Surg. 2008;93(7):39-47. [PubMed] [Google Scholar]
  • 12.ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.ten Cate O. AM last page: what entrustable professional activities add to a competency-based curriculum. Acad Med. 2014;89(4):691. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: a report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med. 2015;90(4):479-84. [DOI] [PubMed] [Google Scholar]
  • 15.Jones MD, Jr, Rosenberg AA, Gilhooly JT, Carraccio CL. Perspective: competencies, outcomes, and controversy—linking professional activities to competencies to improve resident education and practice. Acad Med. 2011;86(2):161-5. [DOI] [PubMed] [Google Scholar]
  • 16.Lowry BN, Vansaghi LM, Rigler SK, Stites SW. Applying the milestones in an internal medicine residency program curriculum: a foundation for outcomes-based learner assessment under the next accreditation system. Acad Med. 2013;88(11):1665-9. [DOI] [PubMed] [Google Scholar]
  • 17.Shaughnessy AF, Sparks J, Cohen-Osher M, Goodell KH, Sawin GL, Gravel J, Jr. Entrustable professional activities in family medicine. J Grad Med Educ. 2013;5(1):112-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Jonker G, Hoff RG, Ten Cate OT. A case for competency-based anaesthesiology training with entrustable professional activities: an agenda for development and research. Eur J Anaesthesiol. 2015;32(2):71-6. [DOI] [PubMed] [Google Scholar]
  • 19.Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ. 2011;11:96. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Canale ST, Azar FA, Beaty JH, Campbell WC. Campbell's Operative Orthopaedics. 13th ed. Philadelphia, PA: Elsevier; 2017. [Google Scholar]
  • 21.Miller MD, Thompson SR. Miller's Review of Orthopaedics. 7th ed. Phildelphia, PA: Elsevier; 2016. [Google Scholar]
  • 22.Woolf AD, Walsh NE, Akesson K. Global core recommendations for a musculoskeletal undergraduate curriculum. Ann Rheum Dis. 2004;63(5):517-24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Dalkey N. The Delphi method: an experimental study of group opinion. 1969. Available at: https://www.rand.org/pubs/research_memoranda/RM5888.html. Accessed July 29, 2016.
  • 24.Hsu CC, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess Res Evaluat. 2007;12:1-8. [Google Scholar]
  • 25.Custer RL, Scarcella JA, Stewart BR. The modified Delphi technique—a rotational modification. J Vocational Tech Educ. 1999;15(2):50-8. [Google Scholar]
  • 26.Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs. 2000;32(4):1008-15. [PubMed] [Google Scholar]
  • 27.Sklar DP. Competencies, milestones, and entrustable professional activities: what they are, what they could be. Acad Med. 2015;90(4):395-7. [DOI] [PubMed] [Google Scholar]
  • 28.Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE guide no. 78. Med Teach. 2013;35(6):e1197-210. [DOI] [PubMed] [Google Scholar]
  • 29.El-Haddad C, Damodaran A, McNeil HP, Hu W. The ABCs of entrustable professional activities: an overview of “entrustable professional activities” in medical education. Intern Med J. 2016;46(9):1006-10. [DOI] [PubMed] [Google Scholar]
  • 30.Carraccio C, Englander R, Holmboe ES, Kogan JR. Driving care quality: aligning trainee assessment and supervision through practical application of entrustable professional activities, competencies, and milestones. Acad Med. 2016;91(2):199-203. [DOI] [PubMed] [Google Scholar]
  • 31.O'Dowd E, Lydon S, O'Connor P, Madden C, Byrne D. A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011-2018. Med Educ. 2019;53(3):234-49. [DOI] [PubMed] [Google Scholar]
  • 32.Wagner JP, Lewis CE, Tillou A, Agopian VG, Quach C, Donahue TR, Hines OJ. Use of entrustable professional activities in the assessment of surgical resident competency. JAMA Surg. 2018;153(4):335-43. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Stahl CC, Collins E, Jung SA, Rosser AA, Kraut AS, Schnapp BH, Westergaard M, Hamedani AG, Minter RM, Greenberg JA. Implementation of entrustable professional activities into a general surgery residency. J Surg Educ. 2020;77(4):739-48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Larrabee JG, Agrawal D, Trimm F, Ottolini M. Entrustable professional activities: correlation of entrustment assessments of pediatric residents with concurrent subcompetency milestones ratings. J Grad Med Educ. 2020;12(1):66-73. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program. Acad Med. 2015;90(7):888-97. [DOI] [PubMed] [Google Scholar]
  • 36.Valentine N, Wignes J, Benson J, Clota S, Schuwirth LW. Entrustable professional activities for workplace assessment of general practice trainees. Med J Aust. 2019;210(8):354-9. [DOI] [PubMed] [Google Scholar]
  • 37.Mink RB, Schwartz A, Herman BE, Turner DA, Curran ML, Myers A, Hsu DC, Kesselheim JC, Carraccio CL; Steering Committee of the Subspecialty Pediatrics Investigator Network (SPIN). Validity of level of supervision scales for assessing pediatric fellows on the common pediatric subspecialty entrustable professional activities. Acad Med. 2018;93(2):283-91. [DOI] [PubMed] [Google Scholar]
  • 38.Freedman KB, Bernstein J. The adequacy of medical school education in musculoskeletal medicine. J Bone Joint Surg Am. 1998;80(10):1421-7. [DOI] [PubMed] [Google Scholar]
  • 39.Brightwell A, Grant J. Competency-based training: who benefits? Postgrad Med J. 2013;89(1048):107-10. [DOI] [PubMed] [Google Scholar]
  • 40.Glass JM. Competency based training is a framework for incompetence. BMJ. 2014;348:g2909. [DOI] [PubMed] [Google Scholar]
  • 41.Hodges BD. A tea-steeping or i-Doc model for medical education? Acad Med. 2010;85(9 suppl):S34-44. [DOI] [PubMed] [Google Scholar]
  • 42.Nousiainen MT, Mironova P, Hynes M, Glover Takahashi S, Reznick R, Kraemer W, Alman B, Ferguson P; CBC Planning Committee. Eight-year outcomes of a competency-based residency training program in orthopedic surgery. Med Teach. 2018;40(10):1042-54. [DOI] [PubMed] [Google Scholar]
  • 43.Altschuld JW, Watkins R. A primer on needs assessment: more Than 40 years of research and practice. In: Needs Assessment: Trends and a View Toward the Future New Directions for Evaluation. San Francisco, CA: Jossey-Bass; 2014:5-18. [Google Scholar]
  • 44.Cate OT. A primer on entrustable professional activities. Korean J Med Educ. 2018;30(1):1-10. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from JBJS Open Access are provided here courtesy of Wolters Kluwer Health

RESOURCES