Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2014 Mar;6(1 Suppl 1):317–319. doi: 10.4300/JGME-06-01s1-39

Development of the Educational Milestones for Surgery

Thomas H Cogbill, Susan R Swing, on behalf of the Surgery Milestone Working Group
PMCID: PMC3966598  PMID: 24701295

Milestone Development

A working group (box) to develop educational Milestones for surgery was first convened in 2009. Under the leadership of Richard H. Bell Jr, MD, American Board of Surgery (ABS), the group consisted of 12 members representing ABS, the Accreditation Council for Graduate Medical Education (ACGME), the Residency Review Committee for Surgery (RRC-Surgery), and the Association for Program Directors in Surgery (APDS). The group included 1 resident member and an expert in surgical evaluation and performance assessment. In 2011, 5 new members were added, including 2 new members each from the RRC-Surgery and the APDS and a new chair from the ABS, Thomas H. Cogbill, MD.

Box Surgery Milestone Working Group Members

Thomas H. Cogbill, MD, Gundersen Health System, Chair

Stanley W. Ashley, MD, Harvard Medical School; Brigham and Women's Hospital

Richard H. Bell Jr, MD, American Board of Surgery, Past Chair (2009–2010); Northwestern University

Karen R. Borman, MD, Abington Memorial Hospital

Jo Buyske, MD, American Board of Surgery

Joseph B. Cofer, MD, University of Tennessee–Chattanooga Medical Center

Adeline M. Deladisma, MD, Indiana Surgical Specialists

Mark L. Friedell, MD, University of Missouri Kansas City

James C. Hebert, MD, University of Vermont College of Medicine

Mark A. Malangoni, MD, American Board of Surgery

Peggy Simpson, EdD, Accreditation Council for Graduate Medical Education (ACGME)

Susan R. Swing, PhD, ACGME

Paula M. Termuhlen, MD, Medical College of Wisconsin

R. James Valentine, MD, University of Texas Southwestern Medical Center

Charles W. Van Way III, MD, University of Missouri-Kansas City School of Medicine

Thomas V. Whalen, MD, Lehigh Valley Health Network, 2009–2010

Reed G. Williams, PhD, Indiana University Medical School

The working group began the development of the Milestones by reviewing the Dreyfus model of professional development (from novice to expert),1 along with perspectives from the literature on expertise,2 and contemporary methods of evaluation for surgeons in training.35 The working group then undertook a multistep project to define the primary domains of practice in which surgical trainees needed to develop proficiency. Eight distinct domains of surgical practice were selected: (1) care for diseases and conditions; (2) coordination of care; (3) performance of operations and procedures; (4) self-directed learning; (5) teaching; (6) improvement of care; (7) maintenance of physical and emotional health; and (8) performance of administrative tasks. These domains provided a framework for identifying Milestones integrally related to residents' learning and performance of patient care and related professional responsibilities. The group identified the ACGME competencies that were most relevant to each domain and then developed Milestones for each of these competency-domain pairs.

The group drew from the widely used SCORE curriculum to describe advancing expertise in patient care and technical skills.6 For example, early levels of patient care for the domain “Performance of Operations and Procedures” describe the expectation that residents perform steps in some essential/common operations identified in the SCORE curriculum, whereas higher levels describe the expectation that residents perform all essential operations (without the need for direct supervision) and have significant experience with complex operations in the SCORE curriculum. The working group also referenced the 2011 report from the ACGME expert panel that focused on Milestone development for professionalism, interpersonal and communication skills, practice-based learning and improvement, and systems-based practice to modify draft Milestones. A principle that guided the group was to make Milestones as succinct as possible, while ensuring that essential skills, knowledge, and behaviors for unsupervised surgical practice were represented. With this in mind, the working group conducted numerous reviews of progressive iterations of the Milestones and made revisions to reduce redundancy and fine tune the Milestones.

Organization of the Surgery Milestones

The final product consisted of 16 subcompetencies that were mapped to both an ACGME competency and a single domain of surgical practice. Sets of descriptors were composed for definitions of critical deficiency and a “Milestone set” consisting of 4 levels of performance. These descriptors were designed to discriminate between each of the levels of performance. The 16 sets of Surgery Milestones map to the ACGME competencies as follows: 3 Milestones for patient care; 2 for medical knowledge; 2 for systems-based practice; 3 for practice-based learning and improvement; 3 for professionalism; and 3 for interpersonal and communication skills. The Milestone sets map to the domains of surgical practice as follows: 5 sets for care for diseases and conditions; 2 for coordination of care; 3 for performance of operations and procedures; 1 for self-directed learning; 1 for teaching; 2 for improvement of care; 1 for maintenance of physical and emotional health; and 1 for performance of administrative tasks.

Milestone Testing and Validation

In December 2011, an alpha pilot test of a semiannual evaluation form for the Surgery Milestones was conducted in 8 stakeholder surgical residencies, representing 8 members of the Surgery Milestone Working Group. The results and comments were compiled and shared with the entire working group. The document underwent significant revision with reordering of some descriptors by level of performance, incorporation of more consistent language, and removal of duplicative descriptions. The revised document and a summary of the process used in its development were shared with the ABS, APDS, and the RRC-Surgery at the spring meeting of each organization. Additional revisions were made in response to suggestions made by these organizations.

In June 2012, the working group launched a 2-phase beta pilot test. An invitation was sent to 22 diverse general surgery residency programs; residencies with a stakeholder member of the Milestone Working Group were excluded. The test involved Clinical Competency Committee (CCC) evaluation of a sample of a program's residents (2 residents per postgraduate year) at 2 times approximately 6 months apart, the first at the end of academic year 2011–2012 and the second in midacademic year 2012–2013. The pilot was designed to investigate whether the Milestones would detect change in resident learning and performance over a 6-month period, and whether the predicted patterns of performance across levels of training and levels of Milestones would occur. Participants also completed surveys at the end of each phase. Among the information collected were participant perspectives on Milestone clarity, usefulness relative to existing methods, and feasibility of implementation, including time to complete the Milestone report. The survey also asked participants to identify Milestones that needed revision.

Eighteen programs completed Phase 1 in August 2012 and 17 programs completed Phase 2 in January 2013. The typical CCC consisted of 4 members and the average time to complete the semiannual resident evaluation was 18 minutes in Phase 1 and 14 minutes in Phase 2. For most residents, CCCs selected higher Milestone performance levels for Phase 2 than Phase 1 reporting. Milestone evaluation results showed the expected progression for most subcompetencies.

Faculty participants provided positive evaluations of the Milestones, with 97% of respondents reporting that the descriptors were clearly understandable. In addition, 85% concluded that use of the Milestones allowed for meaningful evaluation of the resident, and 76% stated that Milestone-based evaluations provided a more fair and systematic semiannual evaluation than their program's current review process. The working group trialed 1 added approach to the CCC, with the pilot instructions asking that all members review all residents in the context of the CCC. A Phase 2 survey question asked participants about the CCC approach they anticipated they would use in the future. The most frequent response expressed a necessity of distributing the Milestone reporting work across time, committee members, and education sites.

The results of this comprehensive pilot were shared with the ABS, ACS, and APDS at their respective annual meetings. In May 2013, the Surgery Milestone Working Group considered the comments and suggestions received by the participants of the beta pilot test as well as members of the aforementioned organizations. A considerable number of additions, revisions, and deletions were made to incorporate this feedback, and the document was finalized. The final draft of the Surgery Milestone document was posted to the ACGME website in June 2013.

Evaluation Using the Milestones

Early in Milestone design, the working group considered the process by which Milestone evaluations would be accomplished. An evaluation plan was drafted that identified a set of existing evaluation tools that could be used at each level of training. The working group's discussions emphasized that the evaluation process would need to be continuous and practical, and that new evaluation tools would need to be developed to allow CCCs to complete the semiannual reports for all residents.

Future Development and Refinement

The working group intends the educational Milestones in surgery to be a dynamic document. Future changes will be made based on a review of suggestions and comments by surgery program directors and CCC members at surgery residency programs. The group anticipates that changes also will be needed to parallel advances in surgical care and education. The working group is excited to learn how the document will perform for individual resident evaluation as well as a tool for the accumulation of national aggregate data on resident performance.

References

  • 1.Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff (Milwood) 2002;21(5):103–111. doi: 10.1377/hlthaff.21.5.103. [DOI] [PubMed] [Google Scholar]
  • 2.Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, editors. The Cambridge Handbook of Expertise and Expert Performance. Cambridge, UK: Cambridge University Press; 2006. [Google Scholar]
  • 3.Sanfey H, Williams RG, Chen X, Dunnington GL. Evaluating resident operative performance: a qualitative analysis of expert opinions. Surgery. 2011;150(4):759–770. doi: 10.1016/j.surg.2011.07.058. [DOI] [PubMed] [Google Scholar]
  • 4.Williams RG, Sanfey H, Chen XP, Dunnington GL. A controlled study to determine measurement conditions necessary for a reliable and valid operative performance assessment: a controlled prospective observational study. Ann Surg. 2012;256(1):177–187. doi: 10.1097/SLA.0b013e31825b6de4. [DOI] [PubMed] [Google Scholar]
  • 5.Tabuenca A, Welling R, Sachdeva AK, Blair PG, Horvath K, Tarpley J, et al. Multi-institutional validation of a web-based core competency assessment system. J Surg Educ. 2007;64(6):390–394. doi: 10.1016/j.jsurg.2007.06.011. [DOI] [PubMed] [Google Scholar]
  • 6.Bell RH. National curricula, certification and credentialing. Surgeon. 2011;9((suppl 1)):10–11. doi: 10.1016/j.surge.2010.11.007. doi: 10.1016/j.surge.2010.11.007. Epub 2011 Feb 22. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES