Table 3.
Existing methods for determining surgical competency
Method | Basic structure | Assessment | Procedure examples | Validated1 for |
|
|
|
|
|
Practicing surgeons
|
Surgical trainees
|
Licensing bodies assessments | |||||
State Medical Boards[50-52] | Mandatory to practice. Required to demonstrate competency through CME. However, states individually may evaluate professional conduct when a physician fails to provide appropriate quality of care | Must regularly participate in CME activities and may require board certification. May have competency evaluation by independent evaluator or approved assessment program if signs of dyscompetence | - | Unclear | - |
ABMS[18,19] | Voluntary certification to show knowledge of standards of practice. Rigorous process of evaluation every 10 yr with MOC | MOC consists of 4-part assessment: Licensure/professional standing, participation in CME programs, cognitive expertise through examination, and documentation of quality of care and/or audits or peer review | - | Unclear | - |
Provincial Licensing Bodies in Canada[53-55] | Mandatory to practice. Required to demonstrate competency through CME. Provincial licencing bodies identify those with deficiencies in competence, requiring peer review | Must regularly participate in CME activities. If evidence of dyscompetence, rigorous individualized assessment of the surgeon’s practice is performed, with emphasis on quality of care | - | Unclear | - |
Fellows of the RCPSC[56,57] | Voluntary certification to show commitment to competent practice. Evaluation and successful completion of MOC program every 5 yr | Must participate in CPD activities. MOC based on 3 section framework: Group learning, self-learning, and assessment | - | Unclear | - |
Non-licensing bodies assessments | |||||
OSATS[58-61] | Multi-station and timed with bench and live model simulations or surgical procedures. Peer evaluated with rating scale | Checklist and global rating scale by expert examiner to evaluate technical skill. Does not assess decision making or concrete surgical aspects | Laparoscopic Gastric Bypass Saphenofemoral dissection. Meniscectomy transtibial or anteromedial femoral tunnel | Yes | Yes |
C-SATS[26,31] | Video recorded surgical performance and evaluated with validated with rating scale | Crowds of anonymous and independent reviewers, including those nonmedically trained, evaluate surgical skill with validated performance tools such as OSATS | Urinary bladder closure. Robotic surgery skills | No | Yes |
O-SCORE[27,62] | Surgical procedure peer evaluated with rating scale | Surgical experts rate performance with 9 item tool and scaling system to assess competence to perform procedure independently | Open reduction internal fixation of hip, wrist, or ankle. Arthroplasty (total hip or hemi). Knee arthroscopy | No | Yes |
GOALS[63,64] | Laparoscopic procedure peer evaluated with rating scale | Surgical experts evaluate performance with 5-point rating scale of 5 items unique to laparoscopy | Laparoscopic cholecystectomy | No | Yes |
GEARS[65,66] | Robotic procedures peer evaluated with rating scale | Surgical experts evaluate performance with 5-point rating scale of 6 items unique to robotic surgery | Inanimate simulators–continuous suturing. Prostatectomy | No | Yes |
Direct Objective Metric Measures[67,68] | Skill/surgical procedure measured with concrete aspects | Measurement of stiffness and failure load for each repair construct, with comparison to expected rehabilitation loads | Tibial plafond fracture reduction. Distal radius fracture reduction | No | Yes |
To note: Methods to determine surgical competency are deemed valid for (1) Experienced surgeons; or (2) Residents/trainees if the assessment (continuing medical education, maintenance of certification or rated technical skill) correlated with experience level and/or with patient outcomes. Validity was shown for specific procedures within specific subspecialties. For example, experienced bariatric surgeons who had higher rated technical skill in laparoscopic gastric bypass surgery had patients with fewer post-op complications[46]. Generalized validity has yet to be shown in literature with regards to the technical skill assessments, although validity was typically demonstrated across several procedures. ABMS: American Board of Medical Specialties; CME: Continuing medical education; MOC: Maintenance of certification; RCPSC: Royal College of Physicians and Surgeons of Canada; CPD: Continuing professional development; OSATS: Objective structured assessment of technical skill; C-SATS: Crowdsourced assessment of technical skills; O-SCORE: Ottawa Surgical Competency Operative Room Evaluation; GOALS: Global operative assessment of laparoscopic skills; GEAR: Global evaluative assessment of robotic skills.