Background:
Robust assessment of skills acquisition and surgical performance during training is vital to ensuring operative competence among orthopaedic surgeons. A move to competency-based surgical training requires the use of tools that can assess surgical skills objectively and systematically. The aim of this systematic review was to describe the evidence for the utility of assessment tools used in evaluating operative performance in trauma and orthopaedic surgical training.
Methods:
We performed a comprehensive literature search of MEDLINE, Embase, and Google Scholar databases to June 2019. From eligible studies we abstracted data on study aim, assessment format (live theater or simulated setting), skills assessed, and tools or metrics used to assess surgical performance. The strengths, limitations, and psychometric properties of the assessment tools are reported on the basis of previously defined utility criteria.
Results:
One hundred and five studies published between 1990 and 2019 were included. Forty-two studies involved open orthopaedic surgical procedures, and 63 involved arthroscopy. The majority (85%) were used in the simulated environment. There was wide variation in the type of assessment tools in used, the strengths and weaknesses of which are assessor and setting-dependent.
Conclusions:
Current technical skills-assessment tools in trauma and orthopaedic surgery are largely procedure-specific and limited to research use in the simulated environment. An objective technical skills-assessment tool that is suitable for use in the live operative theater requires development and validation, to ensure proper competency-based assessment of surgical performance and readiness for unsupervised clinical practice.
Clinical Relevance:
Trainers and trainees can gain further insight into the technical skills assessment tools that they use in practice through the utility evidence provided.
Within an educational paradigm shift toward competency-based measures of performance in surgical training1, there is a need to evaluate surgical skills objectively and systematically, and hence, there is a drive toward developing more reliable and valid measures of surgical competence1-3.
Several surgical skill-assessment tools are currently in use in orthopaedic training, and studies evaluating the ability of these tools to objectively measure surgical performance have been performed. To our knowledge, this is the first systematic appraisal of the evidence for these assessment tools. It is imperative that the modernization of surgical curricula be supported by evidence-based tools for assessing technical skill and to enable summative judgments on progression through training and readiness for unsupervised operating.
The aim of this systematic review was to evaluate the orthopaedic surgical-competency literature and report on the metrics and tools used for skills assessment in trauma and orthopaedic surgical training; their utility with respect to validity, reliability, and impact on learning; and evidence for strengths and weaknesses of the various tools.
Materials and Methods
This review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines4 and registered with PROSPERO (International Prospective Register of Systematic Reviews)5.
Data Sources
We performed a comprehensive literature search of MEDLINE, Embase, and Google Scholar electronic databases. The search strategy was developed by collating keywords from an initial scoping search (Table I). Categories 1, 2, and 3 were combined using Boolean “AND/OR” operators and results were limited to human subjects. No date or language limits were applied. The last search was performed in June 2019. Duplicates were removed, and retrieved titles were screened for initial eligibility.
TABLE I.
Search Strategy
|
Study Selection
Eligible for inclusion were primary empirical research studies assessing postgraduate surgical resident performance in open or arthroscopic orthopaedic surgical skills in a simulated or live operative theater environment. Nonempirical studies and those that focused solely on patient or procedural outcome, or only described a training intervention, were excluded. A deliberately broad search strategy was employed to capture all studies in which an orthopaedic surgical skill was assessed.
Title and Abstract Review
The search identified 2,305 citations. Initial title screening was undertaken by 1 author (H.K.J., a doctoral researcher), with studies that were obviously irrelevant excluded. One hundred and eighty-seven abstracts subsequently underwent screening by 2 authors (H.K.J. and A.W.C., an attending surgeon), and 106 were retrieved in full text. Of these, 105 were included in the final review (1 study was excluded at full-text review as the participants were not surgical residents). Studies were rejected at screening if they were not empirical research, if the study participants were undergraduates, or if nontechnical skills were being assessed; studies reporting simulator protocol development or validation were also excluded at this stage. The reference lists of full-text articles were examined for relevant studies, and those found by hand searching were subject to the same eligibility screening process.
Data Extraction and Analysis
Data items relevant to the review objectives were extracted into a structured form to ensure consistency. The first reviewer undertook data extraction for all studies. Extracted data included study aim, setting, assessment format, number and training stage of participants, skills assessed, assessment tool and/or metrics, assessment tool category, study results, and “take-home” message related to the assessment tool. Assessment tools were classified by the type of method; the following categories were defined: traditional written assessments, objective assessment of technical skill, procedure-specific rating scale, individual procedural metrics, movement analysis, psychomotor testing, and subjective assessments.
Results
Search Results
One hundred and six articles were evaluated in detail, 1 of which was excluded at full-text review because the participants were not surgeons-in-training; 105 articles were therefore included in the review. The flow of studies is shown in Figure 1.
Fig. 1.
PRISMA flowchart.
Study Aims, Setting, and Participants
The studies were broadly split into 3 categories: studies measuring the impact of a simulation training intervention (26 studies6-31), studies assessing the construct validity of a simulator designed for training surgeons (42 studies32-73), and studies validating an assessment tool (37 studies74-110) (see Appendix Tables 1 and 2, column 1). Of the included studies, 60% assessed arthroscopic skill involving the knee (34 studies)6,8,9,13,15,17,19,31-33,36,38,39,41,42,47,48,54-56,63,74,75,77-83,86,87,89,91, the shoulder (25 studies)7,12,15,16,18,20,37,40,44,45,49,51,53,54,56,63,76,77,81,82,88,90-92,110, the hip (3 studies)43,50,85, the ankle (1 study)14, and basic general arthroscopic skills (6 studies)10,11,34,52,57,84. The majority (70%) of the studies assessing arthroscopic skill concerned diagnostic arthroscopy; procedural arthroscopic skills assessed included arthroscopic Bankart repair (3 studies), rotator cuff repair (1 study), labral repair (1 study), meniscal repair (2 studies), and anterior cruciate graft preparation (2 studies) and insertion (1 study) (see Appendix Table 1, column 5). The 42 studies that assessed open surgical procedures are shown in Appendix Table 2; as shown in column 5 of the table, the open procedures assessed included dynamic hip screw (DHS) fixation (4 studies), cannulated hip pinning (2 studies), and hemiarthroplasty (1 study) for a fractured femoral neck; spinal pedicle screw placement (6 studies); open surgical approaches to the shoulder (1 study); hand-trauma skills including nail-bed repair, Z-plasty, metacarpal fracture fixation, and tendon repair (1 study each); and various open reduction and internal fixation (ORIF) procedures for fractures of the forearm (7 studies), ankle (2 studies), and tibia (1 study), and complex articular fractures (1 study). Elective hand procedures, including trigger-finger release (1 study) and carpal tunnel decompression (3 studies), and elective hip (1 study) and knee (1 study) arthroplasty were also assessed.
The majority (85%) assessed skills in the simulated setting, 10 studies assessed skills in the live operative theater, and 10 studies assessed skills in both the simulated and live operative theater. Overall, 2,088 orthopaedic resident participants were involved in the studies, with experience level ranging from PGY (postgraduate year) 1 to 10.
Assessment Format
The assessment format varied considerably (see Appendix Tables 1 and 2, column 3). Fifty-nine studies assessed performance using live observation, and 50 used post-hoc analysis of video footage by experts. Simulator-derived metrics were used in 72 studies. Final-product analysis by expert assessors was used for 3 studies, and biomechanical testing of the final product was used in 7.
Assessment Tools or Metrics
A wide variety of assessment tools were used (see Appendix Table 3). Traditional assessments, such as written examinations, were used in 5 studies. Objective assessment of technical skills was widely used, and took many forms: task-specific checklists (20 studies), global rating scales (19 studies), and novel objective skills-assessment tools for both arthroscopy (22 studies) and open surgery (6 studies). Procedure-specific rating scales were used for both arthroscopic (7 studies) and open procedures (6 studies). Individual procedural metrics, such as final-product analysis and procedure time, were used in 56 studies. Movement analysis using simulator-derived metrics, such as hand movements, gaze tracking, hand-position checking, and instrument speed and path length, was used in 22 studies. Psychomotor testing using commercial dexterity tests was used in 5 studies. Subjective assessment measures were used in 4 studies.
Quality Assessment
Van Der Vleuten described a series of utility criteria, known as the “utility index,” which is a widely accepted framework for assessing an evaluation instrument111. The features of the utility index are described in Table II. Each assessment tool was appraised for utility; the evidence for each of the various technical skills-assessment tools in current use is summarized according to the utility index criteria (see Appendix Table 3, columns 5 to 11). There was a wide spread of utility characteristics among the different tools, and their heterogeneity precludes any formal analysis. The strengths and limitations of the respective tools are presented in Appendix Table 3, columns 3 and 4.
TABLE II.
Utility Criteria111 for Effective Assessment
| Validity | The extent to which the skills claimed to be being assessed are assessed by the instrument |
| Content validity | Describes the appropriateness of the variables measured by the assessment instrument122 |
| Construct validity | Describes the effectiveness of the assessment instrument at differentiating between different skill levels122 |
| Concurrent validity | Describes the extent to which the assessment instrument agrees with existing performance measures122 |
| Reliability | Describes the reproducibility of the results |
| Feasibility/acceptability | The extent to which the instrument is usable by the target audience |
| Educational impact | Consideration of the extent to which the instrument itself influences learning |
| Cost-effectiveness* | The extent to which the assessment instrument delivers value for money |
Not evaluated in this review.
Discussion
Robust assessment of competency and operative skill in trauma and orthopaedic surgery is a topical issue in training. The primary goals of surgical-competency assessment are to provide a platform for learning through feedback, to make summative judgments about capability and progression through training, to maintain standards within the profession, and ultimately, to protect patients from incompetent surgeons1.
To our knowledge, this review is the first comprehensive analysis of the tools currently available for assessing technical skill and operative competency in trauma and orthopaedic surgical training.
The results show that none of the tools currently used for assessing technical skill in orthopaedic surgical training fulfill the criteria of Norcini et al. for effective assessment112. There is a similar deficiency of utility evidence in technical skills-assessment tools in general surgery113 and vascular surgery1, which face the same challenges as trauma and orthopaedics in moving toward a competency-based approach to training1.
Checklists and global rating scales were commonly used tools for technical skills assessment in the review studies (see Appendix Trable 3). Checklists deconstruct a task into discrete steps, and may have educational value for teaching procedural sequencing to novice residents. They do not capture the quality of performance, and the rigid binary scoring does not allow deviation resulting from there possibly being >1 acceptable way of undertaking a procedure. Another disadvantage of checklists is an early ceiling effect1. Checklists do have the advantage of being able to be administered by nonexpert assessors, and judgment on performance can be made either live or from video footage. They also can be used in both the simulated and live theater environment. They show reasonable construct validity68,77,96,98, concurrent validity37,77,96,102,103, and reliability37,88,114. With their limitations in mind, checklists are perhaps most appropriate for novice learners in a formative setting1.
Global rating scales use generic domains with a Likert-type scale and descriptive anchors to capture the quality of performance61,66,93. They are generalizable between procedures and can be used to assess complex procedures when there is >1 accepted method. They can discriminate between competent and expert performance, and there are many studies demonstrating their content17,96 and concurrent validity17,77,85,96,98,103 and their reliability17,37,66,96. They require expert surgeon evaluators and are more time-consuming to administer, and may be susceptible to assessor bias, as domains of assessment such as instrument handling and respect for tissue are inherently quite subjective. The ability of global rating scales to distinguish between all levels of performance and the absence of a ceiling effect make them useful for high-stakes, summative assessment1 and the assessment of advanced residents.
Several novel objective assessment tools have been developed and combine task-specific checklists with a global rating scale. The most promising front-runners among these are the Arthroscopic Surgical Skill Evaluation Tool (ASSET)36,37,77, which combines a task-specific checklist with an 8-domain global rating scale with end and middle descriptive anchors, and the Objective Structured Assessment of Technical Skills (OSATS) tool23,93 (see Appendix Table 3). While the ASSET is obviously restricted to arthroscopic procedures, both have a growing body of evidence across all domains of the utility index (Table II). The hybrid approach of combining a task-specific checklist and a global rating scale into 1 assessment tool enables the strengths of both to be brought together within a single tool but has the disadvantage of becoming long and burdensome to complete, which negatively impacts their feasibility and acceptability in a busy workplace in which training assessment conflicts with service pressures.
The OSATS tool is in current use in training programs in obstetrics/gynecology115 and ophthalmology116 and is popular with residents117. It captures the quality of performance and can distinguish competence from mastery, and the stages of progression in between. There were several studies in this review that demonstrated the validity, reliability, feasibility, and educational value of the OSATS tool in trauma and orthopaedics in the simulated setting (see Appendix Table 3, columns 5 to 11). Further work is required to assess its utility in the live operative theater.
There are a variety of procedure-specific rating scales that have been developed for both open21,32,58,70,99,118 and arthroscopic7,76,81,82,90,92 procedures (see Appendix Table 3). Most are in the early stages of validation and are likely to be most useful for the research setting. They are not practical for the live workplace environment given the variety of procedures that are undertaken within a typical training rotation; a generic tool that may be applied to the assessment of all procedures is more feasible.
Motion analysis (see Appendix Table 3) is also promising for assessing technical skill, particularly in arthroscopy, and several studies in this review demonstrated its utility6,13,31,34,41,50,66,74,75,86. Its use to date has been largely restricted to the research setting, and further work on transfer validity and potential educational impact is required. Some of the obvious barriers, such as sterility concerns, have been mitigated by using elbow instead of hand-mounted sensors in the live operative theater31. Hand-motion analysis can generate a sophisticated data profile that can detect subtle improvement in surgical performance, and may be able to measure the attainment of mastery. Other motion parameters, such as gaze tracking6, triangulation time74, instrument path length12,15,40,48,49,51,55,56,63,110, and collisions38,55, have demonstrated construct validity and feasibility in the simulated environment but are unlikely to be useful in the live operative theater, as most of these measurements are derived from the simulator itself.
Individual procedural metrics can also be used to assess technical skill (see Appendix Table 3). Final-product analysis provides an objective assessment of final product quality, from which technical proficiency is inferred. Examples include tip-apex distance in DHS fixation58,62, screw position22,30,59,71,95, and articular congruency73,93. Orthopaedics has the advantage of the routine use of intraoperative and postoperative radiographs from which relevant, real-life final-product analysis metrics such as implant position can easily be measured. Final-product analysis is objective and quite easy and efficient to perform. A nonspecialist assessor (who has been appropriately trained) can make the measurements. In the simulated setting, invasive final-product-analysis measures, such as biomechanical testing of a fracture construct, can be used to assess procedural success. Final-product analysis is appealing as it relates technical performance to real-world, clinically relevant measures of operative success. Conclusions regarding the construct validity of final-product analysis are, however, rather mixed, with almost as many studies refuting its construct validity59,65,68,73,84 as those demonstrating it22,24,30,58,61,71,72,97, and the studies analyzed did not demonstrate evidence of reliability.
Procedure time was extensively used as a procedural metric to assess technical skill in the included studies. It is easy to measure in both the simulated and in vivo setting. It relies on the intuitive assumption that speed equates to proficiency. This is potentially problematic, as extrinsic patient and staff factors beyond surgeons’ immediate control could influence procedure time, and it gives no indication of quality of performance; procedure time may be measured as fast because the surgeon was a masterfully efficient operator, but alternatively they may have rushed the procedure and been careless. The evidence for construct and concurrent validity for procedure time is mixed, with many studies showing it can discriminate between experience levels6,11,12,30,31,33,34,40,43-45,47,48,50,51,53-56,61,63,64,67,78,86,110, and performs well against other types of assessment6,18,47,86, with others showing it cannot18,20,23,57,60,62,72,73,99,102. Both final-product analysis and procedure time are therefore unlikely to be useful in isolation, but rather could be used as adjunctive measures of technical proficiency.
Limitations
This review is limited to the assessment of technical skills in trauma and orthopaedic surgery; the assessment of nontechnical skills for surgeons was not considered in our analysis. Nontechnical skills are undoubtedly an essential dimension of surgical competence and are rightly beginning to receive attention in the surgical education literature119. The perfect technical skills-assessment tool is therefore never going to be usable in isolation to comprehensively assess competence, but rather should form a key part of a battery of evidence-based assessment tools.
Implications and Recommendations
There is growing dissatisfaction with the current technical skills-assessment tools within the surgical education community105,120, and an increasingly urgent need to develop an evidence-based assessment tool that is generalizable to the broad range of technical and nontechnical skills in trauma and orthopaedic surgery, and that satisfies the utility criteria.
The Procedure Based Assessment, which is the current main assessment tool used for high-stakes assessment in the U.K. training system, is lengthy to complete, comprising 40 to 50 tick boxes and 12 free-text spaces105. It was initially implemented prior to any formal validation beyond an initial consensus-setting (Delphi) process to define the domains105,121. Several years after its introduction, a large, pan-surgical-specialty validation study was undertaken109, with a particular focus on demonstrating the reliability of the rating scales105. Within this study, orthopaedics appears underrepresented, with the totality of the procedure-based assessment-validity evidence relating to 2 orthopaedic procedures involving 7 residents. Subsequent validation work, using more traditional frameworks in general and vascular surgery, has demonstrated that the procedure-based assessment is a valid and reliable measure of performance105 and responsive to change105, but there remains a deficiency of evidence for its utility in orthopaedics, which is surprising given that it is the current gold-standard assessment in the U.K. training system (see Appendix Table 3). Adding to the problem, engagement with the Procedure Based Assessment has been poor105, and it remains unpopular120. A national survey of trauma and orthopaedic resident attitudes toward procedure-based assessments (PBAs) in the U.K. found that more than half agreed or strongly agreed with the statement “completing PBAs is nothing but a form-filling exercise,” 60% agreed or strongly agreed that there are “barriers to the successful use of PBAs by residents”120, and only one-third believed that they should be used for high-stakes assessment in training, such as the Annual Review of Competence Progression120. Further work has found that reasons for the poor engagement are that the Procedure Based Assessment is burdensome to complete; with a coarse rating scale of blunt, binary descriptors, it cannot distinguish mastery or higher-order skills; and it results in general assessment fatigue105.
The Procedure Based Assessment was among the earliest formal tools for technical skills assessment in orthopaedic surgical training, and its creators deserve recognition for beginning the process of objectively assessing technical skills of surgeons-in-training. We propose that the Procedure Based Assessment is no longer appropriate for use in summative assessment in a modern competency-based assessment training environment. The OSATS tool and the ASSET show promise as replacements to the Procedure Based Assessment, and validation work on these, with a particular focus on their use in the live operative theater, should be continued.
Conclusions
The evidence for the utility of the technical skills-assessment tools currently used in trauma and orthopaedic surgical training is inadequate to support their use in summative high-stakes assessment of competency. An assessment tool that is generalizable to the broad range of technical and nontechnical skills relevant to trauma and orthopaedics, that satisfies the utility criteria, and that is cost-effective and feasible requires development.
Appendix
Supporting material provided by the authors is posted with the online version of this article as a data supplement at jbjs.org (http://links.lww.com/JBJSREV/A611).
Footnotes
Investigation performed at Clinical Trials Unit, Warwick Medical School, Coventry, United Kingdom
Disclosure: H.K.J. holds a Versus Arthritis Educational Research Fellowship (grant number 20485). On the Disclosure of Potential Conflicts of Interest forms, which are provided with the online version of the article, one or more of the authors checked “yes” to indicate that the author had a relevant financial relationship in the biomedical arena outside the submitted work and “yes” to indicate that the author had other relationships or activities that could be perceived to influence, or have the potential to influence, what was written in this work (http://links.lww.com/JBJSREV/A610).
References
- 1.Mitchell EL, Arora S, Moneta GL, Kret MR, Dargon PT, Landry GJ, Eidt JF, Sevdalis N. A systematic review of assessment of skill acquisition and operative competency in vascular surgical training. J Vasc Surg. 2014. May;59(5):1440-55. Epub 2014 Mar 19. [DOI] [PubMed] [Google Scholar]
- 2.Okoro T, Sirianni C, Brigden D. The concept of surgical assessment: part 1 – introduction. Bulletin of the Royal College of Surgeons of England. 2010. October;92(9):322-3. [Google Scholar]
- 3.Okoro T, Sirianni C, Brigden D. The concept of surgical assessment: part 2 – available tools. Bulletin of the Royal College of Surgeons of England. 2010. October;92(9):324-6. [Google Scholar]
- 4.Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, Shekelle P, Stewart LA; PRISMA-P Group. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015. January 2;350:g7647 Erratum in: BMJ. 2016 Jul 21;354:i4086. [DOI] [PubMed] [Google Scholar]
- 5.National Institute for Health Research. PROSPERO international prospective register of systematic reviews. Accessed 15 April 2020 https://www.crd.york.ac.uk/prospero/
- 6.An VVG, Mirza Y, Mazomenos E, Vasconcelos F, Stoyanov D, Oussedik S. Arthroscopic simulation using a knee model can be used to train speed and gaze strategies in knee arthroscopy. Knee. 2018. December;25(6):1214-21. Epub 2018 Jun 20. [DOI] [PubMed] [Google Scholar]
- 7.Angelo RL, Ryu RKN, Pedowitz RA, Beach W, Burns J, Dodds J, Field L, Getelman M, Hobgood R, McIntyre L, Gallagher AG. A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set. Arthroscopy. 2015. October;31(10):1854-71. Epub 2015 Sep 2. [DOI] [PubMed] [Google Scholar]
- 8.Bhattacharyya R, Davidson DJ, Sugand K, Bartlett MJ, Bhattacharya R, Gupte CM. Knee arthroscopy simulation: a randomized controlled trial evaluating the effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool. J Bone Joint Surg Am. 2017. October 4;99(19):e103. [DOI] [PubMed] [Google Scholar]
- 9.Camp CL, Krych AJ, Stuart MJ, Regnier TD, Mills KM, Turner NS. Improving resident performance in knee arthroscopy: a prospective value assessment of simulators and cadaveric skills laboratories. J Bone Joint Surg Am. 2016. February 3;98(3):220-5. [DOI] [PubMed] [Google Scholar]
- 10.Çetinkaya E, Çift H, Aybar A, Erçin E, Güler GB, Poyanlı O. The timing and importance of motor skills course in knee arthroscopy training. Acta Orthop Traumatol Turc. 2017. July;51(4):273-7. Epub 2017 Jul 8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Chong ACM, Pate RC, Prohaska DJ, Bron TR, Wooley PH. Validation of improvement of basic competency in arthroscopic knot tying using a bench top simulator in orthopaedic residency education. Arthroscopy. 2016. July;32(7):1389-99. Epub 2016 Apr 23. [DOI] [PubMed] [Google Scholar]
- 12.Gomoll AH, Pappas G, Forsythe B, Warner JJP. Individual skill progression on a virtual reality simulator for shoulder arthroscopy: a 3-year follow-up study. Am J Sports Med. 2008. June;36(6):1139-42. Epub 2008 Mar 6. [DOI] [PubMed] [Google Scholar]
- 13.Jackson WFM, Khan T, Alvand A, Al-Ali S, Gill HS, Price AJ, Rees JL. Learning and retaining simulated arthroscopic meniscal repair skills. J Bone Joint Surg Am. 2012. September 5;94(17):e132. [DOI] [PubMed] [Google Scholar]
- 14.Martin KD, Patterson D, Phisitkul P, Cameron KL, Femino J, Amendola A. Ankle arthroscopy simulation improves basic skills, anatomic recognition, and proficiency during diagnostic examination of residents in training. Foot Ankle Int. 2015. July;36(7):827-35. Epub 2015 Mar 11. [DOI] [PubMed] [Google Scholar]
- 15.Rahm S, Wieser K, Bauer DE, Waibel FW, Meyer DC, Gerber C, Fucentese SF. Efficacy of standardized training on a virtual reality simulator to advance knee and shoulder arthroscopic motor skills. BMC Musculoskelet Disord. 2018. May 16;19(1):150. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Rebolledo BJ, Hammann-Scala J, Leali A, Ranawat AS. Arthroscopy skills development with a surgical simulator: a comparative study in orthopaedic surgery residents. Am J Sports Med. 2015. June;43(6):1526-9. Epub 2015 Mar 13. [DOI] [PubMed] [Google Scholar]
- 17.Cannon WD, Garrett WE, Jr, Hunter RE, Sweeney HJ, Eckhoff DG, Nicandri GT, Hutchinson MR, Johnson DD, Bisson LJ, Bedi A, Hill JA, Koh JL, Reinig KD. Improving residency training in arthroscopic knee surgery with use of a virtual-reality simulator. A randomized blinded study. J Bone Joint Surg Am. 2014. November 5;96(21):1798-806. [DOI] [PubMed] [Google Scholar]
- 18.Dunn JC, Belmont PJ, Lanzi J, Martin K, Bader J, Owens B, Waterman BR. Arthroscopic shoulder surgical simulation training curriculum: transfer reliability and maintenance of skill over time. J Surg Educ. 2015. Nov-Dec;72(6):1118-23. Epub 2015 Aug 19. [DOI] [PubMed] [Google Scholar]
- 19.Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br. 2008. April;90(4):494-9. [DOI] [PubMed] [Google Scholar]
- 20.Waterman BR, Martin KD, Cameron KL, Owens BD, Belmont PJ., Jr Simulation training improves surgical proficiency and safety during diagnostic shoulder arthroscopy performed by residents. Orthopedics. 2016. May 1;39(3):e479-85. Epub 2016 May 2. [DOI] [PubMed] [Google Scholar]
- 21.Butler BA, Lawton CD, Burgess J, Balderama ES, Barsness KA, Sarwark JF. Simulation-based educational module improves intern and medical student performance of closed reduction and percutaneous pinning of pediatric supracondylar humeral fractures. J Bone Joint Surg Am. 2017. December 6;99(23):e128. [DOI] [PubMed] [Google Scholar]
- 22.Gottschalk MB, Yoon ST, Park DK, Rhee JM, Mitchell PM. Surgical training using three-dimensional simulation in placement of cervical lateral mass screws: a blinded randomized control trial. Spine J. 2015. January 1;15(1):168-75. Epub 2014 Sep 4. [DOI] [PubMed] [Google Scholar]
- 23.LeBlanc J, Hutchison C, Hu Y, Donnon T. A comparison of orthopaedic resident performance on surgical fixation of an ulnar fracture using virtual reality and synthetic models. J Bone Joint Surg Am. 2013. May 1;95(9):e60-5: S1-5. [DOI] [PubMed] [Google Scholar]
- 24.Nousiainen MT, Omoto DM, Zingg PO, Weil YA, Mardam-Bey SW, Eward WC. Training femoral neck screw insertion skills to surgical trainees: computer-assisted surgery versus conventional fluoroscopic technique. J Orthop Trauma. 2013. February;27(2):87-92. [DOI] [PubMed] [Google Scholar]
- 25.Ruder JA, Turvey B, Hsu JR, Scannell BP. Effectiveness of a low-cost drilling module in orthopaedic surgical simulation. J Surg Educ. 2017. May-Jun;74(3):471-6. Epub 2016 Nov 7. [DOI] [PubMed] [Google Scholar]
- 26.Sonnadara RR, Van Vliet A, Safir O, Alman B, Ferguson P, Kraemer W, Reznick R. Orthopedic boot camp: examining the effectiveness of an intensive surgical skills course. Surgery. 2011. June;149(6):745-9. Epub 2011 Jan 14. [DOI] [PubMed] [Google Scholar]
- 27.Sonnadara RR, Garbedian S, Safir O, Nousiainen M, Alman B, Ferguson P, Kraemer W, Reznick R. Orthopaedic boot camp II: examining the retention rates of an intensive surgical skills course. Surgery. 2012. June;151(6):803-7. [DOI] [PubMed] [Google Scholar]
- 28.Sonnadara RR, Garbedian S, Safir O, Mui C, Mironova P, Nousiainen M, Ferguson P, Alman B, Kraemer W, Reznick R. Toronto orthopaedic boot camp III: examining the efficacy of student-regulated learning during an intensive, laboratory-based surgical skills course. Surgery. 2013. July;154(1):29-33. [DOI] [PubMed] [Google Scholar]
- 29.Tonetti J, Vadcard L, Girard P, Dubois M, Merloz P, Troccaz J. Assessment of a percutaneous iliosacral screw insertion simulator. Orthop Traumatol Surg Res. 2009. November;95(7):471-7. Epub 2009 Oct 3. [DOI] [PubMed] [Google Scholar]
- 30.Xiang L, Zhou Y, Wang H, Zhang H, Song G, Zhao Y, Han J, Liu J. Significance of preoperative planning simulator for junior surgeons’ training of pedicle screw insertion. J Spinal Disord Tech. 2015. February;28(1):E25-9. [DOI] [PubMed] [Google Scholar]
- 31.Garfjeld Roberts P, Alvand A, Gallieri M, Hargrove C, Rees J. Objectively assessing intraoperative arthroscopic skills performance and the transfer of simulation training in knee arthroscopy: a randomized controlled trial. Arthroscopy. 2019. April;35(4):1197-1209.e1. Epub 2019 Mar 14. [DOI] [PubMed] [Google Scholar]
- 32.Brusalis CM, Lawrence JTR, Ranade SC, Kerr JC, Pulos N, Wells L, Ganley TJ. Can a novel, low-cost simulation model be used to teach anterior cruciate ligament graft preparation? J Pediatr Orthop. 2017. June;37(4):e277-81. [DOI] [PubMed] [Google Scholar]
- 33.Cannon WD, Nicandri GT, Reinig K, Mevis H, Wittstein J. Evaluation of skill level between trainees and community orthopaedic surgeons using a virtual reality arthroscopic knee simulator. J Bone Joint Surg Am. 2014. April 2;96(7):e57. [DOI] [PubMed] [Google Scholar]
- 34.Colaco HB, Hughes K, Pearse E, Arnander M, Tennent D. Construct validity, assessment of the learning curve, and experience of using a low-cost arthroscopic surgical simulator. J Surg Educ. 2017. Jan-Feb;74(1):47-54. Epub 2016 Oct 5. [DOI] [PubMed] [Google Scholar]
- 35.Coughlin RP, Pauyo T, Sutton JC, 3rd, Coughlin LP, Bergeron SG. A validated orthopaedic surgical simulation model for training and evaluation of basic arthroscopic skills. J Bone Joint Surg Am. 2015. September 2;97(17):1465-71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Dwyer T, Slade Shantz J, Chahal J, Wasserstein D, Schachar R, Kulasegaram KM, Theodoropoulos J, Greben R, Ogilvie-Harris D. Simulation of anterior cruciate ligament reconstruction in a dry model. Am J Sports Med. 2015. December;43(12):2997-3004. Epub 2015 Oct 12. [DOI] [PubMed] [Google Scholar]
- 37.Dwyer T, Schachar R, Leroux T, Petrera M, Cheung J, Greben R, Henry P, Ogilvie-Harris D, Theodoropoulos J, Chahal J. Performance assessment of arthroscopic rotator cuff repair and labral repair in a dry shoulder simulator. Arthroscopy. 2017. July;33(7):1310-8. Epub 2017 Mar 25. [DOI] [PubMed] [Google Scholar]
- 38.Escoto A, Trejos AL, Naish MD, Patel RV, Lebel ME. Force sensing-based simulator for arthroscopic skills assessment in orthopaedic knee surgery. Stud Health Technol Inform. 2012;173:129-35. [PubMed] [Google Scholar]
- 39.Fucentese SF, Rahm S, Wieser K, Spillmann J, Harders M, Koch PP. Evaluation of a virtual-reality-based simulator using passive haptic feedback for knee arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2015. April;23(4):1077-85. Epub 2014 Feb 12. [DOI] [PubMed] [Google Scholar]
- 40.Gomoll AH, O’Toole RV, Czarnecki J, Warner JJP. Surgical experience correlates with performance on a virtual reality simulator for shoulder arthroscopy. Am J Sports Med. 2007. June;35(6):883-8. Epub 2007 Jan 29. [DOI] [PubMed] [Google Scholar]
- 41.Howells NR, Brinsden MD, Gill RS, Carr AJ, Rees JL. Motion analysis: a validated method for showing skill levels in arthroscopy. Arthroscopy. 2008. March;24(3):335-42. [DOI] [PubMed] [Google Scholar]
- 42.Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am. 2009. September;91(9):2287-95. [DOI] [PubMed] [Google Scholar]
- 43.Khanduja V, Lawrence JE, Audenaert E. Testing the construct validity of a virtual reality hip arthroscopy simulator. Arthroscopy. 2017. March;33(3):566-71. Epub 2016 Dec 16. [DOI] [PubMed] [Google Scholar]
- 44.Martin KD, Belmont PJ, Schoenfeld AJ, Todd M, Cameron KL, Owens BD. Arthroscopic basic task performance in shoulder simulator model correlates with similar task performance in cadavers. J Bone Joint Surg Am. 2011. November 2;93(21):e1271-5. [DOI] [PubMed] [Google Scholar]
- 45.Martin KD, Cameron K, Belmont PJ, Schoenfeld A, Owens BD. Shoulder arthroscopy simulator performance correlates with resident and shoulder arthroscopy experience. J Bone Joint Surg Am. 2012. November 7;94(21):e160. [DOI] [PubMed] [Google Scholar]
- 46.Martin KD, Akoh CC, Amendola A, Phisitkul P. Comparison of three virtual reality arthroscopic simulators as part of an orthopedic residency educational curriculum. Iowa Orthop J. 2016;36:20-5. [PMC free article] [PubMed] [Google Scholar]
- 47.McCarthy A, Harley P, Smallwood R. Virtual arthroscopy training: do the “virtual skills” developed match the real skills required? Stud Health Technol Inform. 1999;62:221-7. [PubMed] [Google Scholar]
- 48.McCarthy AD, Moody L, Waterworth AR, Bickerstaff DR. Passive haptics in a knee arthroscopy simulator: is it valid for core skills training? Clin Orthop Relat Res. 2006. January;442:13-20. [DOI] [PubMed] [Google Scholar]
- 49.Pedowitz RA, Esch J, Snyder S. Evaluation of a virtual reality simulator for arthroscopy skills development. Arthroscopy. 2002. Jul-Aug;18(6):E29. [DOI] [PubMed] [Google Scholar]
- 50.Pollard TCB, Khan T, Price AJ, Gill HS, Glyn-Jones S, Rees JL. Simulated hip arthroscopy skills: learning curves with the lateral and supine patient positions: a randomized trial. J Bone Joint Surg Am. 2012. May 16;94(10):e68. [DOI] [PubMed] [Google Scholar]
- 51.Rahm S, Germann M, Hingsammer A, Wieser K, Gerber C. Validation of a virtual reality-based simulator for shoulder arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2016. May;24(5):1730-7. Epub 2016 Feb 9. [DOI] [PubMed] [Google Scholar]
- 52.Rose K, Pedowitz R. Fundamental arthroscopic skill differentiation with virtual reality simulation. Arthroscopy. 2015. February;31(2):299-305. Epub 2014 Oct 11. [DOI] [PubMed] [Google Scholar]
- 53.Srivastava S, Youngblood PL, Rawn C, Hariri S, Heinrichs WL, Ladd AL. Initial evaluation of a shoulder arthroscopy simulator: establishing construct validity. J Shoulder Elbow Surg. 2004. Mar-Apr;13(2):196-205. [DOI] [PubMed] [Google Scholar]
- 54.Tuijthof GJM, van Sterkenburg MN, Sierevelt IN, van Oldenrijk J, Van Dijk CN, Kerkhoffs GMMJ. First validation of the PASSPORT training environment for arthroscopic skills. Knee Surg Sports Traumatol Arthrosc. 2010. February;18(2):218-24. Epub 2009 Jul 24. [DOI] [PubMed] [Google Scholar]
- 55.Tashiro Y, Miura H, Nakanishi Y, Okazaki K, Iwamoto Y. Evaluation of skills in arthroscopic training based on trajectory and force data. Clin Orthop Relat Res. 2009. February;467(2):546-52. Epub 2008 Sep 13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Tofte JN, Westerlind BO, Martin KD, Guetschow BL, Uribe-Echevarria B, Rungprai C, Phisitkul P. Knee, shoulder, and fundamentals of arthroscopic surgery training: validation of a virtual arthroscopy simulator. Arthroscopy. 2017. March;33(3):641-646.e3. Epub 2016 Dec 16. [DOI] [PubMed] [Google Scholar]
- 57.Wong IH, Denkers M, Urquhart N, Farrokhyar F. Construct validity testing of the Arthroscopic Knot Trainer (ArK). Knee Surg Sports Traumatol Arthrosc. 2015. March;23(3):906-11. Epub 2013 May 18. [DOI] [PubMed] [Google Scholar]
- 58.Akhtar K, Sugand K, Sperrin M, Cobb J, Standfield N, Gupte C. Training safer orthopedic surgeons. Construct validation of a virtual-reality simulator for hip fracture surgery. Acta Orthop. 2015;86(5):616-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Aoude A, Alhamzah H, Fortin M, Jarzem P, Ouellet J, Weber MH. The use of computer-assisted surgery as an educational tool for the training of orthopedic surgery residents in pedicle screw placement: a pilot study and survey among orthopedic residents. Can J Surg. 2016. December;59(6):391-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Blyth P, Stott NS, Anderson IA. Virtual reality assessment of technical skill using the Bonedoc DHS simulator. Injury. 2008. October;39(10):1127-33. Epub 2008 Jun 13. [DOI] [PubMed] [Google Scholar]
- 61.Christian MW, Griffith C, Schoonover C, Zerhusen T, Jr, Coale M, OʼHara N, Henn RF, 3rd, OʼToole RV, Sciadini M. Construct validation of a novel hip fracture fixation surgical simulator. J Am Acad Orthop Surg. 2018. October 1;26(19):689-97. [DOI] [PubMed] [Google Scholar]
- 62.Froelich JM, Milbrandt JC, Novicoff WM, Saleh KJ, Allan DG. Surgical simulators and hip fractures: a role in residency training? J Surg Educ. 2011. Jul-Aug;68(4):298-302. Epub 2011 Apr 16. [DOI] [PubMed] [Google Scholar]
- 63.Garfjeld Roberts P, Guyver P, Baldwin M, Akhtar K, Alvand A, Price AJ, Rees JL. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics. Knee Surg Sports Traumatol Arthrosc. 2017. February;25(2):616-25. Epub 2016 Apr 16. [DOI] [PubMed] [Google Scholar]
- 64.Giurin I, Bréaud J, Rampal V, Rosello O, Solla F. A simulation model of nail bed suture and nail fixation: description and preliminary evaluation. J Surg Res. 2018. August;228:142-6. Epub 2018 Apr 11. [DOI] [PubMed] [Google Scholar]
- 65.Hohn EA, Brooks AG, Leasure J, Camisa W, van Warmerdam J, Kondrashov D, Montgomery W, McGann W. Development of a surgical skills curriculum for the training and assessment of manual skills in orthopedic surgical residents. J Surg Educ. 2015. Jan-Feb;72(1):47-52. Epub 2014 Aug 6. [DOI] [PubMed] [Google Scholar]
- 66.Leong JJH, Leff DR, Das A, Aggarwal R, Reilly P, Atkinson HDE, Emery RJ, Darzi AW. Validation of orthopaedic bench models for trauma surgery. J Bone Joint Surg Br. 2008. July;90(7):958-65. [DOI] [PubMed] [Google Scholar]
- 67.Lopez G, Wright R, Martin D, Jung J, Bracey D, Gupta R. A cost-effective junior resident training and assessment simulator for orthopaedic surgical skills via fundamentals of orthopaedic surgery: AAOS exhibit selection. J Bone Joint Surg Am. 2015. April 15;97(8):659-66. [DOI] [PubMed] [Google Scholar]
- 68.Mayne IP, Brydges R, Moktar J, Murnaghan ML. Development and assessment of a distal radial fracture model as a clinical teaching tool. J Bone Joint Surg Am. 2016. March 2;98(5):410-6. [DOI] [PubMed] [Google Scholar]
- 69.Qassemyar Q, Boulart L. A 4-task skills examination for residents for the assessment of technical ability in hand trauma surgery. J Surg Educ. 2015. Mar-Apr;72(2):179-83. Epub 2014 Dec 10. [DOI] [PubMed] [Google Scholar]
- 70.Rambani R, Ward J, Viant W. Desktop-based computer-assisted orthopedic training system for spinal surgery. J Surg Educ. 2014. Nov-Dec;71(6):805-9. Epub 2014 Jun 23. [DOI] [PubMed] [Google Scholar]
- 71.Shi J, Hou Y, Lin Y, Chen H, Yuan W. Role of Visuohaptic surgical training simulator in resident education of orthopedic surgery. World Neurosurg. 2018. March;111:e98-104. Epub 2017 Dec 15. [DOI] [PubMed] [Google Scholar]
- 72.Sugand K, Wescott RA, Carrington R, Hart A, Van Duren BH. Teaching basic trauma: validating FluoroSim, a digital fluoroscopic simulator for guide-wire insertion in hip surgery. Acta Orthop. 2018. August;89(4):380-5. Epub 2018 May 10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Yehyawi TM, Thomas TP, Ohrt GT, Marsh JL, Karam MD, Brown TD, Anderson DD. A simulation trainer for complex articular fracture surgery. J Bone Joint Surg Am. 2013. July 3;95(13):e92. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Alvand A, Khan T, Al-Ali S, Jackson WF, Price AJ, Rees JL. Simple visual parameters for objective assessment of arthroscopic skill. J Bone Joint Surg Am. 2012. July 3;94(13):e97. [DOI] [PubMed] [Google Scholar]
- 75.Alvand A, Logishetty K, Middleton R, Khan T, Jackson WFM, Price AJ, Rees JL. Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair. Arthroscopy. 2013. May;29(5):906-12. [DOI] [PubMed] [Google Scholar]
- 76.Bayona S, Akhtar K, Gupte C, Emery RJH, Dodds AL, Bello F. Assessing performance in shoulder arthroscopy: the Imperial Global Arthroscopy Rating Scale (IGARS). J Bone Joint Surg Am. 2014. July 2;96(13):e112 Epub 2014 Jul 2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Dwyer T, Slade Shantz J, Kulasegaram KM, Chahal J, Wasserstein D, Schachar R, Devitt B, Theodoropoulos J, Hodges B, Ogilvie-Harris D. Use of an objective structured assessment of technical skill after a sports medicine rotation. Arthroscopy. 2016. December;32(12):2572-2581.e3. Epub 2016 Jul 27. [DOI] [PubMed] [Google Scholar]
- 78.Elliott MJ, Caprise PA, Henning AE, Kurtz CA, Sekiya JK. Diagnostic knee arthroscopy: a pilot study to evaluate surgical skills. Arthroscopy. 2012. February;28(2):218-24. Epub 2011 Oct 28. [DOI] [PubMed] [Google Scholar]
- 79.Koehler RJ, Nicandri GT. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination. J Bone Joint Surg Am. 2013. December 4;95(23):e1871-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Koehler RJ, Amsdell S, Arendt EA, Bisson LJ, Braman JP, Butler A, Cosgarea AJ, Harner CD, Garrett WE, Olson T, Warme WJ, Nicandri GT. The Arthroscopic Surgical Skill Evaluation Tool (ASSET). Am J Sports Med. 2013. June;41(6):1229-37. Epub 2013 Apr 2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Middleton RM, Baldwin MJ, Akhtar K, Alvand A, Rees JL. Which global rating scale? A Comparison of the ASSET, BAKSSS, and IGARS for the assessment of simulated arthroscopic skills. J Bone Joint Surg Am. 2016. January 6;98(1):75-81. [DOI] [PubMed] [Google Scholar]
- 82.Nwachukwu B, Gaudiani M, Hammann-Scala J, Ranawat A. A checklist intervention to assess resident diagnostic knee and shoulder arthroscopic efficiency. J Surg Educ. 2017. Jan-Feb;74(1):9-15. Epub 2016 Aug 23. [DOI] [PubMed] [Google Scholar]
- 83.Olson T, Koehler R, Butler A, Amsdell S, Nicandri G. Is there a valid and reliable assessment of diagnostic knee arthroscopy skill? Clin Orthop Relat Res. 2013. May;471(5):1670-6. Epub 2012 Dec 20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Pedowitz RA, Nicandri GT, Angelo RL, Ryu RKN, Gallagher AG. Objective assessment of knot-tying proficiency with the Fundamentals of Arthroscopic Surgery Training Program Workstation and Knot Tester. Arthroscopy. 2015. October;31(10):1872-9. Epub 2015 Aug 19. [DOI] [PubMed] [Google Scholar]
- 85.Phillips L, Cheung JJH, Whelan DB, Murnaghan ML, Chahal J, Theodoropoulos J, Ogilvie-Harris D, Macniven I, Dwyer T. Validation of a dry model for assessing the performance of arthroscopic hip labral repair. Am J Sports Med. 2017. July;45(9):2125-30. Epub 2017 Mar 29. [DOI] [PubMed] [Google Scholar]
- 86.Price AJ, Erturan G, Akhtar K, Judge A, Alvand A, Rees JL. Evidence-based surgical training in orthopaedics: how many arthroscopies of the knee are needed to achieve consultant level performance? Bone Joint J. 2015. October;97-B(10):1309-15. [DOI] [PubMed] [Google Scholar]
- 87.Slade Shantz JA, Leiter JR, Collins JB, MacDonald PB. Validation of a global assessment of arthroscopic skills in a cadaveric knee model. Arthroscopy. 2013. January;29(1):106-12. Epub 2012 Nov 20. [DOI] [PubMed] [Google Scholar]
- 88.Gallagher AG, Ryu RKN, Pedowitz RA, Henn P, Angelo RL. Inter-rater reliability for metrics scored in a binary fashion-performance assessment for an arthroscopic Bankart repair. Arthroscopy. 2018. July;34(7):2191-8. Epub 2018 May 2. [DOI] [PubMed] [Google Scholar]
- 89.Hodgins JL, Veillette C, Biau D, Sonnadara R. The knee arthroscopy learning curve: quantitative assessment of surgical skills. Arthroscopy. 2014. May;30(5):613-21. [DOI] [PubMed] [Google Scholar]
- 90.Hoyle AC, Whelton C, Umaar R, Funk L. Validation of a global rating scale for shoulder arthroscopy: a pilot study. Shoulder Elbow. 2012. January;4(1):16-21. [Google Scholar]
- 91.Koehler RJ, Goldblatt JP, Maloney MD, Voloshin I, Nicandri GT. Assessing diagnostic arthroscopy performance in the operating room using the Arthroscopic Surgery Skill Evaluation Tool (ASSET). Arthroscopy. 2015. December;31(12):2314-9.e2. Epub 2015 Aug 28. [DOI] [PubMed] [Google Scholar]
- 92.Talbot CL, Holt EM, Gooding BWT, Tennent TD, Foden P. The Shoulder Objective Practical Assessment Tool: evaluation of a new tool assessing residents learning in diagnostic shoulder arthroscopy. Arthroscopy. 2015. August;31(8):1441-9. Epub 2015 Apr 22. [DOI] [PubMed] [Google Scholar]
- 93.Anderson DD, Long S, Thomas GW, Putnam MD, Bechtold JE, Karam MD. Objective Structured Assessments of Technical Skills (OSATS) does not assess the quality of the surgical result effectively. Clin Orthop Relat Res. 2016. April;474(4):874-81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Backstein D, Agnidis Z, Regehr G, Reznick R. The effectiveness of video feedback in the acquisition of orthopedic technical skills. Am J Surg. 2004. March;187(3):427-32. [DOI] [PubMed] [Google Scholar]
- 95.Bergeson RK, Schwend RM, DeLucia T, Silva SR, Smith JE, Avilucea FR. How accurately do novice surgeons place thoracic pedicle screws with the free hand technique? Spine (Phila Pa 1976). 2008. July 1;33(15):E501-7. [DOI] [PubMed] [Google Scholar]
- 96.Bernard JA, Dattilo JR, Srikumaran U, Zikria BA, Jain A, LaPorte DM. Reliability and validity of 3 methods of assessing orthopedic resident skill in shoulder surgery. J Surg Educ. 2016. Nov-Dec;73(6):1020-5. Epub 2016 Jun 3. [DOI] [PubMed] [Google Scholar]
- 97.Burns GT, King BW, Holmes JR, Irwin TA. Evaluating internal fixation skills using surgical simulation. J Bone Joint Surg Am. 2017. March 1;99(5):e21. [DOI] [PubMed] [Google Scholar]
- 98.MacEwan MJ, Dudek NL, Wood TJ, Gofton WT. Continued validation of the O-SCORE (Ottawa Surgical Competency Operating Room Evaluation): use in the simulated environment. Teach Learn Med. 2016;28(1):72-9. [DOI] [PubMed] [Google Scholar]
- 99.Pedersen P, Palm H, Ringsted C, Konge L. Virtual-reality simulation to assess performance in hip fracture surgery. Acta Orthop. 2014. August;85(4):403-7. Epub 2014 Apr 30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Putnam MD, Kinnucan E, Adams JE, Van Heest AE, Nuckley DJ, Shanedling J. On orthopedic surgical skill prediction—the limited value of traditional testing. J Surg Educ. 2015. May-Jun;72(3):458-70. Epub 2014 Dec 24. [DOI] [PubMed] [Google Scholar]
- 101.Williams JF, Watson SL, Baker DK, Ponce BA, McGwin G, Gilbert SR, Khoury JG. Psychomotor testing for orthopedic residency applicants: a pilot study. J Surg Educ. 2017. Sep-Oct;74(5):820-7. Epub 2017 Mar 7. [DOI] [PubMed] [Google Scholar]
- 102.Van Heest A, Putnam M, Agel J, Shanedling J, McPherson S, Schmitz C. Assessment of technical skills of orthopaedic surgery residents performing open carpal tunnel release surgery. J Bone Joint Surg Am. 2009. December;91(12):2811-7. [DOI] [PubMed] [Google Scholar]
- 103.VanHeest A, Kuzel B, Agel J, Putnam M, Kalliainen L, Fletcher J. Objective structured assessment of technical skill in upper extremity surgery. J Hand Surg Am. 2012. February;37(2):332-7: 337.e1-4. [DOI] [PubMed] [Google Scholar]
- 104.Beard JD, Marriott J, Purdie H, Crossley J. Assessing the surgical skills of trainees in the operating theatre: a prospective observational study of the methodology. Health Technol Assess. 2011. January;15(1):i-xxi, 1-162. [DOI] [PubMed] [Google Scholar]
- 105.Davies RM, Hadfield-Law L, Turner PG. Development and evaluation of a new formative assessment of surgical performance. J Surg Educ. 2018. Sep-Oct;75(5):1309-16. Epub 2018 Mar 24. [DOI] [PubMed] [Google Scholar]
- 106.Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE): a tool to assess surgical competence. Acad Med. 2012. October;87(10):1401-7. [DOI] [PubMed] [Google Scholar]
- 107.Hawkes DH, Harrison WJ. Critiquing operative fracture fixation: the development of an assessment tool. Eur J Orthop Surg Traumatol. 2017. December;27(8):1083-8. Epub 2017 Mar 23. [DOI] [PubMed] [Google Scholar]
- 108.Hoffer MM, Hsu SC. Hand function in selection of orthopedics residents. Acad Med. 1990. October;65(10):661. [DOI] [PubMed] [Google Scholar]
- 109.Marriott J, Purdie H, Crossley J, Beard JD. Evaluation of procedure-based assessment for assessing trainees’ skills in the operating theatre. Br J Surg. 2011. March;98(3):450-7. Epub 2010 Nov 24. [DOI] [PubMed] [Google Scholar]
- 110.Martin KD, Patterson DP, Cameron KL. Arthroscopic training courses improve trainee arthroscopy skills: a simulation-based prospective trial. Arthroscopy. 2016. November;32(11):2228-32. Epub 2016 May 25. [DOI] [PubMed] [Google Scholar]
- 111.Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996. January;1(1):41-67. [DOI] [PubMed] [Google Scholar]
- 112.Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrott V, Roberts T. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-14. [DOI] [PubMed] [Google Scholar]
- 113.van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010. July;97(7):972-87. [DOI] [PubMed] [Google Scholar]
- 114.Angelo RL, Ryu RKN, Pedowitz RA, Gallagher AG. The Bankart performance metrics combined with a cadaveric shoulder create a precise and accurate assessment tool for measuring surgeon skill. Arthroscopy. 2015. September;31(9):1655-70. Epub 2015 Jul 31. [DOI] [PubMed] [Google Scholar]
- 115.The Royal College of Obstetricians and Gynaecologists. OSATS. Accessed 15 April 2020 https://www.rcog.org.uk/en/careers-training/about-specialty-training-in-og/assessment-and-progression-through-training/workplace-based-assessments/osats/
- 116.The Royal College of Ophthalmologists. Objective assessment of surgical and technical skills (OSATS). Accessed 2019 Jun https://www.rcophth.ac.uk/curriculum/ost/assessments/workplace-based-assessments/objective-assessment-of-surgical-and-technical-skills-osats/ [Google Scholar]
- 117.Tsagkataki M, Choudhary A. Mersey Deanery ophthalmology trainees’ views of the objective assessment of surgical and technical skills (OSATS) workplace-based assessment tool. Perspect Med Educ. 2013. February;2(1):21-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 118.Rambani R, Viant W, Ward J, Mohsen A. Computer-assisted orthopedic training system for fracture fixation. J Surg Educ. 2013. May-Jun;70(3):304-8. Epub 2013 Feb 22. [DOI] [PubMed] [Google Scholar]
- 119.Agha RA, Fowler AJ, Sevdalis N. The role of non-technical skills in surgery. Ann Med Surg (Lond). 2015. October 9;4(4):422-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Hunter AR, Baird EJ, Reed MR. Procedure-based assessments in trauma and orthopaedic training—the trainees’ perspective. Med Teach. 2015. May;37(5):444-9. Epub 2014 Sep 4. [DOI] [PubMed] [Google Scholar]
- 121.Pitts D, Rowley DI, Sher JL. Assessment of performance in orthopaedic training. J Bone Joint Surg Br. 2005. September;87(9):1187-91. [DOI] [PubMed] [Google Scholar]
- 122.Bartlett JD, Lawrence JE, Stewart ME, Nakano N, Khanduja V. Does virtual reality simulation have a role in training trauma and orthopaedic surgeons? Bone Joint J. 2018. May 1;100-B(5):559-65. [DOI] [PubMed] [Google Scholar]

