Skip to main content
Clinical Orthopaedics and Related Research logoLink to Clinical Orthopaedics and Related Research
. 2023 Jan 31;481(3):564–579. doi: 10.1097/CORR.0000000000002506

CORR Synthesis: How Have Film Review and Motion Analysis Been Used to Enhance Orthopaedic Surgical Performance?

Jack C Casey 1,, Alan H Daniels 1
PMCID: PMC9928675  PMID: 36719752

In the Beginning…

The use of film review, defined as studying footage of oneself or others to improve skills, is common in many professions; it is also frequently used to study athletic performance. Video analysis has been used to determine the match-play demands of different positions in professional rugby [48], and it is frequently used by swimmers to evaluate their performances after competitions [79]. Technological advances have enhanced performance review capabilities. In tennis, the Hawk-Eye player tracking system records player coordinates every 40 ms, and the data collected allow for calculations of speed, acceleration, distance, and direction changes, which provide insight into a player’s style [31]. Motion analysis, defined as the study of motion parameters to improve performance, is also becoming common practice. Small devices can be affixed to athletes to track speed, position, and acceleration data [53, 85]. These data can then be analyzed to gain a more complete understanding of an athlete’s physical capabilities, and this has been used in rugby [27, 29], field hockey [28], Australian football [45, 88], soccer [74], swimming [35], and cross-country skiing [54].

The best athletes in the world are dedicated to training and review their performance with coaches. Similarly, performance review is common in chess, where even the best players review their own games and study the games of other players. Surgery, similar to all endeavors that require accuracy and precision, involves lifelong dedicated practice and learning. Physicians at all stages of their careers, from chief residents to physicians in practice for more than 30 years, have found a film review session with a retired expert surgeon to be productive [42]. In that study, an expert surgeon reviewed the physicians’ surgical footage with them and adjusted the discussion to the experience level of the physician being coached. General surgery residents must complete the Fundamentals of Laparoscopic Surgery and the Fundamentals of Endoscopic Surgery training programs to obtain general surgery certification according to the American Board of Surgery [6]. The residents complete tasks while being graded on precision, efficiency, speed, and accuracy [26, 86]. Although the exact grading metrics are confidential to protect the integrity of the examination, these programs are examples of how validated assessments of surgical skill that correlate with operative performance [60, 80] have been incorporated into surgical training requirements in a nonorthopaedic specialty. Surgeons of all specialties can benefit from advances in technology that allow for refinement of surgical skill. With the advent of unobtrusive motion sensors that give insight into a surgical trainee’s economy of motion [59], orthopaedic surgery training could benefit from these advanced technologies.

The Argument

Surgical skill historically has been taught using high-volume exposure [34]. Work-hour restrictions have shifted the paradigm, necessitating the creation of more formal and efficient surgical curricula to ensure surgeons finish training with high levels of proficiency despite fewer hours of experience. The American Board of Orthopaedic Surgeons refers to this as a “transition from time-based programs for the acquisition of knowledge and skills to promoting competency-based curricula” [4]. The coronavirus-19 pandemic has also decreased surgical procedure volume for orthopaedic residents, which further emphasizes the need for better methods of teaching surgical skill [37]. Programs have reported various competency-based, novel methods of teaching such as the “surgical games,” in which residents complete simulated tasks, such as a cadaveric carpal tunnel release or a Sawbones model TKA, under direct observation by attending surgeons [16]. Although individual studies such as this have been successful, the cost and time requirements associated with a transition to competency-based curricula seen in countries outside the United States have resulted in concerns that full-fledged adoption of this educational approach will overburden American facilities [22]. Standardized video review and motion analysis could ease this burden.

Direct observation requires faculty surgeons to be present to grade the trainee, which can be time prohibitive and therefore unrealistic as a longitudinal training method in which skill progression is tracked. Motion analysis would be especially useful in an orthopaedic training setting because data can be processed and presented rapidly without the need for faculty to be present. Film review is also an efficient training tool because hours of footage can be condensed by nonmedical staff so only pertinent parts of the procedure are included for surgeon and trainee review. Furthermore, blinded film review and motion analysis enable an objective, unbiased assessment of trainees. Orthopaedic surgery training is slowly catching up to athletic training in its use of technology to measure performance, but there is still no standardized curriculum that includes video review and motion analysis with set benchmarks for surgical skill acquisition.

Rapid curriculum changes risk being expensive and ineffective [22], and the evidence regarding implementation of film review and motion analysis into orthopaedic training is not all positive. Of the few randomized controlled trials assessing the effect of augmenting surgical learning with film review, most showed no improvements relative to conventional learning methods [8, 9, 21]. In contrast, when film review and motion analysis are used to assess trainees, there are many observational studies that demonstrate the efficacy of these tools in differentiating between experience levels and tracking learning curves, with the caveat that not all motion parameters can represent orthopaedic skill. This review will examine how film review and motion analysis have been used in orthopaedics.

Essential Elements

We performed this review via detailed, standardized searches of PubMed and Embase through May 30, 2022 (Supplemental Table 1; http://links.lww.com/CORR/A997). We searched for original, prospective (randomized and nonrandomized) or cross-sectional studies that evaluated or relied on the use of film review or motion analysis in orthopaedic surgical training. Searches were not limited by language. We followed the Preferred Reporting Items for Systematic Reviews and Meta-analyses–styled search process to go through the search return [66]. Screening was performed using the Endnote library (ENDNOTE 20, Clarivate). We combined the PubMed and Embase search results for motion analysis and film review. After screening for duplicates, there were 973 studies. We excluded 781 studies through abstract review because they were not relevant to the topic. These excluded studies included but were not limited to topics such as the use of instructional didactic videos for orthopaedic trainees, videos for patients before or after orthopaedic procedures, and the use of motion technologies to assess postoperative outcomes. We included studies that examined the use of film review or motion analysis to directly enhance or measure surgical performance or procedural learning curves, in which participants were students in their first year of orthopaedic residency or later in their orthopaedic career. We excluded studies if medical students were the only participants being tested, but we included studies that had medical students as a novice group to compare with orthopaedic residents. We included high-quality studies using cadavers or simulators. Studies examining procedures frequently performed by orthopaedic residents but in which orthopaedic residents were not the participants in the study were included. For instance, a study of plastic surgery residents performing carpal tunnel releases in which some of the video reviewers were orthopaedic surgeons was included. We excluded case series, case reports, systematic reviews, and expert opinion articles. We excluded studies in which film review or motion analysis was solely used to grade trainees, and in which the efficacy of these grading measures was not being examined.

After full-text review, we included 27 studies investigating the use of film review in orthopaedics including five randomized controlled trials [8, 9, 21, 36, 47] and 22 observational studies [3, 11, 12, 15, 20, 23, 24, 30, 41, 44, 50-52, 56, 61, 64, 65, 69, 77, 78, 81, 82]. We included 28 studies investigating the use of motion analysis in orthopaedics, including four randomized controlled trials [40, 58, 62, 70] and 24 observational studies [2, 7, 13, 14, 17, 18, 32, 38, 39, 43, 49, 57, 63, 67, 68, 71-73, 75, 76, 83, 84, 89, 90].

We assessed the quality of evidence using the Grading of Recommendations, Assessment, Development, and Evaluations criteria, which assesses the quality of evidence for each outcome [33] (Supplemental Table 2; http://links.lww.com/CORR/A998). Based on this assessment, seven randomized controlled trials [8, 9, 21, 36, 47, 58, 62] contributed moderate-quality evidence while all observational studies and two randomized controlled trials [40, 70] contributed very low–quality evidence. The quality of evidence was downgraded for several reasons, most commonly for small sample size and inconsistency of results between studies.

What We (Think) We Know

Does Film Review Enhance Learning?

Although junior residents often find film review sessions with more experienced surgeons useful [44], learning augmented with film review had no objective effect on resident learning relative to conventional learning methods for performance on a fracture reduction simulation [21], simulated vascular anastomosis [9], or plating and tension band wiring of surgical model fractures [8]. In contrast to these findings, reviewing surgical microscope footage with an expert surgeon enhanced early learning for neurosurgical residents performing lumbar spine procedures in the operating room, although the conventional-learning students eventually reached the same performance level [36]. Evidence more strongly in favor of supplementing conventional learning with film review also exists. Surgeons became more skilled in minimally invasive plate osteosynthesis of distal radius fractures after reviewing video clips of their own and their colleagues’ mistakes [23]. In another study, learning augmented by film review with an expert surgeon improved residents’ performance of reduction and fixation of intraarticular tibial plafond fractures in a model relative to residents who practiced without film review [47]. The evidence for film review as a tool to enhance learning is currently mixed, whereas evidence in favor of film review as a tool to grade trainees is more concrete.

Does Film Review Allow for Grading Trainees or Tracking of Learning Curves?

Procedures can be recorded from head-mounted cameras, glasses, arthroscope tools, microscopes, and stationary cameras. Surgical footage can then be graded using checklists that assess whether a trainee completed predefined tasks or via a global rating scale that assesses how well the tasks were performed and the quality of the final product. Blinded grading of surgical footage is overwhelmingly supported in previous research for its ability to differentiate among skill levels, with only one study [64] reporting difficulty differentiating above a certain skill threshold (Table 1). Surgical footage grading has been successful for basic surgical tasks such as suturing [12, 77] and more advanced surgical procedures [11, 20, 24, 41, 50, 51, 52, 65, 69, 78, 81, 82]. Surgical footage grading is also useful for tracking improvements in skill over time [3, 61], and was found to be an effective method of grading arthroscopic Bankart repair skill with high inter-rater reliability [30].

Table 1.

Summary of studies that used film review to enhance orthopaedic surgical performance

Procedure studied Article Study design Population studied Finding
Film review: grading orthopaedic trainees
Basic surgical skills (such as suturing) Beason et al. [12] Observational PGY-1 surgical residents (general surgery, orthopaedic surgery, plastic surgery, urology, ENT, vascular surgery, neurosurgery) Capable of evaluating skill level
Sanfey et al. [77] Observational PGY-1 residents (general surgery, orthopaedic surgery, otolaryngology, plastic surgery, urology) This is a pilot study presenting 2-years of data. Beason et al. [12] present the 10-year findings.
Applying a dynamic compression plate on a cadaveric porcine model Leong et al. [56] Observational Novices (senior house officers in a basic surgical rotation), intermediates (specialist registrars in trauma and orthopaedics), experts (consultant orthopaedic surgeons) Differentiated by experience level
Applying a short arm plaster cast to a simulator Moktar et al. [64] Observational Medical students, orthopaedic residents, orthopaedic fellows, orthopaedic technologist Differentiated medical students from orthopaedic fellows and an orthopaedic technologist, but did not make any other differentiations
ACL reconstruction on a dry model Dwyer et al. [24] Observational Residents, orthopaedic sports medicine fellows, staff surgeons Differentiated by experience level
Carpal tunnel decompressions in the OR Davies et al. [20] Observational Surgeons of differing abilities Differentiated by experience level
Open reduction and internal fixation of hip fractures in the OR Taylor et al. [81] Observational Orthopaedic residents (PGY-2, 3, and 5) Nonexpert grading of video is capable of evaluating skill level
Taylor et al. [82] Observational Orthopaedic residents (PGY-2-5) and a staff surgeon Differentiated by experience level
Open reduction and internal fixation of fractures in an animal model Hoyt et al. [41] Observational Junior residents (PGY-1, 2), senior residents (PGY-3, 4),
chiefs/attendings (PGY-5+)
Differentiated by experience level
Arthroscopic hip labral repairs on models Phillips et al. [69] Observational Orthopaedic surgery residents (junior: PGY-1, 2, 3; senior: PGY-4, 5), sports medicine fellows, staff surgeons Differentiated by experience level
Simulated shoulder arthroscopy procedures Bayona et al. [11] Observational Novices, intermediate surgeons (<50 arthroscopic procedures as main surgeon), expert surgeons (>50 arthroscopic procedures) Differentiated by experience level
Diagnostic knee arthroscopy on a virtual reality simulator Mulligan et al. [65] Observational Medical students, intermediate orthopaedic trainees (PGY-5 to 11 with <50 independent arthroscopies), expert consultant orthopaedic surgeons (>50 knee arthroscopies per year) Differentiated by experience level
Diagnostic arthroscopy of knees and shoulders in the OR Koehler et al. [51] Observational Orthopaedic residents (PGY-3, 4, 5), a sports medicine fellow, sports medicine faculty members Differentiated fellow and faculty from residents but did not differentiate among residents
Diagnostic arthroscopy on cadaveric knees Koehler et al. [50] Observational Orthopaedic residents (organized by knee arthroscopy procedure experience and PGY level), orthopaedic faculty Differentiated by experience level
Koehler et al. [52] Observational Orthopaedic residents (organized by knee arthroscopy procedure experience and PGY level), orthopaedic sports medicine surgeons Differentiated by experience level
Slade Shantz et al. [78] Observational Novice (residents without prior arthroscopy experience), experienced residents (residents with previous arthroscopy experience),
fellows (sports medicine fellows in the 11th month of fellowship),
faculty (physicians with most of their practice consisting of arthroscopic procedures)
Differentiated by experience level
Simulated arthroscopic meniscal repairs Alvand et al. [3] Observational Orthopaedic residents (PGY-2-4) Tracked a learning curve
Minimally invasive unilateral laminotomy on a lumbar stenosis model Melcher et al. [61] Observational Orthopaedic and neurologic surgery residents Tracked a learning curve
Arthroscopic Bankart repair Gallagher et al. [30] Observational PGY-4, 5 orthopaedic residents High interrater reliability between two raters using a procedure-specific checklist
Carpal tunnel releases on patients in the OR Bjorklund et al. [15] Observational Plastic surgery residents (PGY-1, 5, 6) This study revealed that blinded graders from other institutions are best for grading
Film review: enhancing learning
Fracture/dislocation reduction and initial splinting of patients who present to the emergency department with a displaced traumatic fracture or dislocation requiring a closed reduction Jain et al. [44] Observational Junior orthopaedic residents (PGY-1-3) performed the procedures
All residents (PGY-1-5) and fellowship-trained orthopaedic surgeons attended a film review session
Junior residents and faculty found the film review sessions helpful
Fracture reduction simulation Dickerson et al. [21] Randomized controlled trial Orthopaedic surgery residents (PGY-1-5) Learning augmented with film review had no effect on performance relative to conventional learning methods
Simulated vascular anastomosis Backstein et al. [9] Randomized controlled trial First-year surgical residents (orthopaedic, cardiac surgery, plastic surgery, neurology, otolaryngology, general surgery) Learning augmented with film review had no effect on performance relative to conventional learning methods
Plating and tension band wiring of surgical model fractures Backstein et al. [8] Randomized controlled trial Orthopaedic surgery residents (PGY-1-5) Learning augmented with film review had no effect on performance relative to conventional learning methods
Lumbar spine procedures in the OR Heiland et al. [36] Randomized controlled trial Neurosurgery residents (nine PGY-1-3, three PGY-4, 5) Learning augmented with film review enhanced early learning relative to conventional methods, but both groups eventually reached the same performance level
Minimally invasive plate osteosynthesis of distal radius fractures Ducournau et al. [23] Observational Surgeons (level 3/5 clinical fellows as measured on the Objective Structured Assessment of Technical Skill scale, or non-tenured hospital surgeons) Film review improved performance
Reduction and fixation of an intra-articular tibial plafond fracture model Karam et al. [47] Randomized controlled trial Orthopaedic residents (PGY-1, 2) Learning augmented with film review improved performance relative to conventional learning methods

OR = operating room; PGY = postgraduate year; ACL = anterior cruciate ligament; ENT = ear, nose, and throat.

Does Motion Analysis Enhance Learning?

There is a paucity of evidence regarding whether students learn better when they have access to their motion metrics, and the available evidence is indecisive. Relative to conventional training methods, training on a simulator with access to motion metrics was found to improve residents’ learning of cadaveric anterior-approach THA [58] but had no superior effect when learning diagnostic knee arthroscopy [62].

Does Motion Analysis Allow for Grading Trainees?

Surgical Tasks Other Than Arthroscopy

Most studies assessing motion analysis were performed on arthroscopic procedures (Table 2). Of the studies examining surgical tasks other than arthroscopy, each assessed a different surgical task and collected different motion parameters, so a definitive conclusion cannot be ascertained from their combined results. The results, however, showed that some motion parameters can differentiate surgeons by skill level in nonarthroscopic surgical tasks while other motion parameters cannot. This inconsistency suggests more research should be done before motion analysis is used to grade orthopaedic trainees while they perform nonarthroscopic tasks.

Table 2.

Summary of studies that used motion analysis to enhance orthopaedic surgical performance

Procedure studied Article Study design Population studied Motion metric collected Finding
Motion analysis: arthroscopy
Arthroscopic knee procedures on a virtual reality simulator (diagnostic knee arthroscopy, loose body retrieval, partial meniscectomy) Chang et al. [17] Observational Novice arthroscopists (medical students and PGY-1-3 orthopaedic residents), proficient arthroscopists (PGY-4, 5 orthopaedic residents and orthopaedic staff surgeons) Tool distance traveled Differentiated by experience level
Damage to tissue Did not differentiate by experience level
Dammerer et al. [18] Observational Medical students, orthopaedic residents Tool distance traveled Differentiated by experience level, tracked a learning curve
Putzer et al. [72] Observational Medical students (never performed arthroscopy), orthopaedic residents (taken an arthroscopy course and performed <5 arthroscopic surgeries) Tool distance traveled Differentiated by experience level
Damage to tissue Did not differentiate by experience level
Diagnostic virtual reality hip arthroscopy and loose body removal Bishop et al. [13] Observational Novices (medical students and PGY-1, 2 orthopaedic residents), intermediates (PGY-3, 4 orthopaedic residents),senior trainees (PGY-5 orthopaedic residents, orthopaedic fellows),
attending faculty
Camera and grasper path lengths, scratching the femoral and acetabular cartilage These metrics were used in a calculation of a total simulator score, which was unreliable in differentiating users by experience level
Virtual reality arthroscopic ACL reconstruction Antonis et al. [7] Observational Novices (orthopaedic surgeons with no experience or <15 ACL reconstructions), experts (orthopaedic surgeons with >100 ACL reconstruction procedures as primary surgeons) Damage to femur and tibia cartilage, distance traveled by tools Did not differentiate by experience level
Diagnostic arthroscopy of the knee, hip, and shoulder Alvand et al. [2] Observational Novices (medical students), orthopaedic resident group (performed between 50 and 150 arthroscopic procedures), expert group (consultant orthopaedic surgeons: >700 arthroscopic procedures) Hand motion collected from sensors worn on the dorsum of hands (number of hand movements and path length of hands) Differentiated by experience level
Howells et al. [39] Observational Nonsurgeons (no arthroscopic experience), junior surgeons (orthopaedic surgeons with <50 arthroscopies), senior surgeons (orthopaedic surgeons with >200 arthroscopies) Differentiated by experience level
Howells et al. [40] Randomized controlled trial Junior orthopaedic trainees (<2 years of surgical training, <10 arthroscopies) Tracked a learning curve
Pollard et al. [70] Randomized controlled trial Orthopaedic trainees (junior and senior) Tracked a learning curve
Simulated arthroscopic meniscal repair Jackson et al. [43] Observational Orthopaedic residents (>20 diagnostic knee arthroscopies) Tracked a learning curve
Cadaveric diagnostic arthroscopies Rose et al. [75] Observational Novices (PGY-3 orthopaedic residents), intermediates (PGY-4 orthopaedic residents), fellowship-trained orthopaedic surgeons Wireless sensors worn on the arms, sternum, and lumbar spine (arm position) Differentiated by experience level
Loose body removals on a shoulder arthroscopy simulator Tronchot et al. [83] Observational Intermediate (orthopaedic surgeons who performed <100 arthroscopic surgeries), expert (orthopaedic surgeons with subspecialization in arthroscopic shoulder surgery: >100) Tool trajectory (average velocity, motion economy, and motion smoothness) Differentiated by experience level
Tool acceleration and path length Did not differentiate by experience level
Touching a moving sphere in a shoulder arthroscopy simulator Gomoll et al. [32] Observational Medical students, orthopaedic residents, attending orthopaedic surgeons Ratio of tool path length to ideal path Differentiated by experience level
Probe velocity and collisions with tissue Did not differentiate by experience level
Pedowitz et al. [67] Observational Medical students, orthopaedic residents, orthopaedic faculty Ratio of tool path length to ideal path Differentiated by experience level
Probe velocity and collisions with tissue Did not differentiate by experience level
Arthroscopic Bankart sutures on a shoulder simulator Howells et al. [38] Observational Fellowship-trained lower limb orthopaedic surgeons Hand motion collected from sensors worn on the dorsum of hands (number of hand movements and path length of hands) Tracked a learning curve
Simulated shoulder arthroscopic diagnostic tasks and arthroscopic Bankart labral repairs Middleton et al. [63] Observational Medical students, interns, orthopaedic trainees, and orthopaedic faculty Differentiated by experience level
Standardized tasks on an arthroscopic virtual reality simulator Kirby et al. [49] Observational Novices (no independent arthroscopies), intermediates (<100 independent arthroscopies), experts (>100 independent arthroscopies) Elbow and wrist sensors (number of hand movements) Differentiated by experience level
Basic virtual reality arthroscopic tasks (interacting with shapes) Rose et al. [76] Observational Novices (medical students), intermediates (orthopaedic residents), experts (faculty) Tool path length, tool path length relative to an ideal path, ambidexterity (difference between tool path length in right and left hand), center deviation (angle between the actual and optimal camera-arthroscope center line) These metrics differentiated by experience level, but were unreliable and inconsistent, working only for certain tasks depending on whether the tools were being used with a dominant or nondominant hand
Vaghela et al. [84] Observational Orthopaedic interns, residents, fellows, and attending surgeons Camera percent alignment, camera path length, instrument path length Tool path length differentiated by experience level, but these metrics were unreliable and inconsistent, working only for certain tasks depending on whether the tools were being used with a dominant or nondominant hand
Virtual reality arthroscopic Tetris game Pedowitz et al. [68] Observational Orthopaedic residents (PGY-2, 3, 4), orthopaedic faculty Camera and grasper path length in dominant vs non-dominant hand Differentiated by experience level
Motion analysis: surgical tasks other than arthroscopy
Simulated antegrade femoral nailing (proximal guidewire entry and distal locking) Racy et al. [73] Observational Orthopaedic trainees, orthopaedic consultants Drill tip distance (economy of movement) Differentiated by experience level
Synthetic bone drilling Pourkand et al. [71] Observational Orthopaedic residents (PGY-1), orthopaedic faculty Arm motion Did not differentiate by experience level
Articular fracture reductions on a model Yehyawi et al. [90] Observational Junior orthopaedic residents (PGY-1, 2), senior orthopaedic residents (PGY-4, 5) Total hand distanceNumber of hand movements Differentiated by experience level
Did not differentiate by experience level
Cannulating synthetic lumbar pedicles Woodrow et al. [89] Observational Junior neurosurgery residents (PGY-2-4), neurosurgeons with focused practices in spinal surgery Mean and peak forces Differentiated by experience level
Elbow and probe motion Did not differentiate by experience level
Virtual reality hemilaminectomy Bissonnette et al. [14] Observational Medical students, orthopaedic and neurosurgery residents (junior and senior), spine fellows, spine surgeons Burr angle variance and force applied to the dura Differentiated by experience level
Motion analysis: enhancing learning
Cadaveric anterior approach total hip arthroplasty Logishetty et al. [58] Randomized controlled trial Orthopaedic residents (PGY-3-5) Feedback metrics included: hand path lengths, component orientation, total time Virtual reality training with access to motion metric feedback improved learning of the procedure relative to conventional training methods
Virtual reality anterior approach total hip replacement Logishetty et al. [57] Observational Orthopaedic residents (PGY-1-4), expert hip surgeons (>100 total hip replacements) Efficiency of movement (dominant and non-dominant hand path length, head movement) Residents trained with a virtual reality anterior approach total hip replacement simulator, which tracked a learning curve and showed improvements in efficiency of movement that reached expert surgeon levels
Diagnostic knee arthroscopy Middleton et al. [62] Randomized controlled trial Medical students, interns Feedback metrics included: total time, cartilage damage, camera path length, instrument path length Training with access to motion metric feedback on a virtual reality arthroscopy simulator had no effect on learning of the procedure relative to training on a benchtop arthroscopy simulator without motion feedback

PGY = postgraduate year; ACL = anterior cruciate ligament.

Arthroscopy

Many arthroscopy simulators offer summary feedback of performance in the form of motion metrics, including distance traveled by the arthroscopic camera, probe, and grasper (Table 3). Another metric is roughness, the maximum depth applied to damageable tissues by a tool in the simulation. Although tool-distance-traveled metrics differentiated between trainees with different skill levels performing diagnostic knee arthroscopy, loose body retrievals, and partial meniscectomies on a virtual reality arthroscopy simulator [17, 18, 72], the roughness parameter was not different between trainees with different experience levels [17, 72]. In other studies, neither tool distance traveled nor damage to tissues were reliable metrics in differentiating users by experience level during diagnostic virtual reality hip arthroscopy and loose body removal [13], as well as during virtual reality arthroscopic ACL reconstruction [7].

Table 3.

Summary of common motion metrics collected in motion analysis studies

Motion sensor Motion metric collected What the motion metric represents
Sensors in arthroscopic tools Distance traveled by arthroscopic camera, probe, and grasper Tool distance traveled
Roughness Maximum depth applied to damageable tissues by a tool
Average velocity, motion smoothness, economy of motion, acceleration, path length, ratio of path length to an ideal path, collision with tissue Probe trajectory
Wireless sensors on the dorsum of hands Number of hand movements and path length of hands Hand motion
Wireless sensors on the dorsal forearm, lateral arm, sternum, and lumbar spine 3-dimensional angular joint movement of the shoulder and elbow
Trunk movement
Arm motion
Elbow sensor (three fingerbreadths below the lateral epicondyle in line with the radius) Motion and rotation Hand motion
Wrist sensor (midpoint of the dorsum of each wrist in line with the ulnar styloid) Motion and rotation Hand motion

Hand motion metrics, including the number of hand movements and path length of hands from wireless sensors worn on the dorsum of the hands, have been consistently useful in differentiating between trainees with different skill levels [2, 39] and tracking learning curves [40, 70] while trainees perform diagnostic arthroscopy of the knee, hip, and shoulder, as well as simulated arthroscopic meniscal repair [43]. Wireless sensors on the arms, sternum, and lumbar spine provided insight into performance differences between residents and experienced orthopaedic surgeons performing cadaveric diagnostic arthroscopies. As experience increased, surgeons kept their shoulders more adducted and wrists supinated. Novices had more variable shoulder and elbow movements, and their arms spent less time around their mean joint position [75].

Probe trajectory metrics, including average velocity, motion economy, and motion smoothness, differentiated between intermediate and expert orthopaedic surgeons performing loose body removals on a shoulder arthroscopy simulator, although there was no difference for acceleration or for path length [83]. Although users touched a moving sphere in a shoulder arthroscopy simulator, the ratio of their path length to an ideal path differentiated between experience levels while probe velocity and collisions with tissues could not [32, 67].

Hand sensors improved the number of hand movements and the hand path length as fellowship-trained lower limb surgeons practiced with arthroscopic Bankart sutures on a shoulder simulator [38] and differentiated users by experience level in simulated shoulder arthroscopic diagnostic tasks and arthroscopic Bankart labral repairs [63]. Elbow and wrist sensors similarly provide data on the number of hand movements and differentiated users of a shoulder arthroscopy simulator by experience level, but elbow sensors were more accurate [49].

During basic virtual reality arthroscopic tasks, such as interacting with shapes, movement metrics had less success for differentiating users by experience level [76, 84], although camera and grasper path length metrics collected during a virtual reality Tetris game showed orthopaedic faculty perform equivalently with the left and right hands, whereas residents lacked ambidexterity [68] (Table 2).

Knowledge Gaps

The American Board of Orthopaedic Surgeons and the Accreditation Council for Graduate Medical Education oversee orthopaedic surgical training and set training milestones to ensure graduating residents have developed proficiency in orthopaedic knowledge and surgical skill. The American Board of Orthopaedic Surgeons has set the performance standards for orthopaedic surgeons since 1934 and has appropriately adapted its standards over time to best serve the needs of the public. Currently, there are no standardized guidelines on the use of film review or motion analysis in orthopaedic surgical training as outlined by the American Board of Orthopaedic Surgeons or the Accreditation Council for Graduate Medical Education. Orthopaedic surgery training requirements are moving in that direction, however; the American Board of Orthopaedic Surgeons and the Residency Review Committee for Orthopaedic Surgery implemented mandatory surgical skills simulation training for postgraduate year 1 residents in 2013 [5]. This training is intended to improve basic skills early in training to aid in the acquisition of advanced skills later. Residency programs can fulfill this requirement through any simulation training program, but the American Board of Orthopaedic Surgeons website contains a suggested set of 17 modules, each focusing on a different orthopaedic skill ranging from suturing and knot tying to basic arthroplasty skills. The modules include instructions for setting up the simulation training and suggested readings for each skill, but they offer no objective, validated way to grade the trainee. The Global Index for Technical Skills tool was found to be efficacious in assessing orthopaedic residents’ technical skills in the American Board of Orthopaedic Surgeons modules [10], but it requires live grading and could be further improved by grading via film review and motion analysis.

Implementation of these technologies would be expensive. There would be costs associated with purchasing cameras and motion sensors and hiring staff to design the curriculum and organize the film and motion data. These costs could be justified if the technologies were to help programs educate more-effective surgeons, although most studies in this review used surgery models. The ability of film review and motion analysis to differentiate between the skill levels of surgeons suggests the findings of these studies can be translated into an operating room setting, because more-skilled surgeons would be more skilled in both a simulated and real environment. Nonetheless, more studies demonstrating the effectiveness of these technologies in a real operating room setting would better support their widespread use. More studies are also needed to determine the utility of these technologies in common procedures such as hip arthroplasty.

The evidence that film review and motion analysis are useful as surgical training tools supports their use as ways to grade trainees. Meanwhile, studies examining their use as augments to learning, where students review their own film or motion data, had mixed results. Although students subjectively report satisfaction with learning augmented by access to footage and motion data, objective data on skill improvements resulting from this learning method do not show full support. Another dilemma is whether the interpretation of motion analysis data is as effective without video footage as it is when assessed in the context of video. For instance, film review and motion analysis were used to assess the performance of trainees in a simulated hip arthroscopy task. Grading of video footage of the participant’s hands and arthroscope footage had excellent interrater reliability and showed differences based on experience level. Similarly, motion metrics of hand path length and the number of hand movements showed differences based on experience level [25]. Both methods were efficacious in evaluating trainees, but motion metrics were limited to objective movement data, whereas the video graders could consider trainee decision-making. In that study, the researchers thought the video grading was superior because the motion data could not account for situations in which a trainee moved in an efficient but unsafe way. Arthroscope simulators collect data on tool collisions with tissue, but there are instances in which a trainee makes unsafe maneuvers that do not result in tissue impact. Therefore, interpreting motion data alongside procedure video may be best practice.

When considering implementing film review into a surgical curriculum, an important question is whether it would be more efficient to have trainees watch footage of experts perform procedures rather than their own surgical footage, which may contain mistakes. Trainees who watched a video of nonexperts performing virtual reality arthroscopic tasks performed better, according to the metric of camera path length, than trainees who watched experts performing the virtual reality arthroscopic tasks when both groups completed the tasks themselves [55]. Similarly, surgeons improved their performance of minimally invasive plate osteosynthesis of distal radius fractures after watching video clips of their own and their colleagues’ mistakes [23]. These findings likely can be attributed to the process of learning from seeing what not to do, suggesting surgical trainees may benefit from watching each other’s, and their own, operative video footage.

Barriers and How to Overcome Them

Regarding film review, there are several barriers, and each has some potential solutions. When editing surgical footage, including a holistic representation of the trainees’ performance rather than just focusing on the technically challenging parts of the procedure is imperative. Although this seems counterintuitive, excessively condensing the footage for the sake of time efficiency could prevent the identification of mistakes which would make grading of surgical footage less useful. For instance, short (2-minute) videos of a procedure in which a vein patch was inserted into an artery did not include footage of trainees tangling a suture during the procedure because this part was inadvertently edited out to keep the video short [19]. This resulted in some trainees receiving higher scores for their snapshot footage compared with scores assigned using a global assessment to grade the same performance. A second barrier is the time required. Film review saves valuable time because surgeons tasked with grading must only watch the short, edited clips, but the editing process itself is time intensive. It took 1 to 2 hours per video to edit footage of plastic surgery residents performing carpal tunnel releases, ultimately creating 6- to 8-minute videos out of the original 10- to 15-minute surgical videos [15]. Artificial intelligence can streamline the process and automatically segment surgical footage into chunks for review [46], but until this technology is more widely tested, hiring and training staff to edit the clips may be a necessary but costly avenue. Another logistical challenge with film review is determining how many graders are necessary to assess performance. It has been estimated that an assessment should rely on the grades of five to seven different experts to account for the idiosyncrasies of the judges [88]. This raises the issue of who should grade the videos. A study in which plastic surgery residents performed carpal tunnel releases on patients in the operating room revealed a potential bias in grading, where trainees received different scores based on whether the grader knew their identity and performance history [15]. Expert graders, including orthopaedic and plastic surgeons from outside institutions, were blinded to the identity, postgraduate year level, and performance history of the residents. Faculty graders, some of whom were present for the procedures, were not blinded and consistently assigned higher scores to the residents than expert graders did. Moreover, interrater reliability between faculty graders was fair to moderate, whereas that of the expert graders was moderate to substantial [15]. These findings suggest expert surgeons from outside institutions may be best for grading surgical footage. As for the method of recording, a low-weight and easy-to-wear camera with long battery life would enable reliable and feasible video recording that does not compromise sterility or disrupt surgical flow.

Motion analysis also has barriers to implementation, each with potential solutions. Motion sensors must be unobtrusive and have minimal impact on a sterile procedure if they are to be worn in the operating room. Elbow sensors are a promising choice that provide reliable motion data and do not disrupt surgical flow [49]. A second barrier arises when trying to decipher the data collected. When assessing motion data in nonarthroscopic surgical procedures, care must be taken to ensure motion data represent surgical tasks instead of motions such as reaching for tools or interacting with colleagues. One solution is the synchronization of video recordings and motion data, in which video of the surgical procedure is assessed to remove motion data collected during nonsurgical parts of the procedure [90]. Syncing motion data with video footage also addresses the concern that motion data alone are insufficient without context provided by video footage. Therefore, motion data would be the most useful when gathered and interpreted alongside video recording. Another barrier to the widespread adoption of motion analysis is the learning curve associated with collecting and understanding motion data. Arthroscopic tasks comprised most motion analysis studies in this review, which is understandable because many arthroscopy simulators have software that automatically collects and interprets motion data. For nonarthroscopic tasks, institutions would need to purchase motion sensors and software to interpret the data, which is objectively more difficult than implementing film review. Motion analysis has a clear role in defining new metrics to assess surgical trainees, and it is even being proposed as a method of determining which medical students can achieve the arthroscopic dexterity necessary for a career in orthopaedic surgery [1]. There were just as many studies on motion analysis as on film review, and there are more studies to come. Implementation of motion technology will become easier as it is studied more. Metrics of hand motion, arm position, arm motion, and tool motion can assess surgical proficiency in a variety of procedures and could play a large role in the future of orthopaedic surgical training.

5-year Forecast

We believe film review and motion analysis will not only be implemented into orthopaedic surgical training programs to improve resident training, but they will also serve as tools to improve performance and efficiency of practicing surgeons. We believe these changes will not come in the form of mandates; rather, we speculate individual programs will begin using these technologies to produce more-skilled surgeons. Regarding resident training, we foresee standardized collection of longitudinal data on trainees as they progress through their training, enabling better skills evaluation and easier identification of areas for improvement. Surgical skill thresholds, as assessed by motion analysis, will likely be established to ensure trainees achieve an appropriate level of proficiency in basic surgical skills before assisting in the operating room. We also predict video assessment of surgical technique will be used as a safe alternative to intraoperative feedback, allowing time for more detailed scrutiny without the risk of harming performance in the moment. With the advent of improved video and motion feedback systems, surgeons and surgical trainees will acquire and refine surgical skills in an efficient, safe, and reliable manner.

Footnotes

Each author certifies that there are no funding or commercial associations (consultancies, stock ownership, equity interest, patent/licensing arrangements, etc.) that might pose a conflict of interest in connection with the submitted article related to the author or any immediate family members.

All ICMJE Conflict of Interest Forms for authors and Clinical Orthopaedics and Related Research® editors and board members are on file with the publication and can be viewed on request.

The opinions expressed are those of the writer, and do not reflect the opinion or policy of CORR® or The Association of Bone and Joint Surgeons®.

This work was performed at the Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, University Orthopedics, Providence, RI, USA.

References

  • 1.Alvand A, Auplish S, Khan T, Gill HS, Rees JL. Identifying orthopaedic surgeons of the future: the inability of some medical students to achieve competence in basic arthroscopic tasks despite training: a randomised study. J Bone Joint Surg Br. 2011;93-B:1586-1591. [DOI] [PubMed] [Google Scholar]
  • 2.Alvand A, Khan T, Al-Ali S, Jackson WF, Price AJ, Rees JL. Simple visual parameters for objective assessment of arthroscopic skill. J Bone Joint Surg Am. 2012;94:e97. [DOI] [PubMed] [Google Scholar]
  • 3.Alvand A, Logishetty K, Middleton R, et al. Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair. Arthroscopy. 2013;29:906-912. [DOI] [PubMed] [Google Scholar]
  • 4.American Board of Orthopaedic Surgery. ABOS knowledge, skills, and behavior program onboarding handbook. Available at: https://www.abos.org/wp-content/uploads/2022/01/5488-ABOS_KSB-Handbook_UPDATE_011221.pdf. Accessed May 30, 2022.
  • 5.American Board of Orthopaedic Surgery. ABOS surgical skills modules for PGY-1 residents. Available at: https://www.abos.org/residents/residency-skills-modules/abos-surgical-skills-modules-for-pgy-1-residents/. Accessed May 30, 2022.
  • 6.American Board of Surgery. General surgery training requirements. Available at: https://www.absurgery.org/default.jsp?certgsqe_training. Accessed May 30, 2022.
  • 7.Antonis J, Bahadori S, Gallagher K, Immins T, Wainwright TW, Middleton R. Validation of the anterior cruciate ligament (ACL) module of the VirtaMed virtual reality arthroscopy trainer. Surg Technol Int. 2019;35:311-319. [PubMed] [Google Scholar]
  • 8.Backstein D, Agnidis Z, Regehr G, Reznick R. The effectiveness of video feedback in the acquisition of orthopedic technical skills. Am J Surg. 2004;187:427-432. [DOI] [PubMed] [Google Scholar]
  • 9.Backstein D, Agnidis Z, Sadhu R, MacRae H. Effectiveness of repeated video feedback in the acquisition of a surgical technical skill. Can J Surg. 2005;48:195-200. [PMC free article] [PubMed] [Google Scholar]
  • 10.Bagley JJ, Piazza B, Lazarus MD, Fox EJ, Zhan X. Resident training and the assessment of orthopaedic surgical skills. JB JS Open Access. 2021;6:e20.00173.00173. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Bayona S, Akhtar K, Gupte C, Emery RJ, Dodds AL, Bello F. Assessing performance in shoulder arthroscopy: the imperial global arthroscopy rating scale (IGARS). J Bone Joint Surg Am. 2014;96:e112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Beason AM, Hitt CE, Ketchum J, Rogers H, Sanfey H. Verification of proficiency in basic skills for PGY-1 surgical residents: 10-year update. J Surg Educ. 2019;76:e217-e224. [DOI] [PubMed] [Google Scholar]
  • 13.Bishop ME, Ode GE, Hurwit DJ, et al. The arthroscopic surgery skill evaluation tool global rating scale is a valid and reliable adjunct measure of performance on a virtual reality simulator for hip arthroscopy. Arthroscopy. 2021;37:1856-1866. [DOI] [PubMed] [Google Scholar]
  • 14.Bissonnette V, Mirchi N, Ledwos N, Alsidieri G, Winkler-Schwartz A, Del Maestro RF. Artificial intelligence distinguishes surgical training levels in a virtual reality spinal task. J Bone Joint Surg Am. 2019;101:e127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Bjorklund KA, Sommer N, Neumeister MW, Kasten SJ. Establishing validity evidence for an operative performance rating system for plastic surgery residents. J Surg Educ. 2019;76:529-539. [DOI] [PubMed] [Google Scholar]
  • 16.Blevins JL, Felix KJ, Ling DI, et al. Surgical games: a simulation-based structured assessment of orthopedic surgery resident technical skill. J Surg Educ. 2020;77:1605-1614. [DOI] [PubMed] [Google Scholar]
  • 17.Chang J, Banaszek DC, Gambrel J, Bardana D. Global rating scales and motion analysis are valid proficiency metrics in virtual and benchtop knee arthroscopy simulators. Clin Orthop Relat Res. 2016;474:956-964. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Dammerer D, Putzer D, Wurm A, Liebensteiner M, Nogler M, Krismer M. Progress in knee arthroscopy skills of residents and medical students: a prospective assessment of simulator exercises and analysis of learning curves. J Surg Educ. 2018;75:1643-1649. [DOI] [PubMed] [Google Scholar]
  • 19.Datta V, Bann S, Mandalia M, Darzi A. The surgical efficiency score: a feasible, reliable, and valid method of skills assessment. Am J Surg. 2006;192:372-378. [DOI] [PubMed] [Google Scholar]
  • 20.Davies RM, Hadfield-Law L, Turner PG. Development and evaluation of a new formative assessment of surgical performance. J Surg Educ. 2018;75:1309-1316. [DOI] [PubMed] [Google Scholar]
  • 21.Dickerson P, Grande S, Evans D, Levine B, Coe M. Utilizing intraprocedural interactive video capture with google glass for immediate postprocedural resident coaching. J Surg Educ. 2019;76:607-619. [DOI] [PubMed] [Google Scholar]
  • 22.Dougherty PJ, Andreatta P. CORR® curriculum-orthopaedic education: competency-based medical education-how do we get there? Clin Orthop Relat Res. 2017;475:1557-1560. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Ducournau F, Meyer N, Xavier F, Facca S, Liverneaux P. Learning a MIPO technique for distal radius fractures: mentoring versus simple experience versus deliberate practice. Orthop Traumatol Surg Res. 2021;107:102939. [DOI] [PubMed] [Google Scholar]
  • 24.Dwyer T, Slade Shantz J, Chahal J, et al. Simulation of anterior cruciate ligament reconstruction in a dry model. Am J Sports Med. 2015;43:2997-3004. [DOI] [PubMed] [Google Scholar]
  • 25.Erturan G, Alvand A, Judge A, Pollard TCB, Glyn-Jones S, Rees JL. Prior generic arthroscopic volume correlates with hip arthroscopic proficiency: a simulator study. J Bone Joint Surg Am. 2018;100:e3. [DOI] [PubMed] [Google Scholar]
  • 26.Fundamentals of Laparoscopic Surgery. FLS program description. Available at: https://www.flsprogram.org/index/fls-program-description/. Accessed May 30, 2022.
  • 27.Gabbett T, Jenkins D, Abernethy B. Physical collisions and injury during professional rugby league skills training. J Sci Med Sport. 2010;13:578-583. [DOI] [PubMed] [Google Scholar]
  • 28.Gabbett TJ. GPS analysis of elite women's field hockey training and competition. J Strength Cond Res. 2010;24:1321-1324. [DOI] [PubMed] [Google Scholar]
  • 29.Gabbett TJ, Seibold AJ. Relationship between tests of physical qualities, team selection, and physical match performance in semiprofessional rugby league players. J Strength Cond Res. 2013;27:3259-3265. [DOI] [PubMed] [Google Scholar]
  • 30.Gallagher AG, Ryu RKN, Pedowitz RA, Henn P, Angelo RL. Inter-rater reliability for metrics scored in a binary fashion-performance assessment for an arthroscopic Bankart repair. Arthroscopy. 2018;34:2191-2198. [DOI] [PubMed] [Google Scholar]
  • 31.Giles B, Peeling P, Kovalchik S, Reid M. Differentiating movement styles in professional tennis: a machine learning and hierarchical clustering approach. Eur J Sport Sci. Published online December 30, 2021. DOI: 10.1080/17461391.2021.2006800. [DOI] [PubMed] [Google Scholar]
  • 32.Gomoll AH, O'Toole RV, Czarnecki J, Warner JJP. Surgical experience correlates with performance on a virtual reality simulator for shoulder arthroscopy. Am J Sports Med. 2007;35:883-888. [DOI] [PubMed] [Google Scholar]
  • 33.Guyatt G, Oxman AD, Akl EA, et al. GRADE guidelines: 1. Introduction-GRADE evidence profiles and summary of findings tables. J Clin Epidemiol. 2011;64:383-394. [DOI] [PubMed] [Google Scholar]
  • 34.Haluck RS, Krummel TM. Computers and virtual reality for surgical education in the 21st century. Arch Surg. 2000;135:786-792. [DOI] [PubMed] [Google Scholar]
  • 35.Hamidi Rad M, Aminian K, Gremeaux V, Massé F, Dadashi F. Swimming phase-based performance evaluation using a single IMU in main swimming techniques. Front Bioeng Biotechnol. 2021;9:793302. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Heiland DH, Petridis AK, Maslehaty H, et al. Efficacy of a new video-based training model in spinal surgery. Surg Neurol Int. 2014;5:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Higginbotham DO, Zalikha AK, Stoker SK, Little BE. The impact of COVID-19 on the orthopaedic surgery residency experience. Spartan Med Res J. 2021;6:25963. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Howells NR, Auplish S, Hand GC, Gill HS, Carr AJ, Rees JL. Retention of arthroscopic shoulder skills learned with use of a simulator. Demonstration of a learning curve and loss of performance level after a time delay. J Bone Joint Surg Am. 2009;91:1207-1213. [DOI] [PubMed] [Google Scholar]
  • 39.Howells NR, Brinsden MD, Gill RS, Carr AJ, Rees JL. Motion analysis: a validated method for showing skill levels in arthroscopy. Arthroscopy. 2008;24:335-342. [DOI] [PubMed] [Google Scholar]
  • 40.Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br. 2008;90-B:494-499. [DOI] [PubMed] [Google Scholar]
  • 41.Hoyt BW, Clark DM, Lundy AE, Schroeder NS, Wagner SC, Langhammer C. Validation of a high-fidelity fracture fixation model for skill acquisition in orthopedic surgery residents. J Surg Educ. 2022;79:1282-1294. [DOI] [PubMed] [Google Scholar]
  • 42.Hu YY, Peyre SE, Arriaga AF, et al. Postgame analysis: using video-based coaching for continuous professional development. J Am Coll Surg. 2012;214:115-124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Jackson WF, Khan T, Alvand A, et al. Learning and retaining simulated arthroscopic meniscal repair skills. J Bone Joint Surg Am. 2012;94:e132. [DOI] [PubMed] [Google Scholar]
  • 44.Jain NS, Schwarzkopf R, Scolaro JA. Video review as a tool to improve orthopedic residents׳ performance of closed manipulative reductions. J Surg Educ. 2017;74:663-667. [DOI] [PubMed] [Google Scholar]
  • 45.Johnston RJ, Watsford ML, Pine MJ, Spurrs RW, Murphy A, Pruyn EC. Movement demands and match performance in professional Australian football. Int J Sports Med. 2012;33:89-93. [DOI] [PubMed] [Google Scholar]
  • 46.Kadkhodamohammadi A, Sivanesan Uthraraj N, Giataganas P, et al. Towards video-based surgical workflow understanding in open orthopaedic surgery. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization. 2021;9:286-293. [Google Scholar]
  • 47.Karam MD, Thomas GW, Koehler DM, et al. Surgical coaching from head-mounted video in the training of fluoroscopically guided articular fracture surgery. J Bone Joint Surg. 2015;97:1031-1039. [DOI] [PubMed] [Google Scholar]
  • 48.King T, Jenkins D, Gabbett T. A time-motion analysis of professional rugby league match-play. J Sports Sci. 2009;27:213-219. [DOI] [PubMed] [Google Scholar]
  • 49.Kirby GS, Guyver P, Strickland L, et al. Assessing arthroscopic skills using wireless elbow-worn motion sensors. J Bone Joint Surg Am. 2015;97:1119-1127. [DOI] [PubMed] [Google Scholar]
  • 50.Koehler RJ, Amsdell S, Arendt EA, et al. The arthroscopic surgical skill evaluation tool (ASSET). Am J Sports Med. 2013;41:1229-1237. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Koehler RJ, Goldblatt JP, Maloney MD, Voloshin I, Nicandri GT. Assessing diagnostic arthroscopy performance in the operating room using the arthroscopic surgery skill evaluation tool (ASSET). Arthroscopy. 2015;31:2314-2319.e2. [DOI] [PubMed] [Google Scholar]
  • 52.Koehler RJ, Nicandri GT. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination. J Bone Joint Surg Am. 2013;95:e187-e1876. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Larsson P. Global positioning system and sport-specific testing. Sports Med. 2003;33:1093-1101. [DOI] [PubMed] [Google Scholar]
  • 54.Larsson P, Henriksson-Larsén K. Combined metabolic gas analyser and dGPS analysis of performance in cross-country skiing. J Sports Sci. 2005;23:861-870. [DOI] [PubMed] [Google Scholar]
  • 55.LeBel ME, Haverstock J, Cristancho S, van Eimeren L, Buckingham G. Observational learning during simulation-based training in arthroscopy: is it useful to novices? J Surg Educ. 2018;75:222-230. [DOI] [PubMed] [Google Scholar]
  • 56.Leong JJH, Leff DR, Das A, et al. Validation of orthopaedic bench models for trauma surgery. J Bone Joint Surg Br. 2008;90-B:958-965. [DOI] [PubMed] [Google Scholar]
  • 57.Logishetty K, Gofton WT, Rudran B, Beaulé PE, Cobb JP. Fully immersive virtual reality for total hip arthroplasty: objective measurement of skills and transfer of visuospatial performance after a competency-based simulation curriculum. J Bone Joint Surg Am. 2020;102:e27. [DOI] [PubMed] [Google Scholar]
  • 58.Logishetty K, Rudran B, Cobb JP. Virtual reality training improves trainee performance in total hip arthroplasty: a randomized controlled trial. Bone Joint J. 2019;101-B:1585-1592. [DOI] [PubMed] [Google Scholar]
  • 59.Mackenzie CF, Yang S, Garofalo E, et al. Enhanced training benefits of video recording surgery with automated hand motion analysis. World J Surg. 2021;45:981-987. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.McCluney AL, Vassiliou MC, Kaneva PA, et al. FLS simulator performance predicts intraoperative laparoscopic skill. Surg Endosc. 2007;21:1991-1995. [DOI] [PubMed] [Google Scholar]
  • 61.Melcher C, Hussain I, Kirnaz S, et al. Use of a high-fidelity training simulator for minimally invasive lumbar decompression increases working knowledge and technical skills among orthopedic and neurosurgical trainees. Global Spine Journal. Published online February 28, 2022. DOI: 10.1177/21925682221076044. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Middleton RM, Alvand A, Garfjeld Roberts P, Hargrove C, Kirby G, Rees JL. Simulation-based training platforms for arthroscopy: a randomized comparison of virtual reality learning to benchtop learning. Arthroscopy. 2017;33:996-1003. [DOI] [PubMed] [Google Scholar]
  • 63.Middleton RM, Vo A, Ferguson J, et al. Can surgical trainees achieve arthroscopic competence at the end of training programs? A cross-sectional study highlighting the impact of working time directives. Arthroscopy. 2017;33:1151-1158. [DOI] [PubMed] [Google Scholar]
  • 64.Moktar J, Popkin CA, Howard A, Murnaghan ML. Development of a cast application simulator and evaluation of objective measures of performance. J Bone Joint Surg Am. 2014;96:e76. [DOI] [PubMed] [Google Scholar]
  • 65.Mulligan A, Vaghela KR, Jeyaseelan L, Lee J, Akhtar K. Transferable global rating scales in the validation of the ArthroSim™ virtual reality arthroscopy simulator. Surg Technol Int. 2020;37:306-311. [PubMed] [Google Scholar]
  • 66.Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Pedowitz RA, Esch J, Snyder S. Evaluation of a virtual reality simulator for arthroscopy skills development. Arthroscopy. 2002;18:e29. [DOI] [PubMed] [Google Scholar]
  • 68.Pedowitz R, Nicandri G, Tuchschmid S. Asymmetry in dominant / non-dominant hand performance differentiates novices from experts on an arthroscopy virtual reality serious game. Stud Health Technol Inform. 2016;220:289-294. [PubMed] [Google Scholar]
  • 69.Phillips L, Cheung JJH, Whelan DB, et al. Validation of a dry model for assessing the performance of arthroscopic hip labral repair. Am J Sports Med. 2017;45:2125-2130. [DOI] [PubMed] [Google Scholar]
  • 70.Pollard TC, Khan T, Price AJ, Gill HS, Glyn-Jones S, Rees JL. Simulated hip arthroscopy skills: learning curves with the lateral and supine patient positions: a randomized trial. J Bone Joint Surg Am. 2012;94:e68. [DOI] [PubMed] [Google Scholar]
  • 71.Pourkand A, Salas C, Regalado J, et al. Objective evaluation of motor skills for orthopedic residents using a motion tracking drill system: outcomes of an ABOS approved surgical skills training program. Iowa Orthop J. 2016;36:13-19. [PMC free article] [PubMed] [Google Scholar]
  • 72.Putzer D, Dammerer D, Baldauf M, Lenze F, Liebensteiner MC, Nogler M. A prospective assessment of knee arthroscopy skills between medical students and residents-simulator exercises for partial meniscectomy and analysis of learning curves. Surg Innov. 2022;29:398-405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Racy M, Barrow A, Tomlinson J, Bello F. Development and validation of a virtual reality haptic femoral nailing simulator. J Surg Educ. 2021;78:1013-1023. [DOI] [PubMed] [Google Scholar]
  • 74.Rampinini E, Impellizzeri FM, Castagna C, Coutts AJ, Wisløff U. Technical performance during soccer matches of the Italian Serie A league: effect of fatigue and competitive level. J Sci Med Sport. 2009;12:227-233. [DOI] [PubMed] [Google Scholar]
  • 75.Rose M, Curtze C, O'Sullivan J, et al. Wearable inertial sensors allow for quantitative assessment of shoulder and elbow kinematics in a cadaveric knee arthroscopy model. Arthroscopy. 2017;33:2110-2116. [DOI] [PubMed] [Google Scholar]
  • 76.Rose K, Pedowitz R. Fundamental arthroscopic skill differentiation with virtual reality simulation. Arthroscopy. 2015;31:299-305. [DOI] [PubMed] [Google Scholar]
  • 77.Sanfey H, Ketchum J, Bartlett J, et al. Verification of proficiency in basic skills for postgraduate year 1 residents. Surgery. 2010;148:759-767. [DOI] [PubMed] [Google Scholar]
  • 78.Slade Shantz JA, Leiter JR, Collins JB, MacDonald PB. Validation of a global assessment of arthroscopic skills in a cadaveric knee model. Arthroscopy. 2013;29:106-112. [DOI] [PubMed] [Google Scholar]
  • 79.Smith DJ, Norris SR, Hogg JM. Performance evaluation of swimmers: scientific tools. Sports Med. 2002;32:539-554. [DOI] [PubMed] [Google Scholar]
  • 80.Sroka G, Feldman LS, Vassiliou MC, Kaneva PA, Fayez R, Fried GM. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room-a randomized controlled trial. Am J Surg. 2010;199:115-120. [DOI] [PubMed] [Google Scholar]
  • 81.Taylor LK, Thomas GW, Karam MD, Kreiter CD, Anderson DD. Assessing wire navigation performance in the operating room. J Surg Educ. 2016;73:780-787. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Taylor LK, Thomas GW, Karam MD, Kreiter CD, Anderson DD. Developing an objective assessment of surgical performance from operating room video and surgical imagery. IISE Trans Healthc Syst Eng. 2018;8:110-116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Tronchot A, Berthelemy J, Thomazeau H, et al. Validation of virtual reality arthroscopy simulator relevance in characterising experienced surgeons. Orthop Traumatol Surg Res. 2021;107:103079. [DOI] [PubMed] [Google Scholar]
  • 84.Vaghela KR, Trockels A, Lee J, Akhtar K. Is the virtual reality fundamentals of arthroscopic surgery training program a valid platform for resident arthroscopy training? Clin Orthop Relat Res. 2022;480:807-815. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Varley MC, Fairweather IH, Aughey RJ. Validity and reliability of GPS for measuring instantaneous velocity during acceleration, deceleration, and constant motion. J Sports Sci. 2012;30:121-127. [DOI] [PubMed] [Google Scholar]
  • 86.Vassiliou MC, Dunkin BJ, Fried GM, et al. Fundamentals of endoscopic surgery: creation and validation of the hands-on test. Surg Endosc. 2014;28:704-711. [DOI] [PubMed] [Google Scholar]
  • 87.Williams RG, Sanfey H, Chen XP, Dunnington GL. A controlled study to determine measurement conditions necessary for a reliable and valid operative performance assessment: a controlled prospective observational study. Ann Surg. 2012;256:177-187. [DOI] [PubMed] [Google Scholar]
  • 88.Wisbey B, Montgomery PG, Pyne DB, Rattray B. Quantifying movement demands of AFL football using GPS tracking. J Sci Med Sport. 2010;13:531-536. [DOI] [PubMed] [Google Scholar]
  • 89.Woodrow SI, Dubrowski A, Khokhotva M, Backstein D, Rampersaud YR, Massicotte EM. Training and evaluating spinal surgeons: the development of novel performance measures. Spine. 2007;32:2921-2925. [DOI] [PubMed] [Google Scholar]
  • 90.Yehyawi TM, Thomas TP, Ohrt GT, et al. A simulation trainer for complex articular fracture surgery. J Bone Joint Surg Am. 2013;95:e92. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Clinical Orthopaedics and Related Research are provided here courtesy of The Association of Bone and Joint Surgeons

RESOURCES