Abstract
Purpose of Review
Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes.
Recent Findings
Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches.
Summary
It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics’ high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Keywords: Augmented reality, Extended reality, Orthopedics, Imaging, Surgical navigation
Introduction
The field of orthopedics has been continually transformed by revolutionary new imaging technologies. Starting with the common X-ray in 1985, then moving on to ultrasound (1950s), computerized tomography (CT) (1972), and magnetic resonance imaging (MRI) (1977) with each technological step enhancing the ability of the orthopedic surgeon to diagnose, monitor, and treat musculoskeletal aliments. Advances in these technologies have led to step-wise higher-quality image acquisition at a faster speed, lower cost, and with less radiation exposure. The next revolutionary advancement in imaging may not be in the acquisition, but instead in the display of these images within the realm of extended reality (ER).
Extended reality refers to many different immersive technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR) which seek to merge the physical and virtual worlds. Virtual reality (VR) completely immerses the user in a digitally created world using a head-mounted device that creates a 360-degree view of an artificial world and is often combined with 3D audio and haptic feedback devices to create a more immersive experience. VR has seen prominent use in simulation-based medical training with some applications assisting with pre-op planning [1]. However, because the user loses sight of their real environment, there are currently no relevant intra-operative applications.
Augmented reality (AR) overlays or augments the real world with virtual content. The digital content can be displayed through AR glasses, or via screens, tablets, and smartphones. The most recognizable application that brought attention to this technology was Pokémon Go [2]. Because the users maintain their view of the real world, this technology has several interesting clinical use cases. Mixed reality (MR) also known as merged or hybrid reality is a more advanced form of AR in which digital and real-world objects coexist and interact with one another. This allows virtual objects to be anchored in the physical world. A 3D reconstructed pelvis can be anchored over the patient’s native anatomy such that it will remain in the correct anatomic position no matter how the AR device moves in relation to the patient. These applications require significantly more processing power than either VR or AR. For the purpose of this review, the term augmented reality (AR) will be used to encompass both AR and MR.
Augmented Reality
There are three overarching types of augmented reality: projection based, video-see-through (VST), and optical-see-through (OST). In projection-based AR, a 2D or 3D image is projected onto the site of interest (Fig. 1a) [3•, 4–7]. The advantage of this approach is in the direct overlay of the image on the site of interest and in the ability of all nearby observers to simultaneously see and interact with the AR scene. However, its efficacy can be limited by lighting conditions and changes in the surface contours or appearance of the projection site.
Fig. 1.
Three common types of augmented reality (AR). a Projection-based AR demonstrated by a laser projected QWERTY keyboard. b Video-see-through (VST) AR demonstrated by interior decorating mobile application. c Optical-see-through (OST) AR demonstrated by the Microsoft HoloLense displaying surgical guidance lines. Fig. 1a cropped to emphasize the subject, labeled for reuse according to Create Commons Attribution-Share Alike 2.0 Generic License (From Wikimedia Commons) Fig. 1b and c labeled for reuse according to Create Commons Attribution-Share Alike 4.0 International License (From Wikimedia Commons)
In VST, the digital content is superimposed on a live video feed of the real world which can be displayed on a monitor, tablet, or phone screen [8, 9••, 10••, 11–13, 14•, 15–26]. This has been popularized by mobile gaming (e.g., Pokémon Go) and the design industry where it allows for visualization of virtual objects in a physical space, such as a couch in a living room (Fig. 1b).
In OST, the digital content is projected onto an optically clear lens that the user is able to see through. This is typically accomplished via a head-mounted display (HMD) such as Google Glass or the Microsoft Hololense [20, 21, 27, 28, 29•, 30–33]. This can create an incredibly immersive experience; however, only the wearer of the HMD can see and interact with the virtual content which can impede effective communication in the OR (Fig. 1c). Additionally, HMDs are often bulky and present an ergonomic challenge to surgeons who would be required to wear these devices for the duration of a surgery.
A high-fidelity AR experience in the operating room is built upon three key technologies: visual rendering, image registration, and tracking [17, 34–38]. Visual rendering is the process of generating a 2D or 3D image by means of a computer program. The complexity of the rendering and the speed at which it needs to be rendered and refreshed for a given application dictates the required processing power of the AR device, which has important implications for the ergonomics of the system [34, 36]. Image registration is the creation and overlaying of physical and virtual coordinate systems which allows for the accurate placement of virtual content within the physical scene. The accuracy of this process is critical for surgical outcomes as an inaccurate registration of even a few millimeters could lead to drastic surgical complications [34].
Tracking is the process by which objects are followed, and their exact positions within the overlaid virtual and visual coordinate systems are determined. It is critical to track the patient’s anatomy as well as any tools that enter the surgical field. Tracking is done via optical and or electromagnetic signals and can be accomplished with or without the aid of markers with each approach having unique strengths and weaknesses [36].
Marker-based tracking requires physical markers to be placed within the operative field, adding additional time, workflow changes, and potential obstructions. However, markers provide a higher level of accuracy with a lower computational burden than marker-less tracking. The two most common types of markers are spherical markers that either actively emit or passively reflect incident IR light and fiducial markers which have a unique size, shape, or pattern to enable easy registration (Fig. 2) [19, 27, 28, 34, 35, 37, 38, 41–43]. Marker-based tracking for tools most often involves the attachment of IR or fiducial markers in a unique 3D arrangement that allows the machine vision application to accurately determine the 3D position of the tool within the coordinate system (Fig. 2a and c) [19, 27, 28, 37, 41–43].
Fig. 2.
Surgical markers. a Reflective IR tracking spheres used to monitor the 3D position of an acetabular impactor. b Optical fiducial markers to register placement of femoral and tibial cutting guides. c Reflective IR tracking spheres on a posterior-superior-iliac-spine reference frame and awl-tipped tap. Fig. 2a cropped to emphasize the subject, labeled for reuse according to Creative Commons Public Domain Mark 1.0 [39]. Fig. 2b Reprint permission given by Pixee Médical. Fig. 2c labeled for reuse according to Create Commons Non-Commercial-No-Derivatives 4.0 [40]
Marker-less tracking of the patient and or tools can be accomplished using either a single or stereoscopic camera set up in conjunction with computer vision algorithms [6, 8, 14•, 17, 30–32, 44]. The benefits of marker-less tracking are reduced setup time and easier workflow integration. However, this method is far more computationally demanding, and the level of accuracy is not currently acceptable for clinical use. For OST-HMD-based AR, it is also critical to track the location and vantage point of the wearer. This is typically accomplished through some combination of onboard sensors and cameras in the HMD but is sometimes aided by the use of external cameras [34, 36].
Clinical Advantages of AR
Image-based orthopedic procedures require the use of external monitors which demand frequent attention shifts, create hand-eye coordination problems, and impose a large cognitive load, all of which have the potential to reduce efficiency in the OR and impact accuracy creating a higher chance for surgical error [45, 46]. Augmented reality technologies have immense potential to solve these problems and reduce cognitive load, so there is reserve to focus on education, mentorship, and managing complications if they arise [45, 46].
Attention shifts can be eliminated by using OST-HMDs to display fluoroscopic images in the corner of a surgeon’s view of the operative field. When this technology was tested in a sawbones model, orthopedic surgeons no longer needed to avert their eyes and head when placing femoral head guide wires resulting in improved tip apex distance, less radiation exposure, and shorter insertion times than with conventional fluoroscopy (Fig. 3) [33]. Hand-eye coordination problems can be solved by augmenting the operative site with fluoroscopic images in a precise overlay in what has been coined video-augmented-fluoroscopy (VAF) (Fig. 4a). This is accomplished with a camera-augmented mobile C-arm (CamC) that uses a double mirror system to co-register the video and fluoroscopic feeds with an accuracy of <1mm [13, 14•, 15–19, 24, 42]. VAF has been investigated for distal locking of IMNs and for K-wire insertion in a variety of trauma models with initial reports demonstrating faster operative times, lower X-ray requirements, and subjective reports from operators of improved spatial orientation [14•, 17, 19, 24].
Fig. 3.
Using an OST-HMD to eliminate attention shift. a Surgeon shifting attention to fluoroscopy monitor. b Surgeon wearing smart glasses showing fluoroscopy images no longer needs to shift attention away from the operative field. Fig. 3 labeled for reuse Creative Commons Attribution-Noncommercial 4.0 License [33]
Fig. 4.
AR applications in surgery. a Video-augmented-fluoroscopy (VAF) showing fluoroscopic images overlaid on a live video feed of the operative field to aid with start point localization and drilling in a percutaneous bovine model. b Digitally reconstructed radiographs (DRRs) showing digitally rendered coronal, sagittal, and transverse radiographs at the level of the surgeon’s instrument. c Projector overlaying a rending of the patient’s CT scan to assist with start point localization. d Integral videography (IV) system: left side showing system set up and right side showing surgeons view of an in situ 3D visualization of the patient’s knee based on pre-operative CT scans. Fig. 4a reproduced with permission of John Wiley and Sons [14•]. Fig. 4b reproduced with permission of Elsevier [8]. Fig. 4c reproduced with permission of Elsevier [6]. Fig. 4d reproduced with permission of Elsevier [47]
The aforementioned technologies have a limited impact on cognitive load because they still require the surgeon to create 3D mental reconstructions from 2D images, a process that can be counterintuitive and error-prone due to projective simplification [42]. AR technologies can use pre- or intra-operative CT scans to create 3D anatomic overlays and/or provide a live feed of digitally reconstructed radiographs (DRRs) at different viewing angles without the need to move the C-arm or take additional X- rays (Fig. 4b) [42, 48]. The 3D reconstructions are registered to the patient anatomy using markers attached to anatomic landmarks (Fig. 2c) or in more sophisticated systems via optical surface mapping and machine vision approaches [13, 15–17, 48, 49]. Fischer et al. investigated the impact of these successively more complex AR systems, (1) conventional XR, (2) VAF, and (3) 3D visualization with DRRs, on key operative outcomes and showed significant improvements in operative time, radiation exposure, and reduced task load as measured by the Surgical Task Load Index (SURG-TLX) [17].
The other main approach to reducing cognitive load by providing in situ 3D visualizations has been using projection-based methods [3•, 4–7, 50]. Projection-based approaches avoid the ergonomic concerns of HMD’s and the spatial discontinuity of VST while allowing multiple viewers to see and interact with the AR environment. Initial forays into projection-based methods used projectors that directly overlay mapped anatomic information on the patient’s skin (Fig. 4c) [3•, 6, 7]. Gavaghan et al. showed that this visualization approach improved spatial understanding and reduced the need for sight diversion in both a biopsy and bone tumor resection model [7]. In the spine surgery arena, Wu et al. demonstrated that this approach facilitated faster localization of pedicle screw entry points intra-operatively [6]. However, accuracy is limited body habitus and potential changes in posture from pre-operative imaging; thus, this system is unable to guide insertion trajectory [6]. Xu et al improved upon this technology and demonstrated accurate screw guidance with an acceptable degree of error in a pre-clinical model (Fig. 4c) [3•].
Another approach to projection-based AR is autostereoscopic image overlay of integral videography (IV). Autostereoscopy is a means of generating 3D images without the need for special headgear or glasses on the part of the viewer. Integral videography is a system that uses, captures, and reproduces a light field using a 2D array of micro-lenses to create autostereoscopic images. By using a half-silvered mirror device, the operator can then directly perceive the real environment while also viewing a reflected IV video overlay creating the illusion of a 3D image inside the patient’s body (Fig. 4d) [4, 5, 50, 51]. This allows for the precise depth perception that is essential to an AR navigation system and avoids any issues with tissue deformation that would obscure visualization with a conventional projector. Siemionow et al. demonstrated the ability of an optically tracked IV system, to facilitate starting point localization for thoracolumbar instrumentation in a cadaveric model [52•]. However, these systems can be bulky and time-consuming to set up and have the potential to impair the visualization of the operative field.
While the trend in orthopaedics and other surgical disciplines has been a move from open to minimally invasive procedures, has trended in orthopedic and other surgical disciplines, this approach often increases radiation exposure for both the patient and the whole medical team as surgeons need to take repetitive X-ray images to control the procedure and validate the placement of tools and implants [53–62]. AR technologies can allow surgeons to track the movement of their tools and implants in relation to patient anatomy in real time obviating the need for repetitive X-ray exposure. Using the CamC system to co-register fluoroscopic images to the live native scene promises to make fluoroscopic image acquisition more targeted and efficient drastically decreasing the patient and occupational radiation dose. Studies have demonstrated significant reductions in the number of X-rays taken by up to 46% as well as lower occupation radiation exposure as measured by dosimeter [14, 17, 42, 63]. These approaches have primarily been used for establishing starting points for percutaneous access in trauma surgeries.
Another approach to reducing occupational radiation exposure is using 3D virtual reconstructions registered to the patient’s anatomy to provide real-time instrument tracking and DRRs obviating the need for intra-operative fluoroscopy [5, 11•, 64, 65•]. Several groups (Edstrom et al [11•], Elmi-Terander et al [11•, 64], and Burstrom et al [65•]) have utilized intra-operative CBCTs to create and register these 3D models. With CBCT providers can step out of the room or properly shield themselves so that they experience near-zero radiation exposure. Other groups, however, aim to remove ionizing radiation from the operating room altogether. Ma et al. proposed using an intra-operative ultrasound to register a pre-op CT to the patient’s spinal anatomy, while Dibble et al. proposes using a posterior-superior-iliac-crest pin/marker combination to register a pre-op CT to the native anatomy [4, 27]. These solutions have much more market potential as they can be implemented in most operative settings without the requirement for a highly expensive intra-operative CBCT scanner.
A controversial area is the impact of augmented reality technology on operative times, as they introduce new equipment which can require technical setup and complicate surgical workflow. These technologies often increase total procedure time, up to 65 min in some studies [66–70]. Many proposed technologies have claimed to significantly reduce operative time: Fisher et al.’s CBCT-DRR system showed a 60% time-savings for the insertion of K-wires in phantom trauma models, and the Sureshot system showed an 11-min time-savings for the placement of distal locking IMN screws in live patients [17, 71]. However, whether or not these reported modest time improvements translate to real-time benefits is unclear with further research on cost-effectiveness needed.
Furthermore, many technologies that appear to be time-saving in pre-clinical models fail to live up to that promise when utilized in a clinical setting. Elmi-Terander et al demonstrated a decrease in navigation time per-pedicle screw in a cadaveric model using a CBCT AR system. However, when this technology was translated to the clinical setting, the time per pedicle screw was equivalent to that of the freehand technique, and the average operative time was unchanged [9, 15, 64].
As far as accuracy and precision are concerned, a number of groups have studied AR systems in phantom models for a variety of applications: pedicle screw placement [3•, 5, 29•, 31, 64, 65•, 72•, 73], tumor resection [16, 74], distal IMN interlock [4], K-wire or needle placement [21, 33, 42, 75], arthroplasty [13, 43, 44, 76], and vertebroplasty [77] with some groups showing improvement over traditional freehand approaches. However, the jump from pre-clinical sawbones and cadaveric models to live patients is substantial with significantly more degrees of freedom added in.
Spinal surgery and, in particular, pedicle screw placement has received the most attention and as a result [35] has the most technologies at a clinical stage. However, clinical data on the accuracy of augmented reality surgical navigation (ARSN) has been limited and at times contradictory. Su et al compared CT-guided navigation vs. freehand technique and found that ARSN improved screw placement accuracy in the thoracic but not lumbar spine [78]. Chan et al performed a systematic review of small cohorts and found moderate evidence of decreased breach rate using ARSN as compared to freehand [79]. A recent review by Elmi-Terander et al demonstrated improved screw accuracy as well as a higher percentage of screws without a cortical breach in the ARSN as opposed to freehand group [9••].
While data has been mixed on the efficacy of ASRN for pedicle screw placement in the general population, there is compelling evidence that it can improve the accuracy of PS placement in more challenging conditions such as scoliosis in which screws are more likely to be mal-positioned [80••]. Comparing the results of scoliosis surgeries from one surgeon using both an ARSN and FH approach, the ARSN approach allowed the surgeon to increase the PS density and minimize the use of hooks, creating better constructs that will hopefully obviate the need for revision surgery [10••]. In a similar study, using ARSN as opposed to FH led to significantly lower rates of revision surgeries for mal-positioned PS (1.35% vs. 4.38%; p <0.01) [81].
However, there are significant limitations to accuracy that need to be addressed in future systems. Jin et al showed that accuracy decreases significantly as distance from the spinal process reference frame with the IR marker increases. The risk of screw misplacement doubles at 2 levels and quadruples at 3 levels away from the reference frame [80••]. There is also the risk of an operator or assistant inadvertently touching the reference frame during the procedure; if this causes any movement of the tracker, it will need to be repositioned and a repeat CBCT scan taken to re-register the scene [80••].
Conclusion
Orthopedics’ high utilization of pre- and intra-operative imaging combined with the relative ease of tracking rigid structures like bone as compared to soft tissues has made it the clear beachhead market for AR technologies in medicine. AR technologies have the promise of reducing cognitive burden on providers allowing them to operate with more precision and accuracy all while lowering the operative time and radiation exposure. However, it is unclear if current technologies deliver on this promise, and the ability to do so seems contingent upon continued technological progress. The accessibility, efficacy, and costs of these technologies will continue to improve, but their true cost-effectiveness has yet to be determined and remains a major roadblock to continued development. AR technologies appear to be at an inflection point with an increasing number of technologies moving from proof of concept to investigational clinical trials a trend that can be expected to accelerate as technological advancements solve workflow issues, lower costs, and expand access to these technologies.
Code Availability
Not applicable
Author Contribution
Andrew Furman – Idea generation, manuscript research, writing, and preparation
Dr. Wellington Hsu – Idea generation, manuscript editing
Data availability
Not applicable
Declarations
Conflict of Interest
Andrew Furman – No conflicts
Dr. Wellington Hsu – Advisory board member of Stryker, Medtronic, Asahi, Bioventus
Footnotes
This article is part of the Topical Collection on The Use of Technology in Orthopaedic Surgery—Intraoperative and Post Operative Management
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
Papers of particular interest, published recently, have been highlighted as: • Of importance •• Of major importance
- 1.Verhey JT, Haglin JM, Verhey EM, Hartigan DE. Virtual, augmented, and mixed reality applications in orthopedic surgery. Int J Med Robot Comput Assist Surg. 2020;16:e2067. doi: 10.1002/rcs.2067. [DOI] [PubMed] [Google Scholar]
- 2.Chong Y, Sethi DK, Loh CHY, Lateef F. Going forward with Pokemon Go. J Emerg Trauma Shock. 2018;11:243–246. doi: 10.4103/JETS.JETS_87_17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Xu B, Yang Z, Jiang S, Zhou Z, Jiang B, Yin S. Design and validation of a spinal surgical navigation system based on spatial augmented reality. Spine (Phila Pa 1976) 2020;45:E1627–E1e33. doi: 10.1097/BRS.0000000000003666. [DOI] [PubMed] [Google Scholar]
- 4.Ma L, Zhao Z, Zhang B, Jiang W, Fu L, Zhang X, Liao H. Three-dimensional augmented reality surgical navigation with hybrid optical and electromagnetic tracking for distal intramedullary nail interlocking. Int J Med Robot Comput Assist Surg. 2018;14:e1909. doi: 10.1002/rcs.1909. [DOI] [PubMed] [Google Scholar]
- 5.Ma L, Zhao Z, Chen F, Zhang B, Fu L, Liao H. Augmented reality surgical navigation with ultrasound-assisted registration for pedicle screw placement: a pilot study. Int J Comput Assist Radiol Surg. 2017;12:2205–2215. doi: 10.1007/s11548-017-1652-z. [DOI] [PubMed] [Google Scholar]
- 6.Wu J-R, Wang M-L, Liu K-C, Hu M-H, Lee P-Y. Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput Methods Prog Biomed. 2014;113:869–881. doi: 10.1016/j.cmpb.2013.12.021. [DOI] [PubMed] [Google Scholar]
- 7.Gavaghan K, Oliveira-Santos T, Peterhans M, Reyes M, Kim H, Anderegg S, Weber S. Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: phantom studies. Int J Comput Assist Radiol Surg. 2012;7:547–556. doi: 10.1007/s11548-011-0660-7. [DOI] [PubMed] [Google Scholar]
- 8.Nguyen NQ, Priola SM, Ramjist JM, Guha D, Dobashi Y, Lee K, Lu M, Androutsos D, Yang V. Machine vision augmented reality for pedicle screw insertion during spine surgery. J Clin Neurosci. 2020;72:350–356. doi: 10.1016/j.jocn.2019.12.067. [DOI] [PubMed] [Google Scholar]
- 9.Elmi-Terander A, Burström G, Nachabé R, et al. Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: a matched-control study comparing accuracy. Sci Rep. 2020;10:707. doi: 10.1038/s41598-020-57693-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Edström E, Burström G, Persson O, et al. Does Augmented reality navigation increase pedicle screw density compared to free-hand technique in deformity surgery? Single Surgeon Case Series of 44 Patients. Spine (Phila Pa 1976) 2020;45:E1085–E1e90. doi: 10.1097/BRS.0000000000003518. [DOI] [PubMed] [Google Scholar]
- 11.Edström E, Burström G, Omar A, et al. Augmented reality surgical navigation in spine surgery to minimize staff radiation exposure. Spine (Phila Pa 1976) 2020;45:E45–e53. doi: 10.1097/BRS.0000000000003197. [DOI] [PubMed] [Google Scholar]
- 12.Burström G, Nachabe R, Homan R, Hoppenbrouwers J, Holthuizen R, Persson O, Edström E, Elmi-Terander A. Frameless patient tracking with adhesive optical skin markers for augmented reality surgical navigation in spine surgery. Spine (Phila Pa 1976) 2020;45:1598–1604. doi: 10.1097/BRS.0000000000003628. [DOI] [PubMed] [Google Scholar]
- 13.Alexander C, Loeb AE, Fotouhi J, Navab N, Armand M, Khanuja HS. Augmented reality for acetabular component placement in direct anterior total hip arthroplasty. J Arthroplast. 2020;35:1636–41.e3. doi: 10.1016/j.arth.2020.01.025. [DOI] [PubMed] [Google Scholar]
- 14.Weidert S, Wang L, Landes J, et al. Video-augmented fluoroscopy for distal interlocking of intramedullary nails decreased radiation exposure and surgical time in a bovine cadaveric setting. Int J Med Robot Comput Assist Surg. 2019;15:e1995. doi: 10.1002/rcs.1995. [DOI] [PubMed] [Google Scholar]
- 15.Elmi-Terander A, Burström G, Nachabe R, Skulason H, Pedersen K, Fagerlund M, Ståhl F, Charalampidis A, Söderman M, Holmin S, Babic D, Jenniskens I, Edström E, Gerdhem P. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging. Spine (Phila Pa 1976) 2019;44:517–525. doi: 10.1097/brs.0000000000002876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Cho HS, Park MS, Gupta S, Han I, Kim HS, Choi H, Hong J. Can augmented reality be helpful in pelvic bone cancer surgery? An in vitro study. Clin Orthop Relat Res. 2018;476:1719–1725. doi: 10.1007/s11999.0000000000000233. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Fischer M, Fuerst B, Lee SC, Fotouhi J, Habert S, Weidert S, Euler E, Osgood G, Navab N. Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int J Comput Assist Radiol Surg. 2016;11:1007–1014. doi: 10.1007/s11548-016-1363-x. [DOI] [PubMed] [Google Scholar]
- 18.Fallavollita P, Brand A, Wang L, Euler E, Thaller P, Navab N, Weidert S. An augmented reality C-arm for intraoperative assessment of the mechanical axis: a preclinical study. Int J Comput Assist Radiol Surg. 2016;11:2111–2117. doi: 10.1007/s11548-016-1426-z. [DOI] [PubMed] [Google Scholar]
- 19.Londei R, Esposito M, Diotte B, Weidert S, Euler E, Thaller P, Navab N, Fallavollita P. Intra-operative augmented reality in distal locking. Int J Comput Assist Radiol Surg. 2015;10:1395–1403. doi: 10.1007/s11548-015-1169-2. [DOI] [PubMed] [Google Scholar]
- 20.Ponce BA, Jennings JK, Clay TB, May MB, Huisingh C, Sheppard ED. Telementoring: use of augmented reality in orthopaedic education: AAOS exhibit selection. J Bone Joint Surg Am. 2014;96:e84. doi: 10.2106/JBJS.M.00928. [DOI] [PubMed] [Google Scholar]
- 21.U-Thainual P, Fritz J, Moonjaita C, Ungi T, Flammang A, Carrino JA, Fichtinger G, Iordachita I. MR image overlay guidance: system evaluation for preclinical use. Int J Comput Assist Radiol Surg. 2013;8:365–378. doi: 10.1007/s11548-012-0788-0. [DOI] [PubMed] [Google Scholar]
- 22.Shen F, Chen B, Guo Q, Qi Y, Shen Y. Augmented reality patient-specific reconstruction plate design for pelvic and acetabular fracture surgery. Int J Comput Assist Radiol Surg. 2013;8:169–179. doi: 10.1007/s11548-012-0775-5. [DOI] [PubMed] [Google Scholar]
- 23.Yeo CT, Ungi T, U-Thainual P, Lasso A, McGraw RC, Fichtinger G. The effect of augmented reality training on percutaneous needle placement in spinal facet joint injections. IEEE Trans Biomed Eng. 2011;58:2031–2037. doi: 10.1109/tbme.2011.2132131. [DOI] [PubMed] [Google Scholar]
- 24.Navab N, Heining S-M, Traub J. Camera augmented mobile C-arm (CAMC): Calibration, accuracy study, and clinical applications. IEEE Trans Med Imaging. 2010;29:1412–1423. doi: 10.1109/tmi.2009.2021947. [DOI] [PubMed] [Google Scholar]
- 25.Fischer GS, Deguet A, Csoma C, Taylor RH, Fayad L, Carrino JA, Zinreich SJ, Fichtinger G. MRI image overlay: application to arthrography needle insertion. Comput Aided Surg. 2007;12:2–14. doi: 10.3109/10929080601169930. [DOI] [PubMed] [Google Scholar]
- 26.Fichtinger G, Deguet A, Masamune K, Balogh E, Fischer GS, Mathieu H, Taylor RH, Zinreich SJ, Fayad LM. Image overlay guidance for needle insertion in CT scanner. IEEE Trans Biomed Eng. 2005;52:1415–1424. doi: 10.1109/tbme.2005.851493. [DOI] [PubMed] [Google Scholar]
- 27.Dibble CF, Molina CA. Device profile of the XVision-spine (XVS) augmented-reality surgical navigation system: overview of its safety and efficacy. Expert Rev Med Devices. 2021;18:1–8. doi: 10.1080/17434440.2021.1865795. [DOI] [PubMed] [Google Scholar]
- 28.Urakov TM. Augmented reality-assisted pedicle instrumentation: versatility across major instrumentation sets. Spine (Phila Pa 1976) 2020;45:E1622–E16e6. doi: 10.1097/BRS.0000000000003669. [DOI] [PubMed] [Google Scholar]
- 29.Dennler C, Jaberg L, Spirig J, et al. Augmented reality-based navigation increases precision of pedicle screw insertion. J Orthop Surg Res. 2020;15:174. doi: 10.1186/s13018-020-01690-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Urakov TM, Wang MY, Levi AD. Workflow caveats in augmented reality-assisted pedicle instrumentation: Cadaver lab. World Neurosurg. 2019;126:e1449–e1e55. doi: 10.1016/j.wneu.2019.03.118. [DOI] [PubMed] [Google Scholar]
- 31.Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R. Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int J Comput Assist Radiol Surg. 2019;14:525–535. doi: 10.1007/s11548-018-1814-7. [DOI] [PubMed] [Google Scholar]
- 32.Liu H, Auvinet E, Giles J, Rodriguez Y, Baena F. Augmented reality based navigation for computer assisted hip resurfacing: a proof of concept study. Ann Biomed Eng. 2018;46:1595–1605. doi: 10.1007/s10439-018-2055-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Hiranaka T, Fujishiro T, Hida Y, Shibata Y, Tsubosaka M, Nakanishi Y, Okimura K, Uemoto H. Augmented reality: the use of the PicoLinker smart glasses improves wire insertion under fluoroscopy. World J Orthop. 2017;8:891–894. doi: 10.5312/wjo.v8.i12.891. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices. 2021;18:47–62. doi: 10.1080/17434440.2021.1860750. [DOI] [PubMed] [Google Scholar]
- 35.Keating TC, Jacobs JJ. Augmented reality in orthopedic practice and education. Orthop Clin North Am. 2021;52:15–26. doi: 10.1016/j.ocl.2020.08.002. [DOI] [PubMed] [Google Scholar]
- 36.Park BJ, Hunt SJ, Martin C, 3rd, Nadolski GJ, Wood BJ, Gade TP. Augmented and mixed reality: technologies for enhancing the future of IR. J Vasc Interv Radiol. 2020;31:1074–1082. doi: 10.1016/j.jvir.2019.09.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Nguyen NQ, Cardinell J, Ramjist JM, Lai P, Dobashi Y, Guha D, Androutsos D, Yang VXD. An augmented reality system characterization of placement accuracy in neurosurgery. J Clin Neurosci. 2020;72:392–396. doi: 10.1016/j.jocn.2019.12.014. [DOI] [PubMed] [Google Scholar]
- 38.Vávra P, Roman J, Zonča P, Ihnát P, Němec M, Kumar J, Habib N, el-Gendi A. Recent development of augmented reality in surgery: a review. J Healthc Eng. 2017;2017:4574172–4574179. doi: 10.1155/2017/4574172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Bradley MP, Benson JR, Muir JM. Accuracy of acetabular component positioning using computer-assisted navigation in direct anterior total hip arthroplasty. Cureus. 2019;11:e4478. doi: 10.7759/cureus.4478. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Sadrameli SS, Jafrani R, Staub BN, Radaideh M, Holman PJ. Minimally invasive, stereotactic, wireless, percutaneous pedicle screw placement in the lumbar spine: accuracy rates with 182 consecutive screws. Int J Spine Surg. 2018;12:650–658. doi: 10.14444/5081. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Van Duren BH, Sugand K, Wescott R, Carrington R, Hart A. Augmented reality fluoroscopy simulation of the guide-wire insertion in DHS surgery: a proof of concept study. Med Eng Phys. 2018;55:52–59. doi: 10.1016/j.medengphy.2018.02.007. [DOI] [PubMed] [Google Scholar]
- 42.Andress S, Johnson A, Unberath M, Winkler AF, Yu K, Fotouhi J, Weidert S, Osgood G, Navab N. On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial. J Med Imaging (Bellingham) 2018;5(2):021209. doi: 10.1117/1.jmi.5.2.021209.full. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Fotouhi J, Alexander CP, Unberath M, Taylor G, Lee SC, Fuerst B, Johnson A, Osgood GM, Taylor RH, Khanuja H, Armand M, Navab N. Plan in 2-D, execute in 3-D: an augmented reality solution for cup placement in total hip arthroplasty. J Medical Imaging. 2018;5(2):021205. doi: 10.1117/1.jmi.5.2.021205.full. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Pokhrel S, Alsadoon A, Prasad PWC, Paul M. A novel augmented reality (AR) scheme for knee replacement surgery by considering cutting error accuracy. Int J Med Robot. 2019;15:e1958. doi: 10.1002/rcs.1958. [DOI] [PubMed] [Google Scholar]
- 45.Mentis HM, Chellali A, Manser K, Cao CGL, Schwaitzberg SD. A systematic review of the effect of distraction on surgeon performance: directions for operating room policy and surgical training. Surg Endosc. 2016;30:1713–1724. doi: 10.1007/s00464-015-4443-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Cleary K, Peters TM. Image-Guided Interventions: Technology Review and Clinical Applications. Annu Rev Biomed Eng. 2010;12:119–42. https://www.annualreviews.org/doi/abs/10.1146/annurev-bioeng-070909-105249 [DOI] [PubMed]
- 47.Liao H, Ishihara H, Tran HH, Masamune K, Sakuma I, Dohi T. Precision-guided surgical navigation system using laser guidance and 3D autostereoscopic image overlay. Comput Med Imaging Graph. 2010;34:46–54. doi: 10.1016/j.compmedimag.2009.07.003. [DOI] [PubMed] [Google Scholar]
- 48.Reaungamornrat S, Otake Y, Uneri A, Schafer S, Mirota DJ, Nithiananthan S, Stayman JW, Kleinszig G, Khanna AJ, Taylor RH, Siewerdsen JH. An on-board surgical tracking and video augmentation system for C-arm image guidance. Int J Comput Assist Radiol Surg. 2012;7:647–665. doi: 10.1007/s11548-012-0682-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Lee SC, Fuerst B, Fotouhi J, Fischer M, Osgood G, Navab N. Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization. Int J Comput Assist Radiol Surg. 2016;11:967–975. doi: 10.1007/s11548-016-1396-1. [DOI] [PubMed] [Google Scholar]
- 50.Ma L, Fan Z, Ning G, Zhang X, Liao H. 3D Visualization and augmented reality for orthopedics. Adv Exp Med Biol. 2018;1093:193–205. doi: 10.1007/978-981-13-1396-7_16. [DOI] [PubMed] [Google Scholar]
- 51.Liao H, Hata N, Nakajima S, Iwahara M, Sakuma I, Dohi T. Surgical navigation by autostereoscopic image overlay of integral videography. IEEE Trans Inf Technol Biomed. 2004;8:114–121. doi: 10.1109/titb.2004.826734. [DOI] [PubMed] [Google Scholar]
- 52.Siemionow KB, Katchko KM, Lewicki P, Luciano CJ. Augmented reality and artificial intelligence-assisted surgical navigation: technique and cadaveric feasibility study. J Craniovertebr Junction Spine. 2020;11:81–85. doi: 10.4103/jcvjs.JCVJS_48_20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Baumgartner R, Libuit K, Ren D, et al. Reduction of radiation exposure from C-arm fluoroscopy during orthopaedic trauma operations with introduction of real-time dosimetry. J Orthop Trauma. 2016;30:e53–e8. https://journals.lww.com/jorthotrauma/Fulltext/2016/02000/Reduction_of_Radiation_Exposure_From_C_Arm.11.aspx. Accessed 14 July 2021. [DOI] [PubMed]
- 54.Yu E, Khan SN. Does less invasive spine surgery result in increased radiation exposure? a systematic review. Clin Orthop Relat Res. 2014;472:1738–1748. doi: 10.1007/s11999-014-3503-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Yeo CH, Gordon R, Nusem I. Improving operating theatre communication between the orthopaedics surgeon and radiographer. ANZ J Surg. 2014;84:316–319. doi: 10.1111/ans.12482. [DOI] [PubMed] [Google Scholar]
- 56.Müller MC, Strauss A, Pflugmacher R, Nähle CP, Pennekamp PH, Burger C, Wirtz DC. Evaluation of radiation exposure of personnel in an orthopaedic and trauma operation theatre using the new real-time dosimetry system "dose aware". Z Orthop Unfall. 2014;152:381–388. doi: 10.1055/s-0034-1368603. [DOI] [PubMed] [Google Scholar]
- 57.Bronsard N, Boli T, Challali M, de Dompsure R, Amoretti N, Padovani B, Bruneton G, Fuchs A, de Peretti F. Comparison between percutaneous and traditional fixation of lumbar spine fracture: intraoperative radiation exposure levels and outcomes. Orthop Traumatol Surg Res. 2013;99:162–168. doi: 10.1016/j.otsr.2012.12.012. [DOI] [PubMed] [Google Scholar]
- 58.Lee K, Lee KM, Park MS, Lee B, Kwon DG, Chung CY. Measurements of surgeons’ exposure to ionizing radiation dose during intraoperative use of C-arm fluoroscopy. Spine (Phila Pa 1976). 2012;37:1240–4. https://journals.lww.com/spinejournal/Fulltext/2012/06150/Measurements_of_Surgeons__Exposure_to_Ionizing.9.aspx. Accessed 14 July 2021. [DOI] [PubMed]
- 59.Müller LP, Suffner J, Wenda K, Mohr W, Rommens PM. Radiation exposure to the hands and the thyroid of the surgeon during intramedullary nailing. Injury. 1998;29:461–468. doi: 10.1016/S0020-1383(98)00088-6. [DOI] [PubMed] [Google Scholar]
- 60.Blattert TR, Fill UA, Kunz E, Panzer W, Weckbach A, Regulla DF. Skill dependence of radiation exposure for the orthopaedic surgeon during interlocking nailing of long-bone shaft fractures: a clinical study. Arch Orthop Trauma Surg. 2004;124:659–664. doi: 10.1007/s00402-004-0743-9. [DOI] [PubMed] [Google Scholar]
- 61.Gausden EB, Christ AB, Zeldin R, Lane JM, McCarthy MM. Tracking cumulative radiation exposure in orthopaedic surgeons and residents: what dose are we getting? J Bone Joint Surg Am. 2017;99:1324–1329. doi: 10.2106/JBJS.16.01557. [DOI] [PubMed] [Google Scholar]
- 62.Mehlman CT, DiPasquale TG. Radiation exposure to the orthopaedic surgical team during fluoroscopy: "how far away is far enough?". J Orthop Trauma. 1997;11:392–398. doi: 10.1097/00005131-199708000-00002. [DOI] [PubMed] [Google Scholar]
- 63.Von Der Heide AM, Fallavollita P, Wang L, et al. Camera-augmented mobile C-arm (CamC): a feasibility study of augmented reality imaging in the operating room. Int J Med Robot Comput Assist Surg. 2018;14:e1885. doi: 10.1002/rcs.1885. [DOI] [PubMed] [Google Scholar]
- 64.Elmi-Terander A, Nachabe R, Skulason H, Pedersen K, Söderman M, Racadio J, Babic D, Gerdhem P, Edström E. Feasibility and accuracy of thoracolumbar minimally invasive pedicle screw placement with augmented reality navigation technology. Spine (Phila Pa 1976) 2018;43:1018–1023. doi: 10.1097/brs.0000000000002502. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Burström G, Nachabe R, Persson O, Edström E, Elmi Terander A. Augmented and virtual reality instrument tracking for minimally invasive spine surgery: a feasibility and accuracy study. Spine (Phila Pa 1976) 2019;44:1097–1104. doi: 10.1097/BRS.0000000000003006. [DOI] [PubMed] [Google Scholar]
- 66.Kraus M, Weiskopf J, Dreyhaupt J, Krischak G, Gebhard F. Computer-aided surgery does not increase the accuracy of dorsal pedicle screw placement in the thoracic and lumbar spine: a retrospective analysis of 2,003 pedicle screws in a level I trauma center. Global Spine J. 2015;5:93–101. doi: 10.1055/s-0034-1396430. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Doke T, Liang JT, Onogi S, Nakajima Y. Fluoroscopy-based laser guidance system for linear surgical tool insertion depth control. Int J Comput Assist Radiol Surg. 2015;10:275–283. doi: 10.1007/s11548-014-1079-8. [DOI] [PubMed] [Google Scholar]
- 68.Hawi N, Liodakis E, Suero EM, Stuebig T, Citak M, Krettek C. Radiological outcome and intraoperative evaluation of a computer-navigation system for femoral nailing: a retrospective cohort study. Injury. 2014;45:1632–1636. doi: 10.1016/j.injury.2014.05.039. [DOI] [PubMed] [Google Scholar]
- 69.Wilharm A, Marintschev I, Hofmann GO, Gras F. 2D-fluoroscopic based navigation for Gamma 3 nail insertion versus conventional procedure- a feasibility study. BMC Musculoskelet Disord. 2013;14:74. doi: 10.1186/1471-2474-14-74. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Hamming NM, Daly MJ, Irish JC, Siewerdsen JH. Automatic image-to-world registration based on x-ray projections in cone-beam CT-guided interventions. Med Phys. 2009;36:1800–1812. doi: 10.1118/1.3117609. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Moreschini O, Petrucci V, Cannata R. Insertion of distal locking screws of tibial intramedullary nails: a comparison between the free-hand technique and the SURESHOT™ Distal Targeting System. Injury. 2014;45:405–407. doi: 10.1016/j.injury.2013.09.023. [DOI] [PubMed] [Google Scholar]
- 72.Elmi-Terander A, Burström G, Nachabe R, et al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a first in-human prospective cohort study. Spine (Phila Pa 1976) 2019;44:517–525. doi: 10.1097/BRS.0000000000002876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Carl B, Bopp M, Saß B, Nimsky C. Microscope-based augmented reality in degenerative spine surgery: initial experience. World Neurosurg. 2019;128:e541–ee51. doi: 10.1016/j.wneu.2019.04.192. [DOI] [PubMed] [Google Scholar]
- 74.Cho HS, Park YK, Gupta S, Yoon C, Han I, Kim HS, Choi H, Hong J. Augmented reality in bone tumour resection: an experimental study. Bone Joint Res. 2017;6:137–143. doi: 10.1302/2046-3758.63.BJR-2016-0289.R1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Agten CA, Dennler C, Rosskopf AB, Jaberg L, Pfirrmann CWA, Farshad M. Augmented reality-guided lumbar facet joint injections. Investig Radiol. 2018;53:495–498. doi: 10.1097/RLI.0000000000000478. [DOI] [PubMed] [Google Scholar]
- 76.Ogawa H, Hasegawa S, Tsukada S, Matsubara M. A pilot study of augmented reality technology applied to the acetabular cup placement during total hip arthroplasty. J Arthroplast. 2018;33:1833–1837. doi: 10.1016/j.arth.2018.01.067. [DOI] [PubMed] [Google Scholar]
- 77.Abe Y, Sato S, Kato K, Hyakumachi T, Yanagibashi Y, Ito M, Abumi K. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty. J Neurosurg Spine. 2013;19:492–501. doi: 10.3171/2013.7.spine12917. [DOI] [PubMed] [Google Scholar]
- 78.Su P, Zhang W, Peng Y, Liang A, Du K, Huang D. Use of computed tomographic reconstruction to establish the ideal entry point for pedicle screws in idiopathic scoliosis. Eur Spine J. 2012;21:23–30. doi: 10.1007/s00586-011-1962-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Chan A, Parent E, Narvacan K, San C, Lou E. Intraoperative image guidance compared with free-hand methods in adolescent idiopathic scoliosis posterior spinal surgery: a systematic review on screw-related complications and breach rates. Spine J. 2017;17:1215–1229. doi: 10.1016/j.spinee.2017.04.001. [DOI] [PubMed] [Google Scholar]
- 80.Jin M, Liu Z, Qiu Y, Yan H, Han X, Zhu Z. Incidence and risk factors for the misplacement of pedicle screws in scoliosis surgery assisted by O-arm navigation—analysis of a large series of one thousand, one hundred and forty five screws. Int Orthop. 2017;41:773–780. doi: 10.1007/s00264-016-3353-6. [DOI] [PubMed] [Google Scholar]
- 81.Fichtner J, Hofmann N, Rienmüller A, Buchmann N, Gempt J, Kirschke JS, Ringel F, Meyer B, Ryang YM. Revision rate of misplaced pedicle screws of the thoracolumbar spine–comparison of three-dimensional fluoroscopy navigation with freehand placement: a systematic analysis and review of the literature. World Neurosurg. 2018;109:e24–e32. doi: 10.1016/j.wneu.2017.09.091. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable