Abstract
Augmented reality (AR) navigation refers to novel technologies that superimpose images, such as radiographs and navigation pathways, onto a view of the operative field. The development of AR navigation has focused on improving the safety and efficacy of neurosurgical and orthopedic procedures. In this review, the authors focus on 3 types of AR technology used in spine surgery: AR surgical navigation, microscope-mediated heads-up display, and AR head-mounted displays. Microscope AR and head-mounted displays offer the advantage of reducing attention shift and line-of-sight interruptions inherent in traditional navigation systems. With the U.S. Food and Drug Administration’s recent clearance of the XVision AR system (Augmedics, Arlington Heights, IL), the adoption and refinement of AR technology by spine surgeons will only accelerate.
Keywords: augmented reality, mixed reality, computer-assisted spine surgery, spine navigation, accuracy
Introduction
Augmented reality (AR) surgical technology refers to devices that integrate computer-generated images with intraoperative visualization of the surgical field. Early work in the 1980s focused on the design of a system that could display computed tomography (CT) imaging in an operating room microscope, but the accuracy and resolution of these images were limited by the prevailing technology [32]. Research in the 1990s saw further development of microscope-mediated AR technology that included improved alignment accuracy and graphical representation of the 3D images, along with automatic calibration [9,20]. An early head-mounted display (HMD) system was developed concurrently and featured a headpiece that projected AR images onto the surgeon’s visual field, eliminating the need for a microscope [3].
The past decade has witnessed substantial progress in the development of sophisticated AR tools that can be implemented in surgery. In the fields of neurosurgery and orthopedic surgery, AR has been investigated most often in the context of spine instrumentation and pedicle screw placement. While freehand techniques have traditionally been used for screw placement, navigating anatomical landmarks can be technically challenging. Estimated accuracy of freehand techniques ranges from 93.0% to 98.5%, and errors in screw placement are associated with significant morbidity, including neurological and vascular injuries [2,23,31]. These risks vary with the severity of the underlying pathology, with studies estimating a malpositioning rate of 15% in patients with severe spinal deformity [18]. Navigational systems have helped improve accuracy and minimize screw misplacement, but these systems are limited by attention shift and line-of-sight interruption. Attention shift refers to the need to look away from the patient to focus on an external monitor, while line-of-sight interruption refers to any item that can interfere with intraoperative navigation by blocking the tracking markers or camera. The advent of robotic-assisted spine surgery has also been helpful in improving accuracy, as preplanned trajectories can be locked on target by rigid arms. However, robotic systems are often associated with a high cost of acquisition [16,17,38]. AR technology is designed to address these limitations and improve the screw accuracy rate, while offering surgeons real-time feedback on precise anatomical locations.
In addition to pedicle screw placement, AR has been reported to assist in the placement of K-wires and acetabular cup, percutaneous placement of sacroiliac screws, and orthopedic tumor resection [6,12,13,14,39]. AR is also being explored and used across several surgical fields, including neurosurgery, orthopedic surgery, general surgery, cardiovascular surgery, and otolaryngology [34]. Commercial HMD applications, such as Google Glass, Optinvent, and Microsoft HoloLens are available for use across several occupations; their utility in surgical settings is being investigated. AR has also emerged as a tool for resident and medical student education [19].
In this review, we provide an overview of 3 main types of AR technology used in spine surgery: AR navigation using an operating room monitor, microscope-mediated heads-up display (HUD), and AR-HMD. We note the advantages and limitations of each category, offer clinical pearls for use of AR-HMD, and conclude by discussing the future of AR and potential new applications. These technologies are summarized in Table 1.
Table 1.
Technology for augmented reality for spine surgery.
| Technology | Features | Outcomes | Advantages | Disadvantages |
|---|---|---|---|---|
| ARSN | C-arm detector system with integrated optical video cameras |
Cadaveric: 85–94% accuracy Patients: 94.1% accuracy, 1.2% screw revision rate [31,11,12] |
Skin markers for tracking, limits LOS interruptions | Attention shift, requirement of a hybrid OR |
| Microscope HUD | Intraoperative imaging and navigation data integrated into microscope display | Patients: 0.87–1.11 mm registration error [6,5] | Limits attention shift and LOS interruption, extensive history and case reports, limited radiation | Only feasible in microsurgery, no published accuracy data |
| HMDs | Wearable headpiece | — | Limits attention shift and LOS interruption | Mechanical discomfort, learning curve, sensory overload |
| Moverio | Transparent binocular display with camera, Wi-Fi, Bluetooth, | Phantom: axial plane EIA: 2.09° ± 1.3°, sagittal plane EIA: 1.98°±1.8° [1] | Lightweight, flexible hinge design | Limited data |
| Google Glass |
Android operating system, camera, glasses, Wi-Fi, Bluetooth | Patients: 15% decrease in operating time, 90% of surgeons reported positive outcomes [42,37]. | Light weight (35 g), voice controlled, live streaming, displays patient records | HIPAA compliance, battery life (2 hours), processing speed, display size |
| HoloLens | See-through lenses, 4 light cameras, computer, Wi-Fi, Bluetooth |
Phantom: 20% decrease in the total operating time [41] Cadaveric: MAE: 4.3°± 2.3°, MTE: 3.4 ± 1.6 mm [29] |
Mixed reality, voice controlled, hand tracking, adjustable interpupillary distance | Large frames, battery life (2–3 hours), FDA approval only for pre-operative planning |
| XVision | Optical tracking camera, integrated graphics processor, headlight, and transparent near-eye displays that project onto surgeon’s retina |
Cadaveric: 96.7–99.1% accuracy in cadaveric studies [26,27] Patients: 98.0% accuracy [24] |
FDA approval, vendor and implant agnostic, adjustable, published clinical data | Limited clinical results, intraoperative CT scan required, registration markers unavailable for certain rigid locations |
ARSN augmented reality surgical navigation, LOS line-of-sight, OR operating room, HUD heads-up display, HMD head-mounted display, EIA error of insertion angle, MAE mean angular errors between trajectories, MTA mean translational errors between entry points, FDA U.S. Food & Drug Administration, CT computed tomography.
Augmented Reality Surgical Navigation
Augmented reality surgical navigation (ARSN) enhances traditional image guidance by using video cameras to combine intraoperative 3-dimensional (3D) imaging with a navigation path for screw placement [10]. Elmi-Terander et al [11] developed one such device that consists of a C-arm flat detector system with optical video cameras integrated into the detector panel and directed toward the isocenter of the C-arm. The video cameras provide video input and optical markers are placed on the skin around the incision to allow for continuous patient tracking. The C-arm uses 3D CT to plan and guide screw placement and trajectory, and it automatically segments the vertebrae and pedicles. The optimal navigation path is specified by the operator. The CT and navigation path are then overlaid onto a monitor that streams a video of the procedure. The monitor shows navigation paths overlaid on images of the patient’s bony anatomy to aid in planning, alignment, and placement of the pedicle screws. The authors compared accuracy of pedicle screw placement in the thoracic spine of a cadaveric model between ARSN and freehand techniques and reported overall accuracy of 85% using ASRN compared with 64% accuracy using the freehand technique [11]. Peh et al [30] conducted a cadaveric study using a similar system, and reported pedicle screw placement accuracy of 94% with ASRN compared with 88% with fluoroscopy, although this difference was not statistically significant.
More recently, Elmi-Terander et al used their ARSN system to place 253 pedicle screws in the thoracic and lumbosacral spines of 20 patients. They achieved an overall accuracy of 94.1% with an intraoperative screw revision rate of 1.2%, and no adverse events occurred [10]. No comparison group was included in their study, but the authors argued that their accuracy rate and navigation time were comparable to those reported in the literature. A later study by Edström et al [8] reported low patient and staff radiation exposure using ARSN, which they attribute to a real-time dosimeter feedback, a larger field of view, and a reduced need for postoperative CT.
Augmented reality surgical navigation offers several advantages. The system relies on skin markers for patient tracking, eliminating the need for placement of a reference frame on the spine. The 4 cameras integrated into the C-arm panel are able to maintain vision as long as at least 5 of the skin markers are visible from the cameras, limiting line-of-sight interruptions [10]. These systems also offer intraoperative imaging capability, allowing for detection of misaligned screws and helping avoid the need for revision surgery [30]. However, attention shift remains a disadvantage of ARSN, as the surgeon must look up at a monitor to observe the AR-enhanced images, and this introduces a distraction from the operative field. The technology is also limited by the resolution capabilities of the remote screen and by the availability of a hybrid operating room and an ARSN system. The system described by Elmi-Terander et al [10] is sold by Philips but is not currently available in the United States, pending clearance by the Food and Drug Administration (FDA).
Microscope-Mediated Heads-up Display
Microscope-mediated HUDs attempt to minimize the issue of attention shift by integrating AR images directly into the operating microscope. The surgeon looks through the microscope and can simultaneously visualize the operative field and 3D navigation images without having to look up at a monitor [4]. HUDs have been in development for several decades, with initial reports in the 1980s of devices that could superimpose preoperative CT data onto a microscope’s view of the surgical field. Roberts et al assessed the feasibility and quality of such a system in craniotomies for meningioma and glioblastoma multiforme. However, the technological limitations of the era precluded HUD use in procedures, as the computing power would not allow for sufficient image resolution and speed reformatting [32]. Improvements continued into the 2000s, at which point several commercial microscope-based HUDs were available to supplement traditional navigation systems. Nakamura et al [29] described successful image-guided microsurgery for cerebral arteriovenous malformations using a microscope-based HUD that superimposed the extent of the malformation and draining vein onto the surgeon’s view of the operative field. Nonetheless, these early reports featured substantial technological limitations.
Spinal navigation is more challenging than cranial navigation, in part due to the flexibility of the spine and difficulties in the registration process. In fact, until recently, there were few reports on using HUDs for spine surgery as new technology had to emerge to overcome such limitations. For instance, Kosterhon et al [21] developed a microscope HUD to assist in visualization of resection planes for osteotomies, and used paired point matching based on anatomical landmarks to overcome challenges in spinal registration. The navigated osteotomy was preplanned and performed entirely in 3D, and the authors argue that the HUD improves resection accuracy and can be useful in complicated spinal procedures. Similarly, Umebayashi et al generated AR models using intraoperative cone beam CT imaging and integrated the navigation data to their microscope. They demonstrated that microscope HUD is feasible and safe for transvertebral anterior cervical foraminotomy and posterior cervical laminoforaminotomy [36].
Additional investigations into microscope HUDs were performed by Carl et al [6,4], who reported successful implementation of a microscope HUD for spine surgery for degenerative spine conditions and resection of spinal tumors. They used a navigation camera to track both the microscope and a reference array attached to the patient, and AR provided 4-dimensional outlines of the bony anatomy and tumor extent. Much of the workflow was automated, and they simultaneously displayed the microscope video and superimposed AR images on operating room monitors. More recently, Carl et al [5] summarized their experience using a microscope HUD on 42 patients with pathologies including spinal deformity, degeneration, and neoplasms. They reported successful use of AR in all 42 patients, with an average of 7.1 objects segmented for display, including target tumors and spinal vertebrae, along with a low registration error (0.87 ± 0.28 mm). The authors state that microscope-based AR is reliable and effective across a variety of surgical procedures, and argue that it can also be used as an education tool for residents and students.
Augmented reality HUD technology is relatively understudied in the orthopedic literature. Automatic registration ensures high accuracy, and its use has been successfully reported in several surgical procedures. It can be readily integrated into the surgical workflow and substantially reduces attention shift compared with ARSN. Surgeons who are familiar with microscopic surgery may find microscope-mediated AR to be simpler than other AR modalities. Still, the system would benefit from increased display resolution and technological improvements in the AR rendering. A substantial limitation is that an AR HUD device is only feasible for microscopic surgery [5].
Head-Mounted Display
A head-mounted display (HMD) refers to a device worn on the head, most commonly a headset, that projects AR onto a small optical display or directly onto the surgeon’s retina. The setup allows surgeons to simultaneously view the operative field and an AR hologram, which generally consists of radiographic images or 3D computer renderings. It has the advantage of minimizing 2 major limitations in other AR technologies: attention shift and line-of-sight interruption. There are substantial engineering challenges associated with integrating a tracking camera with an HMD device; as a result, only the XVision System (XVS) has received 510(k) clearance by the FDA [7]. The following sections describe current applications of AR-HMD in spine surgery.
Moverio BT-35E Smart Glasses
In 2013, Abe et al designed a novel AR guidance system using the Epson Moverio, a see-through binocular LCD HMD, for 3D visualization of needle trajectory during percutaneous vertebroplasty [1]. Preoperative needle trajectories developed from CT scans were overlayed onto a detected marker on the patient’s skin. The technology was tested in 40 phantom trials as well as in 5 patients who underwent surgery for osteoporotic vertebral fractures. No pedicle breaches were reported, and the accuracy was similar to that of conventional fluoroscopy-assisted procedures. However, the authors noted that technological limitations preclude its use in complicated deformities, and that confirmation of needle tip position required fluoroscopic guidance. More recently, Saylany et al used the Moverio HMD to continuously view intraoperative radiographs while performing single-level arthroplasty for right-sided disc herniation repair; the surgeon was able to complete the surgery successfully, though there were challenges noted with switching between surgical loupes and the glasses [33].
Google Glass
Google Glass is an optical HMD that can be secured to surgical loupes and has been used in both non-surgical and surgical settings. A recent systematic review by Wei et al [41] identified 27 studies examining intraoperative use of Google Glass across several surgical specialties, with uses including recording and sharing procedures, displaying patient records, and monitoring vital signs. Yoon et al were the first to report using Google Glass to assist in spine instrumentation by displaying neuronavigation images and live video in the upper right corner of the surgeon’s vision. They found that operative time was reduced to 4.13 minutes per screw using Google Glass compared with a baseline of 4.86 minutes per screw; 90% of surgeons reported that the HMD improved focus on the patient, as they no longer had to turn their heads to gaze at an external monitor. Google Glass has been discontinued for retail purposes and its availability is therefore limited; the new Enterprise Edition model is available only for industrial use and professional applications. Additionally, further refinement of the technology is necessary, as it is currently limited by its short battery life of 2 hours, its image resolution and processing speed, and small display size [42].
Microsoft HoloLens
The HoloLens is a widely studied, see-through HMD developed by Microsoft that consists of mixed-reality smartglasses that merge virtual reality and AR. The headset combines several sensors and an integrated computer, and the interpupillary distance can be calibrated for each user. It responds to gestures and voice commands and allows the user to manipulate and visualize 3D holograms. Tepper et al [35] argue that its design, functionality, and technical capabilities are advantageous compared to Google Glass.
Wanivenhaus et al customized the HoloLens to capture the different locations of pedicle screws and generate a 3D holographic rod template that was tested in a lumbosacral spine model [40]. The hologram could be moved and rotated, and served as a template for bending the implant. They found that the HoloLens decreased the total time spent on bending and inserting the rod by 20% while improving the accuracy of manual rod bending. They argue that the HMD is both simple and cost-effective, but note that additional technical work is required for clinical applications.
Additional studies by Gibby et al and Liebmann et al describe using the HoloLens to guide pedicle screw placement in lumbar phantom models. CT imaging was used to develop preprocedural planning of the appropriate angle and depth for screw placement and was integrated into the HoloLens, allowing the surgeon to superimpose the virtual trajectory guides and CT images on the phantom model. Both studies found promising results in terms of accuracy and time required to place the needles, but they noted that a real-time tracking system, rather than static CT images, should be integrated for use in patients [15,22]. A more recent cadaveric study by Müller et al [28] evaluated pedicle screw instrumentation when combining 3D intraoperative fluoroscopy with the HoloLens HMD, and found that it achieved similar accuracy to a gold standard high-end tracking system.
The Microsoft HoloLens is a sophisticated HMD that offers hands-free use and manipulation of 3D images in the intraoperative field. The device can potentially be used for both preoperative planning and intraoperative navigation, and can also generate 2D images, such as PDF files and Internet webpages displaying operative protocol. However, it offers only 2 to 3 hours of battery life and is FDA-cleared only for preoperative surgical planning [35]. Nonetheless, the technology is only 5 years old, and technological improvements by Microsoft may someday allow for its use in actual procedures.
XVision
Finally, the XVision (XVS) is a new AR-HMD that features an adjustable headset, built-in tracking system, and integrated headlight. Its transparent headset projects holograms directly onto the surgeon’s retina, allowing for 3D superimposition of the bony anatomy over the real spine. The anatomy is obtained via automated segmentation of an intraoperative cone beam CT scan and the surgeon can also see 2-dimensional (2D) sagittal and axial projections within the headset [7]. Cadaveric studies have shown accuracy rates ranging from 96.7% to 99.1% in open thoracolumbar pedicle screw insertion [25,27]. The FDA granted its first 510[k] clearance of an AR-HMD to the XVS for use in thoracolumbar pedicle screw insertion. Shortly thereafter, Molina et al [26] published a case report of the first-in-human use of AR-HMD, showing clinical accuracy of 100% using the Gertzbein-Robbins grading scale. Recently, Liu et al published their experience with 205 consecutively placed pedicle screws using the XVS on 28 patients with heterogenous pathologies including deformity, degeneration, tumor, and trauma. They found an overall accuracy of 98.0% for screw placement, in line with the reported accuracy of navigation. More cases are needed to evaluate the XVS, including direct comparisons with navigation techniques. Nonetheless, it illustrates the potential for incorporating AR-HMD into spine surgery, and the XVS is being evaluated for surgical use beyond pedicle screw insertion. Current potential uses include the planning of spinal tumor resections and osteotomies and placement of S2-alar-iliac screws [26].
The XVS has the major advantage of minimizing line-of-sight interruption by using an integrated tracking camera built in to the HMD, which excludes external obstructing sources and minimizes attention shift, as the surgeon does not need to look up at a remote monitor [25,27]. Additionally, the XVS is vendor-agnostic and can be used with any pedicle screw instrumentation system. It can be used for both open and minimally invasive surgery [7,26]. Nonetheless, published data are only available from a single institution. The XVS is in its nascency and several technical limitations have been noted, including mechanical and visual discomfort, visual obstruction of anatomical structures by holographic images, and the need for intraoperative, rather than preoperative, CT scan for registration. Initial use of the system may result in sensory overload, and a learning curve is to be expected [7,26]. Additionally, the interspinous clamp is placed in the operative field and may be accidentally disrupted. As improvements are made to the system, many of these limitations will likely be resolved in future iterations.
The use of AR-HMD in spine surgery is rapidly evolving. AR-HMD can minimize line-of-sight interruption and attention shift, decrease operative time, increase surgical accuracy, and reduce radiation exposure. Several technologies are being actively explored for surgical use, and additional human trials incorporating AR-HMD are expected in the coming years.
Technical Pearls for Using AR-HMD
A surgeon’s initial experience with an AR-HMD can be disorienting due to the mixing of real visual input with holographic data projected onto the retina, but the user will likely adapt to this new modality of visual stimulation. Additionally, the holographic data can at times obscure visualization of anatomy and the surgical field, an issue compounded by the current version’s inability to accommodate surgical headlights. The surgeon can adjust the position of the lenses or can toggle the holographic images on and off using a foot pedal to provide an unobscured view as needed [26]. In our experience with the XVision system, we have elected to drill the pilot holes using a surgical headlight before the intraoperative registration CT. We then verify the location of the holes using the XVision system and adjust if needed [23].
Future Directions
Recent advances in AR technology promise an exciting future for spine surgery. We anticipate that the next several years will see increased competition and innovation within the AR field, along with integration of AR and robotic technology. Future versions of an AR-HMD device can improve the surgeon’s ability to interact with and manipulate the holographic display and radiographic scans. The technology can also be enhanced to alert surgeons when they are approaching critical structures, such as neurovascular vessels, providing real-time feedback that can reduce intraoperative complications. Eventually, the HMD may be consolidated into a simple pair of glasses or a contact lens, thus reducing visual and mechanical discomfort. Furthermore, AR technology has the potential to serve as a valuable tool for preoperative planning and to enrich education for residents and medical students. Several AR machines have already been implemented in medical education, including the ImmersiveTouch System, which provides haptic feedback to residents in training for pedicle screw placement [24]. Nonetheless, FDA clearance of AR platforms for clinical use is an evolving process without clear pathways [37].
Finally, although most of the literature on AR-HMD has focused on surgery for degenerative and deformity pathologies, surgeons are exploring the use of AR for trauma surgery, tumor resection, and placement of S2–alar–iliac screws. There is also active exploration in using the XVS AR-HMD for advanced maneuvers, including interbody insertion, decompression, osteotomies, and corpectomies.
In conclusion, the past decade has witnessed rapid advancements in AR technology that can be used by spine and orthopedic surgeons to improve accuracy, precision, safety, and ease of operation. Research into ARSN, microscope-mediated HUDs, and HMDs is ongoing, and there is a robust and competitive market for AR devices. HUDs and HMDs offer the significant advantage of reducing attention shift, and the XVision HMD is currently cleared for intraoperative use. As the technology improves and more studies are published, AR will likely be adopted increasingly in preoperative planning, intraoperative navigation, and medical education.
Footnotes
Declaration of Conflicting Interests: The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Timothy F. Witham, MD, is a medical advisory board member, owns stock or stock options, and receives consultancy fees from Augmedics, the manufacturer of a device discussed in this article. Nicholas Theodore, MD, reports relationships with Neurosurgery Research & Education Foundation (NREF), Defense Advanced Research Projects Agency (DARPA), Globus Medical, Depuy Synthes, and Stryker; and holds patents and receives personal fees for legal case reviews. The other authors declared no potential conflicts of interest.
Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.
Human/Animal Rights: All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2013.
Required Author Forms: Disclosure forms provided by the authors are available with the online version of this article as supplemental material.
References
- 1.Abe Y, Sato S, Kato K, et al. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty. J Neurosurg Spine. 2013;19(4):492–501. 10.3171/2013.7.SPINE12917 [DOI] [PubMed] [Google Scholar]
- 2.Abul-Kasim K, Ohlin A.The rate of screw misplacement in segmental pedicle screw fixation in adolescent idiopathic scoliosis: the effect of learning and cumulative experience. Acta Orthop. 2011;82(1):50–55. 10.3109/17453674.2010.548032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Barnett GH, Steiner CP, Weisenberger J.Adaptation of personal projection television to a head-mounted display for intra-operative viewing of neuroimaging. J Image Guid Surg. 1995;1(2):109–112. . [DOI] [PubMed] [Google Scholar]
- 4.Carl B, Bopp M, Saß B, Nimsky C.Microscope-based augmented reality in degenerative spine surgery: initial experience. World Neurosurg. 2019;128:e541–e551. 10.1016/j.wneu.2019.04.192. [DOI] [PubMed] [Google Scholar]
- 5.Carl B, Bopp M, Saß B, Pojskic M, Voellger B, Nimsky C.Spine surgery supported by augmented reality. Glob Spine J. 2020;10(suppl 2):41S–55S. 10.1177/2192568219868217 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Carl B, Bopp M, Saß B, Voellger B, Nimsky C.Implementation of augmented reality support in spine surgery. Eur Spine J. 2019;28(7):1697–1711. 10.1007/s00586-019-05969-4 [DOI] [PubMed] [Google Scholar]
- 7.Dibble CF, Molina CA.Device profile of the XVision-spine (XVS) augmented-reality surgical navigation system: overview of its safety and efficacy. Expert Rev Med Devices. 2021;18:1–8. 10.1080/17434440.2021.1865795. [DOI] [PubMed] [Google Scholar]
- 8.Edström E, Burström G, Omar A, et al. Augmented reality surgical navigation in spine surgery to minimize staff radiation exposure. Spine (Phila Pa 1976). 2020;45(1):E45–E53. 10.1097/BRS.0000000000003197. [DOI] [PubMed] [Google Scholar]
- 9.Edwards PJ, King AP, Maurer CR, et al. Design and evaluation of a system for microscope-assisted guided interventions (MAGI). In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 1679. Springer; 1999:842–852. 10.1007/10704282_91 [DOI] [Google Scholar]
- 10.Elmi-Terander A, Burström G, Nachabe R, et al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a first in-human prospective cohort study. Spine (Phila Pa 1976). 2019;44(7):517–525. 10.1097/BRS.0000000000002876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Elmi-Terander A, Skulason H, Soderman M, et al. Surgical navigation technology based on augmented reality and integrated 3D intraoperative imaging a spine cadaveric feasibility and accuracy study. Spine (Phila Pa 1976). 2016;41(21):E1303–E1311. https://doi.org/10.1097BRS.0000000000001830. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Fischer M, Fuerst B, Lee SC, et al. Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int J Comput Assist Radiol Surg. 2016;11(6):1007–1014. 10.1007/s11548-016-1363-x [DOI] [PubMed] [Google Scholar]
- 13.Fotouhi J, Alexander CP, Unberath M, et al. Plan in 2-D, execute in 3-D: an augmented reality solution for cup placement in total hip arthroplasty. J Med Imaging. 2018;5(2):021205. 10.1117/1.JMI.5.2.021205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Gavaghan K, Oliveira-Santos T, Peterhans M, et al. Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: phantom studies. Int J Comput Assist Radiol Surg. 2012;7(4):547–556. 10.1007/s11548-011-0660-7 [DOI] [PubMed] [Google Scholar]
- 15.Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R.Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int J Comput Assist Radiol Surg. 2019;14(3):525–535. https://doi.org/101007/s11548-018-1814-7 [DOI] [PubMed] [Google Scholar]
- 16.Godzik J, Walker CT, Theodore N, Uribe JS, Chang SW, Snyder LA.Minimally invasive transforaminal interbody fusion with robotically assisted bilateral pedicle screw fixation: 2-dimensional operative video. Oper Neurosurg. 2019;16(3):E86–E87. 10.1093/ons/opy288 [DOI] [PubMed] [Google Scholar]
- 17.Jiang B, Pennington Z, Zhu A, et al. Three-dimensional assessment of robot-assisted pedicle screw placement accuracy and instrumentation reliability based on a preplanned trajectory. J Neurosurg Spine. 2020;33(4):519–528. 10.3171/2020.3.SPINE20208 [DOI] [PubMed] [Google Scholar]
- 18.Jin M, Liu Z, Qiu Y, Yan H, Han X, Zhu Z.Incidence and risk factors for the misplacement of pedicle screws in scoliosis surgery assisted by O-arm navigation—analysis of a large series of one thousand, one hundred and forty five screws. Int Orthop. 2017;41(4):773–780. 10.1007/s00264-016-3353-6 [DOI] [PubMed] [Google Scholar]
- 19.Jud L, Fotouhi J, Andronic O, et al. Applicability of augmented reality in orthopedic surgery: a systematic review. BMC Musculoskelet Disord. 2020;21(1):103. 10.1186/s12891-020-3110-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Kiya N, Dureza C, Fukushima T, Maroon JC.Computer navigational microscope for minimally invasive neurosurgery. Minim Invasive Neurosurg. 1997;40(3):110–115. 10.1055/s-2008-1053429 [DOI] [PubMed] [Google Scholar]
- 21.Kosterhon M, Gutenberg A, Kantelhardt SR, Archavlis E, Giese A.Navigation and image injection for control of bone removal and osteotomy planes in spine surgery. Oper Neurosurg. 2017;13(2):297–304. 10.1093/ons/opw017 [DOI] [PubMed] [Google Scholar]
- 22.Liebmann F, Roner S, von Atzigen M, et al. Pedicle screw navigation using surface digitization on the Microsoft HoloLens. Int J Comput Assist Radiol Surg. 2019;14(7):1157–1165. 10.1007/s11548-019-01973-7 [DOI] [PubMed] [Google Scholar]
- 23.Liu A, Jin Y, Cottrill E, et al. Clinical accuracy and initial experience with augmented reality-assisted pedicle screw placement: the first 205 screws [published online ahead of print 2021]. J Neurosurg Spine. [DOI] [PubMed] [Google Scholar]
- 24.Luciano CJ, Banerjee PP, Bellotte B, et al. Learning retention of thoracic pedicle screw placement using a high-resolution augmented reality simulator with haptic feedback. Neurosurgery. 2011;69(suppl 1):ons14–19; discussion ons19. 10.1227/NEU.0b013e31821954ed [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Molina CA, Phillips FM, Colman MW, et al. A cadaveric precision and accuracy analysis of augmented reality-mediated percutaneous pedicle implant insertion. J Neurosurg Spine. 2021;34:316–324. 10.3171/2020.6.SPINE20370 [DOI] [PubMed] [Google Scholar]
- 26.Molina CA, Sciubba DM, Greenberg JK, Khan M, Witham T.Clinical accuracy, technical precision, and workflow of the first in human use of an augmented-reality head-mounted display stereotactic navigation system for spine surgery. Oper Neurosurg. 2020;20(3):300–309. 10.1093/ons/opaa398 [DOI] [PubMed] [Google Scholar]
- 27.Molina CA, Theodore N, Karim Ahmed A, et al. Augmented reality-assisted pedicle screw insertion: a cadaveric proof-of-concept study. J Neurosurg Spine. 2019;31(1):139–146. 10.3171/2018.12.SPINE181142 [DOI] [PubMed] [Google Scholar]
- 28.Müller F, Roner S, Liebmann F, Spirig JM, Fürnstahl P, Farshad M.Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. Spine J. 2020;20(4):621–628. 10.1016/j.spinee.2019.10.012. [DOI] [PubMed] [Google Scholar]
- 29.Nakamura M, Tamaki N, Tamura S, Yamashita H, Hara Y, Ehara K.Image-guided microsurgery with the Mehrkoordinaten Manipulator system for cerebral arteriovenous malformations. J Clin Neurosci. 2000;7:10–13. 10.1054/jocn.2000.0702. [DOI] [PubMed] [Google Scholar]
- 30.Peh S, Chatterjea A, Pfarr J, et al. Accuracy of augmented reality surgical navigation for minimally invasive pedicle screw insertion in the thoracic and lumbar spine with a new tracking device. Spine J. 2020;20(4):629–637. 10.1016/j.spinee.2019.12.009. [DOI] [PubMed] [Google Scholar]
- 31.Qian BP, Zhang YP, Qiao M, Qiu Y, Mao SH.Accuracy of freehand pedicle screw placement in surgical correction of thoracolumbar kyphosis secondary to ankylosing spondylitis: a computed tomography investigation of 2314 consecutive screws. World Neurosurg. 2018;116:e850–e855. 10.1016/j.wneu.2018.05.116. [DOI] [PubMed] [Google Scholar]
- 32.Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H.A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J Neurosurg. 1986;65(4):545–549. 10.3171/jns.1986.65.4.0545 [DOI] [PubMed] [Google Scholar]
- 33.Saylany A, Spadola M, Blue R, Sharma N, Ozturk AK, Yoon JW.The use of a novel heads-up display (HUD) to view intra-operative x-rays during a one-level cervical arthroplasty. World Neurosurg. 2020;138:369–373. 10.1016/j.wneu.2020.03.073. [DOI] [PubMed] [Google Scholar]
- 34.Shuhaiber JH.Augmented reality in surgery. Arch Surg. 2004;139(2):170–174. 10.1001/archsurg.139.2.170. [DOI] [PubMed] [Google Scholar]
- 35.Tepper OM, Rudy HL, Lefkowitz A, et al. Mixed reality with hololens: where virtual reality meets augmented reality in the operating room. Plast Reconstr Surg. 2017;140(5):1066–1070. 10.1097/PRS.0000000000003802. [DOI] [PubMed] [Google Scholar]
- 36.Umebayashi D, Yamamoto Y, Nakajima Y, Fukaya N, Hara M.Augmented reality visualization-guided microscopic spine surgery: transvertebral anterior cervical foraminotomy and posterior foraminotomy. JAAOS Glob Res Rev. 2018;2(4):e008. 10.5435/jaaosglobal-d-17-00008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.U.S. Food & Drug Administration. Public workshop—medical extended reality: toward best evaluation practices for virtual and augmented reality in medicine. https://www.fda.gov/media/136890/download Published March 5, 2020. Accessed June 18, 2021.
- 38.Vo CD, Jiang B, Azad TD, Crawford NR, Bydon A, Theodore N.Robotic spine surgery: current state in minimally invasive surgery. Glob Spine J. 2020;10(suppl 2):34S–40S. 10.1177/2192568219878131 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Wang H, Wang F, Leong APY, Xu L, Chen X, Wang Q.Precision insertion of percutaneous sacroiliac screws using a novel augmented reality-based navigation system: a pilot study. Int Orthop. 2016;40(9):1941–1947. 10.1007/s00264-015-3028-8 [DOI] [PubMed] [Google Scholar]
- 40.Wanivenhaus F, Neuhaus C, Liebmann F, Roner S, Spirig JM, Farshad M.Augmented reality-assisted rod bending in spinal surgery. Spine J. 2019;19(10):1687–1689. 10.1016/j.spinee.2019.06.019. [DOI] [PubMed] [Google Scholar]
- 41.Wei NJ, Dougherty B, Myers A, Badawy SM.Using Google glass in surgical settings: systematic review. JMIR Mhealth Uhealth. 2018;6(3):e54. 10.2196/mhealth.9409. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Yoon JW, Chen RE, Han PK, Si P, Freeman WD, Pirris SM.Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. Int J Med Robot Comput Assist Surg. 2017;13(3):e1770–e1795. 10.1002/rcs.1770. [DOI] [PubMed] [Google Scholar]
