Skip to main content
Journal of Minimally Invasive Surgery logoLink to Journal of Minimally Invasive Surgery
. 2025 Sep 15;28(3):122–129. doi: 10.7602/jmis.2025.28.3.122

In-vivo evaluation of an augmented reality enhanced ultrasound needle guidance system for minimally invasive procedures in porcine models: a preclinical comparative study

Sanjit Datta 1, Robert F Short 2, Jeffrey W Milsom 3, Charles Martin III 4, Gaurav Gadodia 5, Gabrielle Stefy Bailey 6,, Crew Weunski 6, Michael Evans 6, Bradley B Pua 7
PMCID: PMC12439044  PMID: 40947929

Abstract

Purpose

This study compared the accuracy, safety, and efficacy of standard-of-care (SOC) ultrasound and augmented reality needle guidance system (ARNGS) used adjunctively for percutaneous needle placement in porcine models.

Methods

Four live swine underwent a model creation procedure in which metallic fiducials were percutaneously implanted into the livers (n = 8 per animal; 32 total) and kidneys (n = 4 per animal;16 total) to serve as “lesions.” Computed tomography was used to create three-dimensional volumetric images of the anatomy. Four physicians, with limited previous ARNGS experience and blinded to the target locations, positioned needles at the targets using either SOC alone or ARNGS + SOC.

Results

No adverse events occurred. Mean target registration error (TRE) was 3.0 mm (95% confidence interval [CI], 2.4–3.6 mm; n = 22) with SOC (an average needle depth, 8.0 cm) and 2.9 mm (95% CI, 2.2–3.5 mm; n = 24) with ARNGS + SOC (an average needle depth, 7.6 cm). The first-attempt success rate was 39.1% (9/23) for SOC and 41.7% (10/24) for ARNGS + SOC. There was not a significant difference in TRE or first-pass success rate between the two groups (p > 0.05). Needle repositions were significantly less when using the ARNGS + SOC (0.8 vs. 3.0, p = 0.01).

Conclusion

In a preclinical study, the ARNGS + SOC was as accurate and safe as SOC in needle targeting of implanted targets. A reduction in needle repositioning suggests its potential to streamline procedures and reduce the risk of complications. This novel image fusion method merits further evaluation.

Keywords: Augmented reality, Ultrasonography, Interventional radiography, Needle biopsy, Computed tomography angiography

INTRODUCTION

Percutaneous approaches are a preferred method to diagnose and treat soft-tissue tumors largely due to their minimally invasive nature [1,2]. Percutaneous procedures require the operator to interpret two-dimensional (2D) displays of ultrasound (US) and computed tomography (CT) images, while navigating a needle accurately and safely through a three-dimensional (3D) anatomical space. Furthermore, standard-of-care (SOC) imaging modalities are limited by the image resolution, narrow field of view, poor depth of penetration (US), and radiation exposure (CT) [3].

Augmented reality (AR) is an emerging technology that has the potential to overcome many of the limitations of SOC imaging. While there have been increasing AR solutions for use in minimally invasive interventions in the orthopedic and neurosurgical spaces with fairly fixed and rigid targets, the adoption of AR for minimally invasive, soft-tissue procedures deep within body cavities has been limited due to the effects of respiratory motion and tissue deformation [411]. To compensate for these hurdles, AR solutions for soft-tissue procedures would need to render and project imaging obtained intraprocedurally or otherwise fuse and display real-time imaging simultaneously.

An augmented reality navigational guidance system (ARNGS; XR90, MediView XR, Inc.) that incorporates and displays real-time, electromagnetically tracked and registered US imaging with AR images has previously shown usability, accuracy, and efficacy in benchtop phantom and cadaveric models [7,11]. In this study, we hypothesized that ARNGS + SOC compared to SOC alone would be comparable in accuracy, safety, and efficacy for percutaneous needle placements in the presence of respiratory motion in soft tissues of a live porcine model.

METHODS

Study population

Four healthy female swine, weighing 44.8 to 50.4 kg, were test subjects for this study. Swine were separated into two groups; porcine models in the test group were targeted with the ARNGS + SOC and those in the control group were targeted with SOC alone. After the procedure was completed, the animals were humanely euthanized in compliance with the United States Food and Drug Administration Good Laboratory Practice guidelines.

Augmented reality system

The ARNGS uses the HoloLens 2 (Microsoft Corp.) to register and project CT-based anatomical models, real-time US, and a virtual needle trajectory. The ARNGS interfaces with a commercially available US system and probe (Vivid iq, GE HealthCare). The system works by mounting an electromagnetic (EM) field generator (Northern Digital Inc.) underneath the subject to create a cylindrical EM field within the operating space to register stereoscopic anatomy and track sensorized instruments, including the needle (eTRAX, CIVCO Medical Solutions) and US probe bracket (Fig. 1). The ARNGS streams real-time US imaging to a heads-up display that is displayed in space. Real-time B-sector US imaging is also projected from the EM-tracked probe. The 3D anatomy, segmented from preoperative CT imaging, is projected and registered in the procedural space in coordination with the virtual projections generated from the tracked US probe and needle in the headset’s coordinate space (Fig. 2).

Fig. 1.

Fig. 1

The components of augmented reality needle guidance system (ARNGS). (A) The ARNGS mounts on a commercial ultrasound cart. (B) Registration markers, equipped with optical and electromagnetic sensors, are used to register the system. (C) An electromagnetic field generator is mounted under the operating table.

Fig. 2.

Fig. 2

View of augmented reality needle guidance system (ARNGS) image projections. (A) Heads-up display streaming the real-time ultrasound projection. (B) View of a second operator using the ultrasound B-mode sector projection intersecting with stereoscopic segmented computed tomography (CT) anatomy to set an electromagnetic (EM)-tracked needle trajectory (green). (C) Operator wearing the HoloLens 2 headset and using the ARNGS.

Target implementation and segmentation

During a model creation procedure, animals were anesthetized and positioned supine on the table. Using US imaging, a single proceduralist percutaneously placed echogenic coils (IZI Medical Products) into the liver (n = 8 per animal; 32 total) and kidney parenchyma (n = 4 per animal; 16 total) of each swine to create simulated tumor targets. Immediately after coil implantation, commercially available radiopaque fiducial markers (Beekley Spot Markers, Beekley Corp.) were placed on the skin, and a contrast-enhanced CT scan was performed. Images were segmented using a commercially available segmentation platform (Axial3D). Segmented images were uploaded onto the ARNGS and used to display stereoscopic 3D anatomical images.

Procedure

Test group

After the model creation procedure, animals were anesthetized again 2 to 3 days later. Registration markers, equipped with EM and optical sensors, were placed over the fiducial markers and the ARNGS was registered to the subject.

Targeting procedures were completed by four physicians: two attending interventionalists, one resident interventional radiologist, and one surgeon with experience in minimally invasive surgical techniques. Proceduralists, blinded to the specific location of the coils, used the ARNGS + SOC to guide their needle to the target coil. The virtual representation of the tracked needle and US was utilized for trajectory planning. Proceduralists then positioned and advanced their tracked needle to the target using the ARNGS + SOC. Needle redirects (defined as forward/backward repositions of the needle) and needle attempts (defined as a complete withdrawal and reentry into the body) were recorded by an independent observer. Once each target was reached, images of the US and EM coordinates of the tracked instruments were simultaneously collected to obtain target registration error (TRE), in-plane angular error, and out-of-plane angular error of the needle. Treatment time was recorded by measuring the time the US touched the skin to the time the target was reached. This procedure was repeated for each target lesion. Total animal procedure time was measured by recording the elapsed time between the start of the first treatment and the time the last target was reached per animal. Safety was evaluated by overall animal health and the recording of adverse events occurring while using ARNGS + SOC or SOC. Efficacy was evaluated by the proceduralists' identification of error messages, adverse events related to device operation, and post-procedure operator usability survey detailing their ARNGS experience (Table 1).

Table 1.

Physician ratings on system usability

Question Average physician score
Overall, I am satisfied with how easy it is to use this system. 1.8
It was simple to use this system. 2.0
I was able to complete the tasks and scenarios quickly using this system. 1.5
I am comfortable using this system. 1.8
It was easy to learn to use this system. 1.8
I could become productive quickly using this system. 1.0
The system gave error messages that clearly told me how to fix problems. 1.7
Whenever I made a mistake using the system, I could recover easily and quickly. 1.8
The information (such as online help, on-screen messages, and other documentation) provided with the system was clear. 1.3
It was easy to find the information I needed. 1.5
The information was effective in helping me complete the tasks and scenarios. 1.5
The organization of information on the system was clear. 1.3
The interface of the system was pleasant. 1.3
I liked using the interface of this system. 1.3
This system has all the functions and capabilities I expect it to have. 2.0
Overall, I am satisfied with this system. 1.5
The system increased my overall confidence during the procedure. 1.5
The system was INEFFECTIVE in aiding the execution of the procedure. 4.5
I expect this application will decrease the length of the procedure. 1.5
The system DID NOT decrease the time required for localization and planning. 4.0
The system is ergonomically designed. 1.3
I would NOT use this application for long periods of time because it is uncomfortable. 6.8
Capturing data points through voice commands felt intuitive. 2.8
The head-mounted display HAD DIFFICULTY differentiating between the operator’s voice and background noise. 6.8
The device improved my ability to visualize the intended target(s). 2.8
The device DID NOT improve my ability to view surrounding anatomy and related structures during the procedure. 5.8
The system will likely decrease the radiation dose received by the patient during the procedure. 1.3
I will NOT use the device to target locations traditionally inaccessible by computed tomography or ultrasound. 5.0
I anticipate that this system will allow me to perform this procedure more efficiently. 1.3
The system DID NOT adequately compensate for respiratory motion. 4.0
I feel this device allows me to stay connected to the patient undergoing the procedure. 1.0
I thought the system WAS A DISTRACTION from my surroundings and the patient during the procedure. 6.0

Scale was 1–7, with 1, strongly agree and 7, strongly disagree.

Control group

Proceduralists completed the same targeting procedure for control models using SOC, including pre-procedural CT imaging and US. The ARNGS was not utilized for control group procedures, but the EM tracking system remained active to measure TRE, in-plane angular error, and needle planarity. The same measurements were collected as with the ARNGS + SOC.

Target registration error, in-plane angular error, and needle planarity

Once the target was reached, the operator held the tracked needle and US in place, ensuring that the target, needle tip, and a portion of the needle trajectory were visible, as recordings of the 2D US were saved while EM coordinates of the tracked instruments were simultaneously recorded. On the US recording, operators marked the target, needle tip, and a point along the needle trajectory. This data was used to measure TRE, in-plane angular error, and needle planarity.

The accuracy of the ARNGS + SOC was measured using TRE, or the post-registration Euclidean distance between the tip of the needle and the center of the target after the needle has been placed. TRE measures how close a user can guide their needle tip, Ptip, toward an identifiable location of a target on the US display, Pctr. TRE is calculated using the distance formula:

TRE = Ptip,xPctr,US,x2+Ptip,yPctr,US,y2+Ptip,zPctr,US,z2

In-plane needle angular error is defined as the angle between the physical needle trajectory and the theoretical ideal trajectory to reach the target. Needle planarity, or the out-of-plane needle angle, is defined as the angle between the tracked ultrasound (US) plane and the needle. Needle planarity measures out-of-plane error between the tracked needle and the US plane. Needle planarity helps evaluate the reliability of the 2D TRE measurement by establishing how far out-of-plane the needle was during data collection. To measure planarity, the proceduralist removed the needle as it was held at the location of skin insertion so that the needle depth of insertion could be captured using a calibrated caliper after removal.

RESULTS

All animals survived the model creation procedure, with three swine encountering mild postanesthesia discomfort, which resolved with appropriate veterinary intervention. Two swine experienced mild post-procedural weight loss deemed clinically insignificant. Overall, animals remained in good health with no major adverse events, indicating system safety. In the SOC group, one implanted target was missed during the acute procedure, resulting in sample size of the SOC group 23.

Mean TRE was found not to be statistically different between the ARNGS + SOC and the SOC alone group (95% confidence interval [CI], –1.0 to 0.7 mm). Mean TRE using the ARNGS was 2.9 mm (95% CI, 2.2–3.5 mm; n = 24) at an average needle depth of 7.6 cm. Among the SOC group, one case was not visible under ultrasound, making TRE assessment infeasible. Consequently, the SOC group included 22 cases for TRE analysis. The mean TRE in the SOC group was 3.0 mm (95% CI, 2.4–3.6 mm; n = 22) with an average needle depth of 8.0 cm (Table 2). The average in-plane angle error for the ARNGS group was 7.1° (95% CI, 5.4°–8.8°; n = 18). The in-plane angle error for the SOC group of 6.3° (95% CI, 5.3°–7.3°; n = 17). After plotting for normality, a two-sample T-test demonstrated no statistically significant difference in the in-plane angle error between the ARNGS + SOC and SOC groups at a confidence level of 95% (p > 0.05). There was no statistically significant difference in mean planarity between the ARNGS + SOC and SOC groups at a confidence level of 95% (p > 0.05). The average absolute value of planarity for the ARNGS + SOC was 4.8° (95% CI, 3.1°–6.5°), while the average absolute value of planarity for the SOC group was 9.5° (95% CI, 7.1°–12.0°).

Table 2.

Comparison of TRE between groups

Variable TRE
ARNGS + SOC SOC alone
Descriptive statistics
No. of samples 24 22a)
Mean ± SD (mm) 2.9 ± 1.5 3.0 ± 1.4
SEM 0.3 0.3
Mean needle depth (cm) 7.6 8.0
Estimation for difference
Difference of means –0.2
Pooled SD 1.4
95% CI for difference –1.0 to 0.7
Effect size (Cohen’s d) –0.1

It presents the results of the two-sample t-test and the 95% confidence interval (CI) for the difference in means of target registration error (TRE).

ARNGS, augmented reality needle guidance system (XR90: MediView XR, Inc., Cleveland, OH, USA); SOC, standard of care ultrasound; SD, standard deviation; SEM, standard error of means.

a)Sample size for SOC TRE = 22 due to one excluded dataset.

The first-attempt success rate was recorded as a categorical value (pass/fail). A chi-square test and Fisher exact test were applied to analyze the data, with no statistically significant difference (Pearson chi-square, p = 0.859; Fisher exact test, p > 0.999) between the groups. Overall, the ARNGS + SOC had a first-attempt success rate of 41.7% (10/24), while the SOC group had a first-attempt success rate of 39.1% (9/23). The mean number of needle attempts was found not to be significantly different between the two groups (difference of means, 0.04; 95% CI, –0.2 to 0.3). The average needle repositions for the ARNGS + SOC group was 0.8 (95% CI, 0.4–1.1; n = 24), compared to the mean SOC repositions of 3.0 (95% CI, 1.5–5.0; n = 22). Inner/Outer Fence criteria determined that the there was a mild outlier amongst needle repositions, hence one sample was excluded from analysis. Needle repositions were significantly less when using the ARNGS + SOC compared to SOC (difference of means, –2.3; 95% CI, –4.3 to –0.6). Mean treatment time was evaluated as a continuous variable. One outlier in the ARNGS + SOC group was identified using the interquartile range method (upper bound, 9.75) and excluded from final analysis. After removal, the mean treatment time for ARNGS + SOC was 2.7 minutes (95% CI, 1.9–3.4 minutes; n = 23). Mean treatment time for the SOC group was 4.5 minutes (95% CI, 2.8–6.2 minutes; n = 23). Mean total animal procedure time for the ARNGS + SOC group was 188 minutes (n = 2) and 125 minutes (n = 2) for SOC (Table 3).

Table 3.

Procedural outcomes by treatment group

Variable Treatment group
ARNGS + SOC SOC alone
First attempt success rate 10/24 (41.7%) 9/23 (39.1%)
Average of attempts per treatment 1.2 1.2
Average of repositions per treatment 0.8 3.0a)
Average treatment time per procedure (min) 2.7b) 4.5
Total procedure time per swine (min) 188 125
Average needle depth (cm) 7.6 8.0

ARNGS, augmented reality needle guidance system; SOC, standard of care.

a)For SOC, average of repositions per treatment, n = 22 after exclusion of one outlier. b)For ARNGS + SOC treatment time per procedure, n = 23 after exclusion of one outlier.

Device usability and system efficacy were assessed by post-procedural Likert scale (1, strongly agree to 7, strongly disagree). Mean scores for each question are presented in Table 1. Moreover, no system error messages or device-related adverse events were identified, further indicating ARNGS + SOC efficacy.

DISCUSSION

Minimally invasive percutaneous procedures are increasingly used to diagnose and treat soft-tissue tumors, and due to their complexity and importance, the guidelines released by the Society of Interventional Radiology mandate that all interventional radiologists must be trained and competent in these procedures [12]. However, even with trained interventionalists, applications may be limited by the use of current 2D image-guidance to target complex 3D anatomy [1,2]. AR systems, such as the ARNGS, may mitigate this issue by providing increased depth perception and improved spatial awareness by fusing US imaging with 3D anatomy. While the accuracy and benefits of AR systems, including the ARNGS, have been established in benchtop and cadaver models, to our knowledge, this is the first evaluation of an AR system used alongside SOC imaging in vivo in soft tissues affected by respiratory motion.

This study suggests that ARNGS + SOC can assist with precise needle placements deep within the body cavity in the presence of respiratory motion (TRE, 2.9 mm and 3.0 mm). While the study found no statistically significant difference in accuracy between the ARNGS + SOC and SOC, various factors could impact the results. The implanted coils serving as simulated targets in this evaluation were highly conspicuous under US, allowing for easy visualization in both the ARNGS + SOC group and the SOC group. However, in clinical use, lesions are not always easily visualized with a single imaging modality. Early human use of the ARNGS + SOC has demonstrated the potential to increase an operator’s ability to visualize the target when one imaging modality alone is insufficient [13]. Additionally, proceduralists were not pretrained to use the ARNGS but rather learned to use the system on the day of the procedure. Even with the minimal amount of training and familiarity with the ARNGS + SOC accuracy was still equivalent to SOC. With additional training, the ARNGS + SOC may offer favorable accuracy, and experienced clinicians could quickly adopt this approach in various clinical settings. Despite current limitations, the ARNGS + SOC demonstrated a significant reduction in needle repositions compared to SOC alone (0.8 vs. 3.0, p = 0.01), and a reduction in treatment time (2.7 minutes vs 4.5 minutes) highlighting its potential to decrease treatment time and to reduce the risk of complications associated with multiple needle passes, while providing comparable accuracy in the presence of respiratory motion.

Overall, animal procedure time was longer in the ARNGS + SOC group, likely due to the placement of registration markers, other disposables, and the process of optical and EM registration. Future iterations should consider improvements to the workflow that decrease overall procedure time and allow for registration of instruments and stereoscopic displays without pre-procedure imaging that includes fiducials. Other considerations include smaller form factor registration markers, refinement of the software to include device control of the US, the addition of color flow, and compatibility with multiple US probes for optimal image quality. While this study was not intended to demonstrate statistically superior accuracy to SOC, future enhancements of the ARNGS may allow for such comparisons.

Usability survey results indicate high satisfaction among the proceduralists, with scores reflecting the ARNGS ease-of-use, comfort, and overall system confidence. The integration of real-time US with 3D anatomical projections may have contributed to these favorable scores, as the image fusion may enhance spatial awareness and depth perception. This is crucial in procedures affected by respiratory motion, as maintaining accurate trajectories can be challenging. These findings suggest that the ARNGS may facilitate minimally invasive techniques on more challenging targets, potentially leading to increased use of minimally invasive procedures, avoiding the need for open surgical procedures, thereby reducing the risk of infection and enhancing recovery time [2,10,1416]. Lastly, operators indicated the system may allow them to perform procedures more efficiently with decreased procedure time. This may lead to improved patient safety, enhanced patient access to care, and decreased resource utilization.

The primary limitation of this study was the utilization of porcine models as the study subjects. Although swine have anatomy and breathing patterns similar to those of humans, further research is needed to demonstrate the utility and efficacy of the ARNGS in a clinical setting. Other limitations include the small sample of users (n = 4) and the overall lack of previous in vivo ARNGS experience among users, which may have led to the increased procedure time. While survey scores demonstrated overall device usability, further studies should be conducted with a larger group of users with varying degrees of previous system experience.

In conclusion, this early preclinical study demonstrates the accuracy, safety, and efficacy of an ARNGS + SOC when compared to SOC alone for percutaneous needle placements in vivo. Reducing needle repositions suggests its potential to streamline procedures and reduce the risk of complications. With further evaluation, this image fusion system may improve clinical performance and patient outcomes.

Acknowledgments

The authors would like to thank Adam Cargill and Amy Grey for their guidance and editorial support.

Notes

Ethics statement

This prospective study was conducted in accordance with United States Food and Drug Administration Good Laboratory Practice (GLP) regulations at a GLP-certified facility, and the protocol was approved by the Institutional Animal Care and Use Committee of North American Science Associates, Inc. (Minneapolis, MN, USA) (IACUC # VOL002-IS75).

Authors’ contributions

Conceptualization: CM, GG, ME

Formal analysis: CW

Investigation: SD, RFS, JWM, BBP

Methodology: CM, GG, ME, CW

Project administration: GSB, ME

Visualization: GSB

Writing–original draft: SD, GG, ME, CW

Writing–review & editing: RFS, JWM, CM, GSB, BBP

All authors read and approved the final manuscript.

Conflict of interest

Sanjit Datta reports no conflicts of interest. Robert F. Short is an active consultant and receives speaking fees from Ethicon, Johnson & Johnson, and Neuwave. Within the last year, he has served as a consultant to Galvanize Therapeutics and MediView XR Inc. Jeffrey W. Milsom is an active consultant for Olympus Corporation, Grumpy Innovations, Qmark Medical, and Samothrace Medical Innovations. Charles Martin III serves on the Interventional Oncology Scientific Advisory Board for Boston Scientific and the Business Strategy Advisory Board. He is a consultant for and reports consulting fees from Terumo Medical and Medtronic, and also serves as a consultant for Cook Medical, MediView XR Inc., and Okami Medical Corp. He has received grants from the Cleveland Clinic Lerner Research Institute Chair’s Research Award Grant, NCAI-CC Grant, and VeloSano Pilot Award Grant. He is listed as an inventor on a patent titled “Use of Holographic Guidance in Multiplanar Imaging; Use of Holography with Ultrasound,” owned by Cleveland Clinic and licensed to MediView XR Inc., and on another patent owned by MediView XR Inc. titled “Augmented Reality System and Method with Periprocedural Data Analytics.” Gaurav Gadodia serves as the Director of Medical Affairs for MediView XR Inc. and is a shareholder. Gabrielle Stefy Bailey, Crew Weunski, and Michael Evans are employees of MediView XR Inc. without fiduciary responsibilities. Gabrielle Stefy Bailey and Michael Evans are listed as inventors on a patent owned by MediView XR Inc. titled “Augmented Reality System and Method with Periprocedural Data Analytics.” Bradley B. Pua is a consultant for GE Healthcare and Galvanize Therapeutics and receives royalties from Oxford University Press. He is also a member of MediView XR Inc.’s Clinical Advisory Board.

Funding/support

The study was funded by MediView XR, Inc. No external funding reported.

Data availability

The data presented in this study are available upon reasonable request to the corresponding author.

References

  • 1.Fonseca AZ, Santin S, Gomes LG, Waisberg J, Ribeiro MA. Complications of radiofrequency ablation of hepatic tumors: frequency and risk factors. World J Hepatol. 2014;6:107–113. doi: 10.4254/wjh.v6.i3.107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Ni JY, Xu LF, Sun HL, et al. Percutaneous ablation therapy versus surgical resection in the treatment for early-stage hepatocellular carcinoma: a meta-analysis of 21,494 patients. J Cancer Res Clin Oncol. 2013;139:2021–2033. doi: 10.1007/s00432-013-1530-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Kim E, Ward TJ, Patel RS, et al. Ct-guided liver biopsy with electromagnetic tracking: results from a single-center prospective randomized controlled trial. AJR Am J Roentgenol. 2014;203:W715–W723. doi: 10.2214/AJR.13.12061. [DOI] [PubMed] [Google Scholar]
  • 4.Auloge P, Cazzato RL, Ramamurthy N, et al. Augmented reality and artificial intelligence-based navigation during percutaneous vertebroplasty: a pilot randomised clinical trial. Eur Spine J. 2020;29:1580–1589. doi: 10.1007/s00586-019-06054-6. [DOI] [PubMed] [Google Scholar]
  • 5.Borde T, Saccenti L, Li M, et al. Smart goggles augmented reality ct-us fusion compared to conventional fusion navigation for percutaneous needle insertion. Int J Comput Assist Radiol Surg. 2025;20:107–115. doi: 10.1007/s11548-024-03148-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Carl B, Bopp M, Saß B, Pojskic M, Nimsky C. Augmented reality in intradural spinal tumor surgery. Acta Neurochir (Wien) 2019;161:2181–2193. doi: 10.1007/s00701-019-04005-0. [DOI] [PubMed] [Google Scholar]
  • 7.Gadodia G, Yanof J, Hanlon A, et al. Early clinical feasibility evaluation of an augmented reality platform for guidance and navigation during percutaneous tumor ablation. J Vasc Interv Radiol. 2022;33:333–338. doi: 10.1016/j.jvir.2021.11.014. [DOI] [PubMed] [Google Scholar]
  • 8.Bhatt FR, Orosz LD, Tewari A, et al. Augmented reality-assisted spine surgery: an early experience demonstrating safety and accuracy with 218 screws. Global Spine J. 2023;13:2047–2052. doi: 10.1177/21925682211069321. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Park BJ, Hunt SJ, Nadolski GJ, Gade TP. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Sci Rep. 2020;10:18620. doi: 10.1038/s41598-020-75676-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Park BJ, Hunt SJ, Martin C, et al. Augmented and mixed reality: technologies for enhancing the future of IR. J Vasc Interv Radiol. 2020;31:1074–1082. doi: 10.1016/j.jvir.2019.09.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Gadodia G, Evans M, Weunski C, et al. Evaluation of an augmented reality navigational guidance platform for percutaneous procedures in a cadaver model. J Med Imaging (Bellingham) 2024;11:062602. doi: 10.1117/1.JMI.11.6.062602. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Siragusa DA, Cardella JF, Hieb RA, et al. Requirements for training in interventional radiology. J Vasc Interv Radiol. 2013;24:1609–1612. doi: 10.1016/j.jvir.2013.08.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Finos K, Datta S, Sedrakyan A, Milsom JW, Pua BB. Mixed reality in interventional radiology: a focus on first clinical use of XR90 augmented reality-based visualization and navigation platform. Expert Rev Med Devices. 2024;21:679–688. doi: 10.1080/17434440.2024.2379925. [DOI] [PubMed] [Google Scholar]
  • 14.Krasnick BA, Sindram D, Simo K, et al. Tumor ablation using 3-dimensional electromagnetic-guided ultrasound versus standard ultrasound in a porcine model. Surg Innov. 2019;26:420–426. doi: 10.1177/1553350619825717. [DOI] [PubMed] [Google Scholar]
  • 15.Kurup AN, Schmit GD, Morris JM, et al. Avoiding complications in bone and soft tissue ablation. Cardiovasc Intervent Radiol. 2017;40:166–176. doi: 10.1007/s00270-016-1487-y. [DOI] [PubMed] [Google Scholar]
  • 16.Sastry AV, Swet JH, Murphy KJ, et al. A novel 3-dimensional electromagnetic guidance system increases intraoperative microwave antenna placement accuracy. HPB (Oxford) 2017;19:1066–1073. doi: 10.1016/j.hpb.2017.08.001. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Minimally Invasive Surgery are provided here courtesy of Korean Society of Endo-Laparoscopic & Robotic Surgery

RESOURCES