Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Oct 20.
Published in final edited form as: Tech Vasc Interv Radiol. 2023 Oct 20;26(3):100919. doi: 10.1016/j.tvir.2023.100919

Virtual and Augmented Reality in Interventional Radiology: Current Applications, Challenges, and Future Directions

Ahmed Elsakka 1, Brian J Park 2, Brett Marinelli 3, Nathaniel C Swinburne 1, Javin Schefflein 1
PMCID: PMC11152052  NIHMSID: NIHMS1998039  PMID: 38071031

Abstract

Virtual Reality (VR) and Augmented Reality (AR) are emerging technologies with the potential to revolutionize Interventional Radiology (IR). These innovations offer advantages in patient care, interventional planning, and educational training by improving the visualization and navigation of medical images. Despite progress, several challenges hinder their widespread adoption, including limitations in navigation systems, cost, clinical acceptance, and technical constraints of AR/VR equipment. However, ongoing research holds promise, with recent advancements such as shape-sensing needles and improved organ deformation modeling. The development of deep learning techniques, particularly for medical imaging segmentation, presents a promising avenue to address existing accuracy and precision issues. Future applications of AR/VR in IR include simulation-based training, preprocedural planning, intraprocedural guidance, and increased patient engagement. As these technologies advance, they are expected to facilitate telemedicine, enhance operational efficiency, and improve patient outcomes, marking a new frontier in interventional radiology.

Keywords: Virtual reality, Augmented reality, Mixed Reality, Interventional Radiology, Interventional Oncology

I. Introduction

Augmented reality (AR) and virtual reality (VR) are rapidly growing cutting-edge technologies that offer advanced navigational solutions. These technologies have become an increasingly important tool in interventional radiology (IR) in recent years with many potential applications.

AR and VR can be positioned on a continuum that spans from actual reality to pure virtual reality. AR merges medical imaging with the physical world, enabling healthcare professionals to interact with virtual objects within the real environment. This involves projecting preoperatively reconstructed computed tomography (CT) and magnetic resonance (MR) images onto the operator’s head-mounted display or directly onto the patient. On the other hand, VR involves the construction of a fully immersive anatomical environment using computer-generated images or graphics, allowing for activities like participating in virtual surgical simulations.1

In the field of IR, the significance of AR as a navigation tool has grown considerably. Both AR and VR have the capability to provide interactive visualization and navigation of medical images, enabling more precise and efficient planning and guidance for interventional procedures. Moreover, AR and VR offer a wide range of potential advantages, including increased accuracy, quicker diagnosis, decreased radiation exposure, and improved patient satisfaction. They can be utilized to register real-time medical images with pre-operative data, enabling physicians to monitor their position during the procedure and identify anatomical structures and vessels. This can contribute to a reduction in complications and an increase in intervention accuracy.2

AR can create three-dimensional (3D) models of patient anatomy and medical images, enabling physicians to further their understanding of the patient’s anatomy and plan the intervention accordingly. Additionally, VR can be employed to construct virtual simulations of interventional procedures, allowing physicians and trainees to practice and review the procedure before performing it on an actual patient. This can increase accuracy and minimize the risk of complications by creating a safer and more realistic training environment.3

This review provides a comprehensive overview of the current applications of AR and VR in interventional radiology and their potential benefits. Also, it discusses the challenges, limitations, and future directions of these technologies.

II. Virtual Reality in Interventional Radiology

Virtual reality simulators have emerged as a valuable tool for training young interventional radiologists. A typical virtual reality simulator for interventional radiology for endovascular intervention simulation consists of a haptic device, computer processor with simulation software, and two screens with two-dimensional (2D) and three-dimensional (3D) display.4 Virtual reality-based simulation technology allows for testing medical devices and new therapies, training physicians, and planning interventional procedures using patient-specific data.5

AR has been used within the field of radiology as a tool for medical students and residents to conceptualize complex anatomy. For example, AR is used in training radiology residents to understand head and neck anatomy including complex inner ear and temporal bone structures.6 Uppot, Raul N et al reported usage of immersive reality in the monthly lectures of interventional radiology for medical students. These interactive lectures involve students viewing interventional procedure suites using stereoscopic viewers attached to their personal smartphones. They present a case and ask related questions including questions about diagnosis, indications, contraindications, type of equipment, and type of sedation. Students then are transported to a virtual IR suite, where they can look around and select equipment to use for the particular case.6

Simulation, including VR/AR, is rapidly becoming established in some areas of IR residency and fellowship training. AR can help create immersive scenarios within a real IR suite to improve performance before complex cases or simulate the use of new equipment before actual use. Some studies showed better performance and shorter learning curve when virtual training simulation was used compared to other conventional methods. A comparative study evaluated the educational impact of a novel pulsatile human cadaver model and a virtual reality simulator in teaching basic endovascular surgical skills. It included 24 medical students with no previous endovascular experience, and the results showed that the learning curve was shorter when a virtual reality simulator (VRS) used in combination with a pulsatile human cadaver model (PHCM) compared to use of PHCM alone (control group) in performing endovascular left renal artery catheterization.7

A phase 1 randomized controlled prospective study evaluated the applicability of VR to the teaching of percutaneous nephrostomy involving 63 participants (31 medical students, 31 residents, and one fellow). The participants first underwent baseline testing on a VRS after which they were randomized to receive either no additional training (control arm) or two 30-min sessions of training on the VRS (intervention arm). Participants who received VRS training showed significant improvement in almost 80% of the objective and subjective parameters, whereas those in the control arm showed no significant improvement in any of the parameters. The values for variables such as fluoroscopy time, puncture attempts, and vascular injuries were lower in the intervention arm, demonstrating a real shortening of the learning curve, as well as suggesting that VR-based teaching could reduce morbidity and mortality among patients undergoing percutaneous nephrostomy.8

A prospective randomized controlled study investigated the feasibility and effectiveness of the use of AR glasses on central venous catheter placement with 32 operators (respiratory therapists, n=19 [59%]; physicians, n=11 [34%]; sleep technicians n=2 [6%]). While the results showed no difference in internal jugular vein cannulation time and total procedure time using AR compared to conventional instruction, there was significantly greater adherence to the 24-Point checklist including procedure technique as instructed for the AR-trained group.9

In addition to simulations, AR devices offer telemedicine platforms that enable sharing environments for collaborative experiences with other users. Existing interactive platforms allow remote consultants to project live annotations into the AR display of another operator, offering remote real-time instruction or expert assistance. Additionally, AR could offer remote training by broadcasting procedures performed with AR on a large scale, allowing interventional radiologists in rural settings or developing countries to visualize live or recorded procedures performed by experts.3

AR/VR can improve patient’s compliance through allowing patients to learn about their own diseases by showing a simulation of the procedure in angiography suite. This could reduce patient’s anxiety during actual procedure and help patient to better understanding of the steps of the procedure. One institution reported making available to patients 360-degree recordings of the IR suite including check-in desk, procedure room, and postprocedural recovery area in advance of their scheduled appointments. This allows patients to look around the room and get oriented to the procedure suites and recovery rooms by connecting stereoscopic viewers to their smartphones.6

VR has a potential role in preprocedural and intraprocedural planning for endovascular procedures commonly performed in IR. VR is taking advanced 3D vascular reconstructions from cross-sectional imaging datasets that allows operators to better understand complex anatomy through manipulation of imaging in a VR workspace, and to replicate a device and train with it by employing VR demonstration with a controller and actuators.10

A pictorial report illustrated the potential role of VR in preprocedural and intraprocedural planning for endovascular procedures involving complex splenic artery aneurysm and hepatocellular carcinoma embolization with complex feeding vascular supply. These cases demonstrated VR’s ability to provide true 3D viewing and maneuverability of tortuous vascular anatomy that can be difficult to elucidate on 2D imaging and can reduce operator confidence.10 Furthermore, depth perception using true 3D view would also facilitate percutaneous procedures such as those in the chest, mediastinum, liver, and pelvis. Data suggests that vessel diameter and length measurements with the use of true 3D are more precise than current technology, especially for tortuous arteries, thus allowing for appropriate device sizing and selection. VR can be combined with 3D printing of implants to ensure accuracy before expending resources to make the implant.10

III. Augmented Reality in Interventional Radiology

The use of augmented reality (AR) for procedural planning and guidance has demonstrated the ability to improve several subjective and objective metrics in IR.

AR technology could overlap holographic 3D diagnostic images of a particular lesion directly onto the patient in the interventional suite (Figure 1). This allows operators to visualize the fused anatomy and proceed with targeting the lesion while still standing next to the patient and interacting with the surrounding environment. Therefore, it provides depth information that is absent from the angiographic 2D images.6

Figure 1.

Figure 1.

Microwave ablation of endophytic renal mass using enhanced visualization with augmented reality. Three-dimensional rendering of the preoperative MRI was projected and manually registered to the patient using Microsoft HoloLens headset device (Redmond, WA).

Moreover, AR 3D spatial information can enable the interventional radiologist to have better understanding of complex vascular anatomy. Before the procedure, the interventional radiologist can simulate ideal fluoroscopic angles and positions that delineate tortuous vessel courses and branchpoints. During the procedure, a virtual 3D roadmap can be placed anywhere within the IR suite as a reference to augment vessel selection and catheter positioning.3 AR offers several benefits including improved interventional accuracy, lower overall procedure time, less radiation exposure and less operator head movement when viewing images.1,3

AR in interventional oncology (IO)

Interventional oncology deals with the diagnosis and treatment of cancer and cancer-related problems using a targeted, minimally invasive approach performed under image guidance. Precise image guidance is considered a critical key to the success of interventional oncology procedures.11

AR allows the interventional radiologist to navigate through 3D virtual objects superimposed upon real world objects, all on a single screen. This could increase precision and improve ergonomics and performance of interventional radiologists aiming, as demonstrated in an experimental study of a novel system that used augmented reality for needle guidance in liver thermal ablation.12 The novel AR system was evaluated on three experimental models; a male anthropomorphic trunk phantom, porcine model and a cadaver with liver metastasis, with reported high targeting accuracy of < 5 mm in all cases.12

Park et al evaluated the utility of visualizing preprocedural MR images in 3-dimensional (3D) space using augmented reality (AR) before transarterial embolization (TAE) in a preclinical model. The study compared prospective TAE with AR (n =12) versus TAE without AR, including prospective TAE (n = 16) and additional cohort of 15 retrospective cases. A significant reduction in fluoroscopy time by 48% was recorded with AR, when compared with combined prospective and retrospective controls. A 27% reduction in total catheterization time, from 42.7 minutes to 31.0 minutes, was also observed with AR planning when compared prospectively.13

AR utilization may help to reduce radiation exposure and contrast injection dosage during endovascular procedures in the future. Kuhlmann et al developed a real-time navigation framework, which allows a 3D holographic view of the vascular system on a patient phantom, enabling real-time endovascular guidance without radiation exposure.14 Racadio et al compared the navigational accuracy and radiation dose during needle localization of targets for AR with and without motion compensation (MC) versus those for cone-beam computed tomography (CT) with real-time fluoroscopy navigation in a pig model. The results showed that the use of an AR C-arm system reduces radiation dose by approximately 40%–50% during needle localization of targets in pigs compared with standard CT fluoroscopy while preserving accuracy.15

The clinical feasibility and potential benefits of using true 3D AR with an HMD (head-mounted display) for percutaneous ablation of abdominal soft tissue tumors have been evaluated in 12 patients. In 10 cases, the intraprocedural holographic guidance agreed with the standard imaging guidance. In 5 of the second-stage cases, the mean distance between the electromagnetic sensors and the AR coordinates was 1.2 mm, and the mean distance between the anatomic markers and AR coordinates was 3.3 mm.16

AR in non-oncologic interventions

AR offers a potential benefit in the diagnosis and treatment planning for various non-oncologic interventions, as well.

Current guidance techniques for transjugular intrahepatic portosystemic shunt (TIPS) including wedge hepatic venography with CO2, intravascular ultrasound, and image fusion guidance have different limitations including high cost of intravascular ultrasound equipment and limited spatial resolution of CO2 venography.17 The AR-based navigation system can holographically project the constructed portal venous tree model onto the live scene and guide the process of portal vein puncture. Yang et al17 described a novel augmented reality (AR) system to guide portal vein puncture during TIPS creation in liver phantom and canine preclinical models. The success rate of AR-guided portal vein access was 100% in both the liver phantom and the 9 canine subjects, requiring 2 attempts in 7 animals and 1 attempt in the remaining 2 animals.17 The authors described several potential advantages of the AR system over currently used techniques, such as eliminating the need for real time fluoroscopy, reducing radiation exposure and the use of contrast agents; providing 3D images of both the hepatic and portal veins; and the ability to highlight other critical structures such as large hepatic artery branches, the gallbladder, tumors, and widened liver fissures, thereby avoiding potential complications during the puncture.17

The widely adopted transperineal template biopsy of the prostate has shown superior sensitivity and lower rate of sepsis in prostate biopsy and minimally invasive prostate interventions compared to most offered transrectal ultrasound guided prostate biopsy (TRUS-biopsy).18 A grid-template is typically used to guide the needle. However, the guidance method has limited positioning resolution and a lack of needle angulation selections that are referenced to ultrasound imaging or TRUS-MRI fusion targets. Li et al described use of a novel AR system using HoloLens to facilitate free-hand real-time needle guidance during transperineal procedures. The image overlay error was 1.29 ± 0.57 mm, and needle targeting error was 2.13 ± 0.52 mm. The planned-path guidance placements showed similar error compared to the free-hand guidance. The authors concluded that the AR support for free-hand lesion targeting is feasible and may provide more flexibility than grid-based methods.19

Ergonomics and Workflow

Interventional radiologists often face ergonomic challenges from prolonged neck angulation and constant shifts in focus between the patient, imaging guidance monitors including ultrasound screen, fluoroscopy system controls, and/or ablation generators. Ideal positioning of imaging monitors and x-ray system controls in many IR and hybrid OR cases may not be possible due to physical obstruction from the c-arm and ancillary equipment. Poor ergonomics can contribute to musculoskeletal disorders and physician burnout, with the incidence of neck pain reported to be as high as 24% among interventionalists.20

Mixed-reality headsets are able to deploy virtual 2D monitors anywhere within the operator’s environment and optimally directly in front of the operator in line with the shoulders. This allows mirroring of ultrasound or c-arm images to an augmented reality virtual monitor (Figure 2). Thus, the operator could maintain focus on the task at hand while reducing gazes away from the procedural field.

Figure 2.

Figure 2.

Real-time fluoroscopy feed streamed into Microsoft HoloLens 2 headset device (Redmond, WA).

Park et al described the feasibility of using mixed-reality headsets on a retrospective case series of 3 patients undergoing benign solid thyroid nodule radiofrequency (RF) ablation. Thyroid RF ablation was performed successfully and uneventfully in all 3 cases. The reported mean overall system latency was 266 milliseconds ± 43. However, the operator did not notice a clinically significant lag during the procedure.20

IV. Challenges and Limitations of Virtual and Augmented Reality in IR

Many AR/VR systems are packaged with a navigation system. Thus, many challenges of AR/VR are inherent to any navigation system, whether 2D or 3D. These include patient and respiratory motion, sterile registration, soft tissue and organ deformation, and needle bending. Research is ongoing to solve some of these issues. Lin et al developed a shape-sensing needle and reduced targeting error by 26% compared to rigid needle assumptions.21 Modeling of organ deformation is possible but computationally intensive.22,23 Additionally, patients are typically scanned preoperatively in the supine position but may be positioned in non-supine orientations on the interventional table, subjecting the preoperative scan to gravitational and positional effects. Respiratory motion can be minimized with respiratory gating or high-frequency jet ventilation. Moreover, navigation systems are expensive, and the use of such systems are not reimbursable at this time for IR procedures in the United States. However, cost savings could come from time saved with more efficient procedures and fewer complications.

Another challenge for AR/VR and enhanced 3D visualization is clinical adoption. The majority of studies are proof-of-concept or feasibility analysis. More robust trials will be needed to advocate for their widespread use. IRs also undergo training in diagnostic radiology and interpret 2D cross-sectional imaging. As a result, IRs may be better equipped to reconstruct 3D images mentally versus other surgical specialties and less willing to adopt AR/VR technology as they gain experience with their mental reconstructions during procedures. As alluded to previously, the benefits of the AR/VR technology may be more applicable to less experienced IRs or IRs in training.

Portable AR headsets need to overcome several technical limitations prior to routine procedural use. Battery life currently limits their use to procedures less than 2–3 hrs. Brightness and resolution of the images are important for diagnostic purposes and procedural decision-making. Currently, no standards exist outlining technical factors for headset displays in radiology. Real-time latencies should be low enough to accommodate high pulse-rate fluoroscopy and digital subtraction angiography without lag. Leaded lenses will also need to be built into the headset for radiation safety. However, as the technology advances, many of these factors will be overcome.

V. Future Directions for Virtual and Augmented Reality in IR

Augmented reality and virtual reality are emerging as valuable tools in interventional radiology, offering advanced navigational solutions and enhancing various aspects of patient care (Table 1). These technologies provide interactive visualization and navigation of medical images, allowing for more precise planning and guidance of interventional procedures. AR combines medical imaging with the physical world, enabling healthcare providers to interact with virtual objects in the real-world environment, while VR creates immersive anatomical environments in computer-generated images or graphics.

Table 1.

Summary of current and potential roles of AR and VR in IR

Role Description and representative examples
Trainee education
  • Teaching basic IR procedures such as percutaneous nephrostomy and central venous catheter placement.8,9

Patient education
  • Allowing patients to learn about IR procedures by simulation on their own anatomy.6

Improve workflow
  • Provide telemedicine platforms for remote real-time consultation or expert assistance.3

Endovascular planning
  • preprocedural and intraprocedural planning for endovascular procedures involving complex anatomy such as splenic artery aneurysm and hepatocellular carcinoma embolization.10

Interventional Oncology
  • Reduce radiation exposure and contrast injection dosage.14,15

  • Provide high precision and targeting accuracy in IO procedures such as liver thermal ablation and percutaneous ablation of soft tissue tumors.12,16

Non-oncologic interventions
  • Real time needle guidance for biopsy or complex endovascular procedures such as transperineal prostate biopsy and TIPS procedure.17,19

Improve ergonomics
  • Mirroring of ultrasound or c-arm images to an augmented reality virtual monitor such as using a mixed reality headset in performing ultrasound guided thyroid RF ablation.20

At their core both AR and VR fidelity depend on the accurate delineation of different anatomic structures. Intuitively, it follows that high resolution and accurate organ and vessel segmentation will facilitate greater precision for users performing interventions with VR and AR. Furthermore, accurately segmented organs and structures can enable higher quality, land-mark based co-registration with previously acquired diagnostic imaging, thus, better accounting for organ deformation which currently limits clinical translatability of AR and VR. The advancement of deep learning (DL) model architecture specifically for medical imaging segmentation (nn-UNet)24 and advent of open-source, pre-trained models for anatomic segmentation may provide a means to overcome the accuracy and precision challenges currently facing AR and VR. There are also emerging DL models for deformable co-registration.25 If appropriately trained these DL models can be deployed with edge-computing capabilities and allow highly accurate intraprocedural segmentation and co-registration, where prior, non-machine learning approaches were limited by time-intensive, long run times not amenable to the intraprocedural setting.26

As research continues, the utility and integration of these fields will grow at every step of medical training and practice. Interventional trainees will be able to practice and review procedures before performing them on patients, as simulation has been shown to improve performance and shorten learning curves compared to conventional training methods. Medical schools and residency programs can integrate these tools into their didactic curricula and clinical rotations, enabling better and safer hands-on experience. Experienced clinicians can also benefit from augmented and virtual reality, as these technologies can be used for preprocedural planning and intraprocedural guidance, helping to visualize complex anatomy and assisting in device selection and sizing. An important secondary benefit of this assistance is decreased operative time, which reduces radiation exposure for operators and patients, decreases patient sedation time, and improves overall operational efficiency. With the advancement of AR and VR devices, there will be improvements in user-friendliness and comfort, enabling seamless integration into various educational and practice settings.

AR and VR extend not only to the procedural component of medicine, but also to the patient interaction itself. These tools can facilitate telemedicine by allowing remote experts to provide real-time instruction and anatomic navigation as well as problem solving when needed. During clinic visits, patients can see simulations of the proposed procedures, reducing anxiety and increasing their involvement in their own treatment planning.

AR and VR have emerged as promising tools in interventional radiology. While challenges remain, the continued development and integration of AR and VR technologies show great potential for the future of interventional radiology, ultimately leading to better patient outcomes.

Funding Sources

No relevant funding sources for any author

References

  • 1.Belmustakov S, Bailey C, Weiss CR. Augmented and virtual reality navigation for interventions in the musculoskeletal system. Current Radiology Reports 2018; 6:1–10.29392096 [Google Scholar]
  • 2.Park BJ, Hunt SJ, Nadolski GJ, et al. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Scientific Reports 2020; 10(1):1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Park BJ, Hunt SJ, Martin C, et al. Augmented and Mixed Reality: Technologies for Enhancing the Future of IR. Vasc Interv Radiol 2020;31(7):1074–1082. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Avramov P, Avramov M, Juković M, et al. Virtual simulation as a learning method in interventional radiology. Med Pregl 2013; 66(7–8):335–340. [DOI] [PubMed] [Google Scholar]
  • 5.Anderson JH, Raghavan R. A vascular catheterization simulator for training and treatment planning. J Digit Imaging 1998; 11(3 Suppl 1):120–123. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Uppot RN, Laguna B, McCarthy CJ, et al. Implementing Virtual and Augmented Reality Tools for Radiology Education and Training, Communication, and Clinical Care. Radiology 2019; 291(3):570–580. [DOI] [PubMed] [Google Scholar]
  • 7.Nesbitt CI, Tingle SJ, Williams R, et al. Educational Impact of a Pulsatile Human Cadaver Circulation Model for Endovascular Training. Eur J Vasc Endovasc Surg 2019; 58(4):602–608. [DOI] [PubMed] [Google Scholar]
  • 8.Knudsen BE, Matsumoto ED, Chew BH, et al. A randomized, controlled, prospective study validating the acquisition of percutaneous renal collecting system access skills using a computer based hybrid virtual reality surgical simulator: phase I. J Urol 2006; 176(5):2173–2178. [DOI] [PubMed] [Google Scholar]
  • 9.Huang CY, Thomas JB, Alismail A, et al. The use of augmented reality glasses in central line simulation: “see one, simulate many, do one competently, and teach everyone”. Adv Med Educ Pract 2018; 9:357–363. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Mohammed MAA, Khalaf MH, Kesselman A, Wang DS, Kothary N. A Role for Virtual Reality in Planning Endovascular Procedures. J Vasc Interv Radiol 2018; 29(7):971–974. [DOI] [PubMed] [Google Scholar]
  • 11.Breuer JA, Ahmed KH, Al-Khouja F, et al. Interventional oncology: new techniques and new devices. Br J Radiol 2022; 95(1138):20211360. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Solbiati M, Passera KM, Rotilio A, et al. Augmented reality for interventional oncology: proof-of-concept study of a novel high-end guidance system platform. Eur Radiol Exp 2018; 2:1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Park BJ, Perkons NR, Profka E, et al. Three-Dimensional Augmented Reality Visualization Informs Locoregional Therapy in a Translational Model of Hepatocellular Carcinoma. J Vasc Interv Radiol. 2020; 31(10):1612–1618. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Kuhlemann I, Kleemann M, Jauer P, et al. Towards X-ray free endovascular interventions - using HoloLens for on-line holographic visualisation. Healthc Technol Lett 2017;4(5):184–187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Racadio JM, Nachabe R, Homan R, et al. Augmented Reality on a C-Arm System: A Preclinical Assessment for Percutaneous Needle Localization. Radiology 2016;281(1):249–255. [DOI] [PubMed] [Google Scholar]
  • 16.Gadodia G, Yanof J, Hanlon A, et al. Early Clinical Feasibility Evaluation of an Augmented Reality Platform for Guidance and Navigation during Percutaneous Tumor Ablation. J Vasc Interv Radiol 2022; 33(3):333–338. [DOI] [PubMed] [Google Scholar]
  • 17.Yang J, Zhu J, Sze DY, et al. Feasibility of Augmented Reality-Guided Transjugular Intrahepatic Portosystemic Shunt. J Vasc Interv Radiol 2020; 31(12):2098–2103. [DOI] [PubMed] [Google Scholar]
  • 18.Thomson A, Li M, Grummet J, Sengupta S. Transperineal prostate biopsy: a review of technique. Transl Androl Urol 2020; 9(6):3009–3017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Li M, Mehralivand S, Xu S, et al. HoloLens augmented reality system for transperineal free-hand prostate procedures. J Med Imaging (Bellingham) 2023; 10(2):025001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Park BJ, Shah S, Konik D, et al. An Ergonomic Holographic Procedural Monitor for Thyroid Radiofrequency Ablation Using a Mixed-Reality Headset. J Vasc Interv Radiol 2023; 34(2):307–310. [DOI] [PubMed] [Google Scholar]
  • 21.Lin MA, Siu AF, Bae JH, et al. HoloNeedle: Augmented Reality Guidance System for Needle Placement Investigating the Advantages of Three-Dimensional Needle Shape Reconstruction. IEEE Robot Autom Lett 2018;3(4). [Google Scholar]
  • 22.Tam GK, Cheng ZQ, Lai YK, et al. Registration of 3D point clouds and meshes: a survey from rigid to nonrigid. IEEE Trans Vis Comput Graph 2013; 19(7):1199–1217. [DOI] [PubMed] [Google Scholar]
  • 23.Si W, Liao X, Qian Y, et al. Mixed reality guided radiofrequency needle placement: A pilot study. IEEE Access 2018; 6:31493–502. [Google Scholar]
  • 24.Isensee F, Jaeger PF, Kohl SAA, et al. nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods 2021; 18(2):203–211. [DOI] [PubMed] [Google Scholar]
  • 25.Tian Lin, Greer Hastings, Vialard François-Xavier, et al. “GradICON: Approximate diffeomorphisms via gradient inverse consistency.” In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2023; 18084–18094. [Google Scholar]
  • 26.Avants BB, Tustison N, Song G. Advanced normalization tools (ANTS). Insight J 2009; 2(365):1–35. [Google Scholar]

RESOURCES