Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Feb 14.
Published in final edited form as: Proc SPIE Int Soc Opt Eng. 2022 Apr 4;12034:120342G. doi: 10.1117/12.2611590

An Augmented Reality-Assisted Visualization System for Potential Applications in Prostate Biopsy

Patric Bettati 1,2, James D Dormer 1,2, Maysam Shahedi 1,2, Baowei Fei 1,2,3,*
PMCID: PMC9928501  NIHMSID: NIHMS1872590  PMID: 36793657

Abstract

Ultrasound-guided biopsy is widely used for disease detection and diagnosis. We plan to register preoperative imaging, such as positron emission tomography / computed tomography (PET/CT) and/or magnetic resonance imaging (MRI), with real-time intraoperative ultrasound imaging for improved localization of suspicious lesions that may not be seen on ultrasound but visible on other imaging modalities. Once the image registration is completed, we will combine the images from two or more imaging modalities and use Microsoft HoloLens 2 augmented reality (AR) headset to display three-dimensional (3D) segmented lesions and organs from previously acquired images and real-time ultrasound images. In this work, we are developing a multi-modal, 3D augmented reality system for the potential use in ultrasound-guided prostate biopsy. Preliminary results demonstrate the feasibility of combining images from multiple modalities into an AR-guided system.

Keywords: Augmented reality (AR), prostate cancer biopsy, HoloLens, PET/CT, ultrasound imaging

1. INTRODUCTION

Prostate cancer is the second leading cause of cancer death in men in the US1. Transrectal ultrasound (TRUS)-guided biopsy is the current clinical standard for prostate diagnosis2. Twelve-core biopsies are performed in a systematic way for the whole prostate. Due to the low sensitivity of the ultrasound imaging for differentiating between healthy and cancer tissue, we incorporated other imaging modalities, such as positron emission tomography / computed tomography (PET/CT) and/or magnetic resonance imaging (MRI), into the TRUS-guided prostate biopsy, which is called targeted biopsy of the prostate3,4. Still, the small volume and internal nature of the prostate can cause this to be difficult, increasing the possibility of false negatives as well as potentially requiring multiple biopsy procedures per patient, increasing patient discomfort. In this study, we propose an Augmented Reality (AR) workflow which could allow physicians to perform prostate biopsies faster and more accurately.

Wearable AR is still a nascent technology, having mainly been utilized as industry or novelty technology5,6. AR allows one to interact with virtual objects superimposed upon the real world. Due to the limitless potential of AR in clinical procedures, just within the past few years, AR systems have been developed to aid in orthopedic, oral, and neural surgeries7,8,9. Therefore, in order to improve prostate biopsy core needle accuracy and to decrease operation time, we present a system to visualize multimodal images in an AR-guided biopsy system, which will be the first to provide a physician with real-time data through an AR headset for use in prostate biopsies.

The benefits of AR systems over traditional systems are the focus of many current studies10,11,12. Prior works regarding the multiple potential uses of AR systems include their use as training simulators or as visualization assistance during clinical procedures13,14. While these systems demonstrate a definite need for AR research, the use of real-time multimodal registration methods is mainly absent from the literature, calling for an increased investigation into such systems. Their uses are manyfold, and using an intraoperative as well as preoperative imaging modality would alleviate many issues that arise when solely utilizing a single imaging modality. For example, registering intraoperative ultrasound imagings over preoperative imagings would alleviate a possible issue of misalignment resulting from patient shift, while still providing increased contrast to the lesion in question compared to what solely the intraoperative ultrasound could provide.

Currently, for prostate biopsies, augmented reality is often researched in conjunction with robotic assistance15. While these studies show promise, the use of robotic systems is extremely expensive, requires specific training, and is unavailable in many parts of the world. Due to these limitations, we believe it is necessary to develop lower cost AR prostate biopsy systems.

In the current work, we plan to incorporate a Microsoft HoloLens 2 AR headset into the biopsy procedure in order to provide the physician with increased information and spatial awareness. The HoloLens 2 headset is capable of rendering three-dimensional (3D) holograms superimposed onto the real world, as well as allowing the user to interact with the holograms. In this study, we focus on the feasibility and phantom studies.

2. METHODS

Our proposed AR system begins with the patient undergoing a routine preoperative PET/CT or MRI. We then manually segment the organs of interest in the male pelvic region. Generally, these include the prostate, bones, lesion, and surrounding skin of the region. Additional segmentations could be segmented out if needed. These 3D segmentations are then inserted into a game engine software called Unity. Through Unity, we are able to affix the holograms to real-world anchors and edit their behavior through user interactions and user interface (UI) elements16. We plan to register these preoperatively segmented organs and lesions to real-time ultrasound images obtained during the biopsy. These ultrasound images registered to the preoperative images could be displayed by the HoloLens 2 in real time, allowing the physician to observe the lesion superimposed on the ultrasound images that are projected into the HoloLens 2.

We begin with a patient undergoing a standard PET/CT or MRI imaging session. From this, we extract the DICOM files and upload these into the 3D Slicer software. Slicer allows us to perform the manual segmentations required for Unity integration17. We generally perform HU thresholding to obtain the bone and skin segmentations, while reserving manual segmentation for the lesion and prostate as well as any other important segmentations.

Upon importation into Unity, the segmented organs and suspicious lesions are then scaled and organized into their proper positions within the virtual scene. To provide a real-world anchor for the holograms, we utilized the Vuforia integration for Unity. Vuforia is a software that attaches holograms to real-world 2D paper anchors. By printing out these tracking patterns (hereafter called ‘markers’), we provide an anchor in the real world to attach the virtual holograms18. At this point, we are able to attach C# scripts to the segmentation models within Unity to control their behavior in various ways.

Examples of the script results include the ability to move a hologram around using hands, resizing the holograms, or having the holograms follow the user as they move around. We additionally developed a UI for the headset. This UI provides opportunities for physicians to customize how they wish to view the holograms. The proposed UI would allow for the selection of which segmentation to select, allow for color or transparency changes, or to tweak which segmentation should even be displayed.

In our lab, we use the Artemis Ultrasound and biopsy system (Eigen, Grass Valley, CA) to obtain 3D ultrasound images that can be registered with the PET/CT or MR images. After registration, a mask over the ultrasound images can be visualized through the HoloLens 2. This could allow the physician to view the lesion superimposed onto real-time ultrasound data.

Additionally, during our tests, we observed that a few of our Vuforia fiducial markers maintained various effects on the robustness of the registration from the HoloLens 2 cameras. We conducted some experiments to determine two factors with major influence on the fiducial marker tracking: image resolution of the fiducial marker and the size of the fiducial marker. We then explored these effects through a systematic search. As a standard, we utilized a pre-built third-party Vuforia Marker19. We then performed various transforms to obtain a few more examples: a ¾ resolution marker with the original size, a ½ resolution marker with the original size, and a marker of standard resolution but with half the width and height, resulting in a ¼ size marker. Generally, there is an acquisition time for the HoloLens 2 to detect the presence of each marker after placing it in the field of view, and we were able to gauge the amount of time that elapsed before the hologram came into view above the fiducial. This experiment was repeated ten times (n=10) per fiducial marker.

3. RESULTS

There have been several challenges in using the 2D Vuforia markers. The markers are large, flexible, and difficult for the HoloLens 2’s cameras to view at any oblique angles. To solve this, we attempted to utilize a Vuforia-supported 3D marker in the form of a cylinder. In theory, a 3D tracking marker could be adapted into a belt or strap which can be used as a form of attire for the patient. This would allow for the registration to be anchored directly to the core of the patient, reducing the calibration time, and potentially increasing the accuracy of the registration. Currently, these 3D markers are only useful for static tracking, where the holograms are permanently affixed to the real-world marker and do not need to move around the real world. This is due to a scaling issue between Unity and Vuforia which we are attempting to work around.

Despite these challenges, we have the capability to accurately visualize and display patient CT data faithfully in AR as well as attach the Holograms to real-world trackers. Through scripting of the holograms, we are also able to physically interact with the holograms in virtual and real space using only the hands of the user. This would open up the customizability of our system, especially when combined with our UI system. Moreover, our system additionally boasts modularity for which type of lesion the system can be used for. With a few rather minor tweaks, we could repurpose this system from a prostate biopsy system to many different biopsy or visualization systems. The modularity of this system comes primarily from the choice of segmentations used; theoretically, if provided data over the entire body and each organ, we could faithfully recreate the entire patient’s body in AR and allow the physician to determine which parts they wish to view through the headset by use of the UI system.

The results of our fiducial acquisition are shown below in Figure 4 and Table 1. Based on our experiments, the largest effect on the acquisition time is by a reduction in size, however, this may be due to the radical reduction to a quarter of the original size. Generally, we saw little difference in a reduction from standard to ¾ resolution, however, when observing the change from standard to ½, we saw the mean acquisition time almost doubles, while simultaneously more than doubling the standard deviation.

Figure 4.

Figure 4.

A graph displaying the effect of different size and resolution markers on fiducial acquisition time.

Table 1:

Data showing the effects of changing size and resolution on fiducial acquisition time.

Testing Condition Mean Time to Acquire Marker (s) Median Time to Acquire Marker (s) Standard Deviation (s)
Standard size, Standard Resolution 2.37 2.00 0.73
Standard size, 3/4 Resolution 2.47 2.43 0.75
Standard size, 1/2 Resolution 4.02 3.43 2.05
1/4 Size, Standard Resolution 11.09 9.56 5.66

4. DISCUSSION AND CONCLUSION

Currently, many AR systems in development are 2D systems relying on a screen to provide information to the physician20. Moving forward, however, we believe that the extra dimension provided by a headset would prove invaluable to not only biopsy procedures but also many other clinical operations in the future, provided that the headset hardware is able to accommodate the high precision and accuracy necessary for the procedures16. Moreover, using a headset instead of a screen would allow for voice or touch commands from the physician, and would eliminate the need for the physician to periodically shift their attention away from the patient to the screen, strengthening spatial awareness16.

The current AR headsets have multiple limitations for interventional procedures. Chief among them is that the HoloLens 2 and Unity connection is not set up to allow for real-time data processing to be transmitted to the HoloLens 2, especially due to the wireless nature of the connection. This greatly complicates the process; however, we are confident that this can be resolved. We are also contemplating utilizing a new motion capture system instead of the real-world Vuforia trackers to increase accuracy and work through the problems surrounding the 3D Vuforia tracking issues.

Additionally, our experiment for determining acquisition time for markers with different resolutions showed that differing resolutions and sizes can have extreme effects on marker acquisition time for HoloLens 2. Our data demonstrate a slight increase in acquisition time for worse resolutions, but a significant increase in acquisition time for decreasing the size. Other potential experiments are required to determine the ideal complexity, resolution, and size of fiducial marker to be optimally used in various projects.

Augmented reality has the potential to change how clinical procedures are performed in the future. Through our proposed AR workflow and visualization system, clinicians would be able to view and interact with personalized patient holograms prior to or during interventional procedures.

Figure 1.

Figure 1.

An example of CT images of the pelvic region and the segmented organs and tissue of the patient.

Figure 2.

Figure 2.

Examples of the UI changing the transparency of a segmented part to allow the user to see the pink prostate underneath.

Figure 3.

Figure 3.

A virtual Unity scene of the 3D Vuforia tracker (left) versus the shifted real-time view through the HoloLens 2 to keep all objects within the field of view (right). The real-world tracker location is outlined in blue and located behind the red cube.

ACKNOWLEDGMENTS

This research was supported in part by the U.S. National Institutes of Health (NIH) grants (R01CA156775, R01CA204254, R01HL140325, and R21CA231911) and by the Cancer Prevention and Research Institute of Texas (CPRIT) grant RP190588.

REFERENCES

  • [1].Siegel DA, O’Neil ME, Richards TB, Dowling NF, Weir HK, “Prostate cancer incidence and survival, by stage and Race/ethnicity — United States, 2001–2017,” MMWR. Morbidity and Mortality Weekly Report 69(41), 1473–1480 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [2].Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J, “Augmenting Microsoft’s hololens with vuforia tracking for neuronavigation,” Healthcare Technology Letters 5(5), 221–225 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [3].Fei B, Abiodun-Ojo OA, Akintayo AA, Akin-Akintayo O, Tade F, Nieh PT, Master VA, Alemozaffar M, Osunkoya AO, et al. , “Feasibility and initial results: Fluciclovine positron emission tomography/ultrasound fusion targeted biopsy of recurrent prostate cancer,” Journal of Urology 202(2), 413–421 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].Fei B, Nieh PT, Master VA, Zhang Y, Osunkoya AO, Schuster DM, “Molecular imaging and fusion targeted biopsy of the prostate,” Clinical and Translational Imaging 5(1), 29–43 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Helin K, Kuula T, Vizzi C, Karjalainen J, Vovk A, “User experience of augmented reality system for Astronaut’s Manual Work Support,” Frontiers in Robotics and AI 5 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [6].Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J, “Augmenting Microsoft’s hololens with vuforia tracking for neuronavigation,” Healthcare Technology Letters 5(5), 221–225 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Léger É, Reyes J, Drouin S, Popa T, Hall JA, Collins DL, Kersten-Oertel M, “Marin: An open-source mobile augmented reality interactive neuronavigation system,” International Journal of Computer Assisted Radiology and Surgery 15(6), 1013–1021 (2020). [DOI] [PubMed] [Google Scholar]
  • [8].Ayoub A, Pulijala Y, “The application of virtual reality and augmented reality in oral & maxillofacial surgery,” BMC Oral Health 19(1) (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [9].Verhey JT, Haglin JM, Verhey EM, Hartigan DE, “Virtual, augmented, and mixed reality applications in orthopedic surgery,” The International Journal of Medical Robotics and Computer Assisted Surgery 16(2) (2020). [DOI] [PubMed] [Google Scholar]
  • [10].Jin Y, Ma M, Zhu Y, “A comparison of natural user interface and graphical user interface for narrative in HMD-based augmented reality,” Multimedia Tools and Applications (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Müller C, Krone M, Huber M, Biener V, Herr D, Koch S, Reina G, Weiskopf D, Ertl T, “Interactive molecular graphics for augmented reality using hololens,” Journal of Integrative Bioinformatics 15(2) (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [12].Velazco-Garcia JD, Navkar NV, Balakrishnan S, Younes G, Abi-Nahed J, Al-Rumaihi K, Darweesh A, Elakkad MS, Al-Ansari A, et al. , “Evaluation of how users interface with holographic Augmented Reality Surgical Scenes: Interactive Planning MR-Guided Prostate Biopsies,” The International Journal of Medical Robotics and Computer Assisted Surgery 17(5) (2021). [DOI] [PubMed] [Google Scholar]
  • [13].McKnight RR, Pean CA, Buck JS, Hwang JS, Hsu JR, Pierrie SN, “Virtual reality and augmented reality—translating surgical training into surgical technique,” Current Reviews in Musculoskeletal Medicine 13(6), 663–674 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Ho S, Liu P, Palombo DJ, Handy TC, Krebs C, “The role of spatial ability in mixed reality learning with the hololens,” Anatomical Sciences Education (2022). [DOI] [PubMed] [Google Scholar]
  • [15].Schoeb DS, Rassweiler J, Sigle A, Miernik A, Engels C, Goezen AS, Teber D, “Robotik und Intraoperative navigation,” Der Urologe 60(1), 27–38 (2020). [DOI] [PubMed] [Google Scholar]
  • [16].Bettati Patric, et al. “Virtual Reality Assisted Cardiac Catheterization.” Medical Imaging 2021: Image-Guided Procedures, Robotic Interventions, and Modeling, 2021, doi: 10.1117/12.2582097. [DOI] [Google Scholar]
  • [17].Bettati Patric, et al. “Augmented Reality-Assisted Biopsy of Soft Tissue Lesions.” Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling, 2020, doi: 10.1117/12.2549381. [DOI] [Google Scholar]
  • [18].Lee D, Yi JW, Hong J, Chai YJ, Kim HC, Kong H-J, “Augmented reality to localize individual organ in surgical procedure,” Healthcare Informatics Research 24(4), 394 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Mazzanti Dario. “AR Image Target Generation (Vuforia).” AR Image Target Generation, https://www.dariomazzanti.com/uncategorized/ar-image-target-generation-vuforia/. [Google Scholar]
  • [20].Vinci Christine, et al. “The Clinical Potential of Augmented Reality.” Clinical Psychology: Science and Practice, vol. 27, no. 3, 2020, doi: 10.1111/cpsp.12357. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES