Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Mar 29.
Published in final edited form as: Proc SPIE Int Soc Opt Eng. 2023 Apr 3;12466:124661T. doi: 10.1117/12.2653952

An Augmented Reality System with Advanced User Interfaces for Image-Guided Intervention Applications

Patric Bettati 1,2, Baowei Fei 1,2,3,*
PMCID: PMC10978155  NIHMSID: NIHMS1973763  PMID: 38560640

Abstract

Augmented Reality (AR) is becoming a more common addition to physicians’ repertoire for aiding in resident training and patient interactions. However, the use of augmented reality in clinical settings is still beset with many complications, including the lack of physician control over the systems, set modes of interactions within the system, and physician’s lack of familiarity with such AR systems. In this paper, we plan to expand on our previous prostate biopsy AR system by adding in improved user interface systems within the virtual world in order to allow the user to more accurately visualize only parts of the system which they consider to be useful at that time. To accomplish this, we have incorporated three-dimensional virtual sliders built from the ground up, using Unity to afford control over each model’s RGB values, as well as their transparency. This means that the user would be able to fully edit the color, and transparency of each individual model in real time as they see fit quickly and easily while still being immersed in the augmented space. This would allow users to view internal holograms while not sacrificing the capability to view the external structure. Such leeway could be invaluable when visualizing a tumor within a prostate and would provide the physician with the capability to view as much or as little of the surrounded virtual models as desired, while providing the option to reinstate the surrounding models at will. The AR system can provide a new approach for potential uses in image-guided interventions including targeted biopsy of the prostate.

Keywords: Augmented reality (AR), virtual reality (VR), prostate biopsy, user interface, image-guided intervention, medical imaging

1. INTRODUCTION

The use of augmented reality (AR) in clinical or surgical applications is on the rise [1]. AR projects virtual objects over the real-world surroundings, done either through a headset or projected onto a screen. An AR system provides physicians additional information both during and before the procedures. However, AR systems often have relatively limited physician control, oftentimes due to the nature of integrating AR technology with currently used systems [2]. Differences in user interfaces (UI) have been shown to have a greater impact on task completion than headset type [3,4]. This demonstrates the need for more effective and advanced UI systems. In this project, we attempt to expand on our previous AR systems by adding a layer of customizability to the physician, providing them with the ability to view 3D models through a HoloLens 2 headset, whilst allowing them to make modifications to these 3D models in real time. Currently, this is limited to altering the transparency or color for each individual model, but even so, its uses are manyfold. It could be used to observe internal models or structures within another 3D model, or to change colors of the models to more accurately represent the models themselves, or as an aide to physicians unable to see certain colors. In this study, we will explore the use of the system and its capabilities in terms of both pre and post operational circumstances as well as some drawbacks and limitations of the system. In our previous paper, we demonstrated the feasibility and capability of an AR visualization system and its potential uses in prostate biopsies [7]. However, we acknowledged that the system was unable to modify itself satisfactorily to the individualized desires of the physicians in real time. This is a critical aspect of any system to increase physician and patient comfort and therefore increase adoption of the system [8].

2. METHODS

To build on our previous work, we added in certain user interface and visualization aiding elements to help the physicians more properly tailor its use to their specific needs within the scope of prostate biopsy procedures. To summarize our previous findings, we maintain an AR system which uses preoperative computed topography (CT) or magnetic resonance imaging (MRI), followed by manual segmentations using 3D Slicer of all relevant models such as the skin, bladder, pelvis, and, most importantly, prostate. We then import these models into the Unity Software, where we are able to modify their behaviors using C# scripts and allow for their registration to real world fiducial markers using the Vuforia system. We then use Visual Studio to deploy the AR application to our headset of choice, namely the Microsoft HoloLens 2 system [9].

A known issue with AR systems is their lack of flexibility. They are generally built for a certain, specific task, with little regard for further uses the physicians could determine for the system [9]. As the user is lack of software design background, it becomes difficult or impossible for them to edit the system in any way, prompting the need for a robust UI with multiple options for the physicians to have more freedom in the use of the system. To this end, we have increased the physician control in our current AR visualization system by the use of virtual dedicated sliders able to alter the color, transparency and scale of each visualized model within a virtual world. We have also been able to remove the default spatial mapping quality of the HoloLens 2 in order to preserve the headset’s processing power, as well as to improve the visibility of the scene through the headset.

Generally, UI elements in virtual space are not well supported by Unity or the HoloLens 2, leading to our decision to build our own interaction methods from the ground up using Unity tools and scripts. We succeeded in building these sliders to control the RBG and transparency values in an easy to use, real-time and intuitive method. The user simply must reach their finger into the slider, pinch and move their hand up and down, similar to how real-world sliders function.

Additionally, we were able to edit certain scripts contained in the AR Software Development Kit (SDK) used to remove spatial mapping from the scene. This resulted in a substantially clearer and more readable scene for the headset user, and also allowed for the models to be placed inside of real-world objects. Previously, the spatial mapping would cover stationary objects such as phantoms or mannequins, resulting in loss of vision of any models placed inside. However, with the spatial mapping decoupled from the system back in the Unity stage, we not only regain this option, but we also are able to save substantial processing power, allowing for the headset to focus on more pressing computational tasks.

3. RESULTS

To use the 3D sliders, we manually built the sliders within Unity using a few GameObjects and multiple C# scripts. This gave us full customizability and freedom to use them with our current AR system. These sliders are then attached to the RGB values of the model, allowing each slider to edit the red, green, or blue values of the model it’s attached to, allowing for an extremely varied set of options for the user to select a color for each model. Additionally, we added another fourth slider to modify the transparency of the system. Turning this slider down to the lowest setting will turn the object fully transparent, allowing the user to see through the model, either to see something in the real world, or to observe another model underneath the external one. This can be very useful for procedures, such as prostate biopsies where the prostate can be underneath various organs such as the rectum, bladder or skin where the physician needs to be able to see through these other models at will in order to best visualize all significant parts in real time. Each model is manipulatable, as the user is free to use their hands to move and scale the holograms as they see fit while still having additional control over them through the slider’s usage.

We are also able to utilize a functionality of the HoloLens SDK to allow physicians to control the sliders from afar. In general, whenever employing a HoloLens 2 system, it is expected to physically move to the location of the model and then place the finger within the location of the slider and pinch in order to be able to move the slider. However, the HoloLens is able to trace where a finger is pointing using a ray, cast from the pointer finger of the user in order to allow for distant interaction. This can be very helpful for a physician who is unable to move from the side of a patient, but still wishes to interact with distant sliders, or other UI elements.

4. DISCUSSION

While our system shows great promise in increasing physician freedom during manyfold clinical environments, there are still factors which merit additional discussion. Although the removal of the default spatial mapping system was able to provide us with increased processing power compared to the previous one, the HoloLens 2 still struggles with providing stably high frames per second, forcing us to use lower resolution models than we would prefer.

As figure 4 shows, when utilizing the ray cast from the finger to interact with the sliders, it must be noted that the HoloLens often has a difficulty in following the cast from the finger for prolonged distances, this is especially noticeable when the target gets further away and the target size decreases, making it more necessary for the ray which is cast to closely follow the trajectory of the pointer finger. However, in many cases, the trajectory does not follow the finger perfectly and over longer distances, requires significant correction from the user.

Figure 4:

Figure 4:

An example of one of the features we have implemented which allows for control of the sliders from afar using a ray cast from the pointer fingers.

To determine whether or not it is feasible to use these ray cast during a time sensitive operation, we ran an experiment, where a user was placed at 0, 1, 5, 10, and 15 feet away from a virtual slider and was then asked to change the slider value. The time until this was completed was recorded (n = 8), and the average time was plotted. Interestingly enough, the time it took for the user was relatively linearly proportional to the increase in distance increments. However, as the distances approached 10 feet, we observed that the time to change a single slider could reach 5 seconds, showing us that improvements must be done on the ray cast system to make it more useful in an environment with spread out sliders throughout a large room.

Additionally, there is still the problem of registration. In this paper, we used a fiducial-based registration system and the Vuforia software. This is among the simplest and most stable methods to implement for the HoloLens 2 [6]. This has the drawback of requiring a real-world fiducial to anchor the models into the correct location in the real-world space [911]. The generally results in a lengthy calibration period, where the real-world models and their software counterparts must be revised iteratively, until the locations are satisfactory. We also have a field of view limitation from the hardware itself. While the HoloLens 2 sports an industry leading field of view, it is still narrow by human standards, and without prior knowledge of the relative locations of the models in the real world, it can take an untrained user some time to understand at which points the models will be outside of the headsets’ field of view.

5. CONCLUSION

In conclusion, we have developed an augmented reality system that includes important features for physicians to control what they choose to view within an AR scene. The use of sliders to provide that freedom is a simple and effective method which could be taught very easily to physicians while greatly increasing their autonomy whilst using the system. The optional removal of the spatial mapping would also simply serve to declutter their view when using the HoloLens 2. The AR system can have many applications in image-guided interventions [1213]

In the future, we plan on updating this system to add on several functionalities, such as turning off the ray cast option, as well as adding in a canvas, where we could apply a dropdown menu, where we can pull up specific sliders, while hiding the rest, in order to minimize the virtual clutter which can form when too many sub-models are available to have their RGB and transparency values altered. This dropdown and associated sliders would also be editable by the ray cast, allowing for additional user usability of the system.

Additional further work will provide more freedom for the physician in visualizing the models, while reducing the number of sliders used, since the use of four sliders per model can become overbearing when dealing with many models. We would need four separate models in that we have one each for red, blue, and green values and one for the transparency. These are linked values to each model, therefore, having 4 sliders per model is required and easily begins to clutter the virtual workspace. We also plan on shifting away from fiducial markers, potentially using external cameras in combination with optical trackers to provide more stable and robust tracking, even when the HoloLens 2 cameras cannot see the markers.

Figure 1:

Figure 1:

A demonstration involving a mannequin of how our AR system could be used to visualize internal organs within a patient. This also demonstrates how other virtual models can occlude internal models as there is a prostate model underneath the pelvis model.

Figure 2:

Figure 2:

A. An example of a scene in a cluttered area without spatial mapping operational. B. A similar scene in the same cluttered environment with spatial mapping turned on.

Figure 3:

Figure 3:

A. An example of the use of RGB sliders, showing the original skin model color. B. A shot showing the changed color of the outer skin model, accomplished by the use of RGB sliders.

Table 4:

Representation showing the average time (n = 8) it takes a user to interact with a virtual slider to adjust RBG or transparency values at different distances.

graphic file with name nihms-1973763-t0005.jpg

ACKNOWLEDGMENTS

This research was supported in part by the U.S. National Institutes of Health (NIH) grants (R01CA156775, R01CA204254, R01HL140325, and R21CA231911), by the Cancer Prevention and Research Institute of Texas (CPRIT) grant RP190588.

REFERENCES

  • [1].Levy JB, Kong E, Johnson N, Khetarpal A, Tomlinson J, Martin GF, and Tanna A, “The mixed reality medical ward round with the MS HoloLens 2: Innovation in reducing COVID-19 transmission and PPE usage,” Future Healthc J. 8(1), 127–130 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [2].Pose-Díez-de-la-Lastra A, Moreta-Martinez R, García-Sevilla M, García-Mato D, Calvo-Haro JA, Mediavilla-Santos L, Pérez-Mañanes R, von Haxthausen F, and Pascau J, “HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions,” Sensors 22(13), 4915 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [3].Kim S, Nussbaum MA, and Gabbard JL, “Influences of augmented reality head-worn display type and user interface design on performance and usability in simulated warehouse order picking,” Appl. Ergon. 74, 186–193 (2019). [DOI] [PubMed] [Google Scholar]
  • [4].Iqbal H, Tatti F, y Baena FR, “Augmented reality in robotic assisted Orthopaedic Surgery: A pilot study,” J. Biomed. Inform. 120, 103841 (2021). [DOI] [PubMed] [Google Scholar]
  • [5].Bettati P, Dormer JD, Shahedi M, and Fei B, “An augmented reality-assisted visualization system for potential applications in prostate biopsy,” Proc. SPIE 12034, 120342G (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [6].Prochaska MT, Press VG, Meltzer DO, and Arora VM, “Patient Perceptions of Wearable Face-Mounted Computing Technology and the Effect on the Doctor-Patient Relationship,” Appl. Clin. Inform. 7(4), 946–953 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Bettati P, Chalian M, Huang J, Dormer JD, Shahedi M, and Fei B, “Augmented reality-assisted biopsy of soft tissue lesions,” Proc. SPIE 11315, 113150W (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Andrews CM, Henry AB, Soriano IM, Southworth MK, and Silva JR, “Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices,” IEEE J. Transl. Eng. Health. Med. 9, 4900214 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [9].Pérez-Pachón L, Sharma P, Brech H, Gregory J, Lowe T, Poyade M, and Gröning F, “Effect of marker position and size on the registration accuracy of HoloLens in a non-clinical setting with implications for high-precision surgical tasks,” Int. J. Comput. Assist. Radiol. Surg. 16(6), 955–966 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Pfefferle M, Shahub S, Shahedi M, Gahan J, Johnson B, Le P, Vargas J, Judson BO, Alshara Y, Li Q, and Fei B, “Renal biopsy under augmented reality guidance,” Proc SPIE 11315, 113152W (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Huang J, Halicek M, Shahedi M, and Fei B, “Augmented reality visualization of hyperspectral imaging classifications for image-guided brain tumor phantom resection,” Proc SPIE. 11315, 113150U (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [12].Ma L, and Fei B, “Comprehensive review of surgical microscopes: technology development and medical applications,” J. Biomed. Opt. 26(1), 010901 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [13].Fei B, Abiodun-Ojo OA, Akintayo AA, Akin-Akintayo O, Tade F, Nieh PT, Master VA, Alemozaffar M, Osunkoya AO, Goodman MM, and Schuster DM, “Feasibility and Initial Results: Fluciclovine Positron Emission Tomography/Ultrasound Fusion Targeted Biopsy of Recurrent Prostate Cancer,” J. Urol. 202(2), 413–421 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES