Abstract
Objective:
Augmented reality devices are increasingly accepted in healthcare, though most applications involve education and preoperative planning. A novel augmented reality ultrasound application, HoloUS, was developed for the Microsoft HoloLens 2 to project real-time ultrasound images directly into the user’s field of view. In this work, we assess the effect of using HoloUS on vascular access procedural outcomes.
Methods:
A single-center user study was completed with participants with (N = 22) and without (N = 12) experience performing ultrasound-guided vascular access. Users completed a venipuncture and aspiration task a total of 4 times; 3 times on study day 1, and once on study day 2 between 2–4 weeks later. Users were randomized to use conventional ultrasound during either their first or second task and the HoloUS application at all other times. T ask completion time, numbers of needle redirections, head adjustments, and needle visualization rates were recorded.
Results:
For expert users, task completion time was significantly faster using HoloUS (11.5 s; IQR = 6.5 – 23.5 s vs. 18.5 s; IQR = 11.0 – 36.5 s; P= 0.04). Number of head adjustments was significantly lower using the HoloUS app (1.0; IQR = 0.0 – 1.0 vs. 3.0; IQR = 1.0 – 5.0; P = <0.0001). No significant differences were identified in other measured outcomes.
Conclusion:
This is the first investigation of augmented reality-based ultrasound-guided vascular access using the second-generation HoloLens. It demonstrates equivalent procedural efficiency and accuracy, with favorable usability, ergonomics, and user independence when compared with traditional ultrasound techniques.
Keywords: Augmented reality, ultrasound guidance, vascular access, HoloLens, head-mounted display
Introduction
The use of ultrasound guidance for vascular access has become the standard for many common bedside procedures given evidence of superior overall cannulation success rates in central venous, peripheral arterial, and even peripheral venous access for patients with difficult vascular anatomy1–4. Additional advantages include improved first-pass success rates and reduced procedure duration, needle insertion attempts, redirections, and mechanical complication rates when compared with landmark-based approaches 1,5,6. Despite these benefits, ultrasound-guided vascular access depends heavily on a practitioner’s knowledge of sono-anatomy and ability to appropriately adjust the probe’s positioning to reveal relevant structures during a dynamic procedure, while also maintaining hand-eye coordination between the intervention needle, ultrasound probe, and ultrasound machine’s screen, which is often located outside of the procedural field 7.
Augmented and virtual reality (AR and VR) devices have become increasingly recognized as potential adjuncts in both the training and performance of vascular access procedures given the concerns above 8–11. Though AR and VR applications have successfully integrated into several areas of medicine, most currently aid in patient and clinician education 12,13 and preoperative planning 14–19. Augmented reality is distinct from virtual reality in that it incorporates virtual elements onto a user’s real-world surroundings, rather than immersing a user in a completely virtual environment, as in VR 20. Both AR and VR provide different potential advantages for expansion into the realm of procedural use, though obstacles for procedural applications include technical limitations of important features including tracking calibration, small field of view, image quality, and latency 21–23.
Several groups are investigating the use of AR and VR for clinician training and preoperative planning for vascular interventions 8,9,16. Though the potential for AR applications to improve the accuracy, efficiency, and ergonomics of vascular access procedures is clear, current work, though promising, has been mostly pre-clinical with mixed results 24–28. A few groups, including ours, have developed custom applications that enable live AR viewing of ultrasound images using the HoloLens (Microsoft, Redmond, Washington, USA), a state-of-the-art smart glasses device (Table 1) 8,12,21,25,29–31. The HoloLens and upgraded HoloLens 2 have multiple favorable features for procedural applications, including transparent lenses which allow for constant, direct visualization of the patient, and eye and hand movement tracking which eliminate the need for a handheld controller. A summary of applications that specifically assess vascular access procedures are included in Table 2, of which one utilized the HoloLens device 10,11,24,27,28,32.
Table 1.
Summary of published work investigating the HoloLens device to view augmented reality (AR) ultrasound images.
| Author | HoloLens Version | Tracking | Ultrasound Acquisition | Latency | Image Size/Frame Rate | Model | Voice Commands | Hand Gestures |
|---|---|---|---|---|---|---|---|---|
| Mahmood et al12 (2018) | 1 | Outside-in | Not Specified | Not specified | Not specified | Vimedix TEE/TTE simulator | Yes | Yes |
| Garcia-Vazquez et al8 (2018) | 1 | Inside-out (electromagnetic sensor) | Image data streaming | 259 ms | Not specified | Aortic aneurysm phantom | Yes | No |
| Kuzhagaliyev et al29 (2018) | 1 | Outside-in (optical tracking) | Frame-grabber | Not specified | Not specified | Not specified | Yes | Yes |
| Ruger et al25 (2020) | 1 | Outside-in (optical tracking) | Frame-grabber | Not specified | 600 × 600 pixels/10 fps | Tissue biopsy phantom | No | No |
| Farshad-Amacker et al30 (2020) | 1 | Inside-out (tracking marker on probe) | Not Specified | Not specified | Not specified | Tissue biopsy phantom | No | No |
| Nguyen et al31 (2021) | 1 | Inside-out (tracking marker on probe) | Image data streaming | 80 ms | 100 × 100 pixels/25 fps | Tissue biopsy phantom | Yes | Yes |
| von Haxthausen et al21 (2022) | 2 | Outside-in (optical tracking) | Not Specified | 16 ms | 512 × 512 pixels/62 fps | Not specified | No | Yes |
TEE/TTE = transesophageal echocardiogram/transthoracic echocardiogram, ms = millisecond, fps = frames per second.
Table 2.
Summary of published work investigating augmented reality (AR) use in vascular access procedures.
| Author | AR Device | Procedure | Procedure time | First-pass success | Needle redirections | Head movement | Complications |
|---|---|---|---|---|---|---|---|
| Wang et al32 (2009) | Sonic Flashlight Not head-mounted | Peripherally inserted central catheter | Not assessed | Equivalent | Equivalent | N/A | Not assessed |
| Jeon et al10 (2014) | Microproject or MBP200 not head-mounted | Simulated venous access | Shorter with AR device | N/A | Fewer with AR device | N/A | N/A |
| Wu et al27 (2014) | Google Glass | Simulated central venous catheter placement | Longer with AR device | N/A | Greater with AR device | Fewer with AR device | N/A |
| Lim et al24 (2019) | Moverio BT-300 | Simulated venous access | Equivalent | N/A | Equivalent | Fewer with AR device | N/A |
| Jang et al11 (2021) | Moverio BT-35E | Pediatric radial artery catheterization | Shorter with AR device | Higher with AR device | N/A | N/A | Lower with AR device |
| Suzuki et al28 (2021) | HoloLens | Simulated central venous catheter placement | Equivalent | N/A | N/A | N/A | N/A |
This work continues our previous investigation of a novel AR application, HoloUS, which displays live images from a portable ultrasound machine directly into the user’s field of view 31. Though this initial user study did not demonstrate statistically significant improvements in targeting task time, it did show that HoloUS can be feasibly used in ultrasound-guided procedures. Notably, user study results were impacted by the inherent limitations of the first-generation HoloLens hardware (e.g., small field of view) and suboptimal software implementations (e.g., reduced image quality in order to minimize latency). By upgrading to the more powerful HoloLens 2 and integrating multiple software optimizations, many of these limitations were overcome (Appendix A). HoloUS is programmed to track the ultrasound probe, project the image directly over the target anatomy, and “follow” the transducer as it is adjusted into the ideal procedural position, eliminating the need for a physical screen. From this baseline position, users can modify the relative image location, size, brightness, and contrast based on their preference using voice commands.
In this follow-up study it was hypothesized that use of the upgraded HoloUS application would improve procedural accuracy, task completion time, and numbers of needle and head position adjustments compared with conventional ultrasound guidance. To test these hypotheses, a single-center, randomized user study was conducted in which a portable ultrasound machine and a 3-vessel branched vascular phantom were used to perform a venipuncture and aspiration task series. We believe that this is the first investigation of AR-based ultrasound-guided vascular access using the HoloLens 2.
Materials and Methods
Software Architecture
The upgraded HoloUS application preserves the overall software architecture of the first generation application31 and has two main custom software modules. First, the transmit module on the ultrasound machine retrieves and sends ultrasound images wirelessly to the HoloLens (Fig. 1). Next, the display module on the HoloLens receives and renders the transmitted images holographically through the HoloLens and processes any voice commands to adjust rendering mode. Though the HoloUS application was designed to be compatible with any ultrasound machine, the Terason uSmart 3200T (TeraTech Corporation, Boston, MA, USA) was chosen for this user study based on its portability and the built-in Windows-based interface, which enables the simultaneous operation of the ultrasound program and transmit module within a single tablet-based machine. The transmit module was written in C++ using the Terason SDK. The display module was written in C# using Unity 2020.3.25.f1 (Unity Technologies, USA), Microsoft Mixed Reality Toolkit. (Microsoft, Redmond, Washington, USA), and Visual Studio 2019.
Fig. 1.

The two main modules of the HoloUS application: the transmit and display modules. The HoloUS application communicates wirelessly with the ultrasound machine and HoloLens 2 device to render holographic ultrasound images and register them in space with an augmented reality marker on the ultrasound transducer.
The holographic ultrasound image was registered to the ultrasound transducer using ARToolKit, a commonly used software library for building AR applications. To enable 3D tracking, a case with AR pattern were designed and 3D-printed for the ultrasound transducer using SolidWorks (Dassault Systèms., Waltham, Massachusetts, USA). The black-and-white AR pattern is pre-defined to be recognizable by ARToolKit. The display module identifies the AR pattern to determine the transducer’s 3D pose and, in turn, the 3D pose of the ultrasound image (Fig. 1).
The main limitation of the first generation HoloUS application was its limited field of view. This was addressed by upgrading to the HoloLens 2, which provides an expanded field of view to 54 degrees diagonally, compared with 30 degrees for the first-generation display (Appendix A). Touchless features, including expansion and improvement of voice commands are supported to move and scale the ultrasound image, as well as adjust the brightness and contrast, all in real time.
Application Performance Specifications
The HoloUS application transmits ultrasound images from the ultrasound machine to the HoloLens using Transmission Control Protocol/Internet Protocol (TCP/IP) and DXT1 compression. The HoloLens 2 natively supports DXT1 image compression, allowing for rapid decoding 33 and improved frame rate and image resolution with comparable image latency. In the first-generation HoloUS application, the ultrasound screen frames were cropped to isolate the ultrasound image and downsized to facilitate transmission. In the second-generation application, the screen frame was similarly cropped, though no downsizing of the image was necessary. Application latency was estimated by adding its two main contributors: (a) image encoding and packet preparation time (i.e., the time it takes to compress a raw image into TCP/IP packets) and (b) image transmission time (i.e., the time to send a complete image from the ultrasound machine to the HoloLens device). Rendering frame rate of HoloLens 2 was measured in Unity and can be assessed in real time with the HoloUS app. To ensure uniform application performance across users, the ultrasound machine was set at 3 cm depth (47 Hz) during all experiments.
The two modes supported by HoloUS, Tracking Mode and Floating Mode 31, can be activated using voice commands (Fig. 2). In Tracking Mode, HoloUS recognizes the ultrasound probe’s location using a custom 3-D printed ArUco marker, positions the holographic image below the transducer, in its corresponding anatomical position, and tracks the ultrasound probe in real time. This mode was designed for use during anatomical surveys of vascular structures, or examinations that span multiple body regions. In Floating Mode, the holographic ultrasound image is initially positioned based on transducer location but does not track small movements (< 10 cm), and instead remains in a fixed position so as not to distract the proceduralist during needle advancement. In either mode, voice commands are enabled, using speech recognition of predefined keywords.
Fig. 2.

Photos of a HoloUS user’s view taken with the HoloLens 2 camera. The holographic ultrasound image as well as the physical surroundings are shown as they appear to the user. In Tracking Mode (A), the default ultrasound image location is directly beneath the ultrasound transducer in its corresponding anatomical position based on tracking of 3D-printed ArUco markers. HoloUS “tracks” the transducer and automatically adjusts the image location. In Floating Mode (B), the default ultrasound image location is directly in the user’s line of sight. The image position can be adjusted with voice commands based on the user’s preference.
User Study
A single-center, prospective user study was conducted using volunteers. All study activities were performed with approval of the hospital’s Institutional Review Board. “Experts” were classified as users with training and experience performing ultrasound-guided vascular access procedures meeting their hospital’s standards for proficiency. All others were categorized as “novices.”
All users completed a series of simulated venipuncture tasks using a validated 3-vessel branched vascular access phantom (Fig. 3, VATA, Inc, Canby, OR) using conventional ultrasound guidance visualized on an external portable screen (Terason uSmart 3200T) as well as AR ultrasound guidance using the HoloUS application running on the HoloLens 2. Procedure time, accuracy, number of needle adjustments, and number of visual focus adjustments were recorded. Users were recruited from departments of anesthesia, critical care, surgery, interventional radiology, bioengineering, and research for study participation.
Fig. 3.

The 3-vessel vascular phantom used for the user study (A, B). Users were instructed to target the center of the vessel prior to its bifurcation (C, D). Simultaneous photos taken by an external observer and digital camera (C) and by the user and the HoloLens’ camera (D) are shown. User study timeline (E).
Study day 1: Tasks 1–3
Following an informed consent procedure and a brief training session, participants completed a simulated venipuncture task a total of 3 times on the first study day, once using conventional ultrasound technique and the ultrasound machine’s included external screen (task C), and twice using the HoloLens and HoloUS augmented reality app with the holographic ultrasound image displayed directly in their field of view (tasks H1 and H2). For the training, users were shown a prepared slideshow depicting the study task procedure and outcome measurements. Novice users were given an additional presentation on the background and use of ultrasound for vascular access. After training, users were instructed to practice the study task a total of three times and given time to practice with the HoloLens and HoloUS application. To ensure consistent experimental procedures, all users were asked to view the vessel in an out-of-plane/cross section for task completion. They were then randomly assigned to use the conventional ultrasound setup either first or second, followed by a third time using HoloUS (Fig. 3E). Though users were invited to experience both Tracking and Floating mode on HoloUS during training, all experimental task procedures were performed using Floating Mode. In order to minimize experimental disruptions, users were instructed to customize the ultrasound image position, size, and gain parameters to their preference with voice commands prior to task performance.
For all tasks, users identified the vascular phantom’s middle vessel with the ultrasound, then used a 21-gauge needle attached to a 5-mL syringe to puncture the phantom. To evaluate accuracy, users were instructed to visualize the needle within the vessel using ultrasound, adjusting or redirecting the needle freely. Once satisfied with the needle’s position, they were told to verbally notify study staff and stop all needle adjustments prior to placing negative pressure on the syringe to withdraw fluid. Users then pulled back on the syringe. This was documented as the first attempt. The attempt was counted as successful if it was positive for fluid aspiration. In the case of first-attempt failure, users were instructed to continue by adjusting the positions of the needle and/or ultrasound probe. Once satisfied with the needle’s location, users again notified study staff prior to making attempt #2. This process was repeated until the users made a successful attempt, or until 10 minutes had elapsed, whichever occurred first. After completing task 1, users then performed the task a second time using the other method (conventional vs. HoloUS), followed by a third time using HoloUS. Outcome measurements included task completion time, accuracy, number of attempts, number of needle redirections, and number of head adjustments. These are defined in Table 3.
Table 3.
Task outcome definitions. All tasks outcomes were measured or counted by observing study staff.
| Outcome | Definition |
|---|---|
| Task completion time | The total time from when the needle first punctured the surface of the phantom to when fluid was successfully aspirated into the syringe, or 10 minutes, whichever occurred first |
| Accuracy | Measured as either 0 or 1 0 = success on the first attempt 1 = failure on the first attempt |
| Number of attempts | Number of times the user needed to successfully withdraw fluid from the vessel |
| Number of needle redirections | All needle redirections, including partial withdrawal and re-aiming of the needle, and advancing or withdrawing the needle after vocalization of satisfactory placement |
| Number of head adjustments | Number of times the user moved or turned their head from its baseline position at the start of the procedure until task completion. |
| Needle visualization rate | Number of times the needle was visualized by study staff divided by number of times the user stated that the needle was visualized |
Study day 2: Task 4
Two to four weeks after completing study day 1, volunteers returned to perform the task a fourth time using the HoloUS application (task H3). Practice procedure, task protocol and measurements were the same as for tasks 1–3 on study day 1.
The task series described above is summarized in Fig. 3E. On day 1, Task C using conventional ultrasound was completed to establish a baseline for each user using the standard viewing platform for ultrasound guidance. The order of tasks C and H1 were randomized to mitigate any learning effect that may have been present, especially for novice users. The task was repeated with HoloUS (H2) on day 1 to assess for improvement in any outcome parameters with repeated use of the application. Finally, Task H3 was delayed for 2–4 weeks to determine whether any improvement elicited on study day 1 could be sustained after a gap in time.
Immediately following completion of task 4 on study day 2, the users completed a NASA-task load index (TLX) usability survey on the use of the HoloUS and HoloLens 2 device to perform the phantom-based venipuncture task. This method measures workload on six scales: mental, physical, temporal, performance, effort, and frustration34–37.
Statistical Analysis
Wilcoxon signed rank tests were used to compare task paired outcomes from task completion time and number of head adjustments with and without use of the HoloUS application after normality testing. Wilcoxon rank sum test was used to compare unpaired outcomes of the NASA-TLX survey from novice and expert users. Fisher’s exact test was used to compare number of accurate attempts and number of single needle pass attempts with and without use of the application. Pearson’s correlation coefficient was used to examine the associations between all NASA-TLX survey outcomes. Asterisks are used to indicate statistically significant differences: * = P < 0.05, ** = P < 0.01, *** = P < 0.001, **** = P <0.0001.
Results
Application Performance Specifications
Frame rate of the second-generation HoloUS application was up to 60 frames per second (fps) (vs. 25 fps in the first-generation application). Mean image latency was 94.1 ms for original-resolution 600 × 600 pixel images (vs. 80 ms for 100 × 100 pixel images). Mean application latency was 94.1 ms (mean image encoding and packet preparation time = 29.0 ms and mean image transmission time = 65.1 ms). A full comparison of the first and second-generation HoloUS applications can be found in Appendix A.
User Study
A power calculation based on results from the first-generation HoloUS application 31 indicated that N = 20 users per group were needed to detect a 20-second difference in mean task completion time (SD = 30 s, power = 80%, alpha = 0.05). A total of 34 users (22 experts and 12 novices) were enrolled. Two expert users did not return for study day 2 and were omitted from the final analysis. One expert user did not complete the NASA-TLX survey.
Expert users were all physicians or advanced practice providers credentialed to perform ultrasound-guided vascular access procedures at the study institution. Specialties represented included anesthesia, neonatal and pediatric critical care, surgery or surgical subspecialties, interventional radiology, and users spanning all levels of training from resident through attending (Table 4). Novice users were mostly engineers from the hospital’s research institute.
Table 4.
User characteristics and specialty.
| All (N = 34) | Experts (N = 22) | Novices (N = 12) | |||
|---|---|---|---|---|---|
|
| |||||
| Male/Female | 20/14 | Male/Female | 13/9 | Male/Female | 7/5 |
| Anesthesia | 7 | Engineer | 9 | ||
| CC (NICU/PICU) | 7 (2/5) | Research | 1 | ||
| Surgery/Other | 4 | Student | 1 | ||
| IR | 4 | Other | 1 | ||
| Attending | 11 | ||||
| Trainee/APP | 11 | ||||
CC = critical care, NICU = neonatal intensive care unit, PICU = pediatric ICU, IR = interventional radiology, APP = advanced practice provider.
Overall, task completion time significantly improved on study day 2 during the third use of the HoloUS (H3) (Fig. 4, Appendix B). There was a 10% improvement from mean conventional US (C) time (21 s vs. 24 s, median = 11.5 s vs. 18.5 s, P = 0.04), and a 38% improvement from the average first use of the HoloUS (H1) (21 s vs. 34 s, median = 11.5 s vs. 18.5 s, P = 0.02). Outcomes from day 1 showed mixed mean task completion time results; however, when comparing medians using nonparametric analysis, these were not significant. Number of head adjustments was significantly lower during all tasks completed with the HoloUS app (1, 1, 1, and 3 for H1 (P = 0.0001), H2 (P = 0.01) H3 (P < 0.0001), and C, respectively).
Fig. 4.

Task outcome results for all users (A – E, N = 32) and expert users (F – J, N = 20). Task completion time in seconds (A, F), number of attempts (B, G), number of needle redirections (C, H), number of head adjustments (D, I), and needle visualization rate (E, J) (%). Time is plotted as median ± interquartile range (A, F) (IQR), all others plotted as mean ± SEM (B – E, G – J). * = P < 0.05, ** = P < 0.01, *** = P < 0.001, **** = P < 0.0001.
For expert users, a 30% improvement in task completion time was observed during H3 from C (Fig. 4, Appendix C, 11 s vs. 16 s, median = 8.5 s vs. 13.5 s, P = 0.02). Number of head adjustments was also significantly improved when comparing conventional ultrasound with all uses of the HoloUS (2, 1, 1 and 1 for C, H1 (P = 0.0005), H2 (P = 0.01) and H3 (P = 0.002), respectively).
Though not statistically significant, novice users also trended toward faster median task completion time during task H3 versus task C (Appendix D, 37 s v 36 s, median = 23.5 s vs. 37.0 s, P = 0.88). Number of head adjustments was significantly lower during using the HoloLens (1.5, 1.5, 1, and 4 for H1 (P = 0.04), H2 (P = 0.16), and H3 (P = 0.001), and C, respectively.
No significant differences were identified in number of attempts, needle redirections, or needle visualization rate between any two task pairs. Additionally, there were no differences in task completion times when comparing users who used conventional ultrasound first to those who started with the HoloUS application (Appendix E).
NASA-TLX responses showed significant differences in assessment of temporal workload, with expert users scoring the task as less burdensome (Fig. 5, Appendix F, mean = 8.4 vs. 19.2 and median = 5.0 vs. 15.0 (P = 0.03) out of 100 for experts vs. novices). Significant positive correlations were identified between most pairs of scales (Appendix G). However, no significant correlation was found between measures of frustration and physical workload, frustration and temporal workload, frustration and performance workload, or physical and performance workload.
Fig. 5.

NASA-Task Load Index (TLX) survey results by specialty and level of training. Total score (A) as well as mental (B), physical (C), temporal (D), performance (E), effort (F), and frustration (G) outcome measures were analyzed.
Discussion
While the work presented in this manuscript focuses on HoloUS, a HoloLens-based AR application, head-mounted displays (HMD) represent a subset of the tools aimed at improving the performance of vascular access procedures. Pre-smartglasses-based projection devices, including those investigated by Wang et. al.32 and Jeon et. al.10 paved the way for in situ visualization of ultrasound images. With the advent of HMDs, the ability to superimpose ultrasound images over their corresponding anatomy continues to advance.11,24,27 Although the HoloLens 2 was utilized in this iteration of the HoloUS application, the technology can be applied to other HMDs with the goal of integrating seamlessly with multiple AR platforms and clinical ultrasound machines. By combining with increasingly common ultrasound machine features like needle-guidance and tracking and expanded interventional options like echogenic access needles, HoloUS has the potential to improve the procedural experience for clinicians and outcomes for patients.
For the assessment of vascular targeting accuracy, careful consideration was taken into designing a simple, but informative task protocol. For vascular access, success is usually confirmed clinically by positive aspiration of blood and subsequent catheterization of the vessel of interest. Ultrasound is especially useful when multiple vessels are in close proximity to each other or to other sensitive structures and can guide needle trajectory and visually confirm placement within the targeted vessel. However, this often occurs after blood has been aspirated using a syringe placed under negative pressure by the proceduralist, or a “flash” of blood is observed in a vascular catheter’s visualization window. To isolate HoloUS’ effect on needle visualization and placement accuracy, we sought to eliminate these visual and tactile cues. Thus, participants were asked to place the needle without manipulating the syringe plunger, verbally indicate when they visualized the needle within the vessel on ultrasound, and only then confirm successful placement by aspirating blood with the syringe.
Our accuracy assessment was based on each participant’s report of needle visualization with corroboration from study staff. For each study task, users were required to vocalize when the observed the needle in the vessel based on ultrasound imaging. At this point in time, the ultrasound image was simultaneously assessed by study staff. Accuracy was defined as either 0 or 1 with 0 indicating that the first aspiration attempt after reported needle placement failed and 1 indicating first attempt success.
Participants understood the theoretical value of the protocol’s accuracy assessment, however, in practice, the coordination of keeping the needle stationary while pulling back on the syringe plunger was difficult to standardize. Novices struggled to withdraw the plunger while keeping the needle’s tip within the vessel. Experts had difficulty disregarding their muscle memory for the experimental task, and were often observed making slight needle adjustments when they were intended to be stationary, including while withdrawing on the syringe plunger in some cases where the tip was not initially in the vessel. Additionally, some participants became preoccupied with producing the ideal ultrasound view, possibly prolonging task completion time. A possible alternative preclinical model could use an arterial phantom capable of delivering positive pressure on command, eliminating the need for participant action to confirm intravascular needle placement. Despite this limitation, some user variation in needle control is expected and did not significantly affect our accuracy assessments or outcomes.
High variation of baseline task completion time was observed among users, even within those grouped in the same category (experts vs. novices) or medical specialty and level of training (e.g., critical care fellows). To approach this, each participant’s conventional ultrasound task measures were used as their individual control, and each user was randomized to use the conventional ultrasound setup either first or second. Prior to performing the measured tasks on each study day, each user was required to practice the task 3 times with HoloUS to standardize the preparation procedures and ensure that users were familiar with the task protocol.
Even with this training and practice, task performance time on study day 1 was widely variable for many users, and similar patterns were seen regardless of whether conventional ultrasound was used first or second. In most cases, users were slightly slower during their first use of the HoloUS than with the conventional ultrasound, and even slower during task H2. Possible explanations include fatigue, since users completed the task three times in a row on study day 1, or fluctuating performance in the setting of absorbing new and complex skills. This may be especially true for novices, who were often not only becoming acquainted with the HoloLens and AR but were also learning ultrasound and venipuncture skills for the first time.
The other outcomes of task duration, needle redirections, and head turnings were unambiguous, and therefore simpler to assess. In this study, a head adjustment was defined as any movement of the user’s head from its baseline position at the start of the procedure until task completion as observed by study staff. This included movement up, down, left, right, back, front, and tilt. Our findings showing decreased numbers of head movements with use of AR is consistent with previous work by other groups 24,27. Even with AR, number of head adjustments was nonzero. In most cases, this was due to the tint or metal frame of the physical smart glasses, since some users preferred looking down at the needle prior to withdrawing on the syringe which required them to look “around” the hologram or glasses frame.
Due to large disparities in baseline understanding of ultrasound and needle-based procedures, and unacceptably high variation in task completion times, recruitment for the novice group was stopped prior to recruiting 20 users. However, it is promising that users across all levels of training performed markedly better by study completion, regardless of time interval between study days. Not only were most objective measures improved, but qualitative feedback and overall subjective confidence were also positive. This suggests that use of this application may be particularly suited for training purposes, and though this study was not designed to evaluate HoloUS as a training tool, follow-up studies to assess this are ongoing.
While the primary aim of this study was to assess the effect of using the HoloUS application on the discussed procedural outcomes, we also sought feedback from the 22 clinicians who generously provided their time to participate. This feedback is being actively integrated into expansions for the next version of the HoloUS application to improve its usability, streamline its physical footprint, integrate with clinical ultrasound machines, and broaden its functional capabilities and customizable features.
All users were given the opportunity to use both modes of the HoloUS application. In general, most agreed that the Tracking Mode was most useful for surveying the phantom’s vascular anatomy since the image position automatically and seamlessly updates based on transducer position. During the actual procedure, most users preferred using Floating Mode since the image is placed directly in the user’s line of sight and remains in a fixed position while small adjustments to transducer position are made.
The most common feedback from expert users were requests for the addition of midline and depth indicators. Since the AR US projection is not manipulated or processed by the HoloUS application, this can be easily addressed by turning these features on using the ultrasound machine so that they are included in the transmission. Similarly, any feature shown by an ultrasound machine can theoretically be transmitted using HoloUS. This includes Doppler, motion, and B-modes. Though some image latency is unavoidable, no users reported notable delays during task completion. Additionally, though this manuscript describes the pairing of HoloUS with a conventional two-dimensional ultrasound machine, possible future work could integrate the application with three- or four-dimensional ultrasound machines, which would enable multiplanar visualization of vascular structures.
Currently, HoloUS and other similar custom applications are not able to adjust settings like ultrasound mode or scan depth using voice commands or hand gestures. This is because most ultrasound machine vendors do not expose the programmatic interface for third-party software manipulation of these settings. One goal of future work will be to establish partnerships that will allow for tighter integration of the application with ultrasound machine software.
Beyond potential improvement in procedural efficiency, utilizing a virtual screen with AR technology also provides the advantages of eliminating the need for a bulky ultrasound machine at the bedside and gives proceduralists greater autonomy in what can sometimes be emergent and chaotic procedures in the sickest and most vulnerable patients. Though this work utilizes a pre-clinical phantom-based model, we received enthusiastic support from the diverse clinician-participants and aim for future investigation of HoloUS for use in procedures across multiple medical specialties, including anesthesia, intensive care, interventional radiology and surgery, with the ultimate goal of bringing this technology to the bedside.
Conclusion/Summary
Augmented reality display of live ultrasound images using the Microsoft HoloLens 2 smart glasses is a useful tool for vascular access procedures. Both expert and novice users can be trained to use this technology, and improvement in procedure completion time and number of head adjustments is observed when compared with conventional ultrasound. This effect is even more pronounced with repeated use of this AR application.
Supplementary Material
Acknowledgements
Research reported in this publication was supported by the National Institutes of Health (R03EB032513, 2022).
The authors would also like to acknowledge the clinicians and volunteers who generously contributed their time and expertise to complete this study and improve the HoloUS application. We would also like to acknowledge Terason’s loan of the portable ultrasound system used in our work.
Footnotes
Conflict of Interest Statement
Raj Shekhar is the founder of IGI Technologies, Inc. Trong Nguyen is an employee of IGI Technologies. All other authors have no conflicts of interest or financial ties to disclose.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Data Availability Statement
The data that support the findings of this study are openly available in Mendeley Data at http://doi.org/10.17632/p837h9kwnv.2. The HoloUS application can be accessed free of charge from the Microsoft store at https://www.microsoft.com/en-us/p/holoviewus/9nhlk5n3m32p.
References
- 1.Lamperti M, Biasucci DG, Disma N, et al. European Society of Anaesthesiology guidelines on peri-operative use of ultrasound-guided for vascular access (PERSEUS vascular access). Eur J Anaesthesiol. 05 2020;37(5):344–376. doi: 10.1097/EJA.0000000000001180 [DOI] [PubMed] [Google Scholar]
- 2.Gu WJ, Wu XD, Wang F, Ma ZL, Gu XP. Ultrasound Guidance Facilitates Radial Artery Catheterization: A Meta-analysis With Trial Sequential Analysis of Randomized Controlled Trials. Chest. Jan 2016;149(1):166–79. doi: 10.1378/chest.15-1784 [DOI] [PubMed] [Google Scholar]
- 3.Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. Jul 2012;38(7):1105–17. doi: 10.1007/s00134-012-2597-x [DOI] [PubMed] [Google Scholar]
- 4.Troianos CA, Hartman GS, Glas KE, et al. Special articles: guidelines for performing ultrasound guided vascular cannulation: recommendations of the American Society of Echocardiography and the Society Of Cardiovascular Anesthesiologists. Anesth Analg. Jan 2012;114(1):46–72. doi: 10.1213/ANE.0b013e3182407cd8 [DOI] [PubMed] [Google Scholar]
- 5.Gu WJ, Tie HT, Liu JC, Zeng XT. Efficacy of ultrasound-guided radial artery catheterization: a systematic review and meta-analysis of randomized controlled trials. Crit Care. May 08 2014;18(3):R93. doi: 10.1186/cc13862 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Zanolla GR, Baldisserotto M, Piva J. How useful is ultrasound guidance for internal jugular venous access in children? J Pediatr Surg. Apr 2018;53(4):789–793. doi: 10.1016/j.jpedsurg.2017.08.010 [DOI] [PubMed] [Google Scholar]
- 7.Pittiruti M Ultrasound guided central vascular access in neonates, infants and children. Curr Drug Targets. Jun 2012;13(7):961–9. doi: 10.2174/138945012800675696 [DOI] [PubMed] [Google Scholar]
- 8.García-Vázquez V, von Haxthausen F, Jäckle S, et al. Navigation and visualisation with HoloLens in endovascular aortic repair. Innov Surg Sci. Sep 2018;3(3):167–177. doi: 10.1515/iss-2018-2001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Kuhlemann I, Kleemann M, Jauer P, Schweikard A, Ernst F. Towards X-ray free endovascular interventions - using HoloLens for on-line holographic visualisation. Healthc Technol Lett. Oct 2017;4(5):184–187. doi: 10.1049/htl.2017.0061 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Jeon Y, Choi S, Kim H. Evaluation of a simplified augmented reality device for ultrasound-guided vascular access in a vascular phantom. J Clin Anesth. Sep 2014;26(6):485–9. doi: 10.1016/j.jclinane.2014.02.010 [DOI] [PubMed] [Google Scholar]
- 11.Jang YE, Cho SA, Ji SH, et al. Smart Glasses for Radial Arterial Catheterization in Pediatric Patients: A Randomized Clinical Trial. Anesthesiology. 10 01 2021;135(4):612–620. doi: 10.1097/ALN.0000000000003914 [DOI] [PubMed] [Google Scholar]
- 12.Mahmood F, Mahmood E, Dorfman RG, et al. Augmented Reality and Ultrasound Education: Initial Experience. J Cardiothorac Vasc Anesth. 06 2018;32(3):1363–1367. doi: 10.1053/j.jvca.2017.12.006 [DOI] [PubMed] [Google Scholar]
- 13.Mill T, Parikh S, Allen A, et al. Live streaming ward rounds using wearable technology to teach medical students: a pilot study. BMJ Simul Technol Enhanc Learn. 2021;7(6):494–500. doi: 10.1136/bmjstel-2021-000864 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Sutherland J, Belec J, Sheikh A, et al. Applying Modern Virtual and Augmented Reality Technologies to Medical Images and Models. J Digit Imaging. 02 2019;32(1):38–53. doi: 10.1007/s10278-018-0122-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Quero G, Lapergola A, Soler L, et al. Virtual and Augmented Reality in Oncologic Liver Surgery. Surg Oncol Clin N Am. 01 2019;28(1):31–44. doi: 10.1016/j.soc.2018.08.002 [DOI] [PubMed] [Google Scholar]
- 16.Hatzl J, Böckler D, Hartmann N, et al. Mixed reality for the assessment of aortoiliac anatomy in patients with abdominal aortic aneurysm prior to open and endovascular repair: Feasibility and interobserver agreement. Vascular. Apr 11 2022:17085381221081324. doi: 10.1177/17085381221081324 [DOI] [PubMed] [Google Scholar]
- 17.Laverdière C, Corban J, Ge S, et al. Augmented-reality-guided insertion of sliding hip screw guidewire: a preclinical investigation. Can J Surg. 2022 May-Jun 2022;65(3):E364–E371. doi: 10.1503/cjs.025620 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Puladi B, Ooms M, Bellgardt M, et al. Augmented Reality-Based Surgery on the Human Cadaver Using a New Generation of Optical Head-Mounted Displays: Development and Feasibility Study. JMIR Serious Games. Apr 25 2022;10(2):e34781. doi: 10.2196/34781 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Tanzer M, Laverdière C, Barimani B, Hart A. Augmented Reality in Arthroplasty: An Overview of Clinical Applications, Benefits, and Limitations. J Am Acad Orthop Surg. May 15 2022;30(10):e760–e768. doi: 10.5435/JAAOS-D-21-00964 [DOI] [PubMed] [Google Scholar]
- 20.Vávra P, Roman J, Zonča P, et al. Recent Development of Augmented Reality in Surgery: A Review. J Healthc Eng. 2017;2017:4574172. doi: 10.1155/2017/4574172 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.von Haxthausen F, Moreta-Martinez R, Pose Díez de la Lastra A, Pascau J, Ernst F. UltrARsound: in situ visualization of live ultrasound images using HoloLens 2. Int J Comput Assist Radiol Surg. Jul 01 2022;doi: 10.1007/s11548-022-02695-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Montemurro N, Condino S, Carbone M, et al. Brain Tumor and Augmented Reality: New Technologies for the Future. Int J Environ Res Public Health. 05 23 2022;19(10)doi: 10.3390/ijerph19106347 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Qian L, Song T, Unberath M, Kazanzides P. AR-Loupe: Magnified Augmented Reality by Combining an Optical See-Through Head-Mounted Display and a Loupe. IEEE Trans Vis Comput Graph. 07 2022;28(7):2550–2562. doi: 10.1109/TVCG.2020.3037284 [DOI] [PubMed] [Google Scholar]
- 24.Lim H, Kim MJ, Park JM, et al. Use of smart glasses for ultrasound-guided peripheral venous access: a randomized controlled pilot study. Clin Exp Emerg Med. Dec 2019;6(4):356–361. doi: 10.15441/ceem.19.029 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Rüger C, Feufel MA, Moosburner S, Özbek C, Pratschke J, Sauer IM. Ultrasound in augmented reality: a mixed-methods evaluation of head-mounted displays in image-guided interventions. Int J Comput Assist Radiol Surg. Nov 2020;15(11):1895–1905. doi: 10.1007/s11548-020-02236-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Tanwani J, Alam F, Matava C, et al. Development of a Head-Mounted Holographic Needle Guidance System for Enhanced Ultrasound-Guided Neuraxial Anesthesia: System Development and Observational Evaluation. JMIR Form Res. Jun 23 2022;6(6):e36931. doi: 10.2196/36931 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Wu TS, Dameff CJ, Tully JL. Ultrasound-guided central venous access using Google Glass. J Emerg Med. Dec 2014;47(6):668–75. doi: 10.1016/j.jemermed.2014.07.045 [DOI] [PubMed] [Google Scholar]
- 28.Suzuki K, Morita S, Endo K, et al. Learning effectiveness of using augmented reality technology in central venous access procedure: an experiment using phantom and head-mounted display. Int J Comput Assist Radiol Surg. Jun 2021;16(6):1069–1074. doi: 10.1007/s11548-021-02365-6 [DOI] [PubMed] [Google Scholar]
- 29.Kuzhagaliyev T, Clancy N, Janatka M, et al. Augmented reality needle ablation guidance tool for irreversible electroporation in the pancreas. vol 10576. SPIE Medical Imaging. SPIE; 2018. [Google Scholar]
- 30.Farshad-Amacker NA, Bay T, Rosskopf AB, et al. Ultrasound-guided interventions with augmented reality in situ visualisation: a proof-of-mechanism phantom study. Eur Radiol Exp. 02 04 2020;4(1):7. doi: 10.1186/s41747-019-0129-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Nguyen T, Plishker W, Matisoff A, Sharma K, Shekhar R. HoloUS: Augmented reality visualization of live ultrasound images using HoloLens for ultrasound-guided procedures. Int J Comput Assist Radiol Surg. Feb 2022;17(2):385–391. doi: 10.1007/s11548-021-02526-7 [DOI] [PubMed] [Google Scholar]
- 32.Wang D, Amesur N, Shukla G, et al. Peripherally inserted central catheter placement with the sonic flashlight: initial clinical trial by nurses. J Ultrasound Med. May 2009;28(5):651–6. doi: 10.7863/jum.2009.28.5.651 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Kraft V, Strehlow J, Jäckle S, et al. A comparison of streaming methods for the Microsoft HoloLens Tagungsgband der 18. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie (CURAC). Oliver Burgert, Hochschule Reutlingen, Bernhard Hirt, Universität; Tübingen; 212–216. [Google Scholar]
- 34.Tubbs-Cooley HL, Mara CA, Carle AC, Gurses AP. The NASA Task Load Index as a measure of overall workload among neonatal, paediatric and adult intensive care nurses. Intensive and Critical Care Nursing. 2018;46:64–69. [DOI] [PubMed] [Google Scholar]
- 35.Hoonakker P, Carayon P, Gurses AP, et al. Measuring workload of ICU nurses with a questionnaire survey: the NASA Task Load Index (TLX). IIE transactions on healthcare systems engineering. 2011;1(2):131–143. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Hart SG. NASA-task load index (NASA-TLX); 20 years later. Sage publications Sage CA: Los Angeles, CA; 2006:904–908. [Google Scholar]
- 37.Hancock PA, Meshkati N. Human mental workload. North-Holland Amsterdam; 1988. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The data that support the findings of this study are openly available in Mendeley Data at http://doi.org/10.17632/p837h9kwnv.2. The HoloUS application can be accessed free of charge from the Microsoft store at https://www.microsoft.com/en-us/p/holoviewus/9nhlk5n3m32p.
