Skip to main content
BMC Medical Education logoLink to BMC Medical Education
. 2026 Feb 7;26:403. doi: 10.1186/s12909-026-08648-3

Delivering remote procedural training on peripheral intravenous cannulation to medical students using the Microsoft Hololens 2: a mixed-methods evaluation

Murray Connolly 1,, Gabriella Iohom 1, Niall O’Brien 2, James Volz 2, Aogán O’Muircheartaigh 3, Paschalitsa Serchan 3, George Shorten 4
PMCID: PMC12977549  PMID: 41654883

Abstract

Background

Head-Mounted-Displays which have mixed reality capabilities have a range of functions which may improve medical training and address certain challenges facing universities worldwide. Devices such as the Microsoft HoloLens 2 device can facilitate provision of procedural tutorials from remote locations, provide tutors with a first-person perspective of a student’s field of view, and allow the insertion of holographic artefacts which can replicate in-person tutorials and facilitate vertical integration of curriculum elements. This study aimed to evaluate the feasibility, efficacy and usability of the Microsoft HoloLens 2 device in providing tutorials to medical students on peripheral intravenous cannulation from a remote location.

Methods

Medical student volunteers were randomly allocated to an In-Person (IP) group and HoloLens (HL) group. Students in the IP group received a tutorial on intravenous cannulation in-person, the HL group received the tutorial remotely via the HoloLens. Both groups completed an initial metric- based competency assessment, received a period of metrics-based feedback which informed a period of deliberate practice, followed by a second competency assessment. Students in the HL group received a familiarisation session with the device, completed a Mental Rotations Test, provided feedback, completed a System Usability Scale (SUS) questionnaire and three students completed interviews. One additional student completed a Think Aloud session.

Results

Seventeen student volunteers were recruited. Sixteen students underwent block randomisation. One additional student completed the Think Aloud session. Baseline characteristics of the two groups were similar. Mean initial competency scores were similar for the HL and IP groups, both groups showed significant improvements in performance, and there was no significant difference between the groups’ post-feedback scores, (p= 0.11). Students SUS scores were mean 79.4 (Standard Deviation 13.1). Quantitative and qualitative feedback was relatively positive, but identified several usability and technological limitations, including the development of headaches and visual fatigue.

Conclusions

This study demonstrates that tutorials on intravenous cannulation which are delivered remotely using the HoloLens 2 device are feasible, show a similar improvement in student performance to in-person tutorials, and are relatively well received by students and tutors. The ability to deliver procedural tutorials to a remote location has the potential to be a significant asset to medical educators. However, it also highlights several usability issues, including the development of “cybersickness” symptoms which may be limiting factors in its widespread adoption by educational institutions, and warrants further investigation.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12909-026-08648-3.

Keywords: Mixed reality, Augmented reality, Medical education, Remote education, Telementorship, Cannulation, Microsoft HoloLens 2

Introduction

In general, medical education facilitates the attainment of scientific knowledge, professionalism, and basic clinical skills. Traditionally the majority of medical training has been delivered in person [13]. However, medical schools worldwide are facing common challenges, particularly regarding training in procedural skills. Medical school class sizes are increasing in many jurisdictions, with limited availability of faculty, patients, and teaching facilities [4]. Increasing levels of scrutiny regarding infection control measures may limit student access to clinical areas, and increasingly institutions take account of the environmental impact of student and faculty travel [58]. Technological advances, particularly in information technology, have often been incorporated into medical education curricula, and my help to address the current challenges [9]. Recent advances in the areas of Head-Mounted-Displays (HMDs), and virtual, augmented, and mixed reality software may offer novel educational options and may facilitate the delivery of procedural training on simulated models, remotely. However, prior to their incorporation into medical school curricula, thorough evaluation of the benefits, limitations and costs associated with these media is warranted.

Simulation training in medical education

There has been a gradual shift towards the use of simulation as an educational tool. Simulation Based Medical Education (SBME) for procedural skills has several advantages over traditional methods. It facilitates the creation of a controlled learning environment, enables repeated deliberate practice of procedural skills, and provides a means for delivery of structured feedback, without the inherent risks to live patients [1012]. Simulation-based tutorials on IV cannulation have been shown to be as or more effective than bedside teaching on live volunteers [13]. Initial models of simulation-based training involved face-to-face tutorials with tutors who were present in the same location as those learning. However, advances in technology have presented opportunities to provide procedural training remotely. Initially this was facilitated using standard webcam technology, and more recently has involved the use of (HMDs), some of which incorporate augmented or mixed reality technology.

Mixed reality, and its use in medical training

Augmented reality (AR) is a virtual environment within which holographic artefacts can be overlaid upon physical surroundings within the user’s view; mixed reality (MR) is an extension of AR which enables the holographic artefacts to interact with the physical environment [14, 15]. The number of HMDs which can create MR environments is steadily increasing, and their use has been evaluated in a wide range of industries, including healthcare, education, and manufacturing [1621].

Specific to medical education, studies have evaluated the use of HMDs in providing training in such areas as anatomy, virtual ward-rounds, bedside tutorials and procedural training [18, 2231]. Some studies have evaluated the use of HMDs in providing clinical telementorship, the interactive guidance of students from a remote location [32, 33]. Many other studies evaluating the use of HMDs in procedural training have focused on the development of virtual anatomical or patient models on which to practice [3438].

The Microsoft HoloLens 2 is a head-mounted-display (HMD) with internet connectivity and MR functions which has specific capabilities that may be of use in procedural training from a remote location. The device can facilitate remote two-way audiovisual communication between tutor and student; it provides the tutor with a first-person perspective of the student’s field of vision, and it enables the tutor to insert holographic artefacts such as pointers, diagrams, models and images into the student’s field of view. Importantly, the insertion of media such as anatomical diagrams and biological models may facilitate vertical integration of basic scientific with clinical content, which aligns with the Constructivist pedagogical theory [39, 40].

Limitations of head-mounted displays in procedural training

The use of HMDs by students has several potential limitations. There is a significant cost associated with the devices and licences, a secure reliable internet connection is required, and both the tutors and students require competency in the use of the device and its programme interface. Tutorials delivered remotely via HMDs may not be as effective as in-person tutorials or may only be as effective for specific procedures. There is also the potential that even if HMD-facilitated tutorials are shown to be effective, they may not be deemed acceptably usable by students or tutors. There is mixed evidence regarding the effect of participants’ visuospatial abilities and technical skills acquisition by novice medical trainees [4143]. Several studies have indicated that users’ visuospatial skills have an influence on their performance while using HMDs, as well as their perceptions of the sessions, which may be important variables in the generalisability of such a tutorial format [4245].

Study goals

There is a relative dearth of evidence comparing MR-enhanced procedural tutorials to standard teaching practices and there is mixed evidence regarding the effects of user visuospatial skills on MR-enhanced education. This study aims to evaluate the use of the Microsoft HoloLens 2 device to deliver procedural tutorials on peripheral intravenous cannula insertion to medical students in a remote location. Peripheral Intravenous Cannula (PIVC) insertion was chosen for evaluation in this study as it is a common invasive procedure and one which is contained in most medical school curricula. We hypothesized that HoloLens-based remote tutorials would be non-inferior to in-person training in terms of competency improvement.

Specific objectives

  1. To evaluate the feasibility of delivering PIVC insertion tutorials remotely via the Microsoft HoloLens 2.

  2. To assess the learning efficacy of these tutorials versus in-person tutorials.

  3. To evaluate the usability of the device in this context.

  4. To investigate the effects of students’ visuospatial skills on their performance improvement and perceived usability of the device.

  5. To elicit the perceptions of students and the tutor regarding the MR-enhanced tutorials.

Methods

This study was approved by the Social Research Ethic Committee of University College Cork, and the University College Cork Research and Postgraduate Affairs Committee [Log 2021 − 197]. All participants including students, tutor and technical facilitator (the latter two who were also investigators) provided written informed consent prior to inclusion in the study.

Study population

Medical students attending an acute tertiary referral University hospital for a clinical attachment with the Department of Anaesthesia and Intensive Care Medicine at UCC were invited to participate in the study. All participants were eighteen years or over and provided information on their age, gender, previous PIVC training and number of PIVCs they have previously performed on live patients.

Tutorial structure

Tutorials were delivered across a single academic year. Participants were allocated using block randomisation into two groups: in-person (IP) and HoloLens (HL). Students in the IP group received one-to-one training in-person by the tutor. Students in the HL group wore the HoloLens 2 device throughout the tutorial and received one-to-one training from the tutor remotely. The tutor connected to the HoloLens 2 utilising the Microsoft Teams application over secure institutional Wi-Fi (Eduroam).

Prior to delivery of the tutorial, students allocated to the HL group completed a Mental Rotations Test-A (MRT-A) in order to assess their visuospatial skills [Additional File 1] [46]. They also received structured training in the use of the HoloLens device in order to ensure competence in establishing a connection with the tutor and interacting with the holographic artefacts.

The HoloLens familiarisation training was developed by the tutor (MC) and technical facilitator (NOB) who both had significant experience in the use of the device, and trialled with a medical student coinvestigator (JV). The training made use of the Microsoft Tips application and provided the students with competence in device start-up, initiation of a Microsoft Teams call, interaction with holographic MR artefacts and appropriate placement of artefacts in their environment around their simulation workstation [47]. Details of the training structure are available in the Additional File 2.

Students performed cannulation on an artificial cannulation training model. Baseline competence in PIVC was assessed using a 29-point validated assessment tool [Additional File 3] [48]. Students then received metrics-based feedback from the tutor and took part in a supervised practice session, the duration of which was measured. Metrics-based feedback was provided in a stepwise manner in order to inform sequential deliberate practice of the procedure.

During the feedback session for students in the HL group, the tutor utilised various MR capabilities of the HMD. This included the introduction of holographic artefacts such as the Metrics Score Sheet, anatomical diagrams of the venous system and procedural diagrams of cannulation. They were also able to insert holographic pointers and annotations to highlight relevant areas on the simulation arm or on the displayed diagrams. An example of the students field of vision, including the holographic metrics sheet, pointer, and Microsoft Teams interface is illustrated in Fig. 1.

Fig. 1.

Fig. 1

Student view via HoloLens

Students in the IP group received identical resources (Metrics Score Sheet, anatomical diagrams of the venous system and procedural diagrams of cannulation) which were provided as physical printed copies instead of holographic images. The tutor physically pointed our or highlighted relevant material, in place of the holographic pointers and annotations which the HL group received.

Subsequent to the feedback session, students in both groups completed a second PIVC competency assessment.

Resources employed

Resources necessary to provide the tutorials via the HoloLens included capital costs of the HoloLens device (€3500) as well as annual licence costs of €275 per user (n = 4). Human resources employed in developing the tutorials and trialling equipment included approximately 20 h of training, remote assistance (Microsoft) and collaboration between the tutor (MC), Professor (GS) and facilitator (NOB), as well as 8 h input from the Senior Lecturer (GI).

Internet connectivity

An internet connection of at least 1.5 mpbs of bandwidth is recommended by Microsoft for best audio, visual and content sharing experience [49]. Secure, password protected wireless internet access via the University institutional network (Eduroam) was utilised by both tutor and students.

Hardware

Tutorials were hosted by an MSI personal computer running the Windows 10 operating system. The assessments were recorded using an Anker PowerConf C300 camera.

Software

Dynamics 365 Remote Assist application was used, in-tandem with Microsoft Teams, to host each video call. This connection allowed the tutor to see the student’s field of vision and facilitated two-way communication. Hand gestures including the “hand-ray”, “air-tap”, “air-tap and hold” and “start-gesture” were used to control the HMD and manipulate the holographic artefacts. Relevant holographic artefacts were superimposed during the tutorial. This included the insertion of diagramatic representaions of venous anatomy, PIVC insertion and the metrics with which the students were being assessed. The holographic pointer and “drawing” functions were used by the tutor to highlight relevant points and emphasise information on the holographic diagrams.

Assessment of learning efficacy

Pre-tutorial and post-tutorial metrics based competency assessments for both groups were video recorded and then scored by two tutors. Prior to assessment of student performance, both tutors received training in the assessment tool and carried out a trial assessment on a sample video. Before assessing videos acquired in the study, inter-rater reliability was assessed for these assessors using proportionate agreement, with an acceptable threshold of ≥ 0.8. Scores from each tutor were then averaged for each individual assessment.

Assessment of student perceptions

Immediately after completion of each tutorial, students in the HL group completed a modified Evaluation of Technology-Enhanced Learning Materials: Learner Perceptions (ETELM-LP) questionnaire in order to assess their perceptions of the tutorial, which incorporated a seven-point Likert Scale and open questions [Additional File 4] [50]. Cronbach’s Alpha was calculated to test for internal consistency, after exclusion of question 1 and reverse scoring of questions 10 and 12.

Students in the HL group were invited to take part in interviews after the sessions. The semi-stuctured interviews were completed via Microsoft Teams. Researchers undertook this study from an interpretive approach [51]. JV conducted each of the three interviews. A template of questions and associated probes were utilised to guide the inteviews, and specific points were expanded upon as appropriate [Additional File 5].

One additional student, who was not part of the randomised trial, completed a moderated Think Aloud session. In this session, the student was instructed on the Think Aloud Protocol and proceeded to complete the HoloLens familiarisation training and perform insertion of a PIVC on the simulation model while wearing the HoloLens device. They verbally projected their thoughts and perceptions throughout in line with Think Aloud practices [52].

The interviews and Think Aloud session were recorded, transcribed, and subsequently analysed using Dedoose Qualitative Research Software Version 4.3 [53]. Qualitative data from interviews and feedback questionnaires were coded thematically in line with Clarke and Braun’s guidelines for qualitative analysis [54]. This involved a six-step process of data familiarisation, code generation, theme identification, theme review, and report generation. We used a combination of inductive and deductive coding, whereby we evaluated the data from a bottom-up and top-down approach, so that some codes were derived entirely from the qualitative the data, and others were predetermined. To control for potential bias in qualitative analysis, data was independently coded by JV and MC, and subsequently reviewed by GI. Where differences in coding occurred, the relevant data was examined by the three colaborators and a consensus was reached. The then stengthened the interpretation of results through further content analysis, and illustrative quotes were chosen to represent relevant themes and subthemes. Quotes were selected once they were determined to be illustrative of relevant points, reflective of patterns observed in the interviews, and adequately succinct [55].

On completion of the tutorials, students in the HL group completed a System Usability Scale (SUS) questionnaire [56].

Assessment of tutor perceptions

On completion of the final tutorial, the tutor summarised their perceptions of providing the turorials remotely via the HMD.

Data processing

A paired two-sample t-Test was used to compare student group ages, previous cannulation experience, and duration of feedback and supervised practice. The Shapiro Wilk test was performed to assess normality of distribution of student assessment scores. The Mann-Whitney U Test was performed to compare differences between pre-tutorial and post-tutorial assessment scores and between the HL and IP group performances. Cohen’s d was calculated for each group to evaluate the effect size of each intervention arm. The influence of student scores in the MRT-A on change in assessment scores, and SUS scores were tested using linear regression and the Pearson Correlation Test. Post-hoc power analysis was performed using G*Power Version 3.1.9.7.

Results

Sixteen undergraduate medical students volunteered to participate in the tutorials and were allocated equally to the HL and IP groups. Baseline characteristics of the two groups were similar and are summarised in Table 1. There were no significant differences in age, number of previous PIVC tutorials or previous PIVCs performed on live patients. One additional student performed a Think Aloud session.

Table 1.

Baseline characteristics of study participants allocated to in-person and hololens groups. SD: standard Deviation; PIVC: peripheral intravenous Cannula; MRT-A: mental rotations Test-A

In-Person Group (n = 8) HoloLens Group (n = 8) p-Value
Gender, male: female 3:5 4:4
Age, years, Mean (SD) 24.1 (2.4) 23.8 (2.3) 0.75
Previous PIVC tutorial, mean 1 1
Previous PIVC completed, mean (SD) 1.5 (1.9) 2.5 (3.3) 0.48
MRT-A Score, mean (SD) 13.75 (4.8)

Feasibility

We found that it was feasible to use the HoloLens2 to facilitate remote one-to-one delivery of procedural training on PIVC. No tutorials were cancelled or postponed due to technology-related issues. The tutorials were dependent on secure Wi-Fi access for both tutor and students and the delivery of pre-tutorial device familiarisation training to the students.

Total feedback and supervised practice time was similar for the two groups, presented as (mean, SD) in minutes: seconds, (31:08, 04:15) and (30:07, 04:10) for the HL and IP groups respectively, p = 0.64.

Learning efficacy

The combined inter-rater reliability was > 0.8 for each metric scored by the two assessors. The mean interrater reliability of all assessments was 0.953.

Student performance in metrics based competency assessments are illustrated in Fig. 2. Data were not normally distributed, and are presented as (mean, SD). The HL and IP groups were similar in terms of performance in the pre-feedback competency assessment, (18.56, 2.69) and (18.63, 3.1) respectively, p = 0.96. The HL and IP groups showed statistically significant improvements in their post-feedback assessment scores, (26.69, 1.85) and (27.94, 1.37) respectively, p < 0.001 in both groups. Cohen’s d was 3.51 and 3.87 for the HL and IP groups respectively, showing a large effect size. The HL and IP groups were simlar in terms of performance in the post-feedback assessment, p = 0.11. Cohen’s d comparing the HL and IP groups, with HL as the intervention group was − 0.77, with post-hoc analysis displaying a power of 0.41. Details of students’ performances in each metric are provided in Additional File 6.

Fig. 2.

Fig. 2

Student competency assessment scores. HL; HoloLens group. IP; in-person group

Students’ visuospatial skills, as measured by the MRT-A, were not significantly associated with (pre-to post training) change in assessment scores, as illustrated in Fig. 3. Correlation coefficient was calculated as 0.29, p = 0.48. R-squared was calculated as 0.08.

Fig. 3.

Fig. 3

Effect of mental Rotations Test-A (MRT-A) on change in assessment score

Student quantitative feedback

Student responses to the modified ETELM-LP questionnaire are summarised in Fig. 4. Results are presented as (mean, SD) and refer to a seven-point Likert scale, where 1 = strongly disagree, = = strongly agree. The students had little previous experience in MR (1.63, 1.19), deemed the visual quality to be clear (6.75,0.46), and found the MR artefacts useful (6.75,0.46). They agreed that the tutorial replicated an in-person tutorial (5.75, 0.7) and that the technology supported the learning objectives (6.5, 0.75). They did not agree that the MR elements served as a distraction or that it required inappropriately high technology skills (2.6, 1.6 and 2.3, 1.2 respectively). Students gave mixed results regarding the audio clarity during the tutorial (4.75, 1.5), with two students scoring it negatively. Cronbach’s Alpha for the student quantitative feedback questionnaire was 0.776, displaying acceptable internal consistency.

Fig. 4.

Fig. 4

Student modified ETELM-LP results. 7 point Likert scale with 7 as strongly agree and 1 as strongly disagree. ETELM-LP; Evaluation of Technology Enhanced Learning Materials – Learner Perceptions

System Usability Scale Scores are demonstrated in Fig. 5. Mean SUS Score was 79.4 (SD = 13), with all but one student scoring the system greater than 68, where 68 is deemed an above average usability score.

Fig. 5.

Fig. 5

Student System Usability Scale score

Students’ visuospatial skills, as measured by the MRT-A, were not significantly associated with System Usability Scale scores, Fig. 6. Correlation coefficient was calculated as -0.19, p = 0.64. R-squared was calculated as 0.04.

Fig. 6.

Fig. 6

Effect of Student MRT-A score on SUS score. MRT-A; MRT: Mental Rotations Test-A; SUS: System Usability Scale

Student qualitative feedback results

All eight students in the HL group provided written feedback, three undertook semi-structured interviews and one additional student completed a Think Aloud session. Analysis of written feedback, interview transcripts, and Think Aloud session transcript identified 63 specific positive and 45 negative comments. Analysis of transcripts revealed five primary themes. These included positive comments on the MR technology [25], The use of the HMD to provide remote supervision [18], engaging features of the MR artefacts [8], usability issues of Microsoft HoloLens 2 [33], and limitations of the use of MR technology [10] (Table 2).

Table 2.

Student qualitative feedback illustrative quotes

Themes Illustrative quotes
Positive comments on MR technology

“The use of virtual reality is exciting as more senses are incorporated as I learn. Beyond vision and hearing, I could use my body more.”

“Overall, I really enjoyed using it. It’s relatively easy to use and it doesn’t take a lot of training to figure out how to use it.”

The use of the HMD to provide remote supervision

“The course felt good as I felt I had support throughout the process, and the tutor could see exactly what I saw to help me adjust.”

“Real-time feedback I guess would be the best. Yeah, having a one-on-one, having the doctor walk me through the procedure was helpful to solidify what I was trying to learn.”

Engaging features of MR holographic elements

“Pretty unique experience to have a 3D view”

“It was really nice that when I looked down and was just working at it when I looked back up, I could see the Zoom call or just the face of whoever is teaching me, and right away could speak to them and get that almost non-verbal communication.”

Usability issues of the HoloLens 2

“The menu kept popping up when you tried to put on your gloves or something. In a testing situation that could be a point where people might get flustered”

“Whenever I would move my arm in the IV cannula sometimes that would inadvertently return it to the home screen for example. And then I would have to repeat the process of turning that part off, and then proceeding with the rest of the IV cannulation.”

“One other drawback was that the HoloLens inhibited my visual field.”

“I think maybe just it’s the headache. The headache definitely plays a role.”

“I know at some point, I guess later on in the session, I started to have a bit of a headache. I guess from, I’m not sure if it’s from like the light’s kind of flashing around – not constantly but the whole process of wearing and using it started to wear down on me.”

“I didn’t want to look at it and felt like my eyes were straining. It might just be the headache reason, like my eyes were just straining because I was using the HoloLens.”

Limitations of MR technology

“Sometimes I wouldn’t be able to perform the task, and it took me a while to get a grip on how to use it properly.”

“Only issue was connection dropping from time to time”

“Audio cut out on occasion”

Overall, most students expressed excitement about the technology and its use in procedural tutorials. All three interviewees stated that they found the HL facilitated tutorials equally or more effective than standard in-person tutorials. All three interviewees reported a 6 or 7 out of 7 on a Likert scale when asked if they would recommend that HoloLens facilitated tutorials are incorporated in medical education programmes, and if they would recommend this MR facilitated tutorial structure to peers.

There were a significant number of both positive and negative comments regarding the usability of the HoloLens device. Students described a significant learning curve associated with using the HMD. Importantly, students expressed a greater comfort with the device as the sessions progressed and a number stated that their interaction with it became more “fluid”.

The development of a headache or eye strain was reported by some students as the tutorial progressed, noted as “later on” during the tutorial by one student interviewee, and one commented that the HoloLens negatively affected their field of vision.

One of the most common issues regarding the usability of the device while performing the cannulation procedure was when the HoloLens erroneously identified procedural movements such as glove donning or cannula advancement as “gestures”, which resulted in the devices holographic “Home” screen, or other artefacts such as the pointer hologram appearing in the student’s field of vision. This served as a significant distraction to the students and required a shift in the student’s attention to remedy.

Think aloud session

Analysis of the Think Aloud session focused on the usability of the device in the context of procedural training. The student initially struggled with elements of the HoloLens familiarisation session, including the spatial orientation and manipulation of the holographic artefacts. The student stated that she felt these objects did not initially respond as expected to their movements. The student found it challenging to achieve core competencies to use the HoloLens, including moving, rotating, and enlarging the virtual elements. These difficulties in completing the tasks in the “Tips” application are illustrated by quotes such as “So there is a ray and I am trying to use it to touch the gems, I’m just trying to figure out how to line it up with the ray that they have, it’s not really working” and “that didn’t work. I’m going to try again, yeah, it’s not working.” They also stated that the HoloLens affected their vision while performing the cannulation process but qualified the statement that they were still able to perform the task, “It’s a little bit hard, because it kind of like impairs your field of view, but it’s not to the point that I can’t do it at all”, “The headset it was fine honestly. The only annoying thing was that the field of view was kind of obstructed. And it was a little bit annoying to have to wear it, but I can still do it I think”.

Tutor feedback

The tutor (MC) stated that overall the interactions with the students via the HoloLens was relatively smooth, and that the use of the MR capabilities and holographic artefacts were useful. However they noted that the tutorials delivered via the HMD were subjectively more fatiguing and required greater concentration levels on the tutor’s part. The need to provide pre-tutorial instructions on the use of the device was noted as an added workload.

The tutor stated that the field of vision of the HoloLens 2’s integrated camera was a significant issue. The students’ hands, and therefore the area of interest during the tutorial, frequently fell below the field of vision projected to the tutor, who had to give relatively frequent prompts to students to lower their heads. This resulted in breaking the student’s and tutor’s focus and attention, as well as potentially forcing the student to adopt a posture which may have been uncomfortable or unergonomic.

The device also frequently interpreted students’ handmovements, mistakenly, as “gestures”. The most common examples of this were mistakingly opening the Home menu when students were donning or doffing gloves, or projecting “Air Tap” pointers when students extended a finger during cannulation. These issues served as significant distractions to both student and tutor during the assessments and feedback and practice sessions.

Discussion

Our results indicate that it is feasible to employ Mixed Reality (using the Microsoft HoloLens 2 device) to deliver tutorials remotely on basic technical skills such as intravenous cannulation. Although such tutorials appear to be relatively well received by students and tutors, several usability issues were identified.

Our study has demonstrated that the delivery of one-to-one tutorials on PIVC insertion using the Microsoft HoloLens 2 device is feasible, once the technological resources described in our study are in place. The study demonstrates that tutorials delivered remotely via the HoloLens resulted in a statistically significant learning effect, which was similar to in-person tutorials. Students’ visuospatial skills did not have a significant relationship with SUS scores, or on improvements in PIVC competency assessment scores. Quantitative feedback from students was mostly positive, with some reservations expressed regarding the quality of the audio connection. Qualitative feedback demonstrated a large number of positive responses, particularly regarding the MR technology, but also revealed some significant limitations including audio quality, headaches and eye strain. In particular, the Think Aloud session illustrated usability issues when the device was used by students who were unfamiliar with HMDs.

Many studies in the literature evaluating the use of MR in procedural training have focused on the use of novel holographic models, often without direct input or supervision from tutors. Rochlen et al. evaluated the use of the Microsoft HoloLens 2 device in a PIVC tutorial, finding the MR-enhanced tutorial well received by participants and that it provided an environment conducive to successful learning [57]. However, that study did not compare learning efficacy to standard tutorials, nor was the participant wearing the HMD which precluded an assessment of the usability of the device by students.

Studies of the usability of the HoloLens 2 device in other contexts have found mixed results, with SUS scores ranging from 51.5 to 75 in studies by Escalada-Hernandez, Minty, and Johnston which evaluated the device in various clinical education settings [5860]. The interpretation of SUS scores has been debated in the literature. Lewis and Sauro created a grading system based on a large dataset of hundreds of SUS-based usability studies [6163]. In their framework, a mean score of 79.4, which we found in this study, correlates with an A- grade, equivalent to the 80th -85th percentile of the dataset [Additional File 7] [61]. However, the usability and technological issues which were highlighted in our quantitative feedback, qualitative feedback and Think Aloud session were significant. This emphasises the value of a robust mixed methods approach when evaluating new forms of technology in contexts as nuanced as medical education.

The development of headaches and/or eye strain by several students is important. The phenomenon of “cybersickness” has been associated with the use of HMDs in the past. This can include a range of symptoms including headaches, dizziness and nausea [64, 65]. Visual fatigue has also been associated with prolonged use of HMDs and is one of the most frequently reported adverse symptoms associated with use of the HoloLens 2 device [6668]. The prevalence of these symptoms may be significant limiting factors in their widespread adoption, and overcoming these issues should be a priority for hardware and software developers.

It was noted by both tutors and students that the field of view of the HoloLens 2 integrated camera was inadequate. This was particularly evident when the student’s hands (the focus of interest for many procedural tasks) fell below the field of view broadcast to the tutor. This resulted in the tutor frequently requesting that the student lower their head. This firstly disrupted the concentration of both tutor and student, and secondly necessitated that the student adopted an unnatural posture. Changes in head and neck posture when wearing an HMD, which may adversely load the musculoskeletal system, have been described in the literature [69, 70]. These changes have primarily been attributed to the weight of the devices and the resultant change in the centre of gravity of the wearer. The need to further change the posture to accommodate the camera’s limited field of view may compound the problem, and therefore manufacturers should strive to facilitate this posture when designing future generations of HMDs.

It has been shown that participants’ visuospatial abilities can have an influence on their performance when using MR devices, but this was not the case in our tutorial format [44]. This may be attributable to the relative simplicity of the task being performed, which did not require a large amount of manipulation of the holographic elements by the students. This may be a useful finding for educators when they are designing sessions which include mixed reality elements.

The frequency of instances where the device incorrectly interpreted students movements as “gestures” is very noteworthy. It is particularly relevant to scenarios where the wearers are performing practical procedures such as in this study. This would have an even greater significance if the wearer were performing a procedure on a live patient, when the consequences of distraction or visual obstruction could lead to patient harm. Possible solutions include alteration of specific gesture settings in the device options; however this would have a significant impact on the device interface and functionality.

Future direction of research in mixed reality in procedural training

A particulatly interesting topic for future work will be the integration of these devices into Learning Analytics programmes. The volume of data generated from HMD devices such as audio and video recordings, interaction data, and eye tracking data may create large databases of information. We suggest that the acquisition of specific metrics related to the performance of a technical skill (such as reported by Winkler-Schwartz et al.) could be combined with other data such as demographics and prior academic performance to train machine learning models which could inform deliberate practice as well as predict future performance [7173].

We believe that future research should focus on carrying out more thorough evaluations of the learning efficacy of tutorials delivered using MR, specifically with larger participant numbers, and across a variety of clinically relevant procedures. We also recommend further research into evaluating, predicting, and mitigating the effects of “cybersickness” symptoms.

Limitations

This study has a number of limitations. The number of participants in our study was relatively small. Given the novel format of our intervention, we were unable to perform an valid power calculation during study design, and therefore the participants comprised a convenience sample, which limits the generalisability of our findings. All student participants were volunteers and thus may display an inherent selection bias; academically motivated individuals or individuals who had a particular interest in the technology may have been more likely to participate. This may also affect the generalisability of our findings. The tutor feedback must be interpreted in the context that hewas also an investigator.

Research involving HMDs such as the HoloLens 2 has a number of inherent limitations. There are no specific benchmarks or validated evaluation tools to test its efficacy in remote procedural training [7476].

Conclusions

The results of this study demonstrate the feasibility and efficacy of using the HoloLens 2 device to carry out tutorials on intravenous cannulation from a remote location. The device was found to have above average usability in this context and the technology was found to be agreeable to students. The provison of effective procedural tutorials from a remote location has a number of potential benefits. It may improve access to training, reduce infection control concerns and reduce the environmental impact associated with tutor and student travel. However, a significant number of issues were identified, including the development of headaches and visual fatigue by participants, limits to participants visual fields, and incorrect interpretation of hand movements as “gestures” by the device. The fixed and licencing costs associated with the device are also significant. The authors believe that MR technology has significant potential in medical education, and that technological advancements are likely necessary to address the usability and technical issues raised in this study. However, we recommend that further research is required to strengthen its evidence base prior to its’ widespread adoption.

Supplementary Information

12909_2026_8648_MOESM1_ESM.pdf (287.4KB, pdf)

Supplementary Material 1: Mental Rotations Test - A.

12909_2026_8648_MOESM2_ESM.pdf (54.1KB, pdf)

Supplementary Material 2: Student HoloLens Training Procedure Form.

12909_2026_8648_MOESM3_ESM.pdf (129.2KB, pdf)

Supplementary Material 3: Intravenous Cannulation Metrics Score Sheet.

12909_2026_8648_MOESM4_ESM.pdf (56.5KB, pdf)

Supplementary Material 4: Student Tutorial Feedback Form.

12909_2026_8648_MOESM5_ESM.pdf (160.2KB, pdf)

Supplementary Material 5: Student Interview Structure.

12909_2026_8648_MOESM6_ESM.pdf (66.6KB, pdf)

Supplementary Material 6: Successful completion of individual metrics.

12909_2026_8648_MOESM7_ESM.pdf (36.2KB, pdf)

Supplementary Material 7: Lewis and Sauro Curved Grading Scale for the System Usability Scale Score.

Acknowledgements

The authors would like to acknowledge the assistance from members of the UCC College of Medicine and Health, including Dr. Colm O’Tuathaigh, Dr. Pat Henn and Professor Paula O’Leary, as well as Ms Michelle Donovan in the UCC Centre for Digital Education.

Abbreviations

AR

Augmented Reality

ETELM-LP

Evaluation of Technology-Enhanced Learning Materials: Learner Perceptions

HMD

Head-Mounted Display

HL

HoloLens

IP

In-Person

MR

Mixed Reality

MRT-A

Mental Rotations Test – A

SBME

Simulation Based Medical Education

SD

Standard Deviation

SUS

System Usability Scale

Authors’ contributions

MC lead the design of the study, carried out the tutorials, analysed both quantitative and qualitative data and was the primary author of the manuscript. GI guided the design of the study and contributed to writing the manuscript. NOB contributed to the technical and logistical design of the study and acted as technical facilitator for the tutorials and contributed to manuscript composition. JV designed, completed and analysed the semi-structured student interviews and contributed to manuscript composition. AOM, PS, analysed student assessment data. GS played a central role in study design and completion and contributed manuscript composition.

Funding

This study received funding and research support through the UCC Learning Analytics LITE programme, which is funded through the Strategic Alignment of Teaching and Learning Enhancement fund. The UCC Learning Analytics LITE programme provided logistical and research support in study design and funds were used to hire assistance in data interpretation.

This study also received funding from the UCC College of Medicine and Health which was utilised to purchase the HoloLens 2 Device and associated licences.

Data availability

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Declarations

Ethics approval and consent to participate

This study was approved by the Social Research Ethic Committee of the Cork Teaching Hospitals, and the University College Cork Research and Postgraduate Affairs Committee [Log 2021 − 197]. All methods were carried out in accordance with guidelines and regulations as set out by the ethics and research committees. All participants provided informed consent to participate in the study.

Consent for publication

All participants including students, tutor and technical facilitator provided written informed consent prior to inclusion in the study. Participants who’s identifiable images are included provided informed consent for publication of identifiable information/ images in an open access journal.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Cooke M, Irby DM, Sullivan W, Ludmerer KM. American medical education 100 years after the flexner report. N Engl J Med. 2006;355(13):1339–44. [DOI] [PubMed] [Google Scholar]
  • 2.Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287(2):226–35. [DOI] [PubMed] [Google Scholar]
  • 3.Kroenke K, Omori DM, Landry FJ, Lucey CR. Bedside teaching. South Med J. 1997;90(11):1069–74. [DOI] [PubMed] [Google Scholar]
  • 4.Hemmer PA, Ibrahim T, Durning SJ. The impact of increasing medical school class size on clinical clerkships: a National survey of internal medicine clerkship directors. Acad Med. 2008;83(5):432–7. [DOI] [PubMed] [Google Scholar]
  • 5.Walsh K. E-learning in medical education: the potential environmental impact. Educ Prim Care. 2018;29(2):104–6. [DOI] [PubMed] [Google Scholar]
  • 6.Sherpa JR, Donahue L, Tsai J, Nguemeni Tiako MJ. The planetary benefit of suspending USMLE step 2 CS: estimating carbon emissions associated with US medical students’ travel to testing centers. Yale J Biol Med. 2023;96(2):185–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Sharma D, Rizzo J, Nong Y, Murase LC, Fong S, Lo K, et al. Virtual learning decreases the carbon footprint of medical education. Dermatol Ther (Heidelb; 2024. [DOI] [PMC free article] [PubMed]
  • 8.The expansion of. medical student numbers in the United Kingdom [press release]. London2021.
  • 9.Oliver R, Herrington J. Exploring Technology-Mediated learning from a pedagogical perspective. ECU Publications. 2003;11.
  • 10.Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. [DOI] [PubMed] [Google Scholar]
  • 11.McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63. [DOI] [PubMed] [Google Scholar]
  • 12.Ziv A, Ben-David S, Ziv M. Simulation based medical education: an opportunity to learn from errors. Med Teach. 2005;27(3):193–9. [DOI] [PubMed] [Google Scholar]
  • 13.Lund F, Schultz JH, Maatouk I, Krautter M, Möltner A, Werner A, et al. Effectiveness of IV cannulation skills laboratory training and its transfer into clinical practice: a randomized, controlled trial. PLoS ONE. 2012;7(3):e32831. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Azuma RT. A survey of augmented reality. Presence: Teleoperators Virtual Environ. 1997;6(4):355–85. [Google Scholar]
  • 15.Hu HZ, Feng XB, Shao ZW, Xie M, Xu S, Wu XH, et al. Application and prospect of mixed reality technology in medical field. Curr Med Sci. 2019;39(1):1–6. [DOI] [PubMed] [Google Scholar]
  • 16.Garzón J, Pavón J, Baldiris S. Systematic review and meta-analysis of augmented reality in educational settings. Virtual Reality. 2019;23(4):447–59. [Google Scholar]
  • 17.Hantono BS, Nugroho LE, Santosa PI, editors. Meta-Review of Augmented Reality in Education. 2018 10th International Conference on Information Technology and Electrical Engineering (ICITEE); 2018 24–26 July 2018.
  • 18.Eckert M, Volmerg JS, Friedrich CM. Augmented reality in medicine: systematic and bibliographic review. JMIR Mhealth Uhealth. 2019;7(4):e10967. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Wang L, Zhao Z, Wang G, Zhou J, Zhu H, Guo H, et al. Application of a three-dimensional visualization model in intraoperative guidance of percutaneous nephrolithotomy. Int J Urol. 2022;29(8):838–44. [DOI] [PubMed] [Google Scholar]
  • 20.Liu X, Sun J, Zheng M, Cui X. Application of mixed reality using optical See-Through Head-Mounted displays in transforaminal percutaneous endoscopic lumbar discectomy. Biomed Res Int. 2021;2021:9717184. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Kitagawa M, Sugimoto M, Haruta H, Umezawa A, Kurokawa Y. Intraoperative holography navigation using a mixed-reality wearable computer during laparoscopic cholecystectomy. Surgery. 2022;171(4):1006–13. [DOI] [PubMed] [Google Scholar]
  • 22.Mill T, Parikh S, Allen A, Dart G, Lee D, Richardson C, et al. Live streaming ward rounds using wearable technology to teach medical students: a pilot study. BMJ Simul Technol Enhanc Learn. 2021;7(6):494–500. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Silvero Isidre A, Friederichs H, Müther M, Gallus M, Stummer W, Holling M. Mixed reality as a teaching tool for medical students in neurosurgery. Med. 2023; 59(10). [DOI] [PMC free article] [PubMed]
  • 24.Kolecki R, Pręgowska A, Dąbrowa J, Skuciński J, Pulanecki T, Walecki P, et al. Assessment of the utility of mixed reality in medical education. Translational Res Anat. 2022;28:100214. [Google Scholar]
  • 25.Bala L, Kinross J, Martin G, Koizia LJ, Kooner AS, Shimshon GJ, et al. A remote access mixed reality teaching ward round. Clin Teach. 2021;18(4):386–90. [DOI] [PubMed] [Google Scholar]
  • 26.George O, Foster J, Xia Z, Jacobs C. Augmented reality in medical education: A mixed methods feasibility study. Cureus. 2023;15(3):e36927. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Connolly M, Iohom G, O’Brien N, Volz J, O’Muircheartaigh A, Serchan P, et al. Delivering clinical tutorials to medical students using the Microsoft hololens 2: A mixed-methods evaluation. BMC Med Educ. 2024;24(1):498. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Levy JB, Kong E, Johnson N, Khetarpal A, Tomlinson J, Martin GF, et al. The mixed reality medical ward round with the MS hololens 2: innovation in reducing COVID-19 transmission and PPE usage. Future Healthc J. 2021;8(1):e127–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Sivananthan A, Gueroult A, Zijlstra G, Martin G, Baheerathan A, Pratt P, et al. Using mixed reality headsets to deliver remote bedside teaching during the COVID-19 pandemic: feasibility trial of hololens 2. JMIR Form Res. 2022;6(5):e35674. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Mentis HM, Avellino I, Seo J, editors. Ar hmd for remote instruction in healthcare. 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW); 2022: IEEE.
  • 31.Schoeb DS, Schwarz J, Hein S, Schlager D, Pohlmann PF, Frankenschmidt A, et al. Mixed reality for teaching catheter placement to medical students: a randomized single-blinded, prospective trial. BMC Med Educ. 2020;20(1):510. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Prionas ND, Kung TH, Dohn A, Piro N, von Eyben R, Katznelson L, et al. A pilot, randomized controlled trial of telementorship: A useful tool during social distancing. J Clin Transl Res. 2021;7(1):66–71. [PMC free article] [PubMed] [Google Scholar]
  • 33.Bui DT, Barnett T, Hoang H, Chinthammit W. Usability of augmented reality technology in situational telementorship for managing clinical scenarios: Quasi-Experimental study. JMIR Med Educ. 2023;9:e47228. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Casas-Yrurzum S, Gimeno J, Casanova-Salas P, García-Pereira I, García Del Olmo E, Salvador A, et al. A new mixed reality tool for training in minimally invasive robotic-assisted surgery. Health Inf Sci Syst. 2023;11(1):34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Feifer A, Delisle J, Anidjar M. Hybrid augmented reality simulator: preliminary construct validation of laparoscopic smoothness in a urology residency program. J Urol. 2008;180(4):1455–9. [DOI] [PubMed] [Google Scholar]
  • 36.Neary PC, Boyle E, Delaney CP, Senagore AJ, Keane FBV, Gallagher AG. Construct validation of a novel hybrid virtual-reality simulator for training and assessing laparoscopic colectomy; results from the first course for experienced senior laparoscopic surgeons. Surg Endosc. 2008;22(10):2301–9. [DOI] [PubMed] [Google Scholar]
  • 37.Engum SA, Jeffries P, Fisher L. Intravenous catheter training system: Computer-based education versus traditional learning methods: [1]. Am J Surg. 2003;186(1):67. [DOI] [PubMed] [Google Scholar]
  • 38.Loukas C, Nikiteas N, Kanakis M, Moutsatsos A, Leandros E, Georgiou E. A virtual reality simulation curriculum for intravenous cannulation training. Acad Emerg Med. 2010;17(10):1142–5. [DOI] [PubMed] [Google Scholar]
  • 39.Dennick R. Constructivism: reflections on Twenty five years teaching the constructivist approach in medical education. Int J Med Educ. 2016;7:200–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Quarles J, Lampotang S, Fischler I, Fishwick P, Lok B. Scaffolded learning with mixed reality. Computers Graphics. 2009;33(1):34–46. [Google Scholar]
  • 41.Ahmed OMA, Niessen T, O’Donnell BD, Gallagher AG, Breslin DS, DunnGalvin A, et al. The effect of metrics-based feedback on acquisition of sonographic skills relevant to performance of ultrasound-guided axillary brachial plexus block. Anaesthesia. 2017;72(9):1117–24. [DOI] [PubMed] [Google Scholar]
  • 42.McClusky DA 3rd, Ritter EM, Lederman AB, Gallagher AG, Smith CD. Correlation between perceptual, visuo-spatial, and psychomotor aptitude to duration of training required to reach performance goals on the MIST-VR surgical simulator. Am Surg. 2005;71(1):13–20. discussion – 1. [PubMed] [Google Scholar]
  • 43.Keehner MM, Tendick F, Meng MV, Anwar HP, Hegarty M, Stoller ML, et al. Spatial ability, experience, and skill in laparoscopic surgery. Am J Surg. 2004;188(1):71–5. [DOI] [PubMed] [Google Scholar]
  • 44.Brucker B, Pardi G, Uehlin F, Moosmann L, Lachmair M, Halfmann M, et al. How learners’ visuospatial ability and different ways of changing the perspective influence learning about movements in desktop and immersive virtual reality environments. Educational Psychol Rev. 2024;36(3):65. [Google Scholar]
  • 45.Steffe LP, Gale J, editors. Constructivism in education. 1st ed. New York: Routledge; 1995. [Google Scholar]
  • 46.Peters M, Laeng B, Latham K, Jackson M, Zaiyouna R, Richardson C. A redrawn Vandenberg and Kuse mental rotations test: different versions and factors that affect performance. Brain Cogn. 1995;28(1):39–58. [DOI] [PubMed] [Google Scholar]
  • 47.Corporation M. HoloLens Tips 2023 [Available from: https://www.microsoft.com/en-us/p/hololens-tips/9pd4cxkklc47?activetab=pivot:overviewtab
  • 48.Schuster C, Stahl B, Murray C, Keleekai NL, Glover K. Development and testing of a short peripheral intravenous catheter insertion skills checklist. J Association Vascular Access. 2016;21(4):196–204. [Google Scholar]
  • 49.Microsoft. Infrastructure Guidelines for HoloLens: Microsoft Learn; 2022 [updated 07/07/2022; cited 2022 07/10/2022]. Available from: https://learn.microsoft.com/en-us/hololens/hololens-commercial-infrastructure
  • 50.Cook DA, Ellaway RH. Evaluating technology-enhanced learning: A comprehensive framework. Med Teach. 2015;37(10):961–70. [DOI] [PubMed] [Google Scholar]
  • 51.Thompson Burdine J, Thorne S, Sandhu G. Interpretive description: A flexible qualitative methodology for medical education research. Med Educ. 2021;55(3):336–43. [DOI] [PubMed] [Google Scholar]
  • 52.Baxter K, Courage C, Caine K. Chapter 7 - During Your User Research Activity. In: Baxter K, Courage C, Caine K, editors. Understanding your Users (Second Edition). Boston: Morgan Kaufmann; 2015. pp. 158 – 89.
  • 53.Consultants SR. Dedoose cloud application for managing, analyzing, and presenting qualitative and mixed method research data. 43 ed. ed. Los Angeles, California: LLC; 2021. [Google Scholar]
  • 54.Clarke V, Braun V. Teaching thematic analysis: overcoming challenges and developing strategies for effective learning. Psychol. 2013;26:120–3. [Google Scholar]
  • 55.Lingard L. Beyond the default colon: effective use of quotes in qualitative research. Perspect Med Educ. 2019;8(6):360–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Brooke J. SUS -- a quick and dirty usability scale. 1996. pp. 189 – 94.
  • 57.Rochlen LR, Putnam E, Levine R, Tait AR. Mixed reality simulation for peripheral intravenous catheter placement training. BMC Med Educ. 2022;22(1):876. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Escalada-Hernandez P, Soto-Ruiz N, Ballesteros-Egüés T, Larrayoz-Jiménez A, Martín-Rodríguez LS. Usability and user expectations of a HoloLens-based augmented reality application for learning clinical technical skills. Virtual Reality. 2024;28(2):102. [Google Scholar]
  • 59.Minty I, Lawson J, Guha P, Luo X, Malik R, Cerneviciute R, et al. The use of mixed reality technology for the objective assessment of clinical skills: a validation study. BMC Med Educ. 2022;22(1):639. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Johnston M, O’Mahony M, O’Brien N, Connolly M, Iohom G, Kamal M, et al. The feasibility and usability of mixed reality teaching in a hospital setting based on self-reported perceptions of medical students. BMC Med Educ. 2024;24(1):701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Lewis J, Sauro J. Item Benchmarks for the System Usability Scale. 2018;13:158 – 67.
  • 62.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Int J Human–Computer Interact. 2008;24(6):574–94. [Google Scholar]
  • 63.Albert W, Tullis T. Measuring the User Experience2008.
  • 64.Rebenitsch L, Owen C. Review on cybersickness in applications and visual displays. Virtual Reality. 2016;20(2):101–25. [Google Scholar]
  • 65.Hughes C, Fidopiastis CM, Stanney KM, Bailey PS, Ruiz E, editors. The psychometrics of cybersickness in augmented reality. Frontiers in Virtual Reality; 2020.
  • 66.Iskander J, Hossny M, Nahavandi S. Using biomechanics to investigate the effect of VR on eye vergence system. Appl Ergon. 2019;81:102883. [DOI] [PubMed] [Google Scholar]
  • 67.Fan L, Wang J, Li Q, Song Z, Dong J, Bao F, et al. Eye movement characteristics and visual fatigue assessment of virtual reality games with different interaction modes. Front Neurosci. 2023;17:1173127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Vovk A, Wild F, Guest W, Kuula T, editors. Simulator sickness in augmented reality training using the Microsoft HoloLens. Proceedings of the 2018 CHI conference on human factors in computing systems; 2018.
  • 69.Knight JF, Baber C. Effect of head-mounted displays on posture. Hum Factors. 2007;49(5):797–807. [DOI] [PubMed] [Google Scholar]
  • 70.Schlussel AT, Maykel JA. Ergonomics and musculoskeletal health of the surgeon. Clin Colon Rectal Surg. 2019;32(6):424–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Winkler-Schwartz A, Yilmaz R, Mirchi N, Bissonnette V, Ledwos N, Siyar S, et al. Machine learning identification of surgical and operative factors associated with surgical expertise in virtual reality simulation. JAMA Netw Open. 2019;2(8):e198363–e. [DOI] [PubMed] [Google Scholar]
  • 72.Palumbo A. Microsoft hololens 2 in medical and healthcare context: state of the Art and future prospects. Sens (Basel). 2022;22(20). [DOI] [PMC free article] [PubMed]
  • 73.Shorten G. Artificial intelligence and training physicians to perform technical procedures. JAMA Netw Open. 2019;2(8):e198375–e. [DOI] [PubMed] [Google Scholar]
  • 74.Herron J. Augmented reality in medical education and training. J Electron Resour Med Libr. 2016;13:1–5. [Google Scholar]
  • 75.Gerup J, Soerensen CB, Dieckmann P. Augmented reality and mixed reality for healthcare education beyond surgery: an integrative review. Int J Med Educ. 2020;11:1–18. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surg Endosc. 2016;30(10):4174–83. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

12909_2026_8648_MOESM1_ESM.pdf (287.4KB, pdf)

Supplementary Material 1: Mental Rotations Test - A.

12909_2026_8648_MOESM2_ESM.pdf (54.1KB, pdf)

Supplementary Material 2: Student HoloLens Training Procedure Form.

12909_2026_8648_MOESM3_ESM.pdf (129.2KB, pdf)

Supplementary Material 3: Intravenous Cannulation Metrics Score Sheet.

12909_2026_8648_MOESM4_ESM.pdf (56.5KB, pdf)

Supplementary Material 4: Student Tutorial Feedback Form.

12909_2026_8648_MOESM5_ESM.pdf (160.2KB, pdf)

Supplementary Material 5: Student Interview Structure.

12909_2026_8648_MOESM6_ESM.pdf (66.6KB, pdf)

Supplementary Material 6: Successful completion of individual metrics.

12909_2026_8648_MOESM7_ESM.pdf (36.2KB, pdf)

Supplementary Material 7: Lewis and Sauro Curved Grading Scale for the System Usability Scale Score.

Data Availability Statement

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.


Articles from BMC Medical Education are provided here courtesy of BMC

RESOURCES