Abstract
Background:
Augmented reality (AR) has great potential for improving image-guided neurosurgical procedures, but until recently, hardware was mostly custom-made and difficult to distribute. Currently, commercially available low-cost AR devices offer great potential for neurosurgery, but reports on technical feasibility are lacking. The goal of this pilot study is to evaluate the feasibility of using a low-cost commercially available head-mounted holographic AR device (the Microsoft Hololens) in the operating room. The Hololens is operated by performing specific hand gestures, which are recognized by the built-in camera of the device. This would allow the neurosurgeon to control the device “touch free” even while wearing a sterile surgical outfit.
Methods:
The Hololens was tested in an operating room under two lighting conditions (general background theatre lighting only; and general background theatre lighting and operating lights) and wearing different surgical gloves (both bright and dark). All required hand gestures were performed, and voice recognition was evaluated against background noise consisting of two nurses talking at conversational speech level.
Results:
Wearing comfort was sufficient, with and without regular glasses. All gestures were correctly classified regardless of lighting conditions or the sort of sterile gloves. Voice recognition was good. The visibility of the holograms was good if the device was configured to use high brightness for display.
Conclusions:
We demonstrate that using a commercially available low-cost head-mounted holographic AR device is feasible in a sterile surgical setting, under different lighting conditions and using different surgical gloves. Given the availability of freely available software for application development, neurosurgery can benefit from new opportunities for image-guided surgery.
Keywords: Augmented reality, mobile health, neuronavigation, simulation, virtual reality
INTRODUCTION
Neurosurgical procedures are often complex and require high accuracy. Neuronavigation is available for both cranial and spinal procedures to assist in surgical preparation and execution, but standard approaches require the neurosurgeon to tilt the head from the patient to the navigation screen and vice versa. Moreover, navigation images displayed on a two-dimensional screen can only offer two-dimensional navigation (in three anatomical planes) despite the availability of a three-dimensional reconstruction of the region of interest in the navigation software.
Augmented reality (AR), also referred to as “mixed reality,” allows to merge a virtual environment into the real physical environment.[3,7,8,9] Integration of surgical information into the operating microscope is a well-known example, but obviously limited to the use of a microscope. This limits the use for surgical planning, in which AR could offer added value compared to standard neuronavigation. A head-mounted display (HMD) would offer more flexibility and solve this problem.
Only a few studies on using HMD's for AR-based neuronavigation have been published, but they are performed using in-house developed solutions, limiting wider expansion of their particular approach.[9,16] We are the first to report on the feasibility of a low-cost commercially available HMD for AR in the operating room.
MATERIALS AND METHODS
Hardware
The Microsoft Hololens (Microsoft Corporation, Redmond, WA) is an HMD with see-through holographic lenses, automatic pupillary distance calibration, spatial sound, gaze tracking, gesture input, and voice support. It runs Windows 10 on 2 GB RAM and 64 GB internal storage. Wireless connectivity is available as Wi-Fi 802.11ac and Bluetooth 4.1 LE. Its weight is 579 g (1.2 lbs) which is mainly front-loaded. Its integrated battery offers 2–3 h active use and approximately 2 weeks standby time.[14] Default gesture input consists of “air tapping” (pinching of thumb and index finger) to select objects which are holographically displayed, and “blooming” (opening the hand) which opens or closes the main menu. Although a “clicker” (a sort of computer mouse) is available, the device can be operated completely “touch free” using gestures. The device is still in the development phase, but available for purchase for early testing and software development for 3000 US dollar.
Software
To create the holographic images we used DICOM files obtained from magnetic resonance imaging (MRI) and computed tomography (CT) head scans and the Open-Source software 3D Slicer (http://slicer.org), to specifically select the area of a scan we would like to display as a hologram in the Hololens.[2] Once selected and edited, this digital 3D model was converted into a 3D modeling format such as.stj or.obj using 3D Slicer. Finally, these files can be converted using Open-Source software such as Autodesk FBX converter 2013 (https://www.autodesk.com/developer-network/platform-technologies/fbx-converter-archives) into the.fbx format. Finally, this.fbx file, created from MRI or CT scans, can then be imported onto the Hololens via Microsoft Onedrive and displayed using Microsoft's 3D Viewer Beta application.
Evaluation
To evaluate usability in the operating room, the first author was trained in a 15-min session on using the Hololens and performing the gestures. Then he was fully dressed up for surgery and wearing the Hololens in an operating room. He performed a standard set of tasks under two lighting conditions (general background theatre lighting only; and general background theatre lighting and operating lights), and wearing standard latex (white) and dark latex (brown) sterile gloves. The Hololens was powered up before dressing up. The standard task set consisted of opening the main menu [“blooming” gesture, Figure 1], selecting three particular applications from the menu [“air tapping” gesture, Figure 2], and placing holographic screens at specific sites in the operating room (gaze tracking and air tapping). Voice recognition was evaluated against background noise consisting of two nurses talking at conversational speech level.
RESULTS
The Hololens fitted comfortably wearing a complete surgical outfit, and the see-through lenses offered a complete and accurate vision on the physical surrounding environment. It did not move during task performance. The standard task set could be completed at first attempt for all tasks, in both lighting conditions (directly below surgical lamp and using room lighting). We did not separately quantify lighting brightness inside the operating room. There was no difference in performance between using bright (white) gloves or dark (brown) gloves. Audio hearing and voice recognition worked without problems.
Additionally, the second author evaluated the Hololens’ fitting comfort wearing regular glasses, which required a small adaptation in head mounting after which the combination of glasses and the Hololens and gesture recognition worked without problems.
DISCUSSION
Virtual reality (VR) and AR are both computer-aided techniques to display a virtual environment in an immersive way. Whereas VR shows only the virtual environment (either on a screen or using HMD completely replacing the physical environment), AR, or “mixed reality,” merges the virtual and physical environment. VR can be an excellent tool for surgical planning or surgical simulation,[4,5,15,16,17] but the lack of integration with the physical world makes it less suited for surgical neuronavigation. AR offers exactly this advantage. Meola et al. described a practical 10-point multiparametric assessment for AR systems in neurosurgery.[9] Applying this scale, the Hololens is available for all mentioned fields of use (open neurosurgery, endoscopy, and endovascular). Regarding AR system features, the real data source is the integrated video camera using optical tracking, display type is the HMD, and the perception location is the patient. The registration technique can be all techniques mentioned (fiducial markers, skin surface registration, or manual registration). Regarding AR scene parameters, all virtual image sources apply, and visualization occurs by holographic images or overlays.
Previous works clearly suggest added value for AR-guided neurosurgery.[3,7,8,9,11,13] These studies relied on custom-developed AR devices that are limited in widespread adoption. In contrast, the Hololens is a low-cost HMD device that can be used standalone (i.e. no computer connection required). Developing new applications for this device can be done using Microsoft Visual Studio (Microsoft Corporation, Redmond, WA, USA) or Unity 3D (Unity Technologies, San Francisco, CA, USA). Both are widely used among software developers, and available as a free download for personal use to experiment. The combination of a low-cost commercially available device that has excellent support for software development creates exciting opportunities for applications of AR-guided neurosurgery. To the best of our knowledge no holographic AR applications for neurosurgery exist. According to a press release, two residents from Duke Hospital started collaboration with the Duke immersive Virtual Environment lab to create a Hololens application for external ventricular shunt placement.[6]
To date, no reports have been published that examine the technical feasibility of using such an AR device in a surgical setting. The ability to control the device with hand gestures that do not require any physical contact with the device would be ideal for a sterile surgical setting in which any physical contact (even when performed with a stylus or other device) would pose an infection risk to the patient. However, this requires two conditions. First, the device should be sufficiently stable and comfortable when being worn with surgical clothing (hat, mouth mask) even when the surgeon is wearing glasses. Second, gestures need to be recognized when wearing sterile surgical gloves, even under the operating lamp. The latter may cause decreased recognition of the surgeon's hand by diminished contrast or light reflections, and due to the nature of the device, no support by a team member can be provided to control the AR application of the device. In our evaluation both conditions were met for both researchers. Although the weight of the device is mainly front-loaded and therefore feels a little heavy, proper fitting and fixation before hand washing and sterilization went smooth. There was no need for accommodation afterwards, measured for 30 min after repeated head and body movements in all directions. Also, wearing glasses was no problem in combination with the Hololens. The physical environment was clearly visible through the glasses, the tinted glass did not adversely impact visibility of the environment in both lighting conditions. All hand gestures were reliably detected at first attempt, regardless of lighting conditions (direct light or room light) and regardless of sterile glove color (white or brown). The audio from the device was perfectly hearable and voice commands worked properly. Communication with the scrub team was not affected by wearing the device. The holographic images were sufficiently visible, although under direct light increasing brightness improved visibility. This can be done by pressing a button on the device, either by the surgeon before sterilizing the hands or by a nurse while the surgeon is wearing the device. We did not experiment with cleaning the device glasses, although this should not be a problem (a large window on the outside protects the inner glasses that provide the actual images).
Furthermore, creating 3D models out of medical scans (in DICOM format) was an easy and affordable task. At the time of reviewing, imported 3D models can only be displayed if they are in Autodesk Filmbox (.fbx) format, which are viewed using the Hololens 3D Viewer Beta app (available free of charge from the Microsoft application store).[10] 3D models in.fbx formats can be downloaded from various websites, but can also be created from conventional DICOM format files obtained from MRI or CT scans. This can be done using the Open-Source software 3D Slicer. Finally, these files can be converted using Open-Source software such as Autodesk FBX converter 2013 into the.fbx format.
Alternatively, one can also use OsiriX on Apple devices to create 3D models (.obj or.stl).[12] Files in.obj format can be used for further editing using Unity 5.0 or Windows 3D Builder and files in.stl format can be used for 3D printing.[1,18] A problem we encountered in this process was that using high-definition DICOM files to create 3D models sometimes resulted in models that could not be displayed on the Hololens due to an overload of vertices and/or meshes. To avoid this, we lowered the polygon count and texture resolution using 3D Slicer and Windows 3D Builder. Whereas this works fine for educational and training purposes, the possible impact for actual surgical applications remains to be evaluated. Newer and more powerful hardware is likely to address this potential shortcoming.
As limitations, the field of view is relatively small, which requires some head tilting if multiple holographic screens or visualizations are to be used. For neurosurgical applications this should not be a problem, as the region of interest does not span the entire room. Also, the relatively short wearing test (30 min) may not be representative for longer procedures, but in those procedures we expect the device to be taken off at some point (e.g., when reaching the target or when introducing the operating microscope).
CONCLUSIONS
In our evaluation the Hololens proved to be stable, comfortable, and working reliably in a surgical setting wearing a sterile outfit. In particular, gesture recognition worked flawlessly under different light conditions (direct light and room light) and wearing different colors of sterile gloves (bright and dark). The availability of a low-cost commercial AR device in combination with freely available software for application development opens exciting opportunities for AR-guided neurosurgery. Our evaluation serves as a confirmation that such development can take place, knowing that the holographic software applications can be used in the operating room during sterile surgical conditions.
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
Footnotes
Contributor Information
Pieter L. Kubben, Email: p.kubben@mumc.nl.
Remir S. N. Sinlae, Email: s.sinlae@student.maastrichtuniversity.nl.
REFERENCES
- 1.3D Builder – Windows Apps on Microsoft Store. 3D Builder – Windows Apps on Microsoft Store. [Last accessed on 2017 Mar 03]. Available from: https://www.microsoft.com/en-us/ store/p/3d-builder/9wzdncrfj3t6 .
- 2.3D Slicer. 3D Slicer. [Last accessed on 2017 Mar 03]. Available from: https://www.slicer.org .
- 3.Besharati Tabrizi L, Mahvash M. Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg. 2015;123:206–11. doi: 10.3171/2014.9.JNS141001. [DOI] [PubMed] [Google Scholar]
- 4.Chan S, Conti F, Salisbury K, Blevins NH. Virtual reality simulation in neurosurgery: technologies and evolution. Neurosurgery. 2013;72(Suppl 1):154–64. doi: 10.1227/NEU.0b013e3182750d26. [DOI] [PubMed] [Google Scholar]
- 5.Cohen AR, Lohani S, Manjila S, Natsupakpong S, Brown N, Cavusoglu MC. Virtual reality simulation: Basic concepts and use in endoscopic neurosurgery training. Childs Nerv Syst. 2013;29:1235–44. doi: 10.1007/s00381-013-2139-z. [DOI] [PubMed] [Google Scholar]
- 6.Duke Immersive Virtual Environment. The Hololens’ potential impact on neurosurgery. 2016. [Last accessed on 2017 Feb 23]. Available from: http://virtualreality.duke.edu/ the-hololens-potential-impact-on-neurosurgery/
- 7.Mahvash M, Besharati Tabrizi L. A novel augmented reality system of image projection for image-guided neurosurgery. Acta Neurochirurgica. 2013;155:943–7. doi: 10.1007/s00701-013-1668-2. [DOI] [PubMed] [Google Scholar]
- 8.Masutani Y, Dohi T, Yamane F, Iseki H, Takakura K. Augmented reality visualization system for intravascular neurosurgery. Comput Aided Surg. 1998;3:239–47. doi: 10.1002/(SICI)1097-0150(1998)3:5<239::AID-IGS3>3.0.CO;2-B. [DOI] [PubMed] [Google Scholar]
- 9.Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, Ferrari V. Augmented reality in neurosurgery: A systematic review. Neurosurg Rev. 2017;40:537–48. doi: 10.1007/s10143-016-0732-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Microsoft Inc. Using 3D Viewer Beta on HoloLens. [Last accessed on 2017 Mar 03]. Available from: https:// support.microsoft.com/en-us/help/13766/hololens-using-3d-vieweron-hololens .
- 11.Mitha AP, Almekhlafi MA, Janjua MJJ, Albuquerque FC, McDougall CG. Simulation and augmented reality in endovascular neurosurgery: Lessons from aviation. Neurosurgery. 2013;72(Suppl 1):107–14. doi: 10.1227/NEU.0b013e31827981fd. [DOI] [PubMed] [Google Scholar]
- 12.OsiriX features. OsiriX features. [Last accessed on 2017 Mar 03]. Available from: http://www.osirix-viewer. com/resources/technical-sheet/
- 13.Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, Lee SJ, et al. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J Clin Neurosci. 2017;35:1–4. doi: 10.1016/j.jocn.2016.09.002. [DOI] [PubMed] [Google Scholar]
- 14.Rubino D These are the full hardware specifications of the Microsoft HoloLens. 2016. [Last accessed on 2017 Feb 22]. Available from: http://www.windowscentral.com/ hololens-hardware-specs .
- 15.Shenai MB, Dillavou M, Shum C, Ross D, Tubbs RS, Shih A, et al. Virtual Interactive Presence and Augmented Reality (VIPAR) for remote surgical assistance. Neurosurgery. 2011;68:ons200–7. doi: 10.1227/NEU.0b013e3182077efd. [DOI] [PubMed] [Google Scholar]
- 16.Spicer MA, Apuzzo MLJ. Virtual reality surgery: Neurosurgery and the contemporary landscape. Neurosurgery. 2003;52:489–97. doi: 10.1227/01.neu.0000047812.42726.56. discussion 496-7. [DOI] [PubMed] [Google Scholar]
- 17.Stadie AT, Kockro RA, Reisch R, Tropine A, Boor S, Stoeter P, et al. Virtual reality system for planning minimally invasive neurosurgery. Technical note. J Neurosurg. 2008;108:382–94. doi: 10.3171/JNS/2008/108/2/0382. [DOI] [PubMed] [Google Scholar]
- 18.Unity-Unity-Overview. Unity-Unity-Overview. [Last accessed on 2017 Mar 03]. Available from: https:// unity3d.com/unity .