Abstract
Recent miniaturization of electronic components and advances in image processing software have facilitated the entry of extended reality technology into clinical practice. In the last several years the number of applications in cardiology has multiplied, with many promising become standard of care. We review many of these applications in the areas of patient and physician education, cardiac rehabilitation, pre-procedural planning and intraprocedural use. The rapid integration of these approaches into the many facets of cardiology suggests that it will one day become an every-day part of physician practice.
Introduction
The extended realities represent a continuum that spans from fully immersive, occlusive digital environments to unmodified, unobstructed physical reality, addressing diverse clinical and educational challenges in cardiology. Given the intrinsic advantage of 3-dimensional (3D)visualization and the increased availability of Virtual Reality (VR) consumer devices, many immersive 3D educational applications have emerged and have undergone rapid development. While these higher resolution, fully immersive systems provide an ideal platform for these educational applications, there is still reticence to integrate fully immersive devices into a clinical environment due to the complete obstruction of the normal field of vision. On the other end of the clinical spectrum, minimally intrusive Augmented Reality (AR) platforms allow physicians to view patient data through a pair of glasses, while maintaining eye contact. As display hardware capabilities have further matured, a third major cluster of features emerged; Mixed Reality (MR) addresses the middle ground between VR and AR. In concert, advances in voice, gesture and gaze tracking as user input methods have continued to progress, providing more intuitive control methods. With these developments, the user can realize the benefits of the 3D visualization and improved comprehension through control while remaining in their natural space. This technical leap made the intuitive benefits of immersive 3D more tractable by allowing the user to remain connected to the environment, minimizing additional risk to the patient, and lowering the barrier to more clinically-oriented applications. There is a growing recognition that the power in these applications is not only in the ability to provide 3D visualization, but also in the ability to control and manipulate digital images with the potential to provide the physician control over each critical tool in the operating room.
AR, MR and VR each provide unique technical capabilities to create an appropriate integration of digital and physical reality and address the varied user needs of different Cardiology applications(1). AR is generally the least intrusive of the extended realities and is well-suited to provide contextually relevant information or notifications to the user. Sometimes referred to as data snacking, this type of AR is well suited for providing information normally accessed using paper or on traditional displays. MR integrates information into the physical environment to provide additional spatial relevance and context to the displayed information. MR anchors or overlays information in physical space and enables the user to interact with digital objects more as if they are present in physical space (2). VR replaces the entire physical context with a digital replacement and can selectively substitute or remove physical information as well as add digital information. VR is capable of supporting the same interactions as MR, but prevents user interaction with the physical space. (See Figure 1)
This review article will explore various cardiac applications of the extended realities, including the method of control for the user. A general overview of the landscape with technical details and clinical data, where applicable, will be presented.
Education and Training
Cardiac applications in education and training can have different target audiences, for instance they can be patient-facing applications, or medical personnel-facing applications (3). Certain applications may do both with different goals for each end user group (4). Education and training applications translate traditional materials into a 3D format to allow the user to participate or experience the content in a controlled yet customizable method.
Patient Education
Project Brave Heart.
Project Brave Heart is one of the 3 aims of the VR program at Lucile Packard Children’s Hospital Stanford (5). This project aims to reduce anxiety and stress in patients who are scheduled for cardiac catheterization procedures (6). Up to 40 patients, ages 8–25, are asked to watch a program multiple times in the week prior to their procedure. The program utilizes a VR headset to provide the patient with a fully immersive experience in which they walk through the hospital, pre-procedural area, cardiac catheterization lab, and the recovery area prior to their procedure (See Figure 1, panel A). The hypothesis is that patients who undergo cardiac procedures have significant anxiety associated with their procedures and by providing patients with an immersive experience combined with teaching mindfulness techniques, VR can help reduce procedural related anxiety. As part of the study, measures of stress are collected, such as heart rate, blood pressure and cortisol levels. The expectation is that data from this project will be similar to other projects in the adjacent field of Pediatric Pain Therapy (7) where VR has been successfully deployed to manage acute, chronic and periprocedural pain.
Medical Student Education/Training
The Body VR.
This system (https://thebodyvr.com/) utilizes fully immersive VR visualization for 3 specific use cases: 1) a journey inside a cell, 2) Anatomy Viewer, and 3) patient pre-procedural teaching for colonoscopy. For a journey inside a cell, the user travels within a blood cell through the bloodstream to understand how cells work together throughout the body. The user can then granularly dive further into the cell, understanding the intracellular architecture and mechanics of cellular function and action. The anatomy viewer is a commercially available tool that allows for visualization of patient specific DICOM data, including computerized tomography (CT), magnetic resonance imaging (MRI) and positron emission tomography (PET) scans. These data are displayed using an Oculus Rift or HTC Vive VR headest (See Figure 1, panel A), which support 1080×1200 pixels per eye and an approximate field of view of 110 degrees. While viewing the virtual models, the users can scale, rotate and crop the models to identify lesions or abnormalities, and can then annotate on the model. “Colon Crossing” is the final use case and is a patient facing tool to improve patient education and compliance prior to colonoscopy.
Additionally, a system utilizing this technology was developed as an adjunctive tool for teaching anatomic details to medical students. Specifically, medical school coursework regarding the teaching of cardiac anatomy is challenging due to the complex 3D nature of the pathology and pathophysiology. In 2018, Maresky et al (8) published their work using a VR simulation model as part of the anatomy education curriculum for medical students. A total of 42 first year medical students voluntarily enrolled in the study and were randomized to +VR exposure or -VR exposure. The -VR group had traditional cadaveric dissection teaching followed by independent study whereas the +VR group during the independent study phase was given a 5-minute tutorial and 25 minutes of explorative study on a virtual cardiac model using The Body VR (9) within an Oculus Rift VR headset. All students underwent a pre-intervention and post-intervention quiz.
The -VR group demonstrated no significant difference between pre and post intervention quizzes, whereas the +VR group demonstrated a significant 28% (p<0.001) overall increase in test scores. Interestingly, the +VR group scored 21.4% (p=0.004) higher in conventional content, 26.4% (p<0.001) higher in visual-spatial content, and 23.9% (p<0.001) higher on the overall post-intervention quiz when compared to the -VR group (8). This study clearly demonstrated not only the feasibility of using VR as an adjunct to traditional medical student education, but also that it added value in understanding complex anatomic relationships.
HoloAnatomy.
Developed at Case Western Reserve University (CWRU), the HoloAnatomy course was developed to enable students, particularly medical students, to examine the body in totality, as well as the organ systems and their relationships within the body. Led by Dr. Mark Griswold, the HoloAnatomy application serves as the 1st of its kind of what is to become a full-fledged holographic anatomy curriculum scheduled to start in 2019 at CWRU (10) and uses the Microsoft HoloLens MR head up display for hardware (See Figure 1, panel C).
The Virtual Heart.
This Stanford Virtual Heart Project emerged to create and evaluate a VR headset based interactive virtual heart for training (5). The user can explore the chambers of the heart and experience a fully immersive environment to understand cardiac chamber anatomy and relationships, great vessel anatomy and relationships, and blood circulation through the heart. In addition to normal cardiac models, users also interact with common congenital heart defects. This program is available to medical trainees, cardiologists and cardiothoracic surgeons. At completion, this project aims to have created a virtual atlas of approximately 24 models for trainees to interact with and learn from. The program will then be made available to teens, parents, and caregivers. The goal of the technology is to make the individual patient anatomy easier for the entire care team, including family, to understand.
Anima Res.
Anima Res (https://animares.com/) is a 3-dimensional medical animation company with applications in augmented reality, mixed reality and virtual reality (9, 11). The mission of this team is to make medical education more palpable and tangible for medical student, physicians and patients. Specifically, the “Insight Heart” application allows for virtual exploration of the human heart with visual effects of myocardial infarction, systemic hypertension and atrial fibrillation viewable in this immersive experience. This experience is possible on a variety of extended reality hardware platforms.
Simulators.
Some of these standalone education and training applications have been further enhanced through integration with hardware simulators (12). The Vimedix transesophageal echocardiogram (TEE) and transthoracic echocardiogram (TTE) simulator by CAE demonstrates several potential applications of mixed reality (13). The applications of the MR simulator include achieving both normal educational endpoints, such as the understanding of anatomical relationships, as well as understanding the use and positioning the probe itself (14).
Cardiac Patient Rehabilitation
MindMaze.
MindMotionPRO and MindMotion Go (https://www.mindmaze.com/) are tools designed to provide intensive physical rehabilitation to adult patients in both an inpatient and outpatient setting. These devices use a conventional monitor displaying a virtual 3D environment, coupled with motion tracking cameras to allow patients to control a virtual avatar and visualize the feedback on the monitor. MindMotion uses AR to amplify and project movements during rehabilitative exercises to provide additional guidance and feedback to the participant. As the participant performs movements as directed on screen, their movements are displayed in either a third person or virtual mirror view to provide amplified feedback based on sensed movements. Data published in the Journal of Neuroengineering and Rehabilitation uses embodied AR to administer rehabilitation to 10 stroke survivors with chronic (>6 months) upper extremity paresis (15). This embodied AR may be beneficial for functional recovery as demonstrated by a median 5.4% improvement in motor function and was well tolerated in the pilot group. Although the devices themselves do not use an immersive display, the motion capture allows for intuitive interaction and control of the 3D avatar, allowing for more intense intervention without an increase in participant stress. Current research in perceived pain and stress (16) suggests this effect could be further enhanced by integrating MindMotion technology with an immersive VR display.
Pre-Procedural Planning
With the advent of AR and MR, several tools have been developed to act as a standalone DICOM viewers (see above section: The Body VR, as an example). These viewers allow the user to import standard radiologic imaging, such as CT scans, MRI, and PET scans and display these data sets in true 3D. Open source systems that allow for import and display from 3D echocardiographic (3DE) data (16, 17) have also become available. Additionally, systems have been organically developed at academic institutions and used for pre-procedural planning for patients undergoing congenital heart disease surgical repair demonstrating feasibility of such systems (17, 18).
EchoPixel.
The third arm of the Stanford VR program uses 3-dimensional VR Imaging to help surgeons plan for a given cardiac surgery—or pre-procedural planning. This arm allows the CT surgeons to perform virtual run-throughs of the procedure prior to conducting the actual procedure (5). This system utilizes the True3D technology of EchoPixel, based in Mountain View, CA (https://www.echopixeltech.com/). The technology uses a Hewlett Packard Enterprise Zvr 1080p active 3D VR display and stylus. EchoPixel’s software used in conjunction with the desktop monitor style display and special glasses allow an operator to have a 3-dimensional visualization of the cardiac anatomy, and a stylus allows the operator to interact with the data. Using the stylus, the user can rotate, cut into, and measure certain parts of the anatomy. Subsequently there has been an increase in the number of centers using this kind of technology for preprocedural planning (19).
Intra-procedural Use
Early Work in Intraprocedural Augmented Reality.
In 2015, the Cardiology group in Warsaw, Poland described a first-in-man use of Google Glass AR technology (See Figure 1, Panel B) in the Interventional Cardiology Laboratory (20). In this case, a 49-year-old hypertensive patient with a history of coronary artery bypass graft of a left circumflex artery presented to the hospital with new onset angina with mild exertion. A preprocedural CT angiogram was obtained that demonstrated an extensive perfusion defect in the distribution of the right coronary artery. Static 3D CT angiograms were taken at angulations to highlight the perfusion defect. When the interventional cardiologist took the patient to the catheterization laboratory, these static images were posted in the field of the view of the cardiologist with the corresponding angulations to assist in the localization of the stenosis for further intervention. This case demonstrated that the use of an AR display could allow for better planning and guidance of interventional procedures. Additionally, these wearable devices, in the case Google Glass, can improve procedural efficiency with a monocular, monochrome, 640×360 display. While only a single case report, this case demonstrated the use of AR as something more than a preprocedural planning tool—as a tool that had true intraprocedural utility. This case report then became a springboard for the following technologies discussed below.
RealView Imaging.
In 2016, the Pediatric Cardiology section at Schneider Children’s Medical Center Israel, in concert with Philips and RealView Imaging (http://realviewimaging.com/) evaluated the use of computer-generated holography for identifying landmarks from transesophageal echocardiography (3DTEE) and 3D rotational angiography (3DRA) (21). The core technology behind RealView Imaging provides 3-dimensional interference-based holograms based on “Digital Light Shaping™” technology. These images are synthesized by a spatial light modulator to emulate the normal interaction of light with a physical object in 3D space, thereby allowing the user to view and interact with a true hologram within the field of view of the display, thus a MR display. This display medium preserves the focal cues used when observing close objects and resulted in successful identification of anatomical landmarks relevant to the respective procedures. The display is coupled with tools and hand tracking to allow for multiple, natural modes of interaction with the generated image. The current generation of equipment required for this true hologram creation and interaction requires a HoloScope, which limits the experience to a fixed window on the world.
Bruckheimer et al (21) designed a study to demonstrate the technical feasibility of creating holograms within the cardiac catheterization lab. The primary objective was to demonstrate that the anatomical landmarks identified on standard of care imaging could similarly be identified independently on holographic imagery. The secondary objective was to demonstrate functionality and usability of interacting with the holograms, such as rotation, cutting, and zoom/magnification. Eight patients were enrolled in the study (5 patients undergoing atrial septal defect closure with 3DTEE, 2 patients undergoing right ventricular outflow tract/coronary artery assessment prior to percutaneous pulmonary valve placement using 3DRA, and 1 patient with a Glenn shunt undergoing catheterization using 3DRA). In all cases holograms were easily created, visible, and of good visual characteristics for observation and interaction. All relevant landmarks were identifiable by both standard imaging and by holographic imaging. Additionally, all users were able to perform the functional interactive gestures with the hologram and rated the interactions as very easily performed. This early feasibility study demonstrated the utility for improved 3-dimensional visualization in the cardiac catheterization lab.
EchoPixel.
The EchoPixel system, described previously, has also been utilized for intraprocedural use. Balloca et al investigated the feasibility of the EchoPixel stereoscopic display when compared to current 2D measurement toolsets for examination of Mitral Valve (MV) anatomic structures (22). Although EchoPixel demonstrated feasibility for specific measurements, the system did not improve time to completion of measurements, particularly without the specialized measurement tools and integrations of the existing QLAB Mitral Valve Navigation (MVN) software. The EchoPixel stereoscopic display and stylus provide visualization and control to the clinician but lack specializations for the measurement of MV anatomy. In this MR application, a physician uses the stylus to manipulate the anatomy for annotation and to measure specific structures using the built in linear, orthogonal, and spline curve measurements. This method of interaction resulted in good intraobserver variability and consistent time to completion of measurement of the anatomic structures. Agreement with MVN in measurements for annular area and circumference in the pathologic group, and scallop measurements for both groups was low. This intraobserver consistency and low agreement may be attributable to the limited degrees of freedom of measurement of the circumference and area in EchoPixel when compared to MVN. Although not specifically discussed, intraobserver and interobserver consistency with EchoPiexl may also be attributable to improved ease of interpretation of the datasets in 3-dimensions as evidenced by the relatively stable time to completion when using the EchoPixel suite. Additionally, it is suggested by the authors that the simple, direct measurements may be easier to use for non-echocardiographers. This evidence suggests that the improved visualization capabilities of the EchoPixel display may allow less specialized users understand and measure the MV anatomy, but the accuracy of these measurements is currently limited by the ability to interact and measure complex structures using the current interface.
SentiAR.
The SentiAR solution (https://www.sentiar.com/) is currently being developed for the electrophysiology laboratory, with potential adjacent future applications in cardiac interventional procedures. Current limitations in the electrophysiology laboratory include the use of multiple different pieces of equipment each with their own interface and display—with none of these technologies interfacing with each other. Electroanatomic mapping data, a complex 3Ddata set, is compressed onto a 2D screen.
The SentiAR system accepts data from a commercially available electroanatomic mapping system and displays patient specific real-time cardiac geometries, electroanatomic mapping, and catheter locations in real-time using a Microsoft HoloLens 720p stereoscopic 3D headset (See Figure 1, Panel C). Additionally, standard DICOM images can be displayed through the SentiAR system once imported and segmented via the mapping systems.
The sterile, and hands-free (gaze-dwell) interface allows the operator to control and manipulate the models to best enhance their use during the case, all while maintaining procedural sterility. This system couples the true 3D visualization with the ability to control and manipulate the data, without requiring the use of hands, a true MR application for intraprocedural use. This specialization is necessary for applications where the operator’s hands are using other instruments during use of the system. The model manipulations include the ability to rotate (both to standard and nonstandard angulations used in the laboratory), magnify/zoom, clip into, and transparency alteration (1). Other functionality includes a sharing mode that allows for up to 5 users to engage in a single, shared session. In this mode, one user retains active control of the model with the remaining users as bystanders in that session and the model anchored in the environment such that each user is looking at the model from their unique perspective. This “Teacher/Student” mode allows for multiple users to have a single unified shared model from which to describe and discuss the patient’s anatomy and electrophysiologic substrate.
Data presented in 2018 (23–26) demonstrated feasibility of the system including acceptable engineering and visualization metrics for intra procedural use.
Ongoing AR/VR Research in Intraprocedural Electrophysiology
In addition to the companies listed above, there are some academic centers with ongoing projects on intraprocedural EP. In 2018, Jang et al (27) published their data on 3D holographic visualization of high-resolution myocardial scar as defined on cardiac MRI using the Microsoft HoloLens. The hardware used by the group was the mixed reality head up display (HoloLens) with a demonstrated gaze-gesture interface. This type of interface requires the user to use a combination of gaze-dwell to move the “cursor” and gesture, or “air tapping,” to select the menu choice.
In the study, 5 swine underwent controlled surgical infarction and high-resolution cardiac MRI to identify myocardial scar substrate. Subsequently, they underwent endocardial electroanatomic mapping to identify ventricular tachycardia substrate. Using the HoloLens, the generated maps were holographically displayed. This proof of concept study addressed feasibility, and importantly went on to obtain early assessment of usability.
At the conclusion of the animal study, both the operators and the mapping specialist were provided questionnaires addressing both usability and usefulness. Both users found the HoloLens display of ventricular scar useful (scale 1 [low] – 7 [high], operator rating average 5.8; mapping specialist rating average 5.5). Additionally, the authors felt that the ability to have a true 3D visualization of the scar, coupled with the ability to interact and deeply understand the visual-spatial anatomic relationships may facilitate MRI-guided, substrate-based VT ablation.
Ongoing AR Research in Interventional Cardiology
Sadri et al (28) presented their data using an augmented reality guidance system during interventional cardiology procedures. Specifically, the system displayed a virtual, patient-specific 3-dimensional anatomic model intraprocedurally during transcatheter aortic valve replacements and cerebral embolic protection (CEP) device placement in 6 patients utilizing a Microsoft HoloLens, a mixed reality headset. In this early feasibility study, they found that the AR guidance eliminated the need for aortic arch angiograms and additional contrast exposure prior to the CEP device placement as confirmed with fluoroscopy and post-procedure patient interviews. They concluded that the AR guidance was feasible, reduced contrast and fluoroscopy exposure, and could make transcatheter interventions faster, safer and more effective. While there is much ongoing development in this field, this early feasibility testing was quite promising.
Technology Limitations
Current technologies are predominantly designed as head mounted displays and are fundamentally limited by the number of points of light they can synthesize, as well as the total brightness they can emit. VR displays produce the widest field of view with the most accurate color reproduction, but will continue have difficulty reproducing peripheral vision. AR displays must not only make the same tradeoffs between field of view and angular resolution, but some displays also make some compromises to provide multiple focal planes, such as the Magic Leap which provides 2 focal planes at the expense of overall brightness and sharpness. The Magic Leap device also improves field of view by moving the display closer to the eyes, which limits the use of normal corrective eyewear. Hand and control tracking in VR is also more forgiving due to the lack of visual reference cues during tracking. In AR, hand and controller tracking must be precise in order to align digital augmentations over hands or controllers in the physical space.
Future Directions
Given the proliferation of applications being created, it is quite likely that by the time of this publication there will be more applications not covered by this review. It is anticipated that there will be significant development not only in software applications, but in hardware as well. There will be significant growth in non-FDA regulated spaces, such as patient education, medical student education, and preprocedural planning, as well as increased applications being used intraprocedural in both in minimally invasive procedures and in open surgical procedures.
There is also a number of hardware solutions being created for the extended realities. While many of these solutions are being developed by small businesses, iterative improvement from larger companies, such as Microsoft, will contribute to advancing the field. The most recent descriptions of HoloLens 2, for instance, are simply the beginning of a long path. Additionally, there may be the development of AR/MR displays that evolve beyond head mounted displays. This type of hardware may provide additional opportunities not currently targeted.
Conclusion
The evidence for the application and benefit of the extended realities in Cardiology is building, spanning from education and training to cardiac patient rehabilitation and from pre-procedural planning to intra-procedural use on platforms with relatively modest resolution and hardware specifications. Hardware innovations have increased the fidelity of the 3D visualization, enabling greater immersion during patient therapy, improvements in learning comprehension during medical training, and improved understanding of patient specific anatomy before and during patient procedures. Improvements in software and hardware to enable more natural interactions will continue to accelerate the completion of complex tasks using 3D data. The applications of these technologies within Cardiology and Clinical Medicine in general will continue to expand to provide improved visualization and application specific controls. The promise of these technologies is to empower the user to control the data to maximize patient benefit—this may happen through the visualization of 3-dimensional objects in a true 3-dimensional space, the control/manipulation of holographic images to maximize understanding of the visual-spatial relationships, or the control of tools that were previously out of reach. The next horizon in Cardiology is to realize these promises.
Table 1.
Education / Simulation | Patient Therapy | Pre-procedural | Intraprocedural | |
---|---|---|---|---|
Immersion | • Full immersion in simulated environment | • Full immersion allows control of patient environment | • Full immersion allows control of distractions and viewing environment | • View through display allows increased awareness |
Sterile Control | • Not Required | • Not Required | • Not Required | • Beneficial |
3D Visualization | • Accelerated skill acquisition • Improved Retention |
• Improves embodiment/ • immersion |
• Improved comprehension • Consistent Measurement |
• Improved Comprehension • Reduced Mental Fatigue |
Display Fidelity | • Higher field of view enables better immersion | • Higher field of view enables better immersion | • High resolution improves overall image quality | • Display must accommodate physician working environment |
Table 2.
Augmented Reality | Mixed Reality | Virtual Reality | |
---|---|---|---|
Immersion | • Digital environment visible at a glance | • Digital environment overlaid on physical environment | • Fully immersed in digital environment |
Mobility | • Mobility unrestricted | • Mobility unrestricted | • Tethered and/or cannot see physical obstacles |
3D Visualization | • 2D Display | • 3D Display | • 3D Display |
Display Fidelity | • Smallest Field of View • Lowest Resolution |
• Limited Field of View • Limited Resolution |
• Wide Field of View • High Resolution |
Acknowledgments
Funding Source: Funding for research discussed in this manuscript provided by the Children’s Discovery Institute of Washington University and St Louis Children’s Hospital (Grant CH-II-2017–575) and the National Institute of Health Small business Innovation Research award (SBIR Fast Track Grant R44 HL140896).
Financial Disclosure: JNAS receives research support from Medtronic, Inc, Abbott, Inc, and AliveCor, Inc. Abbott has provided research support (software) for this project.
Footnotes
Conflict of Interest: JNAS and JRS are co-inventors of the system. JNAS, JRS and MKS are co-founders of SentiAR, Inc. MKS, JNAS, and JRS are SentiAR, Inc. shareholders.
References:
- 1.Silva JNA, Southworth M, Raptis C, Silva J. Emerging Applications of Virtual Reality in Cardiovascular Medicine. JACC Basic Transl Sci 2018;3(3):420–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.McJunkin JL, Jiramongkolchai P, Chung W, Southworth M, Durakovic N, Buchman CA, et al. Development of a Mixed Reality Platform for Lateral Skull Base Anatomy. Otol Neurotol 2018;39(10):e1137–e42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Craig EGM. VR and AR: Driving a Revolution in Medical Education & Patient Care 2017. [August 30, 2017]. Available from: https://er.educause.edu/blogs/2017/8/vr-and-ar-driving-a-revolution-in-medical-education-and-patient-care.
- 4.Breining G Future or fad? Virtual reality in medical education Association of American Medical Colleges: Association of American Medical Colleges News; 2018. [August 28, 2018]. Available from: https://news.aamc.org/medical-education/article/future-or-fad-virtual-reality-medical-education/.
- 5.Hospital SCsHLPCs. Lucile Packard Children’s Hospital Stanford pioneers use of VR for patient care, education and experience 2018. [Available from: https://www.stanfordchildrens.org/en/about/news/releases/2017/virtual-reality-program.
- 6.Hospital SCsHLPCs. Project Brave Heart: Studying the impact of virtual reality preparation and relaxation therapy 2017. [Available from: https://www.stanfordchildrens.org/en/innovation/virtual-reality/anxiety-research.
- 7.Won AS, Bailey J, Bailenson J, Tataru C, Yoon IA, Golianu B. Immersive Virtual Reality for Pediatric Pain. Children (Basel) 2017;4(7). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Maresky HS, Oikonomou A, Ali I, Ditkofsky N, Pakkal M, Ballyk B. Virtual reality and cardiac anatomy: Exploring immersive three-dimensional cardiac imaging, a pilot study in undergraduate medical anatomy education. Clin Anat 2018. [DOI] [PubMed] [Google Scholar]
- 9. [Available from: https://thebodyvr.com/.
- 10.Trends D. Microsoft’s HoloLens gains momentum from award-winning HoloAnatomy app. 2016 [Available from: https://www.digitaltrends.com/virtual-reality/hololens-holoanatomy-award-jackson-hole-science-media-awards/.
- 11.Butler JF, Holcomb JB, Shackelford S, Montgomery HR, Anderson S, Cain JS, et al. Management of Suspected Tension Pneumothorax in Tactical Combat Casualty Care: TCCC Guidelines Change 17–02. Journal of special operations medicine: a peer reviewed journal for SOF medical professionals 2018;18(2):19–35. [DOI] [PubMed] [Google Scholar]
- 12.Talbot H, Spadoni F, Duriez C, Sermesant M, O’Neill M, Jais P, et al. Interactive training system for interventional electrocardiology procedures. Med Image Anal 2017;35:225–37. [DOI] [PubMed] [Google Scholar]
- 13.Mahmood F, Mahmood E, Dorfman RG, Mitchell J, Mahmood FU, Jones SB, et al. Augmented Reality and Ultrasound Education: Initial Experience. J Cardiothorac Vasc Anesth 2018;32(3):1363–7. [DOI] [PubMed] [Google Scholar]
- 14.Panteleimon PCA, Papagiouvanni I, Paparoidamis G, Drosos C, Panagiotakopoulos T, Lales G, Sideria M. Virtual and Augmented Reality in Medical Education. Medical and Surgical Education - Past, Present and Future 2017. p. 77–97.
- 15.Perez-Marcos D, Chevalley O, Schmidlin T, Garipelli G, Serino A, Vuadens P, et al. Increasing upper limb training intensity in chronic stroke using embodied virtual reality: a pilot study. J Neuroeng Rehabil 2017;14(1):119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Tashjian VC, Mosadeghi S, Howard AR, Lopez M, Dupuy T, Reid M, et al. Virtual Reality for Management of Pain in Hospitalized Patients: Results of a Controlled Trial. JMIR Ment Health 2017;4(1):e9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Ong CS, Krishnan A, Huang CY, Spevak P, Vricella L, Hibino N, et al. Role of virtual reality in congenital heart disease. Congenit Heart Dis 2018;13(3):357–61. [DOI] [PubMed] [Google Scholar]
- 18.Mendez A, Hussain T, Hosseinpour AR, Valverde I. Virtual reality for preoperative planning in large ventricular septal defects. Eur Heart J 2018. [DOI] [PubMed]
- 19.Wired M. EchoPixel Announces Progress in the Clinical Adoption of Interactive Virtual Reality for Pediatric Surgery. 2017 [Available from: http://www.marketwired.com/press-release/echopixel-announces-progress-clinical-adoption-interactive-virtual-reality-pediatric-2202796.htm.
- 20.Opolski MP, Debski A, Borucki BA, Szpak M, Staruch AD, Kepka C, et al. First-in-Man Computed Tomography-Guided Percutaneous Revascularization of Coronary Chronic Total Occlusion Using a Wearable Computer: Proof of Concept. Can J Cardiol 2016;32(6):829 e11–3. [DOI] [PubMed] [Google Scholar]
- 21.Bruckheimer E, Rotschild C, Dagan T, Amir G, Kaufman A, Gelman S, et al. Computer-generated real-time digital holography: first time use in clinical medical imaging. Eur Heart J Cardiovasc Imaging 2016;17(8):845–9. [DOI] [PubMed] [Google Scholar]
- 22.Ballocca F, Meier LM, Ladha K, Qua Hiansen J, Horlick EM, Meineri M. Validation of Quantitative 3-Dimensional Transesophageal Echocardiography Mitral Valve Analysis Using Stereoscopic Display. J Cardiothorac Vasc Anesth 2018. [DOI] [PubMed]
- 23.Silva JNASM, Dalal AS, Van Hare GF, Silva JR editor Improving Visualization and Interaction During Transcatheter Ablation Using an Augmented Reality System: First-in-Human Experience. American Heart Society Scientific Sessions; 2018; Aneheim, CA: Circulation. [Google Scholar]
- 24.Southworth MKSJ, Silva JR, editor Using Augmented Reality to Interact with 3D Holographic Images of Intracardiac Geometry and Catheter Positions During Cardiac Ablation Procedures. Biomedical Engineering Society Scientific Sessions; 2017; Phoenix, AZ. [Google Scholar]
- 25.Silva JNASM, Dalal AS, Van Hare GF, Silva JR, editor Improved Accuracy Using Mixed Reality Visualization and Interaction During Transcatheter Ablation Procedures. Biomedical Engineering Society Scientific Sessions; 2018; Atlanta, GA. [Google Scholar]
- 26.Silva JNASM, Van Hare GF, Dalal AS, Silva JR, editor Improve Visualization And Interaction During Transcatheter Ablation Procedures: Results From Initial Human Experience. Heart Rhythm Society Scientific Sessions; 2018; Boston, MA. [Google Scholar]
- 27.Jang J, Tschabrunn CM, Barkagan M, Anter E, Menze B, Nezafat R. Three-dimensional holographic visualization of high-resolution myocardial scar on HoloLens. PLoS One 2018;13(10):e0205188. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.S S. Augmented Reality Guidance for Cerebral Embolic Protection (CEP) With the Sentinel Device During Transcatheter Aortic Valve Replacement (TAVR): First-In-Human Study. In: Loeb GGA, Elvezio C, Velagapudi P, Ng VG, Khalique O, Moses JW, Sommer RJ, Patel AJ, George I, Hahn RT, Leon MB, Kirtane AJ, Nazif TM, Kodali SK, Feiner SK, Vahl TP., editor. irculation 2018.