Abstract
Extended reality (XR) solutions are quietly maturing, and their novel use cases are already being investigated, particularly in the healthcare industry. By 2022, the extended reality market is anticipated to be worth $209 billion. Certain diseases, such as Alzheimer's, Schizophrenia, Stroke rehabilitation stimulating specific areas of the patient's brain, healing brain injuries, surgeon training, realistic 3D visualization, touch-free interfaces, and teaching social skills to children with autism, have shown promising results with XR-assisted treatments. Similar effects have been used in video game therapies like Akili Interactive's EndeavorRx, which has previously been approved by the Food and Drug Administration (FDA) as a treatment regimen for children with attention deficit hyperactivity disorder (ADHD). However, while these improvements have received positive feedback, the field of XR-assisted patient treatment is in its infancy. The growth of XR in the healthcare sphere has the potential to transform the delivery of medical services. Imagine an elderly patient in a remote setting having a consultation with a world-renowned expert without ever having to leave their house. Rather than operating on cadavers in a medical facility, a surgical resident does surgery in a virtual setting at home. On the first try, a nurse uses a vein finder to implant an IV. Through cognitive treatment in a virtual world, a war veteran recovers from post-traumatic stress disorder (PTSD). The paper discusses the potential impact of XR in transforming the healthcare industry, as well as its use cases, challenges, XR tools and techniques for intelligent health care, recent developments of XR in intelligent healthcare services, and the potential benefits and future aspects of XR techniques in the medical domain.
Keywords: Extended reality, Smart health care, Healthcare informatics, Pattern recognition
Introduction
People's perceptions of both physical and virtual worlds are beginning to shift due to recent technological breakthroughs. Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) technologies have transformed their respective fields by changing how people collaborate with actual and virtual items. The much-hyped XR technology was created by combining VR, AR, and MR. VR is the most virtual aspect of the virtual–reality spectrum, as it gives users the impression of being in a computer-generated virtual environment. Google, Microsoft, and Facebook aim to enter the XR space by developing their own gear and software and focusing on the medical industry. A sense of immersion can be created using a head-mounted display (HMD) or a cave automatic virtual environment (CAVE) technique (Hu et al. 2021). Virtual world interactions are performed with movements identical to how we express emotions and ideas with our bodies and have always been the primary means of getting things done. In a nutshell, AR technology blends virtual and real elements to enrich our lives. Placing virtual elements like information and the parameters of an object is widespread, especially among designers and creativity researchers. With the majority of AR techniques relying on mobile devices, the user's interaction with the AR system is mostly restricted to a specific area or object. MR does a better job of mixing the virtual and real-world aspects. To comprehend this concept, many people consider MR to feature two particular attributes of VR and AR: heightened immersion and realism. MR and VR environments entirely cover their users, but MR and AR each blend real-world and virtual elements (Hu et al. 2021).
The state-of-the-art literature suggests the most considerable benefits of XR technologies will be in industries like retail (Berg and Vance 2016; Bonetti et al. 2018), tourism (Griffin et al. 2017), education (Merchant et al. 2014), health care (Freeman et al. 2017a), entertainment (Lin et al. 2017), and research (Bigné et al. 2016), where Virtual Reality (VR) is poised to have a significant impact. An astonishing increase in interest for VR devices shows that sales of VR headsets HMD have just exceeded one million in a quarter for the first time, and VR device sales will increase from $1.5 billion in 2017 to $9.1 billion by 2021 (Canalys Media alert 2017). The decreasing prices of VR headsets, as well as new models coming to market (such as the Oculus GO and HTC Vive Focus, FastCompany), will cause the use of VR to drastically increase in the future. Apart from VR, AR and MR are also included in the top 10 strategic trends for 2018 (Canalys Media alert 2017). The estimated year-over-year revenue increase from these technologies between 2016 and 2020 is enormous (rising from $2.9 billion to $61.3 billion (Canalys Media alert 2017). These results suggest that these technologies will have a bright future.
Definitions
People, places, and information are all connected through XR technology. The breakthrough could also assist in alleviating many people's frustrations with health care's limits by allowing them to stay healthy while working remotely. XR refers to a set of experiences that blurs the line between the actual and simulated worlds, as given in Table 1. To bring people into the experience, the technology can include visuals, audio, and potentially even scents and physical sensation. XR, a term given by Paul, blurs the lines between physical and simulated worlds. XR technology is making immersive experiences more commonplace; it has done so by reducing the importance of distance, which was a big barrier in the past. The spectrum of realities, “Reality–Virtuality Continuum,” has opened up for researchers to discover and drive deep into this virtual reality environment (Milgram and Kishino 1994). The continuum ranges from reality to virtuality at the two extremes. The whole world is captured in a Real Environment (RE). Direct or indirect (seen on a video display) views of a real scene are both included. “Virtual Environments” (VE) are computer-generated environments that use virtual objects (which do not exist in the real world) and an interface that makes it seem like they appear on the device in real time. Virtual Worlds (VWs) like Second Life (a type of VW) provide open virtual environments for their users to interact in real time with other users, who are represented by avatars.
Table 1.
The extended reality spectrum
Virtual reality (VR): it is a digital environment in which a person may move around and interact with their surroundings in real time, resulting in a real sensory experience. Figure 1 depicts the VR system at the far-right end of the continuum, which is entirely made up of computer-generated content. Users are wholly engulfed by the virtual world, with no chance of seeing or interacting with the actual world. A strong sense of presence and immersion in VR simulations allow people to play hypothetical scenarios to their hearts' content. The PlayStation VR is an excellent example. VR system setups can be grouped into three categories. One of the first examples is an integrated virtual reality headset that integrates with either a smartphone-based system or an overall system that is based on a cardboard device. One other system is CAVE, where the walls and floors are made of multiple large screens, and users are completely immersed in it. The final configuration is done with the HMD, which is linked to a separate computer. The setup has become popular recently because it is getting cheaper and better at VR experience, 360 films, and video mapping.
Fig. 1.
Relation between real and virtual worlds in different proportions
Augmented reality (AR): the term "to augment" refers to the addition of something to something else. In augmented reality, digital additions are applied to real-world encounters with the goal of making them more fascinating. Everything takes place in a split second. Snapchat filters are the most popular AR apps, with practically everyone using them. These systems are defined as systems that employ computers to overlay virtual information in the actual world. AR settings have the ability to create innovative tools in a variety of fields. Pokemon Go, the massively popular smartphone game, is an excellent example of augmented reality (Extended Reality-Brief Determining Needs 2018). The ability of AR to create 3D medical models in real time and then project them into remote locations in medical training and simulation boosts the potential of the technology. Furthermore, real-time deformable medical models increase the application of the simulation. A medical 3D augmented reality app for smartphone-equipped medical professionals exists to quickly train and have them administer the procedure for standard but critical functions in a safe and rapid sequence. When it comes to AR systems, they are capable of overlaying digital contents, such as information and objects, onto the real world. The end result is an enhanced experience where the user can see and interact with the surrounding environment while having the ability to use text, imagery, and animation. Smart glasses and handheld devices both improve the user experience. One will never get to race an Olympic 200 m, travel the world on airplane wings, or go on a trip to Mars. The IKEA Place app is a prime example of an AR application. Customers use the app on their smartphones to see products they've placed in their homes. For the first time, in 2011, Crisis Commons and OpenStreetMap responded to the Haiti earthquake with their respective applications (Extended Reality-Brief Determining Needs 2018).
Mixed reality (MR): virtual, mixed, merged, hybrid, or augmented virtuality settings are the extremes. This environment is a hybrid of virtual and real-world settings. The intersection of physical and virtual objects gave rise to MR. They allow users to engage with virtual items as if they were present in the real world, MR systems are ahead of AR. To construct an MR headset, you will need a computer glass that is clear enough for the sensors to see through, as well as a headset with an inbuilt computer. In order to allow virtual objects to interact with the user's actual environment, the real-world space is usually mapped in real time using integrated sensors. MR offers more interactive and immersive AR if you think about it. The Microsoft HoloLens, known for its MR applications, is one example of a widely used MR headset.
The distinctions between AR, VR, and MR can be explained using three primary characteristics (immersion, interaction, and information) (Venkatesan et al. 2021). The user's experience determines the level of technology immersion. While AR adds virtual information to real-world views, VR gives a totally virtual immersive experience. MR is able to translate between the virtual and real worlds in real time, which provides it with a spatial mapping capability. Interaction is used to describe the type of interactions possible with technology. Virtual objects can be interacted with via VR, while AR makes it possible to interact with real-world objects. MR allows people to interact with virtual and physical things. Data refers to the kind of information being processed during visualization. A virtual 3D space is used to keep track of virtual objects in VR. AR provides an interactive annotation while the user is performing an activity. MR takes place in 3D space and time, with correlation to the user's surroundings. One can argue that every incredibly immersive XR experience depends on the effortless interaction between the real and virtual realms. It is vitally important to consider the user's context, including his or her physical surroundings. AR applications emphasize this importance well, thanks to their contextual foundations. A location-based AR experience may be triggered when the customer arrives at a specific store. In the same way, a marker-based AR experience can happen when the customer finds a special AR marker on a display, point-of-sale unit, or packaging. Table 2 gives a description of various XR tools and techniques in the market with the corresponding price range.
Table 2.
Selected XR technology
| Technology name | Type (VR, AR/MR) | Estimated cost, USD | Description |
|---|---|---|---|
| Microsoft Hololens | MR | 3000 | Head-mounted wireless computer system including AR display |
| Google Cardboard | AR | 5–10 | Compact inexpensive cardboard adapter to use smartphones as VR glasses |
| Oculus Rift | VR | 350 | High-fidelity VR headset display; requires a powerful connected computer |
| Oculus Go | VR | 200 | Standalone VR headset; works wirelessly without |
| HTC Vive | VR | 500 | High-fidelity VR headset display; requires a powerful connected computer |
| Magic leap | MR | 2295 | Standalone AR headset; works wirelessly without a computer |
| Google Glass | AR | 1500 | Standalone AR headset; works wirelessly without a computer |
| zSpace | 4000 | AR desktop display monitor; can be viewed by multiple users at once; requires a powerful connected computer | |
| Vuzix Blade | AR | $899 | projects information in both eyes |
| PlayStation VR | VR | $350 | Mostly for gaming with play station. Needs play station and play station camera |
Extended reality in medical health domain
In recent years, virtual reality and augmented reality have provided major benefits to the healthcare business. Many people in need of assistance are benefiting from virtual reality in a variety of ways, including teaching social skills to children with autism, assisting patients with PTSD and depression, detecting early signs of schizophrenia and Alzheimer's disease, and improving the lives of patients with brain injuries. People are also using the new VR therapy programs to teach calm and anxiety control through virtual reality, especially during the COVID-19 quarantine regimen. The video game sector currently generates the highest revenue from XR apps, but this is not always the case in the future. AR has made advances in visualizations which can be helpful for patients and healthcare providers in surgeries and in assisting them in finding patients' veins. Doctors are now able to put images on a patient's body in real time by projecting them with AR. The healthcare field is innovating in many different ways, from experimenting with new learning styles to discovering new medical devices that require no physical contact. Many believe that the 2025 mark will have the entertainment industry alone exceeding USD 5 billion, making it the second-largest and most rapidly expanding sector to use XR (Goldman Sachs Global Investment Research 2016). XR will enable patients to watch surgical procedures before their own surgeries, which will be helpful for their postoperative recovery and for ensuring surgeons give them a successful strategy. It will also help surgeons train and practice by providing them with better surgical guides and experiences that reduce the cost of surgical care (Forbes Inc. 2017). To make things more transparent, this can let doctors see data about surgeries as they're happening. It can also be used to treat mental illness and to control pain medication use. XR may play into the USD 16 billion patient monitoring device market by providing a new way to monitor patient progress remotely (Grand View Research Inc. 2017).
The AR paradigm is commonly used in medical imaging since it is required. One of the most common applications of medical software surgery is image-guided surgery (Lorensen et al. 1993; Grimson et al. 1995). Imaging examinations such as CT or MRI scans, which the surgeon does prior to surgery, provide the surgeon with an inside view into the patient's body. A surgical plan is made with these images. To understand the surgical route to the target area, imaging software creates a 3D model by assembling a complete image of all the relevant views and slices. The AR system can be implemented to the extent that the surgical team will be able to view patient CT or MRI scans that are correctly positioned in the operating room and visible during the surgery (State et al. 1994). Beyond those two examples, AR is also in imaging ultrasound and optical diagnostics (Argotti et al. 2002). The healthcare industry has embraced any new technological innovation in recent years. Innovative virtual reality has finally been brought into the healthcare field, first in surgeries, then medical training, patient care, and more. The multidimensional uses of XR tools and techniques in smart health care are given as:
Doctors currently have to rely on X-rays, ultrasounds, MRIs, and other similarly sophisticated imaging tools to understand a patient's organs. For hard-to-diagnose cases, however, more advanced imaging techniques come up short. Every patient has a distinct body and organ type, which means that doctors have to tailor treatment plans to each individual. And with XR tools and techniques, it is gotten significantly easier. Surgeons can observe organs in three dimensions and then make more accurate incisions. A successful conjoined twin separation procedure took place thanks to extended reality.
Doctors who have decades of experience always feel time-pressed. Even newbies do not have time to train; it is just barely left. Extended reality can do tremendous work here. Using holograms, these experienced doctors can now train students or new doctors, with guidance, without actually being present.
Stroke patients have trouble moving and are therefore more likely to fall. The patient's recovery is supported by XR's virtual rehabilitation environment, which is motivational, task-oriented, and controlled (Wurst 2020).
To prove XR supports the recovery from surgical trauma and muscle weakness, Stanford is conducting a clinical trial on a physical therapy system that offers treatment following injury and surgery. Patients will be assessed for their mobility, but XR's benefits will be explored through pain management.
The UCLA study revealed that VR–goggle-trained surgeons are faster, more accurate, and complete more steps than their non-VR counterparts. The surgeons utilized Osso VR to assist with their work. The improvement will result in thousands of lives saved in the long run.
Abbvie created a VR environment to simulate the effects of Parkinson's disease as a means of increasing Parkinson's disease awareness. Family members of Parkinson's patients may find this useful for getting a better understanding of the disease and learning how to cope with it (Wurst 2020)
GSK created a migraine simulator to increase migraine awareness and understanding. They came up with this for an advertising project for one of their pharmaceuticals.
Bayer set up XR at a booth for a medical meeting, where attendees averaged 10 min at a time over an initial 2 min. In areas like product demonstrations and eDetailing, similar approaches could be effective.
Cedars-Sinai ran a study to look into the effects of VR-based pain relief treatments. 21% of patients who received regular immersive VR relaxation therapy were able to perform the treatment in their homes regularly and noticed significant improvements in their pain levels (Safavi and Kalis 2018).
To treat lazy eyes, the Jessenius Faculty of Medicine (Slovakia) has used an Oculus Rift device loaded with a custom game designed to strengthen patients' better eyes and condition their weaker eyes to treat amblyopia, a condition commonly referred to as lazy eye.
Medical animation studio Random has broad effects on pharma manufacturing plans to provide insight on how manufacturing processes could be visualized and monitor process adherence (Safavi and Kalis 2018).
Using VR headsets to experience molecule structures, scientists at Novartis are doing "molecule walking" to analyze protein structures and functions perfectly.
Virtual reality-assisted clinical trials—the research suggests that the VR experiences increased patient compliance with treatment supporting its enlarged potential implications for clinical trials.
Patients who have surgery have to face the fear of potential complications, like pain and complications following surgery. Doctors can easily locate veins for frequent injections in extended reality, which will lead to better treatment quality and greater patient satisfaction.
Broadly, the XR applications in transforming healthcare services can be interpreted from the following three categories.
Distance to people
Consider a sick, elderly person in a rural location who can consult with a world-renowned physician without ever leaving their home. Rather than operating on cadavers, surgical residents practice procedures using a simulator to prepare for real-world operations. Nurses utilize vein finders to ensure that IVs are inserted the first time correctly. A virtual reality cognitive treatment helps a PTSD sufferer who is a veteran rebound. One such example would be a condition requiring treatment by someone who is not a specialist and a patient who lives far away. XR will provide a realistic experience with a virtual physician. 82 percent of health executives agree that extended reality is removing the hurdle of distance in access to people, information, and experiences (Florida Hospital Tampa 2016). For people patients and their families, Florida Hospital Tampa is using virtual reality models to view inside a patient's brain tumor or aneurysm. Neurosurgeons can view the same model during the operation to create short-term fixes on the spot instead of performing last-minute, drastic changes. The ability to model a surgical procedure using a three-dimensional computer simulation and the understanding of the medical condition in question will allow patients to make more educated medical decisions and understand their current medical status better (Florida Hospital Tampa 2016). XR can get healthcare providers and their students in sync with each other by providing them with tools that work. For example, an internationally recognized sub-specialist could be working with a medical resident in another country on teaching them a new method. So, the research says that "extended reality" is helping health professionals get to people, information, and experiences that were once out of reach and diminished the distance gap between delivery of value-based care and the patients.
Distance to information
XR bridges the distance between consumers and clinicians, reducing the amount of information that providers must collect. It allows a clinician to acquire extensive information, removing roadblocks to essential judgments. A surgeon, for example, may utilize augmented reality glasses to look at digital content projected over the patient without moving their attention away from the patient. Doctors could gain greater accuracy and leverage outcomes that were impossible previously. They do this by having their information appear directly over their physical movements. XR is improving data accessibility, as well as making new discoveries easier to find. With the advent of new XR tools, information is being conveyed in 3D environments, just as humans do, so things look and feel more familiar. This makes it possible for new visualizations, which can lead to fresh discoveries in health care. Surgical procedures are becoming more precise, thanks to the use of 3D mapping and imagery that serves as a "GPS system" for navigating complex anatomy. Doctors used minimally invasive sinus surgery to treat a patient with the recent application of this technology. The system can help surgeons learn how to perform this procedure and the surgical planning (MobiHealthNews 2018). To better understand the extent of various diseases, it is important to view medical scans in a 3D form, which is what The Body VR allows with interactive 3D scans (The Body VR 2021,2021). Oxford researchers developed virtual reality models of genetic data to improve understanding of what goes on inside living cells (Futurism 2017). Drishti is an AI-powered solution from Accenture that helps the visually impaired enhance their experience of the world around them and better adapt to their working environment. The solution includes an app that is capable of notifying users about the number of people in a room, their ages, genders, and emotions. In addition, the user can use it to read text out loud, such as from books and documents, or to identify doorways (e.g., glass doors) that could pose a threat to safety. The bulky amount of heterogeneous medical data make it compulsory for advances in middleware for integrating data from diverse distributed sources. A distributed processing infrastructure known as GDIS, a Grid-based architecture for data integration, is designed by the researchers in Comito and Talia (2004).
Distance to experiences
Perhaps the most significant contribution of XR to the healthcare business is the provision of shared and community experiences. Medical practitioners could not relate to their patients' medical concerns in the past because they were not sick themselves. XR has the potential to change this by allowing medical staff to acquire a sense of how it feels to be sick. Embodied laboratories, for example, creates virtual reality laboratories for seniors to learn about services like assisted living. A medical research laboratory enables medical students to experience what it is like to be 74-year-old with various health issues through the “We Are Alfred” laboratory. The Beatriz laboratory takes users on a progressive Alzheimer's disease journey (xxxx yyyy). Using XR, clinicians will be able to see how debilitating mental illness is, and they will also be able to see the necessity of providing care to people struggling with mental illness. In this case, one example researchers have found compelling is the use of VR therapy to help military veterans confront post-traumatic stress disorder. This allows patients to see images or relive experiences while discussing responses with therapists in real time. Bravemind, a virtual reality-based exposure therapy tool, was developed by the Institute for Creative Technologies at the University of Southern California in collaboration with the US government. It allows psychologically scarred veterans to face their triggers and conquer their PTSD by exposing them to environments that cause stress but are not dangerous. After receiving treatment, 80% of the patients reported a decrease in symptoms, including depression (USC Institute for Creative Technologies 2018). Younger patients can benefit even more by using advanced technologies. XR is being used in hospitals to help children cope with painful experiences like injections and changing clothes. The child about to receive an IV has a chance to visit a virtual ocean right before the procedure (CNET 2018). To teach healthcare professionals CPR techniques, Nicklaus Children's Hospital in Miami has created VR training content (Next Galaxy to Develop Virtual Reality Applications 2015).
Organization
The remainder of the paper is sorted into the following categories: Sect. 2 delves deeply into the state-of-the-art Extended Reality technologies, tools, and strategies used to revolutionize the healthcare sector. The linked survey is broken down into seven sections, each of which is further broken down into sub-components. Each component includes a thorough examination, as well as accurate interpretation and discussion. Section 4 discusses the obstacles, potentials, and future prospects of Extended Reality in the healthcare area. Section 5 contains a summary of the full work, followed by the Reference Section.
Previous related work
Surgical workplace robotic surgery, neurosurgery endoscope assisted microsurgery, pediatric surgery, and obstetrics and gynecology are just a few of the domains where AR is being used. The following part discusses the most up-to-date XR instruments and procedures in the medical field. The section is divided into seven subsections, each of which contains a complete survey and related talks about XR in healthcare facilities (Fig. 2).
Fig. 2.
Schematic flowchart of the state-of-the-art works in XR in health care
Extended reality in medical knowledge transformation
Medical students' and trainees' anatomical learning as well as their invasive procedural training, could be revolutionized by XR technologies. XR can be used by students to investigate internal human systems and their processes of action. Additional patient-specific data, such as computed tomography (CT) scans and magnetic resonance imaging (MRIs), can be easily accommodated via XR modeling. XR allows trainees to create and revise their practice sessions multiple times in a safe environment. There are tonnes of learning and training options available for learners using extended reality. Applications can use VR to simulate everything that a backdrop, complete with any needed instructional materials, has to offer. The second class of applications introduces medical simulations to VR to trainees as the next available platform. It is possible to use VR in all kinds of applications on the most popular consumer VR headsets. In addition, several MxR applications allow multiple students to collaborate while they interact with each other and discuss educational material in a more natural setting. MxR is ideal for this purpose because the software can use the headset's unlimited freedom to walk and communicate freely in the augmented environment. For example, patient-facing applications could be developed to educate cardiac patients about how to handle their illness, while medical personnel-facing applications could be used to help hospital staff better administer treatment. The software may enable different features for customers on each side of the customer base. Applying the 3D model to learning applications is done to enable users to control the content they are studying and, in effect, their own education. The XR in medical sciences education can serve the interests of both patients and students.
Patient education
Project brave heart
The VR program at Lucile Packard Children's Hospital Stanford has three distinct goals (Hospital SCsHLPCs 2018). One of these is Project Brave Heart. Patients who have cardiac catheterization procedures scheduled should be given additional peace of mind through this project (Southworth et al. 2020). People who are getting surgery are asked to repeatedly watch a program over the course of a week before their procedure. The number of patients varies, but most are in their teens or early twenties. Prior to the design, the patient walks through the catheterization laboratory, cardiac surgery laboratory, recovery ward, and their regular hospital via virtual reality (VR) to provide them with an immersive experience. Figure 3 outlines the general framework for the relationship between medical education and XR technology.
Fig. 3.
Outlines medical virtual reality applications by patient involvement, including whether the clinician or patient will be using the virtual reality
Educating and training of medical students
The body VR
It uses immersive VR for three applications: exploring the insides of a cell, viewing virtual human anatomy, and helping patients understand the colonoscopy procedure. To gain a better understanding of cells, the user goes on a trip inside a blood cell, traveling through the bloodstream to learn how cells work together. After this, the user can dig deeper into the cell, learning about the cell's internal structures and how they are related to cellular functions and actions. The DICOM Viewer can be utilized to look at medical scans from various imaging machines like MRI, CT, PET. The results are drafted using an Oculus Rift or HTC Vive virtual reality headset. L. Gong et al. (Gong et al. 2021) designed a training program for medical personnel to learn how to perform intubations with hand–eye coordination utilizing AR concepts to succeed. A novel adaptive synchronization algorithm (ASA) used to maintain the shared state of the collaborative AR environment increases the sense of presence among participants and thus allows them to interact despite any delays caused by the infrastructure.
Stanford virtual heart
Lighthaus, Inc. partnered with Stanford University on a project to provide education about the human heart with the help of virtual reality known as the Stanford Virtual Heart Project (Stanford Children’s Health 2020). This project has several unique components. The first focuses on educating patients and their families about their child's cardiac anatomy, which at the moment is limited to plastic models and drawings. Stanford medical students and trainees can see what their anatomy looks like and see how abnormalities in it affect the body's functions. The students can walk through, examine, and move the models around to have a better understanding of the body's internal organs and physiology. The trainees have a library of about two dozen common congenital lesions at their disposal. The goal of these training exercises is to increase one's comprehension of these diseases and the changes they produce in body physiology. Echopixel is a 3D monitor that is used as the final step in the cardiothoracic operating room. 3D equipment in the surgical suite might make it possible to do detailed assessments of intracardiac anatomy and geometry after the patient has been placed on cardiopulmonary bypass and the heart has been deflated. This could be difficult because the anatomy may be difficult to see with these procedures. Even though XR is not used regularly in medical education at the moment, there have been a significant number of experiments and pilots involving anatomy. Table 3 shows several examples of these pilots.
Table 3.
Examples of XR anatomy innovations at medical schools
| Description | XR type | School |
|---|---|---|
| Stanford Neurosurgical Simulation and Virtual Reality Center (Stanford Medicine, Neurosurgery 2019) | XR Type: VR level of learners: medical students, residents, surgeons Focus: neuroanatomy, neurosurgical procedure training | Stanford |
| HoloAnatomy with Microsoft Hololens (Workman 2018) | XR Type: MR level of learners: medical students Focus: general anatomy | Case Western/Cleveland Clinic |
| Immersive Education at CHLA with Oculus Go (Oculus 2018) | XR Type: VR level of learners: all incoming residents, optional for medical students Focus: pediatric trauma procedures, pediatric resuscitation training | Children’s Hospital of Los Angeles |
| Virtual Reality Anatomy at USCF with HTC Vive (UCSF VR 2021) | XR Type: VR level of learners: first-year medical students Focus: general anatomy | University of California, San Francisco |
| Enduvo VR Teaching and Learning Platform using HTC Vive (The University of Illinois 2021) | XR: VR level of learners: medical students, surgeons, faculty Focus: general anatomy | University of Illinois College of Medicine Peoria |
Extended reality for cardiac applications
Various cardiac applications of virtual reality are depicted in this section.
Medical student training cardiac
HoloAnatomy
Microsoft's HoloLens is being used at Case Western Reserve University to enhance anatomy learning among medical students by tailoring instruction and observation (Sarah et al. 2019; Case Western Reserve, Cleveland Clinic 2021). Students who can better comprehend how 3D anatomic relationships work will find that learning becomes less frustrating and more enjoyable because they will be able to “think like a doctor.” The team has developed HoloAnatomy, a program that allows medical students to perform holographic dissections to better understand the body's organs and systems. The program is a joint effort of the university and the Cleveland Clinic and has public access to the people for demo.
Anima res
Anima Res is a company that specializes in creating 3D medical animations for AR, MR, and VR (Butler et al. 2018). This team's task is to help medical education become more relevant to doctors, med students, and patients. In particular, “Insight Heart” allows users to get a visceral understanding of the human heart by using immersive visual effects to display atrial fibrillation, systemic hypertension, and myocardial infarction, all seen in a three-dimensional space. This encounter is feasible on many different types of advanced reality platforms.
Simulators
Standalone training applications have also been developed through hardware integration (Talbot et al. 2017). The Vimedix transesophageal echocardiogram (TEE) and transthoracic echocardiogram (TTE) simulator by CAE demonstrates how mixed reality can be used in a variety of different medical applications. Using the MR simulator gives students insight into their anatomical relationships, and they are able to understand how to position the probe.
Pre-procedural planning
The use of the systems has been validated, as well, having been proven both clinically effective and able to achieve the task of pre-procedural planning.
EchoPixel
True 3D, developed by Echopixel, is an innovative DICOM workstation that includes the first DICOM-certified 3D system to be approved by the FDA (Chan et al. 2013). 3D visualization is accomplished using a technique similar to that of 3D movie theaters and early 3D consumer televisions, which provides two different images to each eye through glasses that are equipped with small liquid crystal displays (LCDs). The image can be manipulated through handheld wands if one Echopixel user is wearing polarized glasses. Echopixel, which can show arteries in people with pulmonary atresia with a significant collateral vessel in the heart, has been utilized in preliminary cardiology research. Doctors using the True 3D display interpreted findings more quickly than those who used a traditional display, finishing in 13 min as opposed to 22 min. The interpretations were just as accurate when compared to catheter angiography.
Intraprocedural Visualization
Enhanced electrophysiology visualization and interaction system (ELVIS)
Though better visualization has been a big research and development push for many years, equal advances in interaction have not been made. ELVIS, the interventional electrophysiologist, can see real-time patient-specific 3D cardiac geometry and view catheter locations on that display without breaking sterility, which is critical in this field (Silva et al. 2017). ELVIS shows data from either an electroanatomic mapping system (EAMS), computed tomography or cardiac magnetic resonance imaging, which is the EAMS is to use before any procedures are done. A new capability that has been recently discovered allows seeing cases from the past, as well as live ones from the control room. In addition to making it easier to see the content, the system enables sterile control of the display to be done via gestures, gaze, or voice. The new interaction method is ideal for on–off interventions, as it allows the interventionalist to commandeer the unified model in a way that is advantageous for the task at hand. A shared cardiac holographic model is located in the room and lets each user look at the model from his or her perspective while they operate on it. It is possible to pass control to another user at any time while only ever having one person in control at a time.
Realview imaging
The Schneider Children's Medical Center pediatric cardiology group in Yokneam, Israel, was able to successfully use 3D imaging software from Realview Medical Imaging to analyze whether real-time holograms are feasible in a standard cardiac catheterization laboratory in 2016 (Silva et al. 2018). The Realview CGH a set of computational holograms that was created with a combination of 3D rotational angiography coupled with transoesophageal echocardiography. This study included participants with both preexisting heart disease and post-surgery patients. Using a "very easy" image marking process, all patients were able to use the tool to generate real-time 3D holograms of high accuracy that could be cropped, zoomed, rotated, moved, and even sliced.
EchoPixel
The EchoPixel system was used in the intraprocedural procedures. The third arm of Stanford's VR program includes three-dimensional VR Imaging, which aids surgeons in pre-procedural planning for various types of cardiac surgery. Before they perform the procedure, the CT surgeons use this arm to perform virtual run-throughs (Wired 2017). EchoPixel, a California-based company, makes the True3D technology that EchoPixel Tech uses. The technology uses a 1080p, active 3D VR display and stylus, which was built by Hewlett Packard Enterprise. EchoPixel's software, paired with a monitor-style display and 3D glasses, and a stylus, allows a user to interact with data in 3D. The stylus allows the user to rotate, cut into, and measure the parts of the anatomy they are working on (Wired 2017).
Rehabilitation
MindMaze
The VR space company MindMaze is developing hardware and software together to create neurorehabilitation apps (Chevalley et al. 2015). MindMotion PRO, their solution that is approved by the US Food and Drug Administration, is being used in the post-stroke patient population to enhance upper limb mobility by combining virtual reality, brain imaging, and gaming technologies. An interesting study with results that are easy to explain involved post-stroke patients who received a 20–30-min training session starting on day 4 of hospitalization, which allowed the patients to return to normal activities while a stroke healed them. Almost everyone who used the MindMotion PRO reported an improvement in movement capability by about 90 percent (Chevalley et al. 2015).
SentiAR
SentiAR solution is in development for use in electrophysiology laboratories at the moment, with the option of more work with cardiac interventional procedures down the line. The electrophysiology laboratory's current restrictions include the fact that each piece of equipment in the laboratory is unique in that it uses its own control panel and display. No equipment interfaces with any other. A 2D screen is used to store the compressed complex 3D data set known as electroanatomic mapping data. The SentiAR system utilizes electroanatomic mapping data from a commercially available system and projects patient-specific, real-time geometries, cardiac and electroanatomic mapping, and catheter locations in stereoscopic 3D on a Microsoft HoloLens 720p.
Presurgical and intraoperative augmented reality in neuro-oncologic surgery
One of the most-studied applications of 3D models in surgical planning is 3D printing, wherein 3D models of anatomy and pathology derived from medical images are manufactured using a 3D printer. One advantage of AR/VR over 3D printing is that it can additionally provide simultaneous displays of real and virtual images. Table 4 gives a thorough study of the XR application in the field of surgery.
Table 4.
Clinical Summary of XR in neuro-oncologic surgery
| Authors | Surgical purpose | Sample size | Article objective | Brain tumor classification | Analysis software | Clinical outcomes |
|---|---|---|---|---|---|---|
| Gerard et al. (2018) | Intraoperative guidance | 8 patients | To evaluate the feasibility of combining intraoperative ultrasound and AR in tumor surgery | Meningioma × 2, glioma × 4, metastases × 2 | IBIS (Drouin and Kochanowska, Montreal, Canada) | Not reported |
| Finger et al. (2017) | Neuroendoscopy and presurgical planning of biopsy and other procedures | 28 total patients with 14 having the underlying oncologic disease | To evaluate AR-enhanced navigated neuroendoscopy system for intraventricular pathologies | Various periventricular tumors | Nova Plan 2.6.10 (Scopis, Berlin, Germany) | Not reported |
| Chen et al. (2017) | Presurgical planning | 16 patients | To assess an AR system using mobile devices for presurgical planning of supratentorial lesions | Parietal, temporal, and frontal lesions (meningioma × 15, glioma × 1) | 3D Slicer 4.0 (Surgical Planning Laboratory, Brigham and Women’s Hospital, Boston, Massachusetts, USA) | Not reported |
| Sun et al. (2016) | Presurgical planning and intraoperative guidance | 79 patients with functional neuron avigation and intraoperative MRI and 55 control patients | To investigate the utility of combined VR and AR for intraoperative MRI and neuron avigation in glioma surgery | Glioma | iPlan 2.6 (BrainLab AG, Munich, Germany) SARL, Bernex, Switzerland) | 69.6% of the study group achieved complete resection, with an average extent of 95.2% compared with 36.4% and 84.9% in the study group; language, motor, and vision preservation were significantly higher in the study group |
| Watanabe et al. (2016) | Intraoperative guidance | 6 patients | To assess AR-based navigation system with whole operating room tracking | Various tumors | Amira (FEI, Hillsboro, Oregon, USA) | Not reported |
| Rotariu et al. (2017) | Neuroendoscopy and presurgical planning | 22 patients | To evaluate the role and accuracy of virtual endoscopy for presurgical assessment | Pituitary adenomas | OsiriX (Pixmeo | Not reported |
| Tabrizi and Mahvash (2015) | Intraoperative guidance | 5 patients | To intraoperatively evaluate a novel AR neuron avigation system | 3 metastases, 2 glioblastoma | MRIcro (Chris Rorden, Columbia, South Carolina, USA) | All tumors were successfully removed with no complications |
| Inoue et al. (2015) | Presurgical planning | 99 patients | To assess the utility of a 3D CT model for obtaining preoperative information regarding sphenoidal sinus procedures | Pituitary adenomas | 3D Advantage Workstation Volume Share 4 (GE Healthcare, Wauwatosa, Wisconsin, USA) | Not reported |
| Inoue et al. (2013) | Intraoperative guidance | 3 patients | To assess novel AR neuron avigation system using web cameras | Glioblastoma × 2, meningiomas × 2 | 3D Slicer | No new neurologic deficits occurred; 2 of 3 tumors were successfully removed in their entirety |
| Stadie and Kockro (2013) | Presurgical planning | 208 patients (Dextroscope) and 33 patients (Setred) | To report experiences with 2 different VR systems | Various tumors | Dextroscope (Volume Interactions Pte. Ltd., Singapore, Singapore) and Setred system (Setred, Stockholm, Sweden) | Not reported |
| Wang et al. (2012) | Presurgical planning | 60 patients | To examine the utility of VR in planning sellar region tumor resections | Various tumors in the sellar region | Dextroscope | Of the selected group of 30 participants, hormone levels and vision were improved; complications including CSF leakage and diabetes insipidus were noted in 5 patients |
| Stadie et al. (2011) | Craniotomy placement | 48 patients | To describe the method of defining the placement of the craniotomy for minimally invasive procedures | Various tumors | Dextroscope | Not reported |
| Low et al. (2010) | Presurgical planning | 5 patients | To assess the utility of AR surgical navigation for resection of meningiomas | Parasagittal, falcine and convexity meningiomas | Dextroscope | 4 of 5 patients had complete resection; 1 patient had neartotal excision; all patients had good neurologic recovery |
| Qiu et al. (2010) | Presurgical planning | 45 patients | To assess the utility of VR presurgical planning using DTI tractography for cerebral gliomas with pyramidal tract involvement | Cerebral gliomas involving pyramidal tracts | Dextroscope | Gross tumor resection in 33 of 45 (73%) patients and subtotal resection in 6 (13%) patients; 7 of 45 (16%) patients had improved motor function, and 30 of 45 (67%) patients had no change |
| Ferroli et al. (2013) | Presurgical planning | 64 patients | To assess clinical experience using stereoscopic virtual reality for surgical planning | Various tumors | Dextroscope | Not reported |
| Yang et al. (2009) | Presurgical planning | 42 patients in VR group and in the control group | To evaluate the outcome of presurgical planning using Dextroscope in patients with skull base tumors | Meningioma × 15, schwannoma × 15, other × 12 | Dextroscope | Total resection rate was 83% in VR group compared with 71% in the control group; complication rate, length of postoperative stay, and surgery duration were significantly reduced in VR group |
| Stadie et al. (2008) | Presurgical planning | 106 total cases, including 100 cranial lesions | To report on experiences with 3D virtual reality systems for minimally invasive surgical planning | Various tumors | Dextroscope | Not reported |
| Anil et al. (2007) | Presurgical planning | 1 patient | To report preoperative planning with Dextroscope for fourth ventricular ependymoma | Fourth ventricular ependymoma | Dextroscope | Tumor was entirely removed with patient having no immediate postoperative neurologic deficits |
| Rosahl et al. (2006) | Presurgical planning | 110 patients | To investigate the usefulness of VR in image guidance for skull base procedures | Various tumors | Image Guidance Laboratories (Stanford University, Stanford, California, USA | Not reported |
Presurgical planning
The advantages of adopting virtual reality and augmented reality for surgical planning have long been acknowledged, with the greatest benefit shown in circumstances when navigation systems have failed to register effectively. In neuro-oncology procedures, targeted techniques are critical, and their effectiveness is strongly reliant on the tumor's precise location (Orringer et al. 2012). Stadie et al. (Stadie et al. 2011) compared the success rates of cranial surgery by simulating each technique using a virtual planning station named Dextroscope and VectorVision. It was discovered that these techniques are equivalent to localizing craniotomy. Nonetheless, neuron avigation resulted in inaccurate or failed results 3% of the time (out of 48 cases). Using preoperative VR planning, minimally invasive surgery was completed successfully in these scenarios. Apart from patient factors, such as brain swelling and fluid leakage, 3D imaging systems can be prone to errors in probe tracking and image-to-patient registration (Orringer et al. 2012; Widmann et al. 2012). Through better visualization of anatomic features, VR/AR has the potential to help neuron avigation processes improve in areas that have novel case-based applications.
Intraoperative image-guided surgical resection
The AR-based surgical system can show anatomic and functional imaging altogether. The system, developed by Besharati Tabrizi and Mahvash (Tabrizi and Mahvash 2015), is being used in many hospitals and helps surgeons by placing a virtual image onto the patient's skull or brain so that it is visible in real time. Despite having all tumors removed, an imaging system showed more discrepancies than did tumor navigation with regard to defining tumor edges and other tumor parameters. A higher rate of complete glioma resection has been achieved by combining VR/AR protocols using functional neuron avigation and intraoperative MRI. The different tools from the Brainlab software suite were used to help better navigate the tumor through 3D visualization and tracing of critical structures. The neurosurgeon used AR to see a 3D view or 2D image section as a visual representation of the virtual image that was superimposed on the neurosurgeon's view. The intraoperative MRI made it possible to perform complete tumor removal in patients using this technique (69.6 percent compared with 36.4 percent in the control group). Most significant among the benefits of intraoperative MRI, the technique accounts for brain shift, the deformation of the brain that results from factors such as brain swelling and cerebrospinal fluid loss. Another tool in the investigation is intraoperative ultrasound, which may also enable better measurement of brain shift (Nimsky et al. 2001). A different study used the augmented reality surgical navigation platform DEXRay in combination with presurgical planning with Dextroscope to aid in the successful resection of meningiomas in the falcine, convexity, and parasagittal regions.
Augmented reality in neuroendoscopy and skull base neurosurgery
Stereotactic biopsies, when used for treating tumors in a patient's brain, pose problems, particularly when the tumor is located in or near the ventricles or at the boundary of the ventricles and cortex. Neuroendoscopic approaches are best suited for these tumors (Chrastina et al. 2012). Finger et al. (Finger et al. 2017) analyzed the accuracy and value of an AR-assisted neuro endoscopy system for identifying internal disorders such as those found within the ventricles. The region of interest was superimposed as AR on the endoscopic field of view after the virtual planning of the neuronavigation system. The results of all the biopsies were a 100% diagnostic yield. Surgery on the skull base relies heavily on visualization, which is why stereoscopic visualization is important for deep-seated imaging tumors. By using VR, this method has been possible: this new surgical technique and the training associated with it which was difficult to simulate in the past. The accuracy of virtual endoscopy images for surgical planning of transsphenoidal pituitary adenoma surgery is especially great. This method of investigation uncovered a wealth of knowledge about the nasal cavity's structures and landmarks, including the carotid prominence and the sphenoid septa, giving medical professionals the guidance they needed to determine the best surgery plan and strategy (Mikhail et al. 2019). In their prospective, randomized clinical trial, Yang et al. (Yang et al. 2009) focused on using VR in skull base tumor resection procedures as part of presurgical planning. Similar use was found with the technology, and it resulted in a significantly reduced surgical time and postoperative stay, as well as fewer complications because of cerebrovascular injury. According to Zachary G. Schwam (Schwam et al. 2021), augmented reality may be of use in lateral skull base surgery. More research is needed into the potential of augmented reality technology to help with skull base surgery.
VR/AR uses in functional neuroimaging
With the goal of successfully fulfilling all activities while preserving patients' quality of life, the usage of virtual and augmented reality for imaging has expanded (Zhao et al. 2012). During brain tumor surgery, lesions in eloquent brain areas, such as the primary motor cortex and subcortical motor pathways, which are difficult to map and diagnose, are more likely to be impacted (Amidei and Kushner 2015). Diffusion tensor imaging tractography can be used to visualize key white matter tracts and also to provide information on the tumor's location. T1-weighted MRI, T2-weighted MRI, and diffusion tensor imaging are some of the imaging techniques that could be used to implement this. Understanding the interrelationship between white matter tracts and glioma lesions can be better done with the help of fiber tracking and glioma segmentation. This method proved to be very effective, as it was found to increase surgical accuracy, thereby minimizing the potential for damage to motor function, as well as ensuring complete tumor removal without worsening neurovascular function for patients with glioma tumors with metastases that were close to or touching the pyramidal tract and one patient who had a tumor located in the corticospinal tract. Using 3D Slicer, Inoue et al. (Inoue et al. 2013) designed a 3D AR neuron avigation system capable of adding tumors, vascular structures, and tractography to images obtained via web cameras.
Extended reality in neurodegenerative disorders
PTSD
Post-traumatic stress disorder (PSTD) is pervasive among veterans of numerous wars, which has prompted the hunt for both Virtual reality immersion therapy (TERV) as well as other methods be successful treatments (Rothbaum et al. 1999). These sessions were set in a virtual Vietnam, with jungle, rice fields, and rivers, where the patient could move and act on his own. Details like helicopter sounds and blasts from explosions, gunfire, and flashes of light were added. The battle was made even more real with the addition of fog and soldiers shouting orders. The application of VR for the treatment of PTSD was first studied by Difede and Hoffman (2002). In the second simulation, which took place in a jungle clearing, the patient had the virtual experience of flying a Huey helicopter over Vietnam. Improvements with this treatment approach lasted more than 6 months. In 2002, a case study was done by the author et al. (Beck et al. 2007), who utilized exposure to virtual reality treatment on a World Trade Center (WTC) terrorist attack survivor suffering from PTSD. It has been established that this patient has failed with exposure therapy in the past (Table 5).
Table 5.
Studies on the treatment of post-traumatic stress disorder (PTSD) using virtual reality
| Authors | Study type | Number participants | Technical applied | Results |
|---|---|---|---|---|
| Difede and Hoffman (2002) | Case study | 1 | VRET | 90% reduction in symptoms of PTSD, 83% reduction in symptoms of depression |
| Beck et al. (2007) | Case study | 6 | VRET, relaxation techniques, in vivo exposure and in imago exposure | Reduction of symptoms of PTSD. No significant reduction in symptoms was found anxiety and depression |
| Rothbaum et al. (1999) | Case study | 1 | VRET | 45% reduction in symptoms of PTSD-retention beyond 6 months |
| McLay et al. (2011) | Controlled randomized trial | 20 | Group A: VRET, cognitive restructuring. Group B: pharmacotherapy and group therapy | 70% of those who underwent VRET showed > 30% improvement of PTSD symptoms compared to 12.5% of group B |
The VR training involved short one-hour sessions where participants were introduced to different virtual plane crashes, including an approximation of the twin towers' collapse. After finishing the treatment, the patient saw their symptoms significantly diminish. Using virtual reality, Rothbaum et al. (1999) helped 6 people with PTSD resulting from a car accident in 2007. People who were in the treatment group with 10 sessions in the virtual environment showed a decrease in PTSD symptoms. McLay et al. (2011) performed the first controlled randomized trial on the effects of VR and cognitive reconstruction on cognitive rehabilitation. This trial studied the program in active-duty military personnel who had post-combat PTSD, comparing it to standard pharmacotherapy and group therapy. Patients who received no treatment other than sham exposure showed greater improvement than the control group after 10 weeks. 11 September survivors were successfully treated by TERV whose instance was described in an academic study (Difede and Hoffman 2002). The study concluded with a more significant research effort on a larger participant pool of 13 survivors. This experiment also revealed therapeutic effects beyond what was compared to the control group. Recent veterans have significantly benefitted from the TERV, which helps reduce their anxiety and PTSD symptoms (Difede and Hoffman 2002). A more recent study completed a clinical trial comparing classic CBT to TERV on ten subjects with PSTD. The study showed no significant difference between the therapeutic effects of the two but showed a slight preference for TERV (Dayan 2006). Additionally, treatment for victims of road accidents is another major topic. Using the VR therapeutic system of the Argaman Virtual Reality Software Suite is a great treatment option for PTSD, especially after exposure to extreme terror and traumatic experiences (Dayan 2006).
Panic disorder
North et al. (1996) studied the desensitization process of panic disorder with agoraphobia in 30 students using VR. Jang et al. (2000) conducted an open, uncontrolled study to determine the benefits of virtual reality therapy in people with panic disorder with agoraphobia. Panic disorder with agoraphobia can be treated by the Experiential-Cognitive Therapy developed by Vincelli et al. (2003a) (Table 6).
Table 6.
Studies on the treatment of panic disorder using virtual reality
| Authors | Study type | Number of participants | Techniques applied | Results |
|---|---|---|---|---|
| North et al. (1996) | Controlled study | 60 | Group A: VRET. Group B: none no treatment | Significant reduction in PTSD symptoms of the group receiving VRET-Control group showed no improvement |
| Jang et al. (2000) | Open, uncontrolled study | 7 | VRET | The effectiveness of VRET was not supported VR |
| Vincelli et al. (2003a) | Controlled randomized study | 12 | Group A: BHT. Group B: GCCS. Group C: List Waiting list | Similar reduction in panic attack, anxiety, depression symptoms in both treatments. 33% Fewer sessions in the BHT. The superiority of both over the waiting list |
| Choi et al. (2005) | Controlled randomized study | 40 | Group A: BHT, Group B: GCCS | Significant reduction in seizure symptoms panic attacks in both treatments |
| Botella et al. (2007a) | Controlled randomized study | 37 | Group A: VRET, Group B: In vivo report, Group C: List waiting list | Same reduction in panic attack symptoms and superiority over both treatments the waiting list. ∆Maintenance of treatment gains 12 months later |
A complex cognitive behavioral therapy strategy uses virtual reality as its integration point. There was a big plaza, a sub-zero supermarket, and many other large public areas in the virtual environment. Researchers assigned 12 panic disorder patients with agoraphobia to either 8 sessions of experiential-cognitive therapy, 12 sessions GCS, or a waiting list, all on a random basis. The study found that had fewer sessions compared to CBT, and in a few months, study were able to achieve the same results as CBT when it comes to reducing anxiety, depression, and panic attacks. Choi et al. (2005) compared the two groups and found that those who received brief experiential-cognitive therapy fared better. The results of the treatment showed a significant improvement in symptoms, regardless of treatment type. Botella et al. (2007a) handpicked 37 patients who met the DSM-IV panic disorder with agoraphobia criteria were randomized to three 9-session experiments: exposure to virtual reality, exposure to reality, and no treatment of these patients. Subjects that were treated with virtual reality experienced significantly better improvements in their symptoms compared to the subjects who were only on the control waiting list.
Special phobias
Despite having its first clinical trial in the 1990s, virtual reality has been in use as a therapeutic tool to treat anxiety disorders to this day. A string of tests and studies have also been done to evaluate its effectiveness (Gorini and Riva 2008). For a lot of them, their focus is on anxiety disorders, but a few also expand to eating disorders, substance dependence, and control of pain. Some even go so far as to cover palliative care and rehabilitation (Tarnanas et al. 2009).
Flight phobia
Many people suffer from the fear of flying, which prevents them from going on trips, or they end up being too anxious to travel. Many people with severe anxiety are able to travel but resort to alcohol and medications to cope with their symptoms (Roberts 1989). It is believed that between 10 and 25% of the general population has this disorder. A team led by Rothbaumet et al. (Roberts 1989) 2000 researched the impact of an exposure-based treatment for 49 patients with extreme fear of flying. This group treatment included one session where the patients worked in an in-house airplane and also visited a local airport. Patients were also placed on a waiting list. The treatment was delivered in a total of 8 sessions of 60–90 min for 8 weeks. The participants were measured in their desire and anxiety as they prepared to get on an actual plane. Exposure to virtual reality, or even real life, turned out to be better than the waitlist, whereas the two treatments proved equally good (Rothbaum et al. 2002). Rothbaum and her group (Rothbaum et al. 2006) et al. found similar results in 2006 when they replicated the study. The results of the study are laid out in Table 7.
Table 7.
Studies on the treatment of fear of flying using virtual reality
| Authors | Study type | Number of participants | Techniques applied | Results |
|---|---|---|---|---|
| Rothbaum et al. (2000a) | Controlled randomized study | 45 | Group A: VRET, Group B: VRET, Group C: Waiting list | VRET and GCS were superior to the list waiting list. Between them, there was no significant difference. There was no significant difference between the treatment options. Gains were maintained 6 months after |
| Rothbaum et al. (repeat of the above study) (Rothbaum et al. 2002) | Controlled randomized study | 83 | Group A: VRET, Group B: GCS, Group C: Waiting list | 75 completed treatment. Confirmation of the results of the previous study. The therapeutic benefits were maintained at 6 and 12 months after |
| Wiederhold et al. (2001) | Controlled randomized study | 30 | Group A: VRET with feedback of physical stimuli, Group B: VRET without VRET feedback of the without somatic stimuli, Group C: In imago exposure | VRET with feedback: 100% effectiveness V RET without feedback: 80% effectiveness Both 2 were superior to In imago exposure (20% |
| Mühlberger et al. (2003) | Controlled randomized study | 45 | Group A: TBI with VRET (flight simulation), Group B: TF with VRET without motion, Group C: TTH | Reduction of symptoms was observed only in VRET with or without motion. Therapeutic benefits were maintained 6 months after |
| Krijn et al. (2007) | Controlled randomized study | 83 | Group A: VR Report, Group B: GCCS, Group C: Bibliotherapy without communication with the therapist All groups: After After treatment: after treatment, they received an additional 2 days of CrCB | 59 completed treatment. THE VRET and GCS were superior to bibliotherapy GCS after CrCB showed the greatest efficacy |
The first experiment in 2001 involving exposure to VR compared to exposure to fantasy by Wiederhold et al. (2001) studied the results of 30 people with a fear of flying. One group experienced a simulation in VR along with seeing what their bodies were doing (heart rate, sweating, etc.). A second group experienced VR with no knowledge of their bodily status. Finally, the third group experienced their daydreams. In comparison, after treatment ended, of the people who received psychosomatic treatment with the aid of fantasy, 20% were able to fly without any medication, but 80% were able to fly using VR without recording psychosomatic reactions. In contrast, 100% of those who received treatment with virtual reality with psychosomatic recordings could fly without any medication. A 2003 study by Mühlberger et al. (2003) of 45 patients with phobia found that the 6-month evaluation concluded that the virtual environment helped cure symptoms only when the patient was shown both visual and auditory stimuli. Krijn et al. (2007) study, which examined the effects of VR exposure, exposure in the real world, and bibliotherapy, involved a phobia of flying and involved 83 people. Two days of group cognitive therapy were given to all of the groups. It was discovered that VR exposure therapy and GSCT were more effective treatments in comparison to bibliotherapy because they had the most success in getting their participants to break free of their fears.
Social phobia
Using virtual reality, North et al. (1998) treated social phobia for the first time in 1998. People with social phobia were exposed to a virtual environment with speeches for 6 weeks, and those who received real exposure were compared to those who received exposure to a neutral sham environment. The relevant results are presented in Table 8.
Table 8.
Studies on the Treatment of Social Phobia using Virtual Reality
| Authors | Study type | Number of participants | Techniques applied | Results |
|---|---|---|---|---|
| North et al. (1998) | Controlled study | 16 | Group A: VRET public speech, Group B: VRET speech in neutral audience | 14 people completed the treatment. Significant improvement of symptoms only group A hostile audience |
| Pertaub et al. (1999) | Pilot study | 10 | Group A: VRET on neutral emotionally neutral audience, Group B: VRET in a hostile/ hostile audience friendly audience | Stressful reactions were primarily elicited by the |
| Harris et al. (2002) | Controlled randomized trial | 14 | Group A: VRET, Group B: Waiting list | The superiority of VRET over waiting list |
| Roy et al. (2003) | Clinical trial (within groups design) | 10 | VRET in 4 conditions: performance, control, familiarity, and confidence | Improvement of symptoms |
| Klinger et al. (2004a) | Clinical trial | 10 | VRET in 4 conditions: performance, control, intimacy and confidence | Improvement of symptoms |
| Klinger et al. (2004b) | Controlled trial | 36 | Group A: VRET, Group B: GCS | Significant improvement of symptoms in both treatments |
Findings revealed that those who completed the virtual public speaking experience experienced significant improvements. A pilot study done by Pertaub et al. (1999) included 10 participants speaking in front of an audience that was virtual and, importantly, that the audience displayed overtly positive or overtly hostile emotions. The study set out to find out which type of virtual audience would be able to provoke social phobia symptoms. The participants' anxious reactions were mostly provoked by the virtual audience, which was shown by the results. The study by Harris et al. (2002), which exposed eight individuals with a public speaking phobia to virtual reality and compared the results with a control group, concluded that exposure to virtual reality was effective. Roy et al. (2003) and Klinger et al. (2004a) introduced the concept of treating social phobia with virtual reality in 2003. In this treatment, the virtual environments imitated four major elements of the patients' lives with social phobia: performing, intimacy, control, and empowering behavior. Following this, the same researchers (Klinger et al. 2004b) published an identical study in the controlled non-randomized form in which they discovered that virtual reality treatments were as effective in aiding recovery as GCS.
Acrophobia
TERV was first applied to Acrophobia in 1993, which laid a platform for collaboration between computer scientists and Psychotherapists. The first test of the TERV occurred on a nineteen-year-old patient who was suffering from acrophobia in 1995. The researchers discovered that the participants' acrophobia got better (Rothbaum et al. 2000b, 1995). The same team did another experiment in the same year with seventeen acrophobic individuals, which proved fruitful. Further investigations involved exposing 10 and 33 acrophobes to exposure therapy (Emmelkam et al. 2001). In addition to finding similar outcomes in both treatments, the authors discovered affordable PCs to use in the treatment.
Arachnophobia
In 1996, the first research on TERV in the context of arachnophobia occurred when a woman in her late thirties who was afflicted with crippling arachnophobia received the treatment. Following 12 sessions, it was concluded that the patient's arachnophobia had reduced enough that she is now able to sleep in a tent. One of the more notable parts of this experiment was the incorporation of a real object which resembles a hairy spider. This gave participants the chance to practice using feedback from pseudo-haptic and tactile sensations (Emmelkam et al. 2001).
Claustrophobia
Botella et al. (1998) used 8 sessions of virtual reality exposure treatment in a patient who met the criteria for claustrophobia. Assessment results showed an improvement in symptoms, which were also found to persist a month later. Researchers in the following year investigated the utility of virtual reality exposure in treating a patient who had multiple phobias, specifically claustrophobia, fear of storms, and panic disorder accompanied by agoraphobia (Botella et al. 1999), using a virtual environment in 8 one-on-one sessions. After three months, measurements of clinical improvement were found in claustrophobia patients, where significant clinical improvement was found before, during, and after the conclusion of treatment. Agoraphobia and storm phobia even improved though they had not had specific treatment. VR is used for the treatment of 4 claustrophobic patients in 8 sessions (Botell et al. 2000). There was a marked reduction in fear and avoidance among patients after having been evaluated at three intervals (before, after, and three months later). Table 9 contains a compilation of the study results.
Table 9.
Studies on the treatment of claustrophobia using virtual reality
| Authors | Study type | Number of participants | Techniques applied | Results |
|---|---|---|---|---|
| Botella et al. (1998) | Study case study | 1 | VRET | Reduction of symptoms-maintenance of therapeutic gains 1 month after |
| Botella et al. (1999) | Study case study | 1 of treatment gains 3 months after | VRET | Significant reduction in symptoms of claustrophobia. Reduction of agoraphobia and phobia symptoms storm fears without specific treatment for these phobias. ∆Maintenance |
| Botell et al. (2000) | Controlled study | 4 | VRET | Significant reduction in symptoms of fear and avoidance. ∆Maintenance of therapeutic gains 3 months after |
Fear of driving
Two studies were conducted on a single case and ten cases per team. They looked at the fear of driving, and the results were published (Wald and Taylor 2000). A significant improvement in their clinical condition was seen after TERV treatment in both of these studies.
Agoraphobia
Many locations are involved in this phobia that involves many distinct places (plane, subway, cinema, driving, places deserts). Next, many protocols that study agoraphobia include exposure to virtual reality in addition to cognitive therapy (like cognitive restructuring, psychoeducation, self-instruction, etc.). And with clinical efficacy, the experiments have found that relaxation treatments are effective (Vincelli et al. 2003b). A study comparing classical CBT and TERV with traditional therapy demonstrates that the TERV treatment uses the same amount of time while being more effective. The results were further supported by a later study on a larger sample of 37 patients (Difede and Hoffman 2002). The therapeutic efficacy of TERV was observed not only in how the environment is cost-effective for a therapist but also in a measurable therapeutic advantage on all scales of 18 agoraphobes (behavioral, physiological, and subjective).
Anxiety disorders
TERV has been conducted for various anxiety disorders. When treating people with anxiety disorders, it is crucial to introduce new settings that mimic what they fear the most and expose them to a stressful environment for a long period of time. These VR exposure sessions last an hour and consist of anywhere from 5 to 12 individual sessions; the latter can last 30–60 min (Botella et al. 2007b). The following are different types of tests the subject completes before and after the exposure: questionnaires, the results of which are both subjective and objective. For these types of tests, behavioral testing is used rarely, while physiological testing (including body temperature, skin conductance, etc.) is more frequently used (Malbos et al. 2008). Both subjective and objective tests were used to strengthen the results discovered in the presence of notable correlations.
Obsessive–compulsive disorder
Little clinical studies exist in this area, as currently, TERV is facing a unique challenge in treating this disorder, and the development of virtual environments is difficult. Despite this, a pilot study found that people could practice being compulsively free in virtual reality, and many people reported having anxiety symptoms following VR exposure in 33 people (Kim et al. 2009). As TERV sessions progress, psychometric tests reveal decreasing anxiety. This indicates the use of virtual reality for the treatment of obsessive–compulsive disorders.
Schizophrenia
Schizophrenia exhibits some abnormal perceptions that could lead to mistakes about the boundaries of reality. This illness could therefore seem to discourage therapy involving virtual reality at first glance. A 2014 study on four patients with schizophrenia that used VR technology concluded that patients could tolerate this immersive technology with little or no cognitive detriment (Kim et al. 2009). Additionally, cognitive rehabilitation in virtual reality was then applied to 12 schizophrenics aged 60 or older to assist them in their recovery of simple mental tasks like navigating an artificial ocean by using their bodies, catching bullets, etc. (Chan et al. 2010). The cognitive scores and memory abilities improved more than those of the control group (n = 15). Finally, a clinical study that focused on cognitive rehabilitation for schizophrenics in regards to social competence employed VR. This study includes 91 schizophrenics, with a group who received virtual reality therapy and a control group who had traditional role-playing therapy (Park et al. 2011). VR appealed to participants more than traditional treatment based on what initially seemed like a lack of incentive.
Depression
We found only two studies using immersive VR with depression as an explicit focus. A number of smaller studies involving a single treatment technique were performed to evaluate the efficacy of treatment over time, with the depression levels observed to decrease. A non-immersive VR-type task with a focus on spatial navigation memory assessment was performed in a study on depression (Freeman et al. 2017b; Gould et al. 2007).
Eating disorders
It has been noted that the VR field is plagued by very few well-conducted studies, despite an early application of VR for eating disorders (Riva 1998). To induce hunger, it is possible to use VR to do it, as demonstrated by experiments showing that people will experience hunger-related reactions when in VR just as they would if the food were actually in front of them (Pallavicini et al. 2016). In a fascinating VR study, (Keizer et al. 2016) used virtual reality to help patients with anorexia nervosa experience a healthy BMI body, which patients with the condition reported was an important first step in overcoming their disease and also an impact on reducing body size overestimation that lasted for at least two hours.
Extended reality in dental medicine
The development of augmented reality (AR) and virtual reality (VR) in dental medicine and identifying future research needs to accomplish its clinical translation are debated in the current section.
Dental education
In the field of dental education, studies have explored various topics, including how to develop patients' 3D vision during oral health procedures up to complex methods for correcting defects of the facial bones. Eve et al. (2014) compared dental undergraduates to prosthodontics residents on a simulated caries removal exercise. Novice and experienced operators achieved significant increases in efficiency: defined as the percentage of the carious lesion removed over drilling time. Al-Saud et al. (2017) used a haptic VR simulator to investigate the impact of feedback on the rate of motor skill acquisition for tooth preparation. A dental instructor teaching others how to use a haptic device accelerated the learning of basic manual dexterity skills when inexperienced participants were trained in person while receiving feedback on their performance via the device. A study on the subject used haptic VR to test manual dexterity in preclinical dental education was carried out by Urbankova et al. (2013). According to the study, using VR simulators is one way to identify students with learning challenges in dental training in the preclinical stage. Suebnukarn et al. (2014) also performed a similar experiment using a prototype haptic VR dental simulator and assessed its utility for motor skill training. The results proved the VR-accuracy simulators by demonstrating that they can distinguish between the performances of experts and non-experts. The new learning objective explored the acceptance of XR in teaching preparation design, which was found to be successful (Espejo-Trung et al. 2015). Boer et al. (2015) investigated how the use of virtual learning settings affected the appreciation and success of students. The inferior alveolar nerve block was examined by Correa et al. (2017) in regard to the dental anesthesia- training simulator. Khelemsky et al. (2017) investigated the usefulness of a new virtual reality surgical simulator for orbital floor reconstruction in more complex surgical treatment techniques. To evaluate a basic training system that uses VR technology to remove submandibular glands, Miki et al. (2016) ran a study. Training for oral surgeons (novices) was successful thanks to the endoscope-assisted surgery-based VR training system.
Maxillo-facial surgery
In the oral and maxillofacial surgery field, several types of orthognathic operations (repair of the jaw or the use of distraction osteogenesis) as well as reconstructive surgeries for the mandible and the operation of saliva glands are performed. A report on a case in which a 42-year-old woman's pleomorphic adenoma in the lacrimal gland was successfully removed using a microscope-based AR system is featured in the report of Scolozzi and Bijlenga (2017). Yamada et al. (2016) conducted a study of 21 patients who were sent with custom-made titanium mesh trays to fill the gap created by bone and marrow surgery. The 3D printed skull model was combined with titanium mesh bent around it. VR simulation was completed by using computer software and preoperative radiographic data. Qu et al. (2015) employed an AR-toolkit for distraction osteogenesis to guide mandibular osteotomy line placement and to aid with distractor positioning in 20 patients with hemifacial microsomia. Using AR technology to produce a new imaging and visualization tool, the research team led by Zinser et al. (2013) published a protocol that integrates orthognathic surgical navigation with a computer-assisted technique for displaying 3D anatomy overlaid with simulated surgical instruments. The goal was to investigate in vivo accuracy and flexibility. Fernandez-Alvarez et al. (2014) conducted a study to validate VR software for anthropometric measurements before preoperative planning and facial graft harvesting began. Using VR, we were able to match the results of the conventional analog method. The 3D reconstructions produced by VR software could help with understanding the donor's face.
Dental phobia
One of the most prevalent phobias today is dental phobia. Raghav et al. (2016) tested the effectiveness of a noninvasive Virtual Reality Exposure Therapy (VRET) on patients with dental phobia when compared to patients who had only received informational pamphlets. The authors suggested that VRET might be an alternative treatment for dental anxiety and phobia, based on the 6-month follow-up results which were reported.
Anatomy
One must study the 3D anatomy of the cavernous blood sinus to ensure successful skull base surgery for the treatment of lesions in this region. Cadaver dissection, while common, lacks the ability to illustrate the human body's spatial anatomy. Qian et al. (2018) used VR to model the cavernous sinus in 3D. VR-based procedures are convenient, noninvasive, time-saving, and more accurate than traditional ones.
Extended reality in orthopedics
Since 2000, research on AR has been focusing on using the technology in orthopedics, first on preclinical studies using cadavers, bone phantoms, and models (Bagwe et al. 2021). These are grouped into three areas, the world, body, and head-based position of the display system (Bagwe et al. 2021). The World space encompasses locations where displays exist in a stationary place, such as computers and projector-based screens. Head-mounted displays (HMDs) such as HoloLens and AR augmented microscopes with HUDs such as Pentero 900 make up the Headspace. Smartphones and tablets, among other items, fit into the category of Body Space. Several AR systems have been used in orthopedic surgeries since 2013.
World space
Camera augmented mobile (C-Arm) AR system
A camera-augmented mobile (C-arm), developed by Siemens in 1998 has a camera built into the C-Arm device, and that device is present in this image. The image of the patient's non-visible parts was superimposed over the visible parts, which was easier to see and made the device more effective. Von der Heide et al. (2018) tested a novel AR system called CamC on orthopedic surgeries that deal with trauma. The orthopedic surgery community found that CamC's promise was in orthopedic surgery.
Augmented reality surgical navigation system (ARSN)
ARSN is a novel system for spinal surgery navigation developed by Philips. Using ARSN in the surgical placement of 253 pedicle screws, Elmi-Terander et al. 2018 had high accuracy with acceptable operative time after performing the first series of surgical studies with cadavers. The only issue with the system was that it was hard to use on patients who were obese.
Augmented reality computer assisted spine surgery (ARCASS)
Wu et al. (2014) used ARCASS for Percutaneous Vertebroplasty (PVP) in three patients. It was tested prior to use on humans, on things like dummy patients, animal models, and 3D models. With the Visible Patient tool, this technology uses 3D images from preoperative CT scans to project images of patients during the intraoperative period via a camera and projector.
Body space
Depth camera with an optical marker
Shen et al. (2013) used this system in order to create a new lightweight AR system for patient-specific plate-based acetabular contouring using optical markers on printed pictures and a video camera, combined with a desktop. A digital plate was created from the fracture pattern of the real plate, which was used to assist in the design of the virtual plate. This implant was utilized in patients using an implantable lightweight AR system. They reported having less invasive surgeries with greater anatomical precision.
Smartphone camera with QR code
Ogawa et al. (2018) established an AR-based portable navigation system utilizing the smartphone display for viewing the functional pelvic plane and placing an acetabular cup during THA.
Headspace
Augmented reality with heads up display operating microscope
Carl et al. (2020) used the Pentero 900 operating microscope with AR for 42 spinal surgeries, integrated with the HUD-enabled operating microscope. To do automatic registration and integration of data, they used a nonlinear registration system that used low-dose intraoperative CT scans. The people interviewed stated that the use of AR in surgery significantly improved anatomical orientation and surgical accuracy.
Augmented reality using wearable head-mounted display (HMD)
Thanks to advances in technology, today exist a whole bunch of new wearable headsets, with firms like Epson (MOVERIO), Google (Google Glass), Microsoft (HoloLens), and Vuzix in the mix (Smart Glasses M400). The surgical field promises simple and easy use of augmented reality thanks to these devices. Researchers conducted an experiment using Virtual Protractor with Augmented Reality (VIPAR), a system composed of an HMD with a tracking camera and a marker sheet, to perform PVP in 5 patients in 2013 (Abe et al. 2013). First, 40 phantom spine models were used to determine the accuracy and viability of the system. AR was more accurate in positioning the needles.
Virtual environment of things
Technology is merely a tool for creating a conducive learning environment where the most effective training can be carried out. Virtual reality (VR) broadens the possibilities for how immersive learning technologies are perceived. Extended reality devices can display this data in full 3D or in 2D as "virtual windows" (Kwok and Koh 2021). This gives users more flexibility in moving, resizing, and hiding information to manage it better graphically. Network-enabled sensors are required to incorporate sensor data into XR devices (Karim et al. 2020). Sensors with network connectivity are more advanced, but they are becoming increasingly widespread, as evidenced by the rise of the Internet of Things (IoT). There is an extended reality in medical practice (Koutitas et al. 2018). AR devices use spatial mapping techniques to project visuals on top of physical objects and can recover the camera's 3D path, allowing them to map and localize themselves within the environment. These devices can also connect virtual things to the real world, providing consumers with a unique experience by providing them with greater flexibility and a better sense of location. To merge IoT and AR, a new technology called the Virtual Environment of Things (VEoT) is being used (Comito 2021).
More advanced mobile computing devices, such as PDAs, smartphones, and wearable sensors, are becoming more widely available, allowing them to be utilized in more complex mobile applications including collaborative analysis, information sharing, and data mining. Understanding the requirements of data-intensive apps that can run efficiently on mobile devices requires energy characterization. The research has been carried out to experimentally estimate the energy consumption of representative data mining methods. Algorithms can be finished fast on mobile devices by fine-tuning a few parameters, according to the research (Comito and Talia 2017). Similarly, an energy-aware scheduling strategy that assigns computational tasks over a network of mobile devices optimizing the energy usage is presented in Comitoe et al. (2011). The scheduler's key design principle is to create a task allocation that balances the energy burden among the devices to extend network lifetime. A prototype of the system was used to test the model, which included smart phones and android emulators. When compared to traditional time-based schedular, experimental results show that adapting an energy-aware schedular can save a lot of energy while still satisfying the performance requirements. Everyone's lives have been affected by the COVID-19 coronavirus pandemic, including how we work, play, learn, exercise, and socialize. Virtual reality (VR) technology has the potential to ameliorate many of the problems caused by the pandemic, leading to an increase in its use. The research (Ball et al. 2021) investigates how COVID-19's perceived impacts may influence VR usage and pleasure, as well as device ownership and variability. Virtual reality (VR) technology has the potential to alleviate many of the pandemic's concerns, spurring its use (Shirer and Soohoo 2020). VR looks to have the unique affordances and modalities to handle many of the pandemic's short- and long-term issues (Javaid et al. 2020a) and appears to be one of the most significant "industry 4.0" technologies that could help address the COVID-19 pandemic (Imperatori et al. 2020). Some industry observers predict that the pandemic will provide VR with the "jump-start" it needs to break into the mainstream (Osterland 2020).
The work (Fang and Huang 2021) debates the aging population and the promotion of a better lifestyle through the use of a VR application. The authors of Oliveira et al. (2021) investigate the possibilities of Virtual Field Studies as a substitute for real-world field studies. The authors also talked about how GaitWear, a smartwatch application, may be utilized to create this haptic baseline on the go. The authors conduct a brief investigation on Virtual Reality and its applicability for the COVID-19 pandemic in Singh et al. (2020). The authors (Matamala-Gomez et al. 2021) propose a telemedicine process flow for the care of people with chronic neurological diseases in order to meet the new challenges of providing high-quality care as public and private healthcare organizations around the world undergo transformations as a result of the COVID-19 pandemic contingency. The authors of Asadzadeh et al. (2021) looked examined the use of virtual reality and augmented reality in the emergency management of viral outbreaks, with a focus on the COVID-19 outbreak. Table 10 gives a detailed list of works of the XR and AR use cases in epidemic control, monitoring, and applications.
Table 10.
Applications of VR and AR in infectious diseases/pandemic
| Emergency management | Infection type | Technology application |
|---|---|---|
| Response (Javaid et al. 2020b) | COVID-19 | Video calls (potential applications) |
| Preparedness (Real et al. 2017) | Influenza | Improving communication skills of residents under influenza vaccine hesitancy conditions |
| Preparedness (Nowak et al. 2020) | Influenza | Increasing beliefs and perceptions of individuals about the role of vaccination against transmission of the virus |
| Response (Zhou et al. 2006) | SARS | Controlling the spread of the outbreak by simulating human behaviors and interactions |
| Preparedness and response (Gong et al. 2006) | SARS | Teaching methods for controlling transmission |
| Preparedness (Clack et al. 2018) | Infections and microorganisms | Prevention of the transmission of the infection by teaching hand hygiene |
| Preparedness (Monahan et al. 2009) | Emergency pandemic (flu to bioterrorism) | Teaching public health preparedness exercises |
| Preparedness (Klomp et al. 2020) | Ebola and others | Preparation against disease-related disasters by training for improving safety, collaboration, and management |
| Preparedness (Ragazzoni et al. 2015) | H1N1 and others | Realizing training objectives at universities under quarantine |
| Preparedness (Bidaki and Ehteshampour 2019) | Respiratory system pathogenic agents | Providing a tool for learning about infectious diseases |
| Preparedness (Lima et al. 2017) | Dengue virus | Education and epidemiological surveillance |
| Response (Rosenbaum et al. 2007) | Avian influenza | Realizing prevention and training objectives by providing location information and transmission patterns |
Current biomedical trends in XR
XR aids in the visualization and analysis of 3D data via interactive visualizations by giving a tool for seeing 3D models rather than 2D representations. As a result, XR technology aids in the vast enhancement of XR volumetric data. In biomedical engineering, for example, virtual reality and augmented reality have considerably improved visualization and interaction with microscopic pictures, molecular data, and anatomical information. In real time, Google's AR Microscope (ARM) identifies malignancies from tiny images (Chen et al. 2019). The augmented bright-field microscope, computer, and trained deep learning algorithms were all in the ARM system. The training of the DL model in ARM has been done for the purposes of detection of prostate cancer and lymph node metastasis in breast cancer. Using AR, the output from the DL predictions was mapped as contours, heatmaps, or textural information onto the microscopic sample. The pathologists that utilize this system can conduct faster scans for cancer in huge images. Expansion microscopy was paired with VR to use on small-scale images to render microscopic structures that were previously undetectable (Duffy 2021). They presented ExMicroVR, a tool that could accommodate up to six scientists working remotely to accomplish more complicated tasks. A key feature of Expansion microscopy is that it greatly expands tissue sample volume, which allows for ease of visualization of molecules and interactions between cells. For the 3D VR interface, the project's images of the 2D expansion microscopy were combined with 3600 view VR.
ConfocalVR (Stefani et al. 2018), a virtual reality application, is used to investigate cell structure and protein distribution. The software visualized these cells in red–green–blue (RGB) volumes in addition to cellular images such as confocal microscopy stack images. ConfocalVR users could use controllers to drag and rotate an image to get it into the exact location and size needed for a better view of a specific region of interest. One could easily control the appearance of their poster with items like hue, brightness, and opacity. It also gives options for multiple people to use it at the same time, like many VR programs that have been discussed here. Engineers and doctors use the Microsoft HoloLens, a holographic MR HMD, to look at three-dimensional images, such as anatomical structures, and to interact with them in a way that delivers a much clearer image. HoloLens-based virtual reality for medical procedures gave surgeons and medical personnel the ability to view complex organs during operations (Limonte 2021). An autopsy is performed while the deceased's brain was navigated via HoloLens, which allowed the pathologist to get a closer look at what they were doing and view WSIs in a more immersive way. The pathologist also used HoloLens for telepathology and telepathology–radiology correlation to improve workflow and enable them to assess patients in a timely manner.
Neuroscientists also use VR to trace neurons in brain images using TeraVR to annotate neurons in teravoxel-scale brain images (Wang et al. 2019). The neurons of mice that were analyzed in VR improved upon existing representations (e.g., 2D or 360° views). In order to maximize the model's ability to adjust to individual preferences, a U-Net-based DL model was trained on reconstructions to adjust its output. The other tool in the VR neuron tracing set discovered and sorted the neuron data's spatial relations. Theart et al. (2017) created a user interface for 3D microscopy data visualization and data colocation within virtual reality. The arivis application VisionVR is open-source software that creates a VR program for viewing three-dimensional microscopic images and offers a set of tools for virtual manipulation and evaluation (Arivis 2021).
Virtual training for surgeries and biomedical devices
For a variety of reasons, training a physician to do surgeries is tough. Doing them properly, for example, necessitates the acquisition of abilities that can only be acquired through extensive practice and training. Before going on to the next level of the curriculum, the students are given the opportunity to practice an uncomplicated procedure in VR, which can enable medical students to execute surgery on a patient safely. Students who have practiced with medical simulators such as RASimsAs, AnatomyX, and SimSurgery are better equipped to handle unexpected medical emergencies. With the increase in medical school students' usage of these simulators, they can train their abilities to think and solve problems on the fly, respond in a high-pressure environment, and perform tasks despite the stress. Because of the isolation headsets that accompany VR programs, some simulators have to employ solutions such as having participants point to various aspects of a virtual world in order to stimulate collaboration. In the virtual world, students and educators can work together, allowing them to not say which body part or object they are referring to and instead point to the object in the virtual world so that their classmates can see it.
Advanced simulators, such as the University of Aachen's RASimAs, let surgeons rehearse surgery with more precise depictions of how the body reacts. These simulators use data from actual surgical cases to teach doctors how to arrange their treatments effectively. The RASimAs simulator, for example, helped students develop skills for injection accuracy by letting them work with tissue reactions (Calı et al. 2015). When the physicians used RASimAs and used an MRI machine to visualize where they were injecting the RASimAs, they would find that when the needle pierced the nerves, they were imitating reactions that the physicians were to look for when trying to insert a needle into a specific nerve in a patient. When mistakes are brought to light immediately after they occur, medical students benefit greatly.
AnatomyX will enable a collaboration-focused AR learning experience for students that is hands-on and interactive. Multiple students collaborate to improve surgery by collaborating and working in real time on a shared model (ThinkMobiles Team 2016). The program is a superb resource with many valuable features, including getting instant access to the latest medical data which is refreshed with updated information during learning. In addition to virtual learning materials, AnatomyX provides virtual examinations and assessments for educators.
SimSurgery was also available to adjust the exercise difficulty based on the trainee's skills. Also, the use of these tools makes it easier for educators to understand students' experiences since they allow educators to view students' activities firsthand in real time. To train medical students, who need to learn procedures without harm, VR gives students the opportunity to practice difficult procedures in a safe environment.
Researchers found that VR simulation training is inexpensive and easy to replicate. The largest issue in the diagnosis and treatment of cancer and radiology surgery is precision. To adopt Virtual and Augmented Surgical Intelligence is necessary because of this (VASI). Simulators in surgical training have been around longer, like LapSim and MIST VR, both of which are surgical simulators (Medivis 2021). VR training will likely become a big part of schools' programs. In the future, VR and MR technology will probably have a big impact on anatomical education. VR is a great alternative to manuals when it comes to training people on the correct way to do things in the biopharma industry and will help to ensure everyone's success.
Telemedicine and telehealth screening
Telemedicine and telehealth, despite being in a completely different sector than XR, are rapidly gaining ground. Patients can connect with their doctors in a virtual world where they can experience their remote consultations in a three-dimensional and immersive manner via telehealth via XR. The XRHealth platform provides neurocognitive, physical, and emotional support through VR and AR, offering training for recovering from injury and illness as well as stress and pain management. The VR platform uses virtual environments, games, and movement-tracking exercises to provide clinicians with an environment in which they can give feedback. XRHealth has begun to treat patients through its new telehealth clinic, which concentrates on rehabilitation (Kamarudin and Zary 2019). The beauty of this form of therapy is that it allows people to exercise at home in peace. People with paralysis, especially those who have trouble getting around, will benefit greatly from this. VR platform helps keep a person engaged as they immerse themselves in environments that resemble what they know so that they can continue recovery in a realistic setting. VR treatments can even be modified based on what the patients see, with the physicians having the ability to see these same views. Participants receive post-training information on their level of rehabilitation and their progress, so it is easier to keep track of their progress. The XRHealth platform, which includes AI, targets the individual, helping to supplement traditional therapeutic methods such as prescription drugs, tailoring to patient needs.
Anesthesia
A system known as AREA, designed to assist with epidural anesthesia, is described (XRHealth, VR Telehealth 2021). The findings indicate that using Micron Tracker with a simple setup can yield a satisfactory level of accuracy for back level and line recognition, all with little training required.
Discussions/potentials, challenges, and future directions
Discussions/potentials: it is undeniable that information technology is rapidly evolving. As a result, virtual reality and augmented reality (VR/AR) have advanced rapidly, becoming essential technologies. The majority of studies on virtual reality and augmented reality applications fall into one of four categories: entertainment, health care, business and industry, and education and training. These technologies are useful in clinical care for providing telehealth services, visualizations, and communication, as well as assisting healthcare personnel with education, diagnosis, and medication discovery. Moreover, VR-based telemedicine can help physicians in the assessment of patients by allowing remote information visualization. Thus, in the clinical context, making use of all the potentials of AR/VR requires examining their applications for emergency management.
In business, VR applications have been used to hold VR-based conferences, teleconferences, and telemeetings, as well as to navigate through the VR world (Roldán et al. 2019; Hanson et al. 2017), entertainment (Akçayır and Akçayır 2017), tourism (Hamilton et al. 2020), virtual reality teleconferencing (Radu 2014), business (Barsom et al. 2016), online education (Parmaxi and Demetriou 2020), and so on. Different types of virtual reality, including immersive virtual reality, computer-assisted reality, augmented reality, and mixed reality, can all be used for educational purposes. XR technologies such as virtual, augmented, and mixed realities have the potential to enhance the quality and delivery of healthcare education. They could provide high-volume cross-site interactive learning at a lower cost. However, this must not come at the expense of high-quality educational outcomes. XR technologies should offer non-inferior if not superior outcomes to traditional learning methods. The temptation might be to implement XR without a clear understanding of learning environments and objectives. Technology-driven instead of learner-driven approaches often fail to provide sustainable improvements in learning outcomes and the learning environment. Other drivers for XR technologies in healthcare education should also be evaluated, including cost-effectiveness and accessibility. Education and training play an important role in the emergency management of disasters, and responders need to be cross-trained in multiple roles for preparedness against disaster conditions.
Most past studies on infectious diseases were created for educating and teaching prevention or preparedness measures to manage crisis conditions, according to the analysis of earlier relevant work. It should be mentioned that VR skills play an essential part in decreasing the detrimental consequences of infectious diseases prior to pandemics. In other words, people can learn how to respond to infectious diseases by experiencing a pandemic in a simulated environment. Furthermore, the COVID-19 disease has compelled health officials and government officials to urge people to enter quarantine; thus, the use of information technologies, such as virtual reality and augmented reality (VR and AR), as digital solutions, can assist us in a variety of areas, including health care, information sharing, communication, business, education, entertainment, and so on. As a result, these technologies have the potential to be applied in any industry, particularly during the COVID-19 epidemic's quarantine period. However, each type's benefits and shortcomings must be taken into account, with special attention paid to how well virtual reality environments emulate the real world. While AR and MR improve vision, VR and MeR completely obstruct it. When the screen is powered down, it is clear that VR and MeR are entirely opaque, whereas AR and MR are only semitransparent, allowing the user to see underneath digital additions. Students could use VR to practice in a totally immersive simulation, free from any outside distraction. Interventionists, though, can take advantage of AR, MeR, and MR to stay present in a physical room, where they can conduct procedures and stay engaged with the patient while also working with their team. MeR platforms are a threat to safety if the procedure loses power and then the entire view gets blocked. To make the most visually appealing, mobile, interactive, and interactive virtual reality applications, you need to keep the power, processing speed, and weight of your hardware low while avoiding using lots of money on big, heavy equipment.
In neurosurgical oncology, virtual reality and augmented reality are valuable technology. Their utilization allows for a better grasp of intricate anatomic linkages as well as a more ergonomic operating room setting. When paired with intraoperative imaging for real-time vision, improved visualization benefits in both presurgical planning of the surgical corridor and intraoperative localization of lesions. With similar rates of accuracy, VR might be used as a backup or supplement to surgical neuron navigation systems. Virtual reality has the ability to reduce pain, owing to the mental distraction provided by the visual redirection to colorful and animated children's worlds (accompanied by relaxation music for some VR headsets). Indeed, pain perception necessitates attention, and because human attention is finite, entertaining it allows for a slower reaction to rising pain signals. The use of entertainment devices to relieve pain has been around for a long time, but owing to its extremely immersive nature, virtual reality appears to be more successful than traditional ways. The virtual reality headgear shields the patient from any interaction outside of the virtual environment, reducing unwanted visual and audio impulses. Indeed, the instruments used in oral surgery are most of the time metallic, sharp, and bloody instruments, enhancing patient anxiety. The headset is placed at the time of installation on the chair; the syringe used for local or loco-regional anesthesia is not seen by the patient and therefore is not apprehended.
AR is thought to make mental three-dimensional model development more accessible because the surgeon does not have to extrapolate two-dimensional data from imaging into their own three-dimensional construct. This is especially significant in areas like the lateral skull base and the cerebellopontine angle, which are tiny and physically packed. Better situational awareness and a lower risk of crucial errors are anticipated to result from a better mental three-dimensional model. AR projected through a heads-up display can minimize time spent looking at other inputs, as the surgeon does not have to switch attention to another screen out of the field of view. With that said, surgical navigation is becoming ever more accurate; lower target registration errors, auditory alerts when approaching critical structures to avoid, and microsensors at the instrument tips.
Virtual and augmented reality are also primed for growing use in a number of applications in radiology, surgery, and other therapeutic procedures, according to the study. In the past, medical applications have primarily focused on patient-as-user applications, with VR/AR serving as the intervention. Recent developments have enabled the extension of clinician-as-user applications, which had previously been constrained by the technology's ability to provide a satisfactory level of experience. The diagnosis, treatment, and long-term management of cancer patients can present an individual with a multitude of stressors as patients may experience financial strains, difficulty in maintaining interpersonal relationships, physical symptoms, and emotional distress. A handsome chunk of research has been performed for the exploration of the effects of VR interventions on cancer patients. These studies found that VR improved patients’ emotional well-being and diminished cancer-related psychological symptoms. They explored various relevant variables, including different types of settings (i.e., during chemotherapy, during painful procedures, during hospitalization). The abatement of distress is central to oncological care, and this is especially true in the specific contexts of the studies analyzed. More interdisciplinary research, grounded in appropriate theoretical frameworks, is needed to explore inherent complexities in these settings. A more global approach to studying the effects of VR could rely on the latest technology advancements that provide new systems like biosensors as well as “electroencephalogram” monitoring during procedures, which could help to speed up research in this field.
The key to drug design in the pharmaceutical business is determining the proper shape for the molecule to fit within the specific protein pocket. As a result, virtual reality and augmented reality are considerably assisting scientists in the discovery of new medications by visualizing molecules and detecting new drug-like ligands. As a result, during a COVID-19 outbreak, these technologies can be employed to find new drugs. Furthermore, the use of VR and AR in telehealth can help patients, and healthcare providers cope with the harmful effects of epidemics by allowing for remote delivery of healthcare services. In other words, VR-based telepresence systems can play a beneficial role in reducing face-to-face communication. Therefore, the use of this capability can affect the further spread of infection diseases, increasing the safety protection of the public and the hospital staff.
During the COVID-19 epidemic, research was performed to provide a theoretical framework for examining both user-related factors and device-related variables. As a result, the study sheds light on some of the societal demands, goals, and behaviors surrounding virtual reality use during a worldwide crisis. As a result, these findings could be especially valuable to scholars looking into UGT and media consumption in the aftermath of other concerns, natural disasters, or periods of global disturbance. The notion of virtual reality has been widely applied in the field of medicine, and it has been successfully proposed in the treatment of a variety of disorders. The use cases have been ranging from extensive edge to defending and confronting the epidemic COVID-19 by enhancing the skill, confidence, performance, and overall attitude of healthcare persons. It is used as a complementary medical/healthcare edify tool that will enhance the execution of medical deliverables. The concept of virtual reality includes the overall enhancement in the accuracy and the efficacy of the adopted actions, improved working proficiency, well-trained medical staff, etc., which can play a vital role in effectively and fruitfully handling the pandemic COVID-19 cases positively. Beyond the COVID-19 pandemic, there have been numerous studies about VR and AR applications in telemedicine, indicating the capability of these technologies in telehealth services.
VR features, such as the depiction of 3D models, realistic visualization of the molecular structure, and interactions of the target ligand, can be a useful tool for new drug development, molecular docking visualization, interactive drug design, and molecular dynamics simulation. Various apps for the aforementioned programs have been developed in this regard. RealityConvert, for example, was created to convert molecular structures to 3D representations in both AR and VR. In bioinformatics and cheminformatics, it is a useful tool (Yilmaz 2016). These technologies should be considered an effective tool during infectious epidemics. They play an influential role in treatment, telehealth, assessment of patients, bioinformatics, and education by presenting various features, such as immersion, interactivity, and virtual worlds. The literature on infectious diseases also indicated applications of these technologies in different diseases, such as SARS, influenza, and Ebola, for education, teleservices, and clinical purposes (Huang et al. 2016; Guo 2015). Moreover, VR can address the destructive effects of quarantine conditions by offering a simulation of the real world, creating a sense of presence in the real world. In this regard, VR and AR technologies have been used in pursuit of various objectives related to the COVID-19 disease.
Challenges and future directions: the appearance of images is influenced by resolution, brightness, focal depth, and field of view (FOV). The display technology, which is often the most expensive and space-constrained portion of the system, is one of the most challenging aspects of XR (Ashab et al. 2012). On top of this, 3D systems need two displays to create the illusion of depth using vergence, which means that every eye sees a different picture. To maximize visual quality, you have to use the human visual system (HVS). A normal human visual acuity is about 6/6 or 20/20, with a resolution of 1 arcmin/pixel and an elliptical FOV that is 1500–1700 by 1350–1500. A system with the angular resolution in pixels described above will have pixels known as "Retina Display" pixels, as defined by Apple. It is equivalent to a total pixel/eye size of roughly 9000 by 8100 pixels to fully immerse the human visual system. One reason for this is that 4 K HMDs (high-definition displays) contain 3840*2160 pixels and, often, require workstation-class graphics for their processing (Kress and Shin 2013). Clearly, today's optical and display technologies do not make this resolution and FOV affordable. To reach their required performance, device manufacturers have to reduce FOV, pixel density, and display brightness. New AR capabilities like handheld miniaturization and smaller power requirements have just been released, as with the Apple iPhone X (Strasburger et al. 2011), which demonstrates lower power and weight, as well as smaller dimensions. Stereoscopic displays also have trouble showing depth at close distances because of the accommodation of our eyes, which adjusts for distance so that everything is in focus. We need accommodation to help people focus on surgical tools and digital objects (like surgical guides) at different simulated distances (for example, they can focus on objects inside their “personal space” and “action space”). It is possible for HVS accommodation and HMD vergence to be in conflict (commonly known as VAC vergence and accommodation conflict) and result in discomfort while working at near distances. Most digital display systems only offer a single, static focal plane to all content, though recent technologies like adaptive optics are capable of creating multiple fixed focal planes.
AR and VR continue to pose challenges. Headaches, dizziness, and pain are all side effects of surgeons adopting headsets as their preferred hardware. Learning the fundamentals of AR and VR is difficult. The fact that virtual reality systems cannot now imitate physical touch is extremely important to surgeons. Skin marker tracking systems have a significant drawback: there can be a variation in the position of the tissue relative to the bones. However, these obstacles can be overcome through new technology and increased collaboration between medical experts and engineers. Despite some unique constraints, AR and VR possess unique advantages, low costs, and numerous additional advantages for their potential use with other technologies, thus ensuring that their future in the field of spine surgery will be promising. A fact to note is that AR and VR can serve to provide information to the clinician, students, trainees, patients, and surgical robots by linking all of them together to better facilitate surgical operations.
Because AR and VR are related to so many healthcare technologies, they open up new possibilities and paths for AR research and implementation. Gaming is a terrific AR and VR technology that will be used in health care to assist patients and students with pain management, learning, and teaching. The combination of AI, wearable sensors, gaming, augmented reality, and virtual reality can lead to the establishment of physical rehabilitation settings that allow doctors to engage with patients remotely in a virtual area and monitor them in real time. Combining robotic guidance, AI, and AR technologies results in practical, fast, and precise navigation systems. Surgeons can get live feedback during surgeries if they utilize AI, wearables, and AR technologies. Merging virtual, augmented, and mixed reality with AI and surgical robotics can help advance the implementation of telemedicine and also provide the ability to semi-automate and remotely perform surgeries. Surgeons could use AR glasses for many medical procedures in the near future, and soon doctors will be able to see 3D images during consultations, rehabilitation, training, and surgery. Doctors in training will be exposed to advanced virtual reality and augmented reality (AR) simulations. Spine surgery, which is a significant discipline in orthopedics, should be prepared to accept and implement these new technologies. Clinical teams, industrial designers, and gaming professionals are collaborating on spine surgery and are planning to use everything they are learning from these partnerships to change how all aspects of the surgery are conducted within the next decade (Caelli 2014). Additionally, the use of these applications in minimally invasive procedures and open surgical procedures will significantly grow. Some hardware solutions are also being developed for extended realities. Many of these solutions have been designed by small businesses, but the advances in the field will come from big companies, such as Microsoft, constantly improving their products. The newest ideas about HoloLens 2, such as the latest plans, are simply the beginning of a lengthy road. In addition, AR/MR displays that pass beyond head-mounted displays may be developed. In the future, the hardware may open up some new possibilities.
Conclusion
Virtual and augmented reality will see an increase in use in radiologic imaging, surgery, and a variety of other therapeutic applications in the future years. Previous research in medical has primarily focused on patient-as-user applications, with VR/AR serving as the intervention. New advancements have made it possible to broaden clinician-user applications, which had previously been limited by the technology's ability to give a satisfactory experience.
The research for the application and success of the extended realities in the medical sector is accumulating, and its work encompasses education and training as well as patient rehabilitation. It also runs the gamut from planning ahead of a procedure to using VR in the middle of a process on systems with modest resolutions and minimal hardware specifications. New hardware advances have made 3D imaging better, which increases patient therapy, training comprehension, and patient procedure accuracy. Using 3D data, we will see a quick increase in the ability to quickly complete complicated tasks thanks to increasing improvements in software and hardware. The various medical imaging tools will develop further to offer improved visualization and more options for handling specific problems. Patients can be given control over their own data for better health outcomes, which can be accomplished through various applications. This includes, for example, allowing patients to visualize three-dimensional objects, understand how to use data visualizations, or learn how to use technology they previously could not use. Making these promises a reality is the next big hurdle in present data data-driven twenty-first-century healthcare systems.
Declarations
Conflict of interest
The author(s) declare that there is no competing interest regarding the publication of this paper.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Tawseef Ayoub Shaikh, Email: Tawseef.Shaikh@pdpu.ac.in.
Tabasum Rasool Dar, Email: tabasumrasool4@gmail.com.
Shabir Sofi, Email: shabir@nitsri.net3.
References
- Arivis AG (2021) Arivis VisionVR. https://imaging.arivis.com/en/imagingscience/arivis-invie. Accessed 09 Aug 2021
- Abe Y, Sato S, Kato K. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty. J Neurosurg Spine. 2013;19(4):492–501. doi: 10.3171/2013.7.SPINE12917. [DOI] [PubMed] [Google Scholar]
- Akçayır M, Akçayır G. Advantages and challenges associated with augmented reality for education: a systematic review of the literature. Educ Res Rev. 2017;20:1–11. [Google Scholar]
- Al-Saud LM, Mushtaq F, Allsop MJ. Feedback and motor skill acquisition using a haptic dental simulator. Eur J Dent Educ. 2017;21(4):240–247. doi: 10.1111/eje.12214. [DOI] [PubMed] [Google Scholar]
- Amidei C, Kushner DS. Clinical implications of motor deficits related to brain tumors. Neurooncol Pract. 2015;2:179–184. doi: 10.1093/nop/npv017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Anil S, Kato Y, Hayakawa M, Yoshida K, Nagahisha S, Kanno T. Virtual 3-dimensional preoperative planning with the dextroscope for excision of a 4th ventricular ependymoma. Minim Invasive Neurosurg. 2007;50:65–70. doi: 10.1055/s-2007-982508. [DOI] [PubMed] [Google Scholar]
- Argotti Y, Davis L, Outters V, Rolland JP. Dynamic superimposition of synthetic objects on rigid and simple-deformable objects. Comput Graph. 2002;26:919–930. [Google Scholar]
- Asadzadeh A, Samad-Soltani T, Rezaei-Hachesu P. Applications of virtual and augmented reality in infectious disease epidemics with a focus on the COVID-19 outbreak. Informat Med Unlocked. 2021;24:1–9. doi: 10.1016/j.imu.2021.100579. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ashab HAD, Lessoway AV, Khallaghi S, Cheng A, Rohling R, Abolmaesumi P (2012) AREA: an augmented reality system for epidural anaesthesia. In: Proceedings of 34th annual international conference of the IEEE EMBS San Diego, California USA, 28 August–1 September 2012, pp 2659–2663 [DOI] [PubMed]
- Bagwe S, Singh K, Kashyap A, Arora S, Maini L. Evolution of augmented reality applications in Orthopaedics: a systematic review. J Arthrosc Joint Surg. 2021;8:84–90. [Google Scholar]
- Ball C, Huang K-T, Francis J. Virtual reality adoption during the COVID-19 pandemic: a uses and gratifications perspective. Telemat Inform. 2021;65:101728. doi: 10.1016/j.tele.2021.101728. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surg Endosc. 2016;30(10):4174–4183. doi: 10.1007/s00464-016-4800-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beck JG, Palyo AS, Winer HE, Schwagler EB, Ang EJ. Virtual reality exposure therapy for PTSD symptoms after a road accident: an uncontrolled case series. Behav Ther. 2007;38(1):39–48. doi: 10.1016/j.beth.2006.04.001. [DOI] [PubMed] [Google Scholar]
- Berg LP, Vance JM. Industry use of virtual reality in product design and manufacturing: a survey. Virt Real. 2016;21(1):1–17. [Google Scholar]
- Bidaki MZ, Ehteshampour A. Designing, producing, application, and evaluation of virtual reality-based multimedia clips for learning purposes of medical and nursing students. Chest. 2019;155(4):166A. doi: 10.1016/j.chest.2019.02.160. [DOI] [Google Scholar]
- Bigné E, Llinares C, Torrecilla C. Elapsed time on first buying triggers brand choices within a category: a virtual reality-based study. J Bus Res. 2016;69(4):1423–1427. [Google Scholar]
- Bonetti F, Warnaby G, Quinn L. Augmented reality and virtual reality in physical and online retailing: a review, synthesis and research agenda. In: Jung T, tom Dieck M, editors. Augmented reality and virtual reality, Progress in IS. Cham: Springer; 2018. pp. 119–132. [Google Scholar]
- Botell C, Baños RM, Villa H, Perpiña C, Garcia-Palacios A. Virtual reality in the treatment of claustrophobic fear: a controlled, multiple-baseline design. Behav Ther. 2000;31(3):583–595. [Google Scholar]
- Botella C, Baños RM, Perpiña C, Villa H, Alnañiz M, Rey A. Virtual reality treatment of claustrophobia: a case report. Behav Res Ther. 1998;36(2):239–246. doi: 10.1016/s0005-7967(97)10006-7. [DOI] [PubMed] [Google Scholar]
- Botella C, Villa H, Baños RM, Perpiña C, Garcia-Palacios A. The treatment of claustrophobia in virtual reality: changes in other phobic behaviors not specifically treated. Cyberpsychol Behav. 1999;2(2):135–141. doi: 10.1089/cpb.1999.2.135. [DOI] [PubMed] [Google Scholar]
- Botella C, García-Palacios A, Villa H, Baños RM, Quer S, Alcañiz M, Riva G. Virtual reality exposure in the treatment of panic disorder and agoraphobia: a controlled study. Clin Psychol Psychother. 2007;14(3):164–175. [Google Scholar]
- Botella C, Garcia-Palacios A, Villa H. Virtual reality exposure in the treatment of panic disorder and agoraphobia: a controlled study. Clin Psychol Psychother. 2007;14:164–175. [Google Scholar]
- Butler JF, Holcomb JB, Shackelford S, Montgomery HR, Anderson S, Cain JS. Management of suspected tension pneumothorax in tactical combat casualty care: TCCC guidelines Chang. J Spec Oper Med Peer Rev J SOF Med Profes. 2018;18(2):19–35. doi: 10.55460/XB1Z-3BJU. [DOI] [PubMed] [Google Scholar]
- Caelli T (2014) Visual perception: theory and practice: pergamon international library of science. Technology, Engineering, and Social Studies, pp 103–14
- Calı C, Baghabra J, Boges DJ, Holst GR, Kreshuk A, Hamprecht FA, Srinivasan M, Lehva Slaiho H, Magistretti PJ. Three-dimensional immersive virtual reality for studying cellular compartments in 3D models from EM preparations of neural tissues. J Comput Neurol. 2015;524:23–38. doi: 10.1002/cne.23852. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Canalys Media alert: virtual reality headset shipments top 1 million for the first time (2017). https://goo.gl/PvBHjx. Accessed 14 Jun 2021
- Carl B, Bopp M, Sab B, Pojskic M, Voellger B, Nimsky C. Spine surgery supported by augmented reality. Glob Spine J. 2020;10(2):41–55. doi: 10.1177/2192568219868217. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chan CLF, Ngai EKY, Leung PKH, Wong S. Effect of the adapted virtual reality cognitive training program among Chinese older adults with chronic schizophrenia: a pilot study. Int J Geriatr Psych. 2010;25(6):643–649. doi: 10.1002/gps.2403. [DOI] [PubMed] [Google Scholar]
- Chan F, Aguirre S, Heaton HB, Hanley F, Perry S (2013) Head tracked stereoscopic pre-surgical evaluation of major aortopulmonary collateral arteries in the newborns. In: Proceedings of radiological society of North America 2013 scientific assembly and annual meeting, Chicago
- Chen JG, Han KW, Zhang DF, Li ZX, Li YM, Hou LJ. Presurgical planning for supratentorial lesions with free Slicer software and Sina app. World Neurosug. 2017;106:193–197. doi: 10.1016/j.wneu.2017.06.146. [DOI] [PubMed] [Google Scholar]
- Chen PC, Gadepalli K, MacDonald R, Liu Y, Kadowaki S, Nagpal K, Kohlberger T, Dean J, Corrado GS, Hipp JD. An augmented reality microscope with real-time artificial intelligence integration for cancer diagnosis. Nat Med. 2019;25:1453–1457. doi: 10.1038/s41591-019-0539-7. [DOI] [PubMed] [Google Scholar]
- Chevalley OH, Schmidlin T, Marcos PD (2015) Intensive upper limb neurorehabilitation with virtual reality in chronic stroke: a case report. In: Proceedings of annual meeting of American society of neurorehabilitation, Chicago
- Choi YH, Vincelli F, Riva G, Wiederhold BK, Lee JH, Park KH. Effects of group experiential cognitive therapy for the treatment of panic disorder with agoraphobia. Cyberpsycholy Behav. 2005;8(4):387–393. doi: 10.1089/cpb.2005.8.387. [DOI] [PubMed] [Google Scholar]
- Chrastina J, Novak Z, Riha I, Hermanova M, Feitova V. Diagnostic value of brain tumor neuroendoscopic biopsy and correlation with open tumor resection. J Neurol Surg A Cent Eur Neurosurg. 2012;75:110–115. doi: 10.1055/s-0032-1320032. [DOI] [PubMed] [Google Scholar]
- Clack L, Hirt C, Wenger M, Saleschus D, Kunz A, Sax H (2018) Virtue—a virtual reality trainer for hand hygiene. In: Clack L, Hirt C, Wenger M, Saleschus D, Kunz A, Sax H (eds) 2018 9th international conference on information, intelligence, systems and applications (IISA), pp 23–25
- Case Western Reserve, Cleveland Clinic (2021) Case Western Reserve, Cleveland Clinic collaborate with Microsoft on ‘earth-shattering’ mixed-reality technology for education. http://case.edu/hololens/. Accessed 11 Aug 2021
- Comito C (2021) How COVID-19 information spread in US? The role of twitter as early indicator of epidemics. J Latex Class Files. 10.1109/TSC.2021.3091281
- Comito C, Talia D (2004) GDIS: a service-based architecture for data integration on grids. In: Meersman R, Tari Z, Corsaro A (eds) On the move to meaningful internet systems 2004: OTM 2004 workshops. OTM 2004. Lecture notes in computer science, vol 3292. Springer, Berlin, Heidelberg. 10.1007/978-3-540-30470-8_27
- Comito C, Talia D (2017) Energy consumption of data mining algorithms on mobile phones: evaluation and prediction. Pervas Mob Comput (2017). 10.1016/j.pmcj.2017.10.006.
- Comito C, Falcone D, Talia D, Trunfio P (2011) Energy efficient task allocation over mobile networks. In: 2011 IEEE 9th international conference on dependable, autonomic and secure computing, pp 380–387. 10.1109/DASC.2011.80
- Correa CG, Machado MAAM, Ranzini E. Virtual reality simulator for dental anesthesia training in the inferior alveolar nerve block. J Appl Oral Sci. 2017;25(4):357–366. doi: 10.1590/1678-7757-2016-0386. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dayan E (2006) ARGAMAN: rapid deployment virtual reality system for PTSD rehabilitation. In: Proceedings of international conference on information technology: research and education, pp 34–38
- de Boer IR, Lagerweij MD, Wesselink PR. Evaluation of the appreciation of virtual teeth with and without pathology. Eur J Dent Educ. 2015;19(2):87–94. doi: 10.1111/eje.12108. [DOI] [PubMed] [Google Scholar]
- de Oliveira A, Khamis M, Esteves A. GaitWear: a smartwatch application for in-the-wild gait normalisation based on a virtual field study assessing the effects of visual and haptic cueing. Behav Inf Technol. 2021 doi: 10.1080/0144929X.2021.1958060. [DOI] [Google Scholar]
- Difede J, Hoffman GH. Virtual reality exposure therapy for world trade center post-traumatic stress disorder: a case report. Cyberpsychol Behav. 2002;5(6):529–535. doi: 10.1089/109493102321018169. [DOI] [PubMed] [Google Scholar]
- Duffy J (2021) Microscopy and VR illuminate new ways to prevent and treat disease. https://www.cmu.edu/news/stories/archives/2019/july/vr-expands-microscopy.html. Accessed 30 Jun 2021
- Elmi-Terander A, Nachabe R, Skulason H. Feasibility and accuracy of thoracolumbar minimally invasive pedicle screw placement with augmented reality navigation technology. Spine. 2018;43(14):1018–1023. doi: 10.1097/BRS.0000000000002502. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Extended reality-brief determining needs, expectations and the future of XR for the ACRC. A study commissioned by the ICRC innovation board, pp 1–7 (2018). Accessed 10 Mar 2021
- Emmelkam P, Bruynzeel M, Drost L. Virtual reality treatment in acrophobia: a comparison with exposure in vivo. Cyberpsychol Behav. 2001;4(3):183–202. doi: 10.1089/109493101300210222. [DOI] [PubMed] [Google Scholar]
- Espejo-Trung LC, Elian SN, Luz MA. Development and application of a new learning object for teaching operative dentistry using augmented reality. J Dent Educ. 2015;79(11):1356–1361. [PubMed] [Google Scholar]
- Eve EJ, Koo S, Alshihri AA. Performance of dental students versus prosthodontics residents on a 3D immersive haptic simulator. J Dent Educ. 2014;78(4):630–637. [PubMed] [Google Scholar]
- Fang Y-M, Huang Y-J. Comparison of the usability and flow experience of an exercise promotion virtual reality programme for different age groups. Behav Inf Technol. 2021 doi: 10.1080/0144929X.2021.1938680. [DOI] [Google Scholar]
- Fernandez-Alvarez JA, Infante-Cossio P, Barrera-Pulido F. Virtual reality AYRA software for preoperative planning in facial allotransplantation. J Craniofac Surg. 2014;25(5):1805–1809. doi: 10.1097/SCS.0000000000000989. [DOI] [PubMed] [Google Scholar]
- Ferroli P, Tringali G, Acerbi F. Advanced 3- dimensional planning in neurosurgery. Neurosurgery. 2013;72:A54–A62. doi: 10.1227/NEU.0b013e3182748ee8. [DOI] [PubMed] [Google Scholar]
- Finger T, Schaumann A, Schulz M, Thomale UW. Augmented reality in intraventricular neuroendoscopy. Acta Neurochir. 2017;159:1033–1041. doi: 10.1007/s00701-017-3152-x. [DOI] [PubMed] [Google Scholar]
- Freeman D, Reeve S, Robinson A, Ehlers A, Clark D, Spanlang B, Slater M. Virtual reality in the assessment, understanding, and treatment of mental health disorders. Psychol Med. 2017;47(14):2393–2400. doi: 10.1017/S003329171700040X. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Forbes Inc (2017) Accessed 15 May 2021
- Florida Hospital Tampa Integrates Virtual Reality Into Surgical Planning and Patient Education. Florida Hospital website (2016). Accessed 9 Apr 2021
- Freeman D, Reeve S, Robinson A, Ehlers A, Clark D, Spanlang B, Slater M. Virtual reality in the assessment, understanding, and treatment of mental health disorders. Psychol Med. 2017;47:2393–2400. doi: 10.1017/S003329171700040X. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gerard IJ, Oertel KM, Drouin S. Combining intraoperative ultrasound brain shift correction and augmented reality visualizations: a pilot study of eight cases. J Med Imaging. 2018;5(021210):1–13. doi: 10.1117/1.JMI.5.2.021210. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goldman Sachs Global Investment Research (2016) Last Accessed 15 Mar 2021
- Gong J, Zhou J, Li W, Lin H (2006) Design and implementation of an intelligent virtual geographic environment for the simulation of SARS transmission. In: Gong J, Zhou J, Li W, Lin H (eds) Proceedings of the 2006 ACM international conference on virtual reality continuum and its applications
- Gong L, Berglund Å, Johansson B. A framework for extended reality system development in manufacturing. IEEE Access. 2021;9:24796–24813. [Google Scholar]
- Gorini A, Riva G. Virtual reality in anxiety disorders: the past and the future. Expert Rev. Neurotherapeut. 2008;8(2):215–233. doi: 10.1586/14737175.8.2.215. [DOI] [PubMed] [Google Scholar]
- Gould NF, Holmes MK, Fantie BD, Luckenbaugh DA, Pine DS, Gould TD, Burgess N, Manji HK, Zarate CA. Performance on a virtual reality spatial memory navigation task in depressed patients. Am J Psych. 2007;164:516–519. doi: 10.1176/ajp.2007.164.3.516. [DOI] [PubMed] [Google Scholar]
- Grand View Research Inc (2017) Accessed 15 May 2021
- Griffin T, Giberson J, Lee SH, Guttentag D, Kandaurova M, Sergueeva K, Dimanche F (2017) Virtual reality and implications for destination marketing. In: Proceedings of the 48th annual travel and tourism research association (TTRA), Quebec City, Canada, pp 547–560
- Grimson WEL, Ettinger GJ, White SJ, Gleason PL, Lozano-Pérez T, Wells WM, Kikinis R (1995) Evaluating and validating an automated registration system for enhanced reality visualization in surgery. In: proceedings of computer vision, virtual reality, and robotics in medicine, nice, France
- Guo Q (2015) Learning in a mixed reality system in the context of‚ Industrie 4.0. J Tech Educ 3(2)
- Hamilton D, McKechnie J, Edgerton E, Wilson C. Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design. J Comput Educ. 2020;8(1):1–32. [Google Scholar]
- Hanson R, Falkenström W, Miettinen M. Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput Ind Eng. 2017;113:570–575. doi: 10.1016/j.cie.2017.09.048. [DOI] [Google Scholar]
- Hospital SCsHLPCs. Lucile Packard Children’s Hospital Stanford pioneers use of VR for patient care, education and experience (2018). https://www.stanfordchildrens.org/en/about/news/releases/2017/virtual-reality-program. Accessed 9 Apr 2021
- Hu X, Nanjappan V, Georgiev GV. Bursting through the blocks in the human mind: enhancing creativity with extended reality technologies. Interactions. 2021;28(3):57–61. doi: 10.1145/3460114. [DOI] [Google Scholar]
- Huang YC, Backman KF, Backman SJ, Chang LL. Exploring the implications of virtual reality technology in tourism marketing: an integrated research framework. Int J Tour Res. 2016;18(2):116–128. [Google Scholar]
- Imperatori C, Dakanalis A, Farina B, Pallavicini F, Colmegna F, Mantovani F, Clerici M (2020) Global storm of stress-related psychopathological symptoms: a brief overview on the usefulness of virtual reality in facing the mental health impact of COVID-19. Cyberpsychol, Behav, Soc Netw [DOI] [PubMed]
- Inoue D, Cho B, M, Mori, Preliminary study on the clinical application of augmented reality neuronavigation. J Neurol Surg A Cent Eur Neurosurg. 2013;74:71–76. doi: 10.1055/s-0032-1333415. [DOI] [PubMed] [Google Scholar]
- Inoue A, Ohnishi T, Kohno S. Utility of three-dimensional computed tomography for anatomical assistance in endoscopic endonasal transsphenoidal surgery. Neurorug Rev. 2015;38:559–565. doi: 10.1007/s10143-015-0625-3. [DOI] [PubMed] [Google Scholar]
- Jang DP, Ku JH, Shin MB, Young HC, Kim IS. Objective validation of the effectiveness of virtual reality psychotherapy. Cyberpsychol Behav. 2000;3:369–374. [Google Scholar]
- Javaid M, Haleem A, Vaishya R, Bahl S, Suman R, Vaish A (2020a) Industry 4.0 technologies and their applications in fighting COVID-19 pandemic. Diab Metab Syndr: Clin Res Rev J [DOI] [PMC free article] [PubMed]
- Javaid M, Haleem A, Vaishya R, Bahl S, Suman R, Vaish A (2020b) Industry 4.0 technologies and their applications in fighting COVID-19 pandemic. Diab Metab Syndr. 10.1016/j.dsx.2020b.04.032 [DOI] [PMC free article] [PubMed]
- Kamarudin MFB, Zary N (2019) Augmented reality, virtual reality and mixed reality in medical education: a comparative web of science scoping review, pp 1–10. https://www.preprints.org. Accessed 29 Apr 2019
- Karim JS, Hachach-Haram N, Dasgupta P. Bolstering the surgical response to COVID-19: how virtual technology will save lives and safeguard surgical practice. BJU Int. 2020;125:E18–E19. doi: 10.1111/bju.15080. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Keizer A, van Elburg A, Helms R, Dijkerman HC. A virtual reality full body illusion improves body image disturbance in anorexia nervosa. PLoS ONE. 2016;11:e0163921. doi: 10.1371/journal.pone.0163921. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Khelemsky R, Hil B, Buchbinder D. Validation of a novel cognitive simulator for orbital floor reconstruction. J Oral Maxillofac Surg. 2017;75(4):775–785. doi: 10.1016/j.joms.2016.11.027. [DOI] [PubMed] [Google Scholar]
- Kim K, Kim CH, Kim SY, Roh D, Kim SI. Virtual reality for obsessive-compulsive disorders: past and future. Psych Invest. 2009;6:115–121. doi: 10.4306/pi.2009.6.3.115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Klinger E, Légeron P, Roy S, Chemin I, Lauer F, Nugues P. Virtual reality exposure in the treatment of social phobia. Stud Health Technol Inf. 2004;99:91–119. [PubMed] [Google Scholar]
- Klinger E, Légeron P, Ro S, Chemin I, Lauer F, Nugues P. Virtual reality exposure in the treatment of social phobia. In: Riva G, Botella C, Légeron P, Optale G, editors. Cybertherapy: internet and virtual reality as assessment and rehabilitation tools for clinical psychology and neuroscience. Amsterdam: Ios Press; 2004. pp. 91–119. [Google Scholar]
- Klomp RW, Jones L, Watanabe E, Thompson WW. CDC’s multiple approaches to safeguard the health, safety, and resilience of Ebola responders. Prehosp Disaster Med. 2020;35(1):69–75. doi: 10.1017/s1049023x19005144. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Koutitas G, Jabez J, Grohman C, Radhakrishna C, Siddaraju V, Jadon S (2018) Demo/poster abstract: XReality research lab—augmented reality meets internet of things. In: INFOCOM 2018—IEEE conf. comput. commun. work, pp 1–2. 10.1109/INFCOMW.2018.8406848
- Kress B, Shin M (2013) Diffractive and holographic optics as optical combiners in head-mounted displays. In: Proceedings of the 2013 ACM conference on pervasive and ubiquitous computing adjunct publication, pp 1479–82
- Krijn M, Emmelkamp PMG, Lafsson O, Bouwman RM, van Gerwen LJ, Spinhoven P. Fear of flying treatment methods: exposure vs cognitive behavioral therapy. Aviat Spac Environ Med. 2007;78(2):121–128. [PubMed] [Google Scholar]
- Kwok AOJ, Koh SGM. COVID-19 and extended reality (XR) Curr Issue Tour. 2021;24(14):1935–1940. doi: 10.1080/13683500.2020.1798896. [DOI] [Google Scholar]
- Lima T, Barbosa B, Niquini C, Araújo C, Lana R (2017) Playing against dengue design and development of a serious game to help tackling dengue. In Lima T, Barbosa B, Niquini C, Araújo C, Lana R (eds) IEEE 5th international conference on serious games and applications for health (SeGAH)
- Limonte K (2021) AI in healthcare: HoloLens in surgery. https://cloudblogs.microsoft.com/industry-blog/en-gb/health/2018/12/20/ai-health care-hololens-surgery/. Accessed 22 Aug 2021
- Lin JHT, Wu DY, Tao CC. So scary, yet so fun: the role of self-efficacy in the enjoyment of a virtual reality horror game. New Media Soc. 2017;20(4):1–20. [Google Scholar]
- Lorensen W, Cline H, Nafis C, Kikinis R, Altobelli D, Gleason L (1993) Enhancing reality in the operating room. In: Proceedings of IEEE visualization conference, San Jose, CA, USA
- Low D, Lee CK, Dip LLT, Ng WH, Ang BT, Ng I. Augmented reality neurosurgical planning and navigation for surgical excision of parasagittal, falcine and convexity meningiomas. Br J Neurosurg. 2010;24:69–74. doi: 10.3109/02688690903506093. [DOI] [PubMed] [Google Scholar]
- Malbos E, Mestre DR, Note ID, Gellato C. Virtual reality and claustrophobia: multiple components therapy involving game editor virtual environments exposure. Cyberpsychol Behav. 2008;11(6):695–707. doi: 10.1089/cpb.2007.0246. [DOI] [PubMed] [Google Scholar]
- Matamala-Gomez M, Bottiroli S, Realdon O, Riva G, Galvagni L, Platz T, Sandrini G, De Icco R, Tassorelli C. Telemedicine and virtual reality at time of COVID-19 pandemic: an overview for future perspectives in neurorehabilitation. Front Neurol. 2021;12:1–9. doi: 10.3389/fneur.2021.646902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- R.N. McLay, D.P. Wood, J.A. Webb-Murphy, J.L. Spira, D.M. Wiederhold, and J.M. Pyne (2011) A randomized, controlled trial of virtual reality-graded exposure therapy for post-traumatic stress disorder in active duty service member with combat-related posttraumatic stress disorder. Cyberpsychol Behav Soc Netw 14(4), 223–229 [DOI] [PubMed]
- Medivis (2021) The augmented reality anatomy lab and learning platform. https://www.medivis.com/anatomyx. Accessed 28 Jun 2021
- Merchant Z, Goetz ET, Cifuentes L, Keeney-Kennicutt W, Davis TJ. Effectiveness of virtual reality-based instruction on students' learning outcomes in K12 and higher education: a meta-analysis. Comput Educ. 2014;70:29–44. [Google Scholar]
- Mikhail M, Mithani K, Ibrahim GM. Presurgical and intraoperative augmented reality in neuro-oncologic surgery: clinical experiences and limitations. World Neurosurg. 2019;128:268–276. doi: 10.1016/j.wneu.2019.04.256. [DOI] [PubMed] [Google Scholar]
- Miki T, Iwai T, Kotani K. Development of a virtual reality training system for endoscope-assisted submandibular gland removal. J Craniomaxillofac Surg. 2016;44(11):1800–1805. doi: 10.1016/j.jcms.2016.08.018. [DOI] [PubMed] [Google Scholar]
- Milgram P, Kishino F. A taxonomy of mixed reality visual displays. IEICE Trans Inf Syst. 1994;77(12):1321–1329. [Google Scholar]
- Mühlberger A, Wiederman G, Pauli P. Efficacy of one-session virtual reality exposure treatment of fear of flying. Psychother Res. 2003;13(3):323–336. doi: 10.1093/ptr/kpg030. [DOI] [PubMed] [Google Scholar]
- Nimsky C, Ganslandt O, Hastreiter P, Fahlbusch R. Intraoperative compensation for brain shift. Surg Neurol. 2001;56:357–364. doi: 10.1016/s0090-3019(01)00628-0. [DOI] [PubMed] [Google Scholar]
- Next galaxy to develop virtual reality applications for Miami Children’s Hospital (2015). Accessed 11 July 2021
- North NM, North SM, Coble JR. Effectiveness of virtual reality environment desensitization in the treatment of agoraphobia. Presence Teleoper Virt Environ. 1996;5:346–352. [Google Scholar]
- North MM, North SM, Coble JR. Virtual reality therapy: an effective treatment for the fear of public speaking. Int J Virt Ther. 1998;3:2–8. [Google Scholar]
- Nowak GJ, Evans NJ, Wojdynski BW, Ahn SJG, Len-Rios ME, Carera K, et al. Using immersive virtual reality to improve the beliefs and intentions of influenza vaccine avoidant 18-to-49-year-olds: considerations, effects, and lessons learned. Vaccine. 2020;38(5):1225–1233. doi: 10.1016/j.vaccine.2019.11.009. [DOI] [PubMed] [Google Scholar]
- Oculus. Immersive Education. CHLA and oculus expand VR medical training program to new institutions (2018) [cited 2019 Jan 29]. https://www.oculus.com/blog/immersive-education-chla-andoculus-expand-vr-medical-training-program-to-new-institutions/. Accessed 9 July 2021
- Ogawa H, Hasegawa S, Tsukada S, Matsubara M. A pilot study of augmented reality technology applied to the acetabular cup placement during total hip arthroplasty. J Arthroplasty. 2018;33(6):1833–1837. doi: 10.1016/j.arth.2018.01.067. [DOI] [PubMed] [Google Scholar]
- Orringer DA, Golby A, Jolesz F. Neuronavigation in the surgical management of brain tumors: current and future trends. Expert Rev Med Dev. 2012;9:491–500. doi: 10.1586/erd.12.42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Osterland A (2020) https://www.cnbc.com/2020/05/02/coronavirus-could-be-catalyst-to-reinvigorate-virtual-reality-headsets.html
- Pallavicini F, Serino S, Cipresso P, Pedroli E, Chicchi Giglioli IA, Chirico A, Manzoni GM, Castelnuovo G, Molinari E, Riva G. Testing augmented reality for cue exposure in obese patients: an exploratory study. Cyberpsychol Behav, Soc Netw. 2016;19:107–114. doi: 10.1089/cyber.2015.0235. [DOI] [PubMed] [Google Scholar]
- Park KM, Ku J, Choi SH, Jang HJ, Park JY. A Virtual reality application in role-plays of social skills training for schizophrenia: a randomized, controlled trial. Psych Res. 2011;189:166–172. doi: 10.1016/j.psychres.2011.04.003. [DOI] [PubMed] [Google Scholar]
- Parmaxi A, Demetriou AA. Augmented reality in language learning: a state-of-the-art review of 2014–2019. J Comput Assist Learn. 2020;36(6):861–875. [Google Scholar]
- Pertaub DP, Slater M, Barker C. Public speaking in virtual reality: facing an audience of avatars. IEEE Comput Graph Appl. 1999;19(2):6–9. [Google Scholar]
- Qian ZH, Feng X, Li Y. Virtual reality model of the three-dimensional anatomy of the cavernous sinus based on a cadaveric image and dissection. J Craniofac Surg. 2018;29(1):163–166. doi: 10.1097/SCS.0000000000004046. [DOI] [PubMed] [Google Scholar]
- Qiu TM, Zhang Y, Wu JS. Virtual reality presurgical planning for cerebral gliomas adjacent to motor pathways in an integrated 3-D stereoscopic visualization of structural MRI and DTI tractography. Acta Neuroch. 2010;152:1847–1857. doi: 10.1007/s00701-010-0739-x. [DOI] [PubMed] [Google Scholar]
- Qu M, Hou Y, Xu Y. Precise positioning of an intraoral distractor using augmented reality in patients with hemifacial macrosomia. J Craniomaxillofac Surg. 2015;43(1):106–112. doi: 10.1016/j.jcms.2014.10.019. [DOI] [PubMed] [Google Scholar]
- Radu I. Augmented reality in education: a meta-review and cross-media analysis. Pers Ubiquitous Comput. 2014;18(6):1533–1543. [Google Scholar]
- Ragazzoni L, Ingrassia PL, Echeverri L, Maccapani F, Berryman L, Burkle FM, Jr, et al. Virtual reality simulation training for Ebola deployment. Disaster Med Public Health Prep. 2015;9(5):543–546. doi: 10.1017/dmp.2015.36. [DOI] [PubMed] [Google Scholar]
- Raghav K, van Wijk AJ, Abdullah F. Efficacy of virtual reality exposure therapy for the treatment of dental phobia: a randomized control trial. BMC Oral Health. 2016;16:25. doi: 10.1186/s12903-016-0186-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Real FJ, DeBlasio D, Ollberding NJ, Davis D, Cruse B, McLinden D, et al. Resident perspectives on communication training that utilizes immersive virtual reality. Educ Health. 2017;30(3):228–231. doi: 10.4103/efh.EfH_9_17. [DOI] [PubMed] [Google Scholar]
- Riva G. Virtual environment for body image modification: virtual reality system for the treatment of body image disturbances. Comput Hum Behav. 1998;14:477–490. [Google Scholar]
- Roberts R. Passenger fear of flying: behavioral treatment with extensive in-vivo exposure and group support. Aviat Spac Environ Med. 1989;60:342–348. [PubMed] [Google Scholar]
- Roldán JJ, Crespo E, Martín-Barrio A, Peña-Tapia E, Barrientos A. A training system for Industry 4.0 operators in complex assemblies based on virtual reality and process mining. Robot Comput Integr Manuf. 2019;59:305–316. [Google Scholar]
- Rosahl S, Gharabaghi A, Hubbe U, Shahidi R, Samii M. Virtual reality augmentation in skull base surgery. Skull Base. 2006;16:59–66. doi: 10.1055/s-2006-931620. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rosenbaum E, Klopfer E, Perry J. On location learning: authentic applied science with networked augmented realities. J Sci Educ Technol. 2007;16(1):31–45. doi: 10.1007/s10956-006-9036-0. [DOI] [Google Scholar]
- Rothbaum BO, Hodge L, Kooper R. Effectiveness of computer-generated (virtual reality) graded exposure in the treatment of acrophobia. Am J Psychiatr. 1995;152(4):626–628. doi: 10.1176/ajp.152.4.626. [DOI] [PubMed] [Google Scholar]
- Rothbaum BO, Hodges L, Alarcon K, Ready D, Shahar F, Graap K. Virtual reality exposure therapy for PTSD Vietnam veterans: a case study. J Trauma Stress. 1999;12:263–227. doi: 10.1023/A:1024772308758. [DOI] [PubMed] [Google Scholar]
- Rothbaum BO, Hodge L, Smith S, Lee JH, Price L. A controlled study of virtual reality exposure therapy for the fear of flying. J Consult Clin Psychol. 2000;68(6):1020–1026. doi: 10.1037//0022-006x.68.6.1020. [DOI] [PubMed] [Google Scholar]
- Rothbaum B, Hodges L, Smith S. A controlled study of virtual reality exposure therapy for fear of flying. J Consult Clin Psychol. 2000;68(6):1020–1026. doi: 10.1037//0022-006x.68.6.1020. [DOI] [PubMed] [Google Scholar]
- Rothbaum BO, Hodges L, Anderson PL, Price L, Smith S. Twelve-month follow up of virtual reality and standard exposure therapies for the fear of flying. J Consult Clin Psychol. 2002;70(2):428–432. doi: 10.1037//0022-006x.70.2.428. [DOI] [PubMed] [Google Scholar]
- Rothbaum BO, Anderso PL, Zimand E, Hodges L, Lang D, Wilson J. Virtual reality exposure therapy and standard (in vivo) exposure therapy in the treatment of fear of flying. Behav Ther. 2006;37(1):80–90. doi: 10.1016/j.beth.2005.04.004. [DOI] [PubMed] [Google Scholar]
- Roy S, Klinger E, Légeron P, Lauer F, Chemin I, Nugues P. Definition of a VR-based protocol to treat social phobia. Cyberpsychol Behav. 2003;6:411–420. doi: 10.1089/109493103322278808. [DOI] [PubMed] [Google Scholar]
- Rotariu DL, Ziyad F, Budu A, Poeata I. The role of OsiriX based virtual endoscopy in planning endoscopic transsphenoidal surgery for pituitary adenoma. Turk Neurosurg. 2017;27:339–345. doi: 10.5137/1019-5149.JTN.16311-15.2. [DOI] [PubMed] [Google Scholar]
- Harris SR, Kemmerling RL, North MM. Brief virtual reality therapy for public speaking anxiety. Cyberpsychol Behav. 2002;5(543–550):2002. doi: 10.1089/109493102321018187. [DOI] [PubMed] [Google Scholar]
- Safavi K, Kalis B (2018) Extended reality trend 2 the end of distance. Digit Health Technol Vis 1–7
- Sarah M, Zweifacha M, Triolab M. Extended reality in medical education: driving adoption through provider-centered design. Digit Biomark. 2019;3:14–21. doi: 10.1159/000498923. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schwam GZ, Kaul VF, Bu DD, Iloreta AMC, Bederson JB, Perez E, Cosetti MK, Wanna GB. The utility of augmented reality in lateral skull base surgery: a preliminary report. Am J Otolaryngol-Head Neck Med Surg. 2021;42(102942):34–53. doi: 10.1016/j.amjoto.2021.102942. [DOI] [PubMed] [Google Scholar]
- Scolozzi P, Bijlenga P. Removal of recurrent intraorbital tumour using a system of augmented reality. Br J Oral Maxillofac Surg. 2017;55(9):962–964. doi: 10.1016/j.bjoms.2017.08.360. [DOI] [PubMed] [Google Scholar]
- Shen F, Chen B, Guo Q, Qi Y, Shen Y. Augmented reality patient-specific reconstruction plate design for pelvic and acetabular fracture surgery. Int J CARS. 2013;8(2):169–179. doi: 10.1007/s11548-012-0775-5. [DOI] [PubMed] [Google Scholar]
- Shirer M, Soohoo S (2020) Worldwide spending on augmented and virtual reality forecast to deliver strong growth through 2024, According to a New IDC Spending Guide. https://www.idc.com/getdoc.jsp?containerId=prUS47012020
- Silva JN, Southworth MK, Dalal A, Van Hare GF, Silva JR. Improving visualization and interaction during transcatheter ablation using an augmented reality system: first-in-human experience. Circulation. 2017;136:A15358. [Google Scholar]
- Singh RP, Javaid M, Kataria R, Tyagi M, Haleem A, Suman R. Significant applications of virtual reality for COVID-19 pandemic. Diabetes Metab Syndr: Clin Res Rev. 2020;14:661–664. doi: 10.1016/j.dsx.2020.05.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Silva NAJ, Southworth M, Raptis C, Silva J. Emerging applications of virtual reality in cardiovascular medicine. JACC Basic Transl Sci. 2018;3(3):420–430. doi: 10.1016/j.jacbts.2017.11.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Southworth KM, Silva JR, Silva JNA. Use of extended realities in cardiology. Trends Cardiovasc Med. 2020;30:143–148. doi: 10.1016/j.tcm.2019.04.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stanford Medicine, Neurosurgery. Stanford Neurosurgical Simulation and Virtual Reality Center (2019) [cited 2019 Jan 29]. http://med.stanford.edu/neurosurgery/divisions/vr-lab.html. Accessed 15 Aug 2021
- Stadie AT, Kockro RA. Mono-stereo-auto stereo: the evolution of 3-dimensional neurosurgical planning. Neurosurgery. 2013;72:A63–A77. doi: 10.1227/NEU.0b013e318270d310. [DOI] [PubMed] [Google Scholar]
- Stadie AT, Kockro RA, Reisch R. Virtual reality system for planning minimally invasive neurosurgery. J Neurosurg. 2008;108:382–394. doi: 10.3171/JNS/2008/108/2/0382. [DOI] [PubMed] [Google Scholar]
- Stadie AT, Kockro RA, Serra L. Neurosurgical craniotomy localization using a virtual reality planning system versus intraoperative image-guided navigation. Int J Comput Assist Radiol Surg. 2011;6:565–572. doi: 10.1007/s11548-010-0529-1. [DOI] [PubMed] [Google Scholar]
- Stanford Children’s Health, Stanford LPCsH, Lucile Packard Children’s Hospital Stanford pioneers use of VR for patient care, education, and experience. http://www.stanfordchildrens.org/en/about/news/releases/2017/virtual-reality-program. Accessed 11 Dec 2020
- State A, Chen DT, Tector C, Brandt A, Chen H, Ohbuchi R, Bajura M, Fuchs H (1994) Case study: observing a volume rendered fetus within a pregnant patient. In: Proceedings of the 1994 IEEE visualization conference, Washington
- Stefani C, Lacy-Hulbert A, Skillman T. ConfocalVR: immersive visualization for confocal microscopy. J Mol Biol. 2018;430:4028–4403. doi: 10.1016/j.jmb.2018.06.035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Strasburger H, Rentschler I, Jüttner M. Peripheral vision and pattern recognition: a review. J vis. 2011;11:13–25. doi: 10.1167/11.5.13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Suebnukarn S, Chaisombat M, Kongpunwijit T. Construct validity and expert benchmarking of the haptic virtual reality dental simulator. J Dent Educ. 2014;78(10):1442–1450. [PubMed] [Google Scholar]
- Sun GC, Wang F, Chen XL. “Impact of virtual and augmented reality-based on intraoperative magnetic resonance imaging and functional neuronavigation in glioma surgery involving eloquent areas. World Neurosurg. 2016;96:375–382. doi: 10.1016/j.wneu.2016.07.107. [DOI] [PubMed] [Google Scholar]
- Tabrizi LB, Mahvash M. Augmented reality guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg. 2015;123:206–211. doi: 10.3171/2014.9.JNS141001. [DOI] [PubMed] [Google Scholar]
- Texas surgeons perform first sinus surgery using AR. MobiHealthNews (2018). Accessed 9 Apr 2021
- Talbot H, Spadoni F, Duriez C, Sermesant M, O’Neill M, Jais P. Interactive training system for interventional electrocardiology procedures. Med Image Anal. 2017;35:225–237. doi: 10.1016/j.media.2016.06.040. [DOI] [PubMed] [Google Scholar]
- Tarnanas I, Wasserstrom J, Giotakos O. Using virtual reality emotional human agents as a relative—scored personality measure. J Cyberther Rehab. 2009;2(1):155–158. [Google Scholar]
- The Body VR (2021) Website: http://thebodyvr.com/anatomy-viewer/. Accessed 11 Apr 2021
- ThinkMobiles Team (2016) VR apps in medicine transforming healthcare we once knew. https://thinkmobiles.com/blog/virtual-reality-applicationsmedicine/. Accessed 22 July 2021
- The University of Illinois Helps Develop Revolutionary Virtual Reality for Learning. http://www.govtech.com/education/higher-ed/University-of-Illinois-Helps-Develop-Revolutionary-Virtual-Reality-for-Learning.html. Accessed 29 Jan 2021
- Theart RP, Loos B, Niesler TR. Virtual reality assisted microscopy data visualization and colocalization analysis. BMC Bioinformat. 2017;18(2):64–85. doi: 10.1186/s12859-016-1446-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Urbankova A, Eber M, Engebretson SP. A complex haptic exercise to predict preclinical operative dentistry performance: a retrospective study. J Dent Educ. 2013;77(11):1443–1450. [PubMed] [Google Scholar]
- Venkatesan M, Mohan H, Ryan JR, Schurch CM, Nolan GP, Frakes DH, Coskun AF. Review virtual and augmented reality for biomedical applications. Cell Rep Med. 2021;2(100348):1–13. doi: 10.1016/j.xcrm.2021.100348. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vincelli F, Anolli L, Bouchard S, Wiederhold BK, Zurloni V, Riva G. Experiential cognitive therapy in the treatment of panic disorders with agoraphobia: a controlled study. Cyberpsychol Behav. 2003;6(3):3210328. doi: 10.1089/109493103322011632. [DOI] [PubMed] [Google Scholar]
- Vincelli F, Anolli L, Bouchard S. Experiential cognitive therapy in the treatment of panic disorders with agoraphobia: a controlled study. Cyberpsychol Behav. 2003;6(3):321–328. doi: 10.1089/109493103322011632. [DOI] [PubMed] [Google Scholar]
- Monahan C, Ullberg L, Harvey K (2009) Virtual emergency preparedness planning using second life. In: Monahan C, Ullberg L, Harvey K (eds) IEEE/INFORMS international conference on service operations. Logistics and Informatics
- Virtual reality headsets might help cure genetic diseases. Futurism. Accessed 21 Apr 2021
- von der Heide AM, Fallavollita P, Wang L. Camera-augmented mobile Carm (CamC): a feasibility study of augmented reality imaging in the operating room. Int J Med Robot. 2018;14(2):1–8. doi: 10.1002/rcs.1885. [DOI] [PubMed] [Google Scholar]
- UCSF VR (2021) https://www.ucsf.edu/news/2017/09/408301/how-vr-revolutionizing-way-future-doctors-arelearning-about-our-bodies. Accessed 9 May 2021
- USC Institute for Creative Technologies website. Medical Virtual Reality. Accessed 5 Jun 2021
- VR could be your next painkiller. CNET (2018). Accessed 02 Jun 2021
- Wald J, Taylor S. Efficacy of virtual reality exposure therapy to treat driving phobia: a case report. J Behav Ther Exp Ther. 2000;31:249–257. doi: 10.1016/s0005-7916(01)00009-x. [DOI] [PubMed] [Google Scholar]
- Wang SS, Zhang SM, Jing JJ. Stereoscopic virtual reality models for planning tumor resection in the sellar region. BMC Neurol. 2012;12:146–172. doi: 10.1186/1471-2377-12-146. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wang Y, Li Q, Liu L, Zhou Z, Ruan Z, Kong L, Li Y, Wang Y, Zhong N, Chai R. TeraVR empowers precise reconstruction of complete 3-D neuronal morphology in the whole brain. Nat Commun. 2019;10:3474–3487. doi: 10.1038/s41467-019-11443-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Watanabe E, Konno SM, Hirai TM, Yamaguchi T. The trans-visible navigator: a seethrough neuronavigation system using augmented reality. World Neurosurg. 2016;87:399–405. doi: 10.1016/j.wneu.2015.11.084. [DOI] [PubMed] [Google Scholar]
- Widmann G, Schullian P, Ortler M, Bale R. Frameless stereotactic targeting devices: technical features, targeting errors and clinical results. Int J Med Robot Comput Assist Surg. 2012;8:1–16. doi: 10.1002/rcs.441. [DOI] [PubMed] [Google Scholar]
- Wiederhold BK, Gevirtz R, Spira JL (2001) Virtual reality exposure therapy vs imagery desensitization therapy in the treatment of flying phobia. Towards CyberPsychol: Mind, Cogn, Soc 253–272
- Wired M (2017) EchoPixel announces progress in the clinical adoption of interactive virtual reality for pediatric surgery. http://www.marketwired.com/press-release/echopixel-announces-progress-clinical-adoption-interactive-virtual-reality-pediatric-2202796.htm. Accessed 23 Apr 2021
- Workman S (2018) Mixed reality: a revolutionary breakthrough in teaching and learning. EDUCAUSE Rev. https://er.educause.edu/articles/2018/7/mixed-reality-a-revolutionary-breakthrough-inteaching-and-learning
- Wu JR, Wang ML, Liu KC, Hu MH, Lee PY. Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput Methods Progr Biomed. 2014;113(3):869–881. doi: 10.1016/j.cmpb.2013.12.021. [DOI] [PubMed] [Google Scholar]
- Wurst S. Extended reality in life sciences and healthcare. Solut, Mark Moves, Oppor. 2020;1:1–13. [Google Scholar]
- XRHealth, VR Telehealth (2021) https://www.xr.health/. Accessed 23 May 2021
- Yamada H, Nakaoka K, Sonoyama T. Clinical usefulness of mandibular reconstruction using custom-made titanium mesh tray and autogenous particulate cancellous bone and marrow harvested from tibia and/or ilia. J Craniofac Surg. 2016;27(3):586–592. doi: 10.1097/SCS.0000000000002472. [DOI] [PubMed] [Google Scholar]
- Yang DL, Xu QW, Che XM, Wu JS, Sun B. Clinical evaluation and follow-up outcome of the presurgical plan by Dextroscope: a prospective controlled study in patients with skull base tumors. Surg Neurol. 2009;72:682–689. doi: 10.1016/j.surneu.2009.07.040. [DOI] [PubMed] [Google Scholar]
- Yilmaz RM. Educational magic toys developed with augmented reality technology for early childhood education. Comput Human Behav. 2016;54:240–248. [Google Scholar]
- Zhao Y, Chen X, Wang F. Integration of diffusion tensor-based arcuate fasciculus fiber navigation and intraoperative MRI into glioma surgery. J Clin Neurosci. 2012;19:255–261. doi: 10.1016/j.jocn.2011.03.041. [DOI] [PubMed] [Google Scholar]
- Zhou J, Gong J, Li W (eds) (2006) Human daily behavior based simulation for epidemic transmission: a case study of SARS. In: 16th international conference on artificial reality and telexistence-workshops (ICAT’06). IEEE
- Zinser MJ, Mischkowski RA, Dreiseidler T. Computer-assisted orthognathic surgery: Waferless maxillary positioning, versatility, and accuracy of an image-guided visualization display. Br J Oral Maxillofac Surg. 2013;51(8):827–833. doi: 10.1016/j.bjoms.2013.06.014. [DOI] [PubMed] [Google Scholar]




