Skip to main content
NPJ Digital Medicine logoLink to NPJ Digital Medicine
. 2025 May 15;8:283. doi: 10.1038/s41746-025-01575-5

Digital twins for the era of personalized surgery

Yosra Magdi Mekki 1,, Gijs Luijten 2,3,4, Elisabet Hagert 1,5,6, Sirajeddin Belkhair 1,7,8, Chris Varghese 9, Junaid Qadir 10, Barry Solaiman 8,11, Muhammad Bilal 12, Jaghtar Dhanda 13, Jan Egger 2,3,4,14, Jun Deng 15, Vikas Khanduja 16, Alejandro F Frangi 17,18,19,20,21,22, Susu M Zughaier 1, Mitchell A Stotland 8,23
PMCID: PMC12081715  PMID: 40374901

Abstract

Digital twins can aid surgeons in training and in performing interventions with greater awareness and precision. The range and variety of digital twins in surgery are described, and their use across perioperative care is discussed. While largely experimental, they are beginning to show promise for the enhancement of personalized, adaptive, and data-driven surgical care. Issues relevant to the greater adoption and deployment of digital twins are all considered.

Subject terms: Clinical trials, Computational models, Health occupations

Introduction

A digital twin in surgery is a dynamic virtual replica of an individual’s physical and physiological state, integrating both bodily systems and healthcare interactions. It combines fundamental scientific principles with data-driven modeling to create a synchronized replica that updates at defined intervals and precision levels16. This sophisticated digital counterpart serves as a powerful tool for surgical training and planning, pre- and intra-operative decision-making, and outcome prediction. When employed within the operating room setting, a high-fidelity DT reflects patient anatomy and/or physiology and may be able to integrate information in real-time data. Note that the term ‘digital twins’ is sometimes used to describe more advanced or comprehensive models. However, in this work, we use the term to refer to data-driven virtual replica of real-world entities and processes, synchronized at specified intervals and levels of fidelity, as defined by the Digital Twin Consortium5 and other leading organizations6 (Fig. 1).

Fig. 1. Three components of a true digital twin.

Fig. 1

The three components of an ideal digital twin include: (1) Physical patient, (2) Virtual replica, (3) Bidirectional data exchange. This continuous exchange, whilst not applied widely, hypothetically enables a digital twin to be individualized, interconnected, interactive, informative and impactful2.

Depending on the complexity of the digital twin, it may require more established technology such as 3D image segmentation, stereolithographic printing, or more emerging innovations including computer vision, signal processing input, extended reality (XR), and real-time artificial intelligence (AI) -powered decision support.

To fully appreciate the potential of digital twins in surgery, it is essential to understand their evolution from early advancements in computer-assisted surgery (CAS), which laid the foundation for the complex digital twins used today79. The development of 3D modeling advanced the field of virtual surgical planning (VSP), enabled by advances in medical imaging technologies like CT and MRI10, as well as software capable of rendering 3D images from these datasets11. The integration of virtual reality (VR) simulations opened new possibilities for surgical planning, allowing surgeons to immerse themselves in a virtual operating room and interact with 3D models of the patient’s anatomy12. Looking ahead, augmented reality (AR) represents the next evolution in this continuum13, offering the potential to further enhance individualized, procedure-specific planning and delivery to the operating table, building on the foundation laid by CAS, VSP, and VR.

According to Katsoulakis et al., there are four levels of digital twins extending from simple, basic (1) static twins, to (2) functional (mirror) twins, to (3) shadow (self-adaptive) twins to (4) intelligent twins2,14,15. We explore the application of these digital twin types in surgical contexts throughout the article (Table 1).

Table 1.

Types of digital twins in surgical applications

Type Example Application
Static twin 3D model of conjoined twins for separation surgery planning. Preoperative planning and patient education.
Functional twin FEA model for tibial fractures and vertebroplasty procedures. Simulating biomechanical properties and predicting surgical outcomes.
Shadow twin Twin-S for skull base surgery and liver tumor ablation. Real-time guidance and dynamic updates during surgery.
Intelligent twin CardioVision for aortic stenosis and DLR MiroSurge robotic system. Predicting adverse events and providing individualized surgical assistance.

This table summarizes the four levels of digital twins, their examples, and their applications in surgical contexts.

A static twin is a digital replica of a patient’s anatomy or a physical object that does not change over time. They are useful for preoperative planning, patient discussions, and as an intraoperative reference. In a case of omphalopagus, a 3D model of conjoined twins used to plan a separation surgery is a good example of a static digital model (Fig. 2).

Fig. 2. 3D static twins for conjoined twin separation.

Fig. 2

This 3D printed twin, based on medical imaging data, provides a tangible representation of the twins’ shared anatomy (omphalopagus), facilitating surgical planning and improving communication with the medical team and family (credit: Dr. Mitchell A. Stotland, Sidra Medicine).

A functional twin simulates how anatomical structures behave under various conditions, helping to predict the impact of surgical interventions. For instance, finite element analysis (FEA) can predict tibial fracture risk in osteoporosis patients by simulating stress distribution, fracture stability, and bone healing16. Some functional twins incorporate intelligent capabilities, such as deep learning models that predict vertebral fracture responses17 and link them to procedural parameters.

A shadow twin integrates real-time data from sensors or imaging to dynamically update during surgery, adapting to changes like tissue shifting or bleeding. For example, Twin-S18 for skull base surgery uses optical tracking to create a real-time, physics-based simulation of surgical tools and patient anatomy. This provides surgeons with accurate intraoperative guidance, enhancing precision and safety. Holographic augmented reality (AR) further improves visualization of complex structures.

An intelligent twin uses advanced algorithms and/or machine learning to predict outcomes, assess risks, and guide decision-making. For example, CardioVision (CV)19 analyzes calcification distribution in aortic stenosis patients, predicting adverse events and recommending surgical approaches. These twins go beyond mirroring anatomy, offering predictive insights and real-time decision support.

Depending on the level of surgical digital twin one intends to build, certain considerations include the ability to:

1. acquire high-resolution imaging of the anatomic region of interest, and/or
2. obtain comprehensive data for the physiologic systems of interest, and/or
3. (for shadow and intelligent twins), establish real-time patient monitoring in two-way communication with the digital twin, and/or
4. access high-performance computer systems that can process data instantaneously and run AI models.
5. simulate the effects of interventions/tasks before they are executed.

Personalized digital twins across surgical care

Digital twins are evolving as promising tools across various phases of surgical care, from patient education to postoperative monitoring. However, much of their current use is still in the experimental or early development stages, often building upon more established static twins and VR/AR technology. Within each sub-section, we focus on the levels of digital twins most relevant to specific use cases, including patient education, surgical training, preoperative planning, intraoperative guidance, in silico trials, and postoperative care.

Patient education

Static twins

Static twins can help patients become active participants in their healthcare through patient education. At Stanford Medicine’s Neurosurgical Simulation Lab, individualized static twins—from basic overviews to step-by-step walkthroughs—enable patients to understand surgical plans and potential adjustments, empowering them to make informed decisions20. These visualizations are currently demonstrated in the lab setting and have shown promise in improving patient engagement. In a recent neurosurgical pilot study, static VR twins of patient-specific pathologies (e.g., aneurysms, meningiomas) were used to enhance preoperative informed consent, improving patient understanding and satisfaction without requiring real-time data integration21. Similarly, at Aspetar Sports and Orthopaedic Hospital, static twins in virtual reality of patient MRI scans are being trialed, in order to encourage coproduction with athletes22. The metaverse also has potential in patient education and creating interactive environments for learning and collaboration, but much of this remains conceptual and in early development stages2325.

Surgical training

Static twins

Virtual environments, where trainees can practice complex procedures without risk, especially for rare or challenging cases, building experience and confidence26. Emerging technologies like artificial intelligence (AI), simulation, and extended reality (XR), are reshaping medical education and demonstrably improve learning outcomes2729.

Generic static twins, based on standard anatomical representations, provide a valuable foundation for learning basic surgical techniques. However, they lack the specificity of patient-specific static twins, which are crucial for planning and executing surgical procedures.

The integration of augmented reality (AR) and virtual reality (VR) technologies is transforming medical education. A review of HoloLens usage in medical education and interventions highlights its promise for AR-enhanced simulators and its potential to improve student learning outcomes28. The recent availability of three-dimensional datasets of medical instruments, scanned for use in medical mixed reality (MMR), can further enhance simulation environments, supporting both training and tool tracking in augmented reality settings30.

The effectiveness of VR simulators in surgical skill acquisition has been well-documented. A study on a hip arthroscopy simulator showed significant skill improvements after three training sessions, with all metrics improving, indicating that virtual environments with visuo-haptic feedback can effectively develop basic arthroscopic skills31. Another study confirmed the simulator’s effectiveness, showing that increased experience led to better performance, supporting its construct validity32. The simulator also demonstrated sufficient aesthetic and tactile quality33. In orthopedic surgery, using a VR simulator, students were more likely to engage with further training opportunities, even without direct supervision34.

Despite these advancements, surgical training in lower- and middle-income countries faces significant challenges, including limited resources and access to mentorship. Addressing these barriers is crucial for improving global surgical care, as emphasized by the Lancet Commission on Global Surgery35. Programs like Virtual Reality in Medicine and Surgery (VRiMS) use functional twins and extended reality (XR) to deliver quality training through low-cost devices, helping to bridge the surgical access gap36.

Data integration from multiple sources, such as imaging, monitoring devices, and surgical instruments, is currently being demonstrated in experimental settings but is not yet widely available30,37. Advanced haptic feedback, which could simulate tissue textures to give trainees a realistic sense of surgical conditions, is an area of ongoing research, with some promising prototypes but not yet ready for surgical education38. Educators could eventually use shadow twins to create detailed case studies for independent learning, providing a depth of experience that traditional simulators cannot achieve. However, many of these features are still in the developmental phase, and their full realization may take additional time and validation.

Preoperative phase

The preoperative phase often faces challenges related to standardizing protocols and reducing reliance on experiential intuitions39.

Static twins

Static twins enable surgeons to explore patient anatomy in virtual reality, optimizing surgical approaches and minimizing tissue damage. The integration of static twins into preoperative planning is increasingly common, with certified software available from companies like Axial3D40 and Cella Medical41. In colorectal surgery, they integrate CT and MRI imaging to create detailed models of the colon, tumors, and critical structures like blood vessels, helping surgeons plan precise resection strategies42. Similarly, in hepato-pancreatobiliary (HPB) surgery, static twins allow surgeons to simulate resection strategies, maximizing tumor removal while preserving liver function and predicting postoperative tissue viability7. In partial nephrectomies, static twins provide detailed 3D models of the kidney, tumor, and vasculature, enabling surgeons to determine the optimal cutting plane for tumor removal while preserving healthy tissue7.

Functional twins

As of now, functional twins are being tested in orthopedic surgery, allowing virtual examination of implants before operations to determine the best stabilization methods16. They can also simulate procedures like vertebroplasty to predict fracture risks in cancer patients17. However, their use for risk prediction and simulating complications remains primarily in research. Future simulations may incorporate dynamic elements like blood flow and organ movement, providing deeper insights into patient physiology. For complex cardiac procedures like transcatheter aortic valve replacement (TAVR)43, functional twins such as HeartNavigator are used to simulate valve implantation, exploring different prostheses and implant depths to guide surgical planning. Functional twins updated with live imaging data could guide ablation procedures (e.g., radiofrequency or cryoablation) by mapping precise tumor margins and thermal spread, thereby minimizing collateral damage to adjacent tissues while ensuring complete tumor eradication7.

Intelligent twins

Intelligent twins, such as CardioVision (CV)19, a fully automatic deep learning AI tool, analyzes calcification distribution in patients with severe aortic stenosis. It helps clinicians predict adverse events, select the most appropriate surgical approach, and make timely, informed decisions about treatment. Similarly, in thermal ablation therapy for liver tumors, intelligent twins are advancing precision through real-time predictive modeling44. For instance, the integration of holographic augmented reality (AR) with a dynamic digital twin enables 3D navigation of moving tumors and vessels by forecasting internal anatomical shifts (e.g., respiration-induced liver motion). This system employs an external/internal correlation model to estimate target positions with sub-3mm accuracy while compensating for computational delays, ensuring seamless real-time guidance during ablation. By predicting thermal spread and mapping critical structures like blood vessels, it empowers surgeons to minimize collateral damage and optimize tumor targeting—akin to how CV refines cardiac interventions through risk stratification.

Intraoperative phase

The intraoperative phase provides valuable data, including physiological parameters, anatomical changes, and environmental factors.

Static twins

Advances in computer vision enable detailed analysis of anatomy and tissue characteristics, supporting the flow of information between static twins and surgeons2,45,46. In 2021, a multidisciplinary Chinese team described the conceptualization and creation of a static scan overlays with an AR navigation system used in the contouring procedure for craniofacial fibrous dysplasia (Fig. 3)47.

Fig. 3. A surgical AR navigation system.

Fig. 3

The AR navigation system in surgery includes a digital reference frame fixed to the patient’s skull, a surgical drill with a digital reference, an optical tracking system, and a head-mounted display (HMD) (top panel). Surgeons use the HMD for 3D virtual planning and real-time guidance, with the display changing color as the drill approaches the target. Credit: Liu et al.47.

The study involved five patients and showed that AR-assisted surgery was successful, with minimal errors, no complications, and high patient satisfaction. Following rigorous validation and commercialization of platforms like NextAR, these tools are now accessible to surgeons worldwide, enabling cross-regional collaboration and innovation. For example, a Qatari team built on this progress by performing the first AR-navigated spinal surgery in the Middle East48, successfully removing a complex spine tumor (Fig. 4).

Fig. 4. Spinal surgery using an augmented reality (AR) tool.

Fig. 4

This enables neurosurgeons to overlay static CT and MRI scans onto the patient`s spine for increased precision. The procedure led to a successful removal of a cancerous spinal tumor. (credit: Prof. Sirajeddin Belkhair, Hamad Medical Corporation).

During a deep inferior epigastric perforator (DIEP) flap harvest, augmented reality (AR) has been used to visualize patient-specific anatomy directly, enhancing identification and localization of key structures, though not yet routine37. AR also aids in precise identification, dissection, and execution of vascular flaps in reconstructive surgery45. Much of this data remains unanalyzed46, presenting an opportunity for improved surgical care through data-driven insights.

Shadow twins

More advanced shadow twins, which can adapt in real time, are still experimental. Some commercial platforms such as Medivis49 are working towards converting complex 2D imaging into real-time shadow twins superimposed onto the patient via AR. They can be purchased, and are documented in single-center case experiences, with no trial-based assessment50.

Unlike commercial platforms, Twin-S18 is purely an experimental tool designed to explore the potential of real-time surgical guidance. The system uses optical tracking to create a real-time, physics-based simulation of surgical tools and patient anatomy. This provides surgeons with accurate intraoperative guidance, enhancing precision and safety. Holographic AR further improves the visualization of complex structures. The system integrates optical tracking data and calibration techniques to generate a real-time digital replica of the surgical field. It tracks the positions of the surgical drill (d), phantom (p), and camera (c) relative to the optical tracker (o). By recovering the transformation between the base coordinates (oFxb) and model coordinates (xbFx) for each object, Twin-S ensures accurate alignment between the real and virtual entities.

AR technology presents challenges, such as ergonomic issues with extended headset use, as well as persistent limitations in XR overlay accuracy and latency13. For example, in percutaneous procedures such as k-wire pinning, hand surgeons might use AR headsets to overlay a shadow twin of the patient’s anatomy in real time, eliminating the need for repeated fluoroscopy and reducing radiation exposure in the OR. As this technology evolves and incorporates more predictive capabilities, it could advance towards intelligent surgical twins.

Intelligent twins

Integrating remote monitoring systems to update intelligent twins continuously and leveraging high-performance computing for real-time data processing could further support planning, outcome prediction, and simulations2. For example, predictive modeling may help avoid accidental incisions using in-painting techniques51,52.

An intelligent twin could dynamically map tissue-resection pathways using biomechanical modeling to avoid inadvertent damage to critical structures (e.g., nerves, vasculature). Concurrently, patient-specific hemodynamic models within the twin would simulate blood flow alterations in response to surgical actions, such as vessel clamping or partial organ resection, enabling predictive alerts for anticipated blood loss7,53,54.

In the example of percutaneous orthopedic procedures, an intelligent twin could further leverage predictive modeling to continuously refine its recommendations based on intraoperative tool positioning, tissue deformation, and force feedback, creating a closed-loop system that anticipates optimal trajectories while avoiding critical structures (e.g., nerves, vasculature). This would ensure millimetric accuracy by aligning the intelligent twin’s predictions with the patient’s evolving physical state.

In-silico trials

Intelligent twins

In-silico trials use computer simulations to evaluate surgical interventions quickly and cost-effectively. They could provide precise safety risk forecasting of medical devices, reduce the need for human and animal testing, and improve research efficiency, lowering costs while broadening diversity in device testing. Model credibility is crucial, and assessment frameworks are emerging from regulators55,56. Computational modeling is also being incorporated into clinical evidence guidelines, though these frameworks are still being fully established57.

Intelligent twins allow researchers to optimize and identify failure modes virtually. By simulating different patient scenarios, researchers can explore clinical contexts and design options in orthopedic implants58. In-silico trials for neurovascular devices have successfully used intelligent twins to evaluate implant effectiveness and identify potential failure modes. Although still experimental, these applications are paving the way for future regulatory adoption59.

Functional twins

Functional digital twins have also been utilized to emulate percutaneous coronary revascularization using routinely collected data60. Virtual treatment of aortic stenoses shows potential for predicting post-interventional hemodynamics, but remains in the experimental stages with limited clinical evidence43. Functional twins of the knee have been used to study wear in total knee replacement devices61. While they cannot yet fully replace human trials, they serve as a valuable complementary tool for gathering regulatory evidence62.

Postoperative care

Functional twins

The postoperative phase often lacks data-driven decision-making. Functional twins have the potential to bridge this gap by transforming raw data into actionable insights, which could eventually enable personalized rehabilitation plans. They have been explored for providing tailored care post-operation in tibial fracture16.

Intelligent twins

Intelligent twins take functionality a step further to offer continuous feedback from implantable sensors and biometric devices on recovery and complications, which informs postoperative care and the adjustment of rehabilitation plans, linking surgical planning, execution, and outcome. Postoperatively, intelligent twins can also facilitate the generation of patient- and case-specific documentation that can contribute to the development of a virtual surgery database7,63. This would allow clinicians to intervene promptly.

“Healthcasts”—conceptual real-time health broadcasts—could further enhance recovery monitoring, though they remain in early development64. When integrated with telemedicine, healthcasts could support remote monitoring after discharge. Intelligent twins could also integrate predictive analytics to forecast potential complications, such as delayed healing or implant failure, as well as provide real-time insights into vascular health and recovery progress after surgical interventions like EVAR, ultimately improving patient outcomes65.

Call to action

Digital twins have the potential to transform personalized surgery, but several challenges must be addressed before the technology can be effectively implemented at the bedside:

1. Ensure data quality and fairness: We need to make sure the data used for digital twins is accurate and representative. Underrepresented populations must be included to prevent unequal outcomes. Privacy and security of patient data are also crucial, and we need to address issues like data ownership and accountability for errors43,44.
2. Put patients first: Digital twins promise great precision, but we must not let technology overshadow the patient’s voice. The goal should always be to use digital twins to support human care, not replace it. Doctors and patients need to be at the center of decision-making, with digital twins acting as helpful tools66,67.
3. Set standards and validate effectively: We need clear standards and protocols to ensure they meet implementation requirements. Verification, validation, and uncertainty quantification are critical steps to make sure digital twins are safe and effective3,68.
4. Overcome barriers to adoption: To bring digital twins into everyday healthcare, we need to solve issues like data exchange, interoperability, and high costs. Replacing expensive operating room equipment with more affordable XR headsets and software could be a long-term solution. We need to prove that digital twins can reduce costs and improve patient outcomes, especially in hospitals with limited resources69,70.
Bringing digital twins to the operating room

To bring digital twins to practical use towards personalized surgery, consider the following:

1. Conduct rigorous clinical trials and validation studies.
2. Integrate with healthcare systems to demonstrate cost-effectiveness, particularly in resource-limited settings.
3. Conduct regulatory approvals for safety, privacy, and ethical standards.
4. Provide training for healthcare workers to ensure equipment to use digital twins effectively.
5. Develop user-friendly world-building tools to allow clinicians to build and use customized digital twins without extensive technical expertise.

Acknowledgements

Credits to Liu et al.47 for Figure 3. Their article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. M.A.S. acknowledges support from Qatar Research, Development and Innovation Fund (NPRP13S-0119-200108). A.F.F. acknowledges support from the Royal Academy of Engineering under the RAEng Chair in Emerging Technologies (INSILEX CiET1919\/19), ERC Advanced Grant – UKRI Frontier Research Guarantee (INSILICO EP\/Y030494/1). The research was carried out at the National Institute for Health and Care Research (NIHR) Manchester Biomedical Research Centre (BRC) (NIHR203308). J.Q. acknowledges support from Qatar University HIG 23/24-127.

Author contributions

Conceptualizing the manuscript: all authors and ideas revisited by Y.M.M., G.L., J.D., A.F.F., S.M.Z., and M.S. Y.M.M. wrote the manuscript, and made edits upon review by all authors. Y.M.M. and M.S. made several rounds of edits to enhance the readability and clarity of the manuscript. E.H., V.K., J.D., S.B., and M.S. offered important insights from the perspective of plastic, orthopedic, and maxillofacial surgeons. G.L. and J.E. contributed from the viewpoint of developing VR/AR and AI integration in surgery. J.Q. and M.B. shared their thoughts on the ethical framework for the surgical metaverse. A.F.F and J.D. focused on the development of in silico trials and digital twins. B.S. provided guidance on ethical and regulatory matters. S.M.Z. and V.K. contributed ideas related to innovation in medical education. Y.M.M. and C.V. shared perspectives from the standpoint of medical students and surgical residents. Y.M.M. and G.L. combined these viewpoints. Y.M.M. created tables and visual materials for this manuscript. M.A.S. provided Fig. 2. S.B. provided Fig. 4. All authors reviewed and approved the final version of the paper. All authors contributed to the manuscript. All authors are accountable for all aspects of the work in ensuring the questions related to the accuracy or integrity of any part of the work are appropriately investigated or resolved.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.De Domenico, M. et al. Challenges and opportunities for digital twins in precision medicine from a complex systems perspective. Npj Digital Med.8, 1–11 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Katsoulakis, E. et al. Digital twins for health: a scoping review. Npj Digital Med.7, 1–11 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Committee on Foundational Research Gaps and Future Directions for Digital Twins, et al. Foundational Research Gaps and Future Directions for Digital Twins (National Academies Press, 2024). [PubMed]
  • 4.Laubenbacher, R., Mehrad, B., Shmulevich, I. & Trayanova, N. Digital twins in medicine. Nat. Comput. Sci.4, 184–191 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Digital Twin Consortium Home, Digital Twin Consortium. http://www.digitaltwinconsortium.org/ (2025).
  • 6.European Virtual Human Twin. http://www.edith-csa.eu/ (2025).
  • 7.Asciak, L. et al. Digital twin assisted surgery, concept, opportunities, and challenges. Npj Digital Med.8, 1–8 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Ranev, D. & Teixeira, J. History of Computer-Assisted Surgery. Surg Clin North Am.100, 209–218 (2020). [DOI] [PubMed] [Google Scholar]
  • 9.Manning, W. & Zhijian, S. 7—Surgical navigation. in Computer-Aided Oral and Maxillofacial Surgery (eds Egger, J. & Chen, X.) 161–181 (Academic Press, 2021).
  • 10.Portnoy, Y. et al. Three-dimensional technologies in presurgical planning of bone surgeries: current evidence and future perspectives. Int J Surg.109, 3–10 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Giménez, M. et al. Definitions of computer-assisted surgery and intervention, image-guided surgery and intervention, hybrid operating room, and guidance systems: Strasbourg International Consensus Study. Ann. Surg. Open1, e021 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Khor, W. S. et al. Augmented and virtual reality in surgery—the digital surgical environment: applications, limitations and legal pitfalls. Ann. Transl. Med.4, 454 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Dennler, C. et al. Augmented reality in the operating room: a clinical feasibility study. BMC Musculoskelet. Disord.22, 451 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Björnsson, B. et al. Digital twins to personalize medicine. Genome Med.12, 4 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Negri, E., Fumagalli, L. & Macchi, M. A review of the roles of digital twin in CPS-based production systems. Procedia Manuf.11, 939–948 (2017). [Google Scholar]
  • 16.Aubert, K. et al. Development of digital twins to optimize trauma surgery and postoperative management. A case study focusing on tibial plateau fracture. Front. Bioeng. Biotechnol.9, 722275 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Ahmadian, H. et al. A digital twin for simulating the vertebroplasty procedure and its impact on mechanical stability of vertebra in cancer patients. Int. J. Numer. Methods Biomed. Eng.38, e3600 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Shu, H. et al. Twin-S: a digital twin for skull base surgery. Int. J. Comput. Assist. Radiol. Surg. 18, 6 (2023). [DOI] [PMC free article] [PubMed]
  • 19.Rouhollahi, A. et al. CardioVision: a fully automated deep learning package for medical image segmentation and reconstruction generating digital twins for patients with aortic stenosis. Comput. Med. Imaging Graph.109, 102289 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Virtual Reality Lab, Neurosurgery. http://med.stanford.edu/neurosurgery/divisions/vr-lab.html (2025).
  • 21.Westarp, E. et al. Virtual reality for patient informed consent in skull base tumors and intracranial vascular pathologies: a pilot study. Acta Neurochir.166, 455 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Mekki, Y. M. et al. Digital twin of a surgical patient with femoroacetabular impingement syndrome and case study of a static twin in virtual reality. 10.13140/RG.2.2.34137.53604 (2024).
  • 23.Mekki, Y. M., Simon, L. V., Freeman, W. D. & Qadir, J. Medical Education Metaverses (MedEd Metaverses): Opportunities, Use Case, and Guidelines. in Computer, 58, 60–7010.1109/MC.2024.3474033 (2025).
  • 24.Nawaz, F. A. et al. Toward a meta-vaccine future: promoting vaccine confidence in the metaverse. Digit. Health9, 20552076231171477 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Qayyum, A. et al. Secure and trustworthy artificial intelligence-extended reality (AI-XR) for metaverses. ACM Comput. Surv.56, 170 (2024). [Google Scholar]
  • 26.Weeks, J. K. & Amiel, J. M. Enhancing neuroanatomy education with augmented reality. Med Educ.53, 516–517 (2019). [DOI] [PubMed] [Google Scholar]
  • 27.Mekki, Y. M., Mekki, M. M., Hammami, M. A. & Zughaier, S. M. Virtual reality module depicting catheter-associated urinary tract infection as educational tool to reduce antibiotic resistant hospital-acquired bacterial infections. in 2020 IEEE International Conference on Informatics, IoT, and Enabling Technologies (ICIoT) 544–548 (IEEE, 2020).
  • 28.Gsaxner, C. et al. The HoloLens in medicine: a systematic review and taxonomy. Med. Image Anal.85, 102757 (2023). [DOI] [PubMed] [Google Scholar]
  • 29.Mekki, Y. M. & Zughaier, S. M. Teaching artificial intelligence in medicine. Nat Rev Bioeng.2, 450–451 (2024). [Google Scholar]
  • 30.Luijten, G. et al. 3D surgical instrument collection for computer vision and extended reality. Sci. Data10, 796 (2023). [DOI] [PMC free article] [PubMed]
  • 31.Bartlett, J. D. et al. The learning curves of a validated virtual reality hip arthroscopy simulator. Arch. Orthop. Trauma Surg.140, 761–767 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Khanduja, V., Lawrence, J. E. & Audenaert, E. Testing the construct validity of a virtual reality hip arthroscopy simulator. Arthrosc. J. Arthrosc. Relat. Surg.33, 566–571 (2017). [DOI] [PubMed] [Google Scholar]
  • 33.Bartlett, J. D., Lawrence, J. E. & Khanduja, V. Virtual reality hip arthroscopy simulator demonstrates sufficient face validity. Knee Surg. Sports Traumatol. Arthrosc.27, 3162–3167 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Bartlett, J., Kazzazi, F., To, K., Lawrence, J. & Khanduja, V. Virtual reality simulator use stimulates medical students’ interest in orthopaedic surgery. Arthrosc. Sports Med. Rehabil.3, e1343–e1348 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Alkire, B. C. et al. Global access to surgical care: a modelling study. Lancet Glob. Health3, e316–e323 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Please, H. et al. Virtual reality technology for surgical learning: qualitative outcomes of the first virtual reality training course for emergency and essential surgery delivered by a UK–Uganda partnership. BMJ Open Qual.13, e002477 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Wesselius, T. S. et al. Holographic augmented reality for DIEP flap harvest. Plast. Reconstr. Surg.147, 25e–29e (2021). [DOI] [PubMed] [Google Scholar]
  • 38.Georgiou, O., Martinez, J., Abdouni, A. & Harwood, A. Mid-air haptic texture exploration in VR. in 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) 964–965 (IEEE, 2022).
  • 39.Varghese, C., Harrison, E. M., O’Grady, G. & Topol, E. J. Artificial intelligence in surgery. Nat. Med.30, 1257–1268 (2024). [DOI] [PubMed] [Google Scholar]
  • 40.Axial3D—Leading the way with the latest 3D imaging, models, and devices, Axial3D. http://axial3d.com/ (2025).
  • 41.Solutions | Cella. http://www.cellams.com/en/solutions/ (2025).
  • 42.Shen, Md., Chen, Sb. & Ding, Xd. The effectiveness of digital twins in promoting precision health across the entire population: a systematic review. npj Digit. Med.7, 145 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Franke, B. et al. Comparison of hemodynamics in biological surgical aortic valve replacement and transcatheter aortic valve implantation: an in-silico study. Artif. Organs47, 352–360 (2023). [DOI] [PubMed] [Google Scholar]
  • 44.Shi, Y. et al. Synergistic digital twin and holographic augmented-reality-guided percutaneous puncture of respiratory liver tumor. IEEE Trans. Hum. Mach. Syst, 52, 6 (2022).
  • 45.Pratt, P. et al. Through the HoloLensTM looking glass: augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels. Eur. Radiol. Exp.2, 2 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Hein, J. et al. Creating a digital twin of spinal surgery: a proof of concept. arXiv10.48550/arXiv.2403.16736 (2024).
  • 47.Liu, K. et al. Augmented reality navigation method for recontouring surgery of craniofacial fibrous dysplasia. Sci. Rep.11, 10043 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Cureton, D. NextAR empowers Qatari surgeons with AR spinal guidance tool, XR Today. http://www.xrtoday.com/mixed-reality/nextar-empowers-qatari-surgeons-with-ar-spinal-guidance-tool/ (2025).
  • 49.A new standard of surgery using augmented reality. http://www.medivis.com/ (2025).
  • 50.Berger, A., Choudhry, O. J. & Kondziolka, D. Augmented reality–assisted percutaneous rhizotomy for trigeminal neuralgia. Oper. Neurosurg. 10.1227/ons.0000000000000661 (2023). [DOI] [PubMed]
  • 51.Van Meegdenburg, T. et al. A baseline solution for the ISBI 2024 dreaming challenge. in 2024 IEEE International Symposium on Biomedical Imaging (ISBI), IEEE. 1–3 (2024).
  • 52.Li, P., Liu, L., Schönlieb, C.-B. & Aviles-Rivero, A. I. Optimised ProPainter for video diminished reality inpainting. arXiv10.48550/ARXIV.2406.02287 (2024).
  • 53.Cowley, J., Luo, X., Stewart, G. D., Shu, W. & Kazakidi, A. A mathematical model of blood loss during renal resection. Fluids. 8, 12 (2023).
  • 54.Cowley, J. et al. Near real-time estimation of blood loss and flow–pressure redistribution during unilateral nephrectomy. Fluids9, 9 (2024).
  • 55.Center for Devices and Radiological Health. Assessing the credibility of computational modeling and simulation in medical device submissions. http://www.fda.gov/regulatory-information/search-fda-guidance-documents/assessing-credibility-computational-modeling-and-simulation-medical-device-submissions (2024).
  • 56.Aycock, K. I. et al. Toward trustworthy medical device in silico clinical trials: a hierarchical framework for establishing credibility and strategies for overcoming key challenges. Front. Med. 11, 1433372 (2024). [DOI] [PMC free article] [PubMed]
  • 57.T. G. Administration (TGA). Clinical evidence guidelines. Therapeutic Goods Administration (TGA). http://www.tga.gov.au/resources/resource/reference-material/clinical-evidence-guidelines (2024).
  • 58.Favre, P. et al. In silico clinical trials in the orthopedic device industry: from fantasy to reality?. Ann. Biomed. Eng.49, 3213–3226 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Sarrami-Foroushani, A. et al. In-silico trial of intracranial flow diverters replicates and expands insights from conventional clinical trials. Nat. Commun. 12, 3861 (2021). [DOI] [PMC free article] [PubMed]
  • 60.Pathak, S. et al. Surgical or percutaneous coronary revascularization for heart failure: an in silico model using routinely collected health data to emulate a clinical trial. Eur. Heart J.44, 351–364 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Mell, S. P., Wimmer, M. A., Jacobs, J. J. & Lundberg, H. J. Optimal surgical component alignment minimizes TKR wear—an in silico study with nine alignment parameters. J. Mech. Behav. Biomed. Mater.125, 104939 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Frangi, A. et al. Unlocking the power of computational modelling and simulation across the product lifecycle in life sciences: a UK Landscape Report. Zenodo10.5281/ZENODO.7723230 (2023).
  • 63.Bjelland, Ø., Pedersen, M. D., Steinert, M. & Bye, R. T. Intraoperative Data-Based Haptic Feedback for Arthroscopic Partial Meniscectomy Punch Simulation. in IEEE Access,10, 107269–107282 10.1109/ACCESS.2022.3212748 (2022).
  • 64.Coveney, P. & Highfield, R. Virtual You: How Building Your Digital Twin Will Revolutionize Medicine and Change Your Life (Princeton University Press, 2023).
  • 65.Kim, D. et al. Transmission line model as a digital twin for abdominal aortic aneurysm patients. Npj Digital Med.7, 1–10 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Huang, P., Kim, K. & Schermer, M. Ethical issues of digital twins for personalized health care service: preliminary mapping study. J. Med. Internet Res.24, e33081 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Bruynseels, K., Santoni De Sio, F. & Van Den Hoven, J. Digital twins in health care: ethical implications of an emerging engineering paradigm. Front. Genet.9, 31 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Raza, M. M., Venkatesh, K. P., Diao, J. A. & Kvedar, J. C. Defining digital surgery for the future. Npj Digital Med.5, 1–2 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Xu, J. et al. Spatial computing: defining the vision for the future. in Extended Abstracts of the CHI Conference on Human Factors in Computing Systems 1–4 (ACM, 2024).
  • 70.Gentili, A. et al. The cost-effectiveness of digital health interventions: a systematic review of the literature. Front. Public Health10, 787135 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from NPJ Digital Medicine are provided here courtesy of Nature Publishing Group

RESOURCES