Abstract
The emergence of augmented reality (AR) in surgical procedures could significantly enhance accuracy and outcomes, particularly in the complex field of orthognathic surgery. This study compares the effectiveness and accuracy of traditional drilling guides with two AR‐based navigation techniques: one utilizing ArUco markers and the other employing small‐workspace infrared tracking cameras for a drilling task. Additionally, an alternative AR visualization paradigm for surgical navigation is proposed that eliminates the potential inaccuracies of image detection using headset cameras. Through a series of controlled experiments designed to assess the accuracy of hole placements in surgical scenarios, the performance of each method was evaluated both quantitatively and qualitatively. The findings reveal that the small‐workspace infrared tracking camera system is on par with the accuracy of conventional drilling guides, hinting at a promising future where such guides could become obsolete. This technology demonstrates a substantial advantage by circumventing the common issues encountered with traditional tracking systems and surpassing the accuracy of ArUco marker‐based navigation. These results underline the potential of this system for enabling more minimally invasive interventions, a crucial step towards enhancing surgical accuracy and, ultimately, patient outcomes. The study resulted in three relevant contributions: first, a new paradigm for AR visualization in the operating room, relying only on exact tracking information to navigate the surgeon is proposed. Second, the comparative analysis marks a critical step forward in the evolution of surgical navigation, paving the way for integrating more sophisticated AR solutions in orthognathic surgery and beyond. Finally, the system with a robotic arm is integrated and the inaccuracies present in a typical human‐controlled system are evaluated.
Keywords: augmented reality, medical robotics
This study evaluates augmented reality (AR) navigation techniques in orthognathic surgery, comparing traditional drilling guides with two AR‐based methods: ArUco markers and infrared tracking cameras. Findings show that the infrared tracking system matches the accuracy of conventional guides and surpasses ArUco‐based navigation, highlighting its potential to improve surgical precision and reduce invasiveness. Additionally, the study introduces a new AR visualization approach, relying solely on precise tracking data, enhancing future AR applications in surgical navigation.

Abbreviations
- API
application programming interface
- AR
augmented reality
- CAD/CAM
computer‐aided design/computer‐aided manufacturing
- CT
computed tomography
- DICOM
digital imaging and communications in medicine
- FOV
field of view
- FDA
food and drug administration
- IK
inverse kinematics
- LED
light‐emitting diode
- LOS
line of sight
- MIS
minimally invasive surgery
- OS
orthognathic surgery
- PSI
patient‐specific implant
- PoC
point‐of‐care
- PLA
polylactic acid
- RMS
root mean square
- 6DOF
six degrees of freedom
- STL
stereolithography
- TKA
total knee arthroplasty
- VR
virtual reality
- VSP
virtual surgical planning
1. INTRODUCTION
Orthognathic surgery (OS) is a procedure requiring high precision for correcting jaw discrepancy and malocclusion in cases where other treatments are insufficient [1, 2, 3]. OS is crucial for both functional (bite discrepancies, sleep apnea) and aesthetic (facial imbalances) outcomes. In the pre‐surgical phase, the orthodontist and the surgeon prepare a comprehensive treatment plan that includes the surgical adjustments needed to correct skeletal discrepancies. Virtual surgical planning (VSP) is based on multiple modalities: dental impressions, CT scans, and photographs, giving a complete overview of the patient's dental and facial structures [4, 5, 6]. This allows the incorporation of computer‐aided design/computer‐aided manufacturing (CAD/CAM) technologies that facilitate digital preparation [7, 8]. Surgeons must align bone segments accurately to ensure symmetrical results and proper function, making precision critical. This highlights the growing need for advanced navigation and guidance systems to assist surgeons in achieving optimal results.
Patient‐specific surgical guides are custom‐made templates that enhance surgical precision by ensuring perfect alignment based on the preoperative plan [9, 10, 11]. These personalized templates became essential to orthognathic surgery, allowing surgeons to drill holes and perform cuts accurately. Drilling guides are designed to position and stabilize the user's hand, reducing the risk of errors and ensuring interventions implement the preoperative plan perfectly. 3D‐printed guides and patient‐specific implants (PSIs) can reduce operative time, ensure more precise osteotomy cuts, and increase the safety of orthognathic surgery [9]. That said, there are some notable challenges when employing this method. Firstly, the process of designing, fabricating, and verifying the guide can be time‐consuming and requires a multidisciplinary team of experts. Secondly, these 3D‐printed parts pose a manufacturing challenge, where minor errors can result in poor fit and reduced accuracy. In recent years, we observed a rise in the number of hospitals with point‐of‐care (PoC) 3D printing facilities [12, 13, 14]. While this trend indicates the growing recognition of the benefits of 3D printing, these hospitals still represent only a minority with immediate access to this technology.
Optical tracking systems are essential tools in many medical procedures, especially in minimally invasive surgeries (MIS). These systems equip the surgeon with real‐time information about the tool's relative position to the patient, allowing him to navigate to the target location and operate precisely. In principle, infrared‐based optical tracking systems consist of three components: a) cameras that detect the reflected or emitted light from the markers; b) markers attached to the tracked object reflective spheres (passive) or light‐emitting diodes (LED) (active); c) tracking software that, based on the given data, calculates the position and orientation of the tracked target. Optical tracking systems have revolutionized many surgical procedures by providing enhanced precision and safety. These systems become even more integral to modern surgical practices as technology advances.
Augmented Reality (AR) technology enhances the real‐world view by superimposing computer‐generated content and information to enhance users' experiences. While lagging behind virtual reality (VR) in the head‐mounted category due to technical limitations, see‐through displays are gaining traction and have become a focus point for industry leaders with devices such as Apple Vision Pro (Apple Inc., USA) and Microsoft Hololens 2 (Microsoft Corp., USA). AR has already been adopted in various surgical applications, such as orthopedics, urology, cardiology, and neurosurgery [15, 16, 17, 18], to display patient data and provide guidance and navigation during interventions. Doing that, AR can improve accuracy, efficiency, and patient safety by enabling minimally invasive approaches, reducing radiation exposure and procedure time [16, 19]. VR and AR have also been used in training and education by equipping surgeons with immersive interactive tools that aim to simulate realistic scenarios [20, 21].
Pose estimation is a core element of many augmented and virtual reality applications. Identifying and tracking visual features in the real environment allows the VR/AR system to estimate the headset's movement, ensuring that the virtual view aligns with the user's physical action [22]. This process can also be employed to position virtual objects in a specific location in the real world. Binary square fiducial markers are a widespread approach that simplifies the process. A single marker allows for obtaining the camera pose, and binary codification allows for detecting errors and applying corrections. Many open‐source libraries offer detection modules for fiducial markers, one based on the ArUco library, developed by Rafael Muñoz and Sergio Garrido [23].
Despite the availability and ease of use (tracking camera integrated into the headset), ArUco markers have not become the gold standard for surgical navigation. The main factor is accuracy, which can vary based on several factors, such as the size of the marker, the quality of the camera calibration, the resolution of the camera, and the distance from and tilt of the marker [24, 25]. The reported accuracy is often in the range of millimeters, which is an inferior value compared to other optical tracking methods. Only recently, with the release of AR headsets to the market, did we observed an increased interest in ArUco makers.
In 2022, the U.S. Food and Drug Administration (FDA) cleared Knee+ (Pixee Medical, USA), an augmented reality navigation system for total knee arthroplasty (TKA). The system can calculate the 3D coordinates of the instruments by tracking ArUco markers with the headset's built‐in camera [26]. Navigation information is then displayed to the surgeon on a monocular display. Another product offering intraoperative visualization of preoperative plans is VisAR (, USA). FDA‐cleared in 2022 for use in precision‐guided intraoperative spine surgery, VisAR converts a patient's imaging data into a 3D hologram, visible through Microsoft's Hololens 2, and superimposes it onto the patient with claimed submillimeter accuracy [27]. The published data, however, indicate slightly worse results [28, 29] ‐ the mean radial error of , with a big contribution coming from the registration.
This study evaluates the accuracy of drillings performed according to the preoperative blueprint for orthognathic surgery. Three navigation methods were used for the assessment: (1) a drilling guide, (2) an optical tracking system using an infrared camera, and (3) an optical tracking solution based on ArUco markers tracking. We also mounted our system on a robotic arm to establish the influence of human factors in the experimental scenarios. We proposed an alternative AR navigation paradigm that removes the need for registration and eliminates associated errors. We have demonstrated the promising potential and feasibility of using AR‐based navigation systems for a very demanding procedure: orthognathic surgery, paving the way for future advancements in the field. With improvements to the manufacturing process, this paradigm could successfully replace cutting guides and eliminate the shortcomings of existing AR approaches.
2. METHODS
2.1. Virtual surgical planning (VSP)
The data used in this study was an already existing VSP prepared for orthognathic surgery. The procedure started with scanning the patient's maxillofacial region with a high‐resolution CT. The acquired dataset was imported into the planning software Materialise Mimics V.26 (Materialise, Belgium) as a DICOM dataset and threshold segmented to separate the high‐density bones from the other tissues. A composite skull model was created by fusing together the CT model with stereolithography CAD (STL) files acquired using a topographic laser scanner. Osteotomy planes were then defined, and relevant bones were moved to the desired position to achieve the desired functional and aesthetic outcome.
Following the surgical plan, the models were loaded into Materialise 3‐matic V.18 (Materialise, Belgium) 3D modeling software. There the fixation plates were designed firstly by selecting the locations of the screws on the bone surface and later connected to create a plate. In the next step, the osteotomized bone fragments were repositioned to the original position so the holes and the osteotomy planes could have been used to create the cutting/drill guides. All the VSP elements (bones, cutting/drill guides, splint, and osteotomy planes) were then exported as STL files for further processing and 3D printing. The STL files were then imported into Autodesk Fusion (Autodesk Inc., USA) for modeling towards integration with the tracking systems. The VSP model of the splint was modified to incorporate the adapter (see Figure 1) to which we could attach the camera or the ArUco marker.
FIGURE 1.

Virtual surgical plan imported to Autodesk Fusion with the modified splint allowing for an easy tracking element attachment.
2.2. AR navigation systems deployment
The tracking solution used for this study is called iVation (Medivation AG, Switzerland), an infrared‐based, single‐use optical tracking system. We chose this six‐degree‐of‐freedom (6DOF) tracking system for its accuracy and position update rate of 13 frames per second. The setup comprises an infrared camera and a single active marker with five LEDs. The markers are available in two sizes: small (tracked volume up to ) or big (up to . In this study, we opted for a big marker and the workspace volume of at 120 ° field of view (FOV). The reported relative RMS accuracy for this target is as follows: rotation ‐ in ; in ; translation ‐ in ; and in ‐axis; The devices use Bluetooth LE for wireless data transfer to the host PC, running a dedicated application. Thanks to access to this application's source code, we could enhance its functionality so that it can serve as a TCP/IP server, effectively relaying the tracking information to the AR headset. The iVation tracking system is a factory‐calibrated, single‐use medical tracking system that should provide more stable and accurate data than AR glasses during their respective lifetimes.
We designed our AR application to be modular, allowing the navigation system to work with various tracking data sources. This flexibility enables us to seamlessly switch from an infrared optical system to one based on ArUco markers. The Magic Leap 2 (ML2) API provides the MLMarkerTracker class, which can recognize and localize ArUco markers relative to the headset's position. In our application, we chose the DICT_7x7_50 ArUco dictionary and the Large_FOV preset from ML2. One of the parameters influencing the tracking accuracy in our control is the marker size; we chose a size with padding, which was a good compromise between the tracking precision and ensuring it was manageable and did not become an obstacle during drilling tasks.
In our setup, one marker tracks the phantom patient skull, while a second marker provides the position of the drill. The preoperative plan for our case required making holes at various positions and at very specific angles to ensure stability and avoid nerve damage. Considering the parameters of the iVation tracking system, we decided to group the holes by their position on the face. We designed the iVation/ArUco holders to cover one side of the skull at a time. Such a configuration allowed us to get the biggest coverage, ensuring the most optimal setup.
After modifying the VSP model, we reassembled the entire skull model and moved the origin point to the origin of the camera. The adjustment was made to simplify the transform calculations later. Once the assembly was complete, we exported it as an FBX file. This format was selected to maintain the meshes' separability, which was crucial for easy import and processing in Unity 2022.3 (Unity Technologies, USA).
2.3. Unity application
Once the models were imported, we loaded them into the scene and generated hole positions and angles at which they should be drilled. We extracted the vertices from each screw and calculated the mesh's centroid and main axis.
We started with determining the centroid of the screw that gave us the origin of the ray that we would use to find a starting point on the skull. Additionally, it gave us the centered data to construct the covariance matrix to compute the eigenvalues and eigenvectors. The eigenvector associated with the largest eigenvalue is the principal axis, which represents the main axis of the mesh. We then took the main axis as the direction of the ray coming from the centroid and calculated the hit points on its path. We filtered the hit points by their collider tag to identify the entry point on the skull and the end of the screw, which would give us the hole depth. All this data was then saved in a struct within and handled by our drilling manager class.
We generated visual cues for the user, such as the entry and end point of the holes and the cone, which indicates the allowed margin of error for the drilling direction (see Figure 2). Our drilling manager class would monitor the distance to each target point and would navigate the user to the closest hole relative to the drill's tip. The colors of the targets would change when reached to provide real‐time feedback to the user; these error margins can be defined before the deployment. The drilling manager would also keep track of the finished holes and remove them from the active list alongside the visual clues.
FIGURE 2.

A close‐up view of the target location. Three cylinders represent the direction to the target point along the (red), (green), and ‐axes (blue), with the height of each cylinder dynamically adjusting based on the distance to the target along the respective axis. Once the drill tip is at the hole's entry point, the sphere marking the entry point turns green. Also, once the drill aligns within the allowed target direction (5°off the main axis), the cone will turn green. A semitransparent white cylinder indicates the position and orientation of the drill.
2.4. AR visualization
ML2 offers superior computing power and a larger field of view compared to Microsoft's Hololens 2, making it a suitable choice for our needs.
In the case of the iVation system, the user does not have the preoperative plan overlaid on top of the patient, as the glasses are not registered to the skull. As the tracking system only provides a relative position between the camera and the marker, we place the VSP above the patient (see Figure 3). By doing that, we eliminate any potential errors originating from the patient registration process and instead rely only on a calibrated, medical‐grade tracking system. The user can freely move this projection using the controller (or hand gestures) and place it where it is most convenient. Additionally, the VSP can be scaled up, removing the pixel‐size limitations present when overlaying the data and allowing for submillimeter movements, which are possible with the iVation system.
FIGURE 3.

A view from the user perspective using scaled‐up 4x 3D holographic model of the VSP with an actively tracked drill using iVation system. Image taken with built‐in ML2 screenshot tool.
When using ArUco markers, the headset already knows the patient's position, allowing us to overlay the preoperative plan on top of the skull. ML2 has a fixed minimal near plane of , which prevents the surgeon from looking very closely at the model. By tracking the tool, we can display the same floating, scaled‐up projection of the VSP at a convenient location. To simplify navigation towards the target point and to visually guide the user, we realized dynamic cylinders representing the distance and angle errors towards the target location with dynamic cylinders and target cones visualized and explained in Figure 2.
2.5. Robotic control
To remove the human contribution to the overall errors and quantify the raw performance of the iVation tracking solution, we used a KUKA LBR iiwa med (KUKA A.G., Germany) robot arm to position and drill the holes. We only tested robotic drilling with the iVation infrared cameras. Autonomous drilling was not possible with the navigation‐less use of drilling guides. At the same time, ArUco Markers would require either constant marker visibility to the head‐mounted display, which is impractical for the surgeon, or the integration of an additional vision system dedicated to marker tracking. A TCP/IP connection between the robot and the Unity application was realized to send a transform representing the positional and rotational error between the current tip location and the target hole. This simplified the setup, as the robot software had only to move the arm to reduce these task‐space errors without needing to know the drill tip location.
2.6. Experimental setup
We designed the marker holders for the drill modularly to switch between the tracking system without removing the entire part (see Figure 4). Therefore, we did not need to repeat the pivot calibration procedure between runs. The origin points of the infrared and ArUco markers were extracted from the virtual models and could be swapped between during deployment. All models were printed in‐house with Polylactic acid (PLA) on various Bambu Lab 3D printers (Bambu Lab, China). The ArUco markers were also made using a multi‐material 3D printer, eliminating errors caused by the misalignment of the stickers on the final marker body. For robotic drilling, the drill was attached to the robot flange using a custom‐designed attachment compatible with the same marker adapter used for tracking during manual drilling.
FIGURE 4.

A photograph of the 3D printed skull with the iVation camera attached and the Dremel Drill with the active marker.
We decided not to use a medical drill because the available drill bits melted the phantom material and clogged the grooves. Instead, we opted for a Dremel Fortiflex Heavy Duty Flex Shaft Tool (Dremel, USA) because it offered greater flexibility with drill bits that could be hand‐picked to suit drilling into PLA.
The VSP established 30 holes to be drilled on a 3D‐printed skull model, as seen in Figure 1. To accommodate the tracking systems, we split the holes into left and right sides; after the first 15 holes were drilled, the camera/markers were swapped to face the other side. Due to the nature of optical tracking systems, users were instructed to prioritize hole entry positions when tracking issues arose.
2.7. Evaluation
Using four different navigation setups, we calculated the Euclidean distance and angle deviation between the planned holes and those drilled. Four surgeons (two craniomaxillofacial and two neurosurgeons) performed drillings on sixteen skulls (four per navigation scenario). Skulls were later scanned using a nanotom®m micro‐CT scanner (developed initially by Phoenix|X‐ray, currently under Baker Hughes) for high‐resolution imaging. Measurements were made using SpectoVR (Specto Medical AG, Switzerland), a 3D VR surgical planning software. The statistical analysis was performed using Two‐Way ANOVA and Tukey's multiple comparisons test.
3. RESULTS
Tracking was lost at some locations while using iVation with hand‐guided navigation, preventing users from executing four holes per skull. Similarly, with the iVation robotic drilling condition, tracking issues prevented two holes per skull from being drilled. Table 1 reports the successful executions as mean SD.
TABLE 1.
Positional and angular error statistics for the successfully drilled holes per configuration.
| Positional error | Angular error | |||
|---|---|---|---|---|
| Drilling guides |
|
|
||
| iVation cameras |
|
|
||
| ArUco markers |
|
|
||
| iVation robot |
|
|
Two‐way ANOVA revealed a significant difference (P 0.0001, F = 39.31) between the configurations in positional errors of the created drill hole and the planned entry point (Figure 5). The statistical comparisons indicate significant differences between most configurations (P 0.0001). However, drilling guides configuration resulted in a positional error of , which was not significantly different (P = 0.0755) from the errors resulting from iVation cameras mounted on a robotic arm . Similarly, no significant difference resulted between hand and robotic drilling using the iVation tracking system (P = 0.0588), with the mean error for the first being .
FIGURE 5.

The positional errors of the different configurations between the entry points of the hole drilled by the user and the planned locations from the preoperative plan (left). The angular error between the actual and planned hole center axis. The errors were measured to determine the rotational difference (in degrees) (right).
Two‐way ANOVA revealed a significant difference (P 0.0001, F = 55.62) between the configurations in angular errors of the created drill hole and the planned hole center axis (Figure 5). The drilling guides configuration resulted in an angular error of , which was statistically different (P 0.0001) from all other tested configurations. The iVation hand‐guidance system with an angular error of , was also different from the ArUco marker‐based system (P 0.05), and the iVation robotic arm (P 0.001). The system that used built‐in ArUco marker detection resulted in the largest angular errors with , which again was significantly different (P 0.0001) than what was achieved using the iVation system mounted on the robotic arm .
4. DISCUSSION
In this study, we aimed to evaluate the feasibility of using AR technology as the visualization method for navigated surgery. We implemented a navigation system based on small‐workspace infrared cameras and proposed an alternative paradigm for AR visualization that eliminates the errors originating from the intraoperative patient registration process. In addition, we mounted the drill on a robotic arm and guided it using an iVation system to assess the human contribution to the resulting errors.
In our experimental setup, drill guides resulted best in terms of positional and angular errors. By establishing this baseline, we highlighted the influence of the used material and the quality of 3D printing on the final accuracy. Using a soft PLA material in combination with an aggressive drill profile made it challenging for the surgeons to maintain the targeted angle (see Figure 5). Surgical guides are typically printed with a more rigid material (nylon, Polyether Ether Ketone, or titanium), providing more stability when drilling. Additionally, it must be noted that all annotations were made manually, which, in combination with our manufacturing process, might have contributed to errors resulting from the drill guides configuration.
Our navigation application, which relies on the iVation tracking system, performed well and matched the manufacturer's overall system accuracy report. These results could be further improved by machining a custom drill adapter for the attachment of the infrared markers. We noticed that the 3D‐printed material bent when the drill handle was tightened, which affected the attachment. Initial experiments revealed some of the issues that surgeons faced while using the hand‐held drill, such as small target size, issues with depth perception and line of sight (LOS), or slipping of the drill bit on the bone surface.
We addressed some of these issues by displaying a 3D holographic model of the VSP above the phantom anatomy on the level of the user's eyes, which could be scaled for clearer visualization. The testers used a scale factor of 6‐8, which allowed them to position the tip without straining their eyes. Additionally, by not overlaying the VSP on top of the patient, we avoided the errors from the intraoperative patient registration process, whether through landmark detection or an additional ArUco marker. The system was also indifferent to the patient's movement because the camera was rigidly attached to the splint and reported the marker location relative to its origin.
Knowing patient and tool positions allowed us to offer the same 3D holographic model for the ArUco‐based navigation and, at the same time, to overlay it on the actual skull. While the overall results of the system based on ArUco markers were better than expected [24, 25], they were deemed unsuitable as a replacement for drill guides. It is worth noting that the surgeons found the preoperative plan overlay on the patient helpful. They appreciated the comprehensive view of the operating field without needing to perform a mental transformation between the floating projection and the physical skull. That said, we observed that the surgeons, while using the overlaid data for initial guidance, would then transition their view to the enlarged model for finer control.
Another issue we noticed was that the drill bit was slipping off the skull surface when drilled at a low angle. Surgeons had to turn the drill on before touching the surface so that it would bite into it at the desired location. If they started the drill after resting the tip on the surface, it would slip and move, making the user repeat the positioning. This problem does not occur with drill guides because their walls hold the drill shaft in place, preventing slipping.
We integrated our system on a KUKA LBR iiwa r820 robotic arm and hypothesized that a robotic system's stability and rigidity could overcome the slipping issue. Combining navigation and camera mounting with intelligent robot inverse kinematics, we hypothesized to solve line‐of‐sight issues common with optical tracking systems. While positional errors were almost on par with manual drilling using 3D‐printed guides, we could not fully resolve these problems. Again, although reduced, we could still observe a slipping when drilling at very steep angles to the surface, which came from the flexing of our 3D‐printed mount. Also, because the robotic arm had a redundant degree of freedom, we could mostly maintain LOS with the camera but still encountered some tracking losses during drilling. While some LOS issues could be mitigated by using Kuka's IK control as a fallback, they would remain for manual drilling with the same navigation system.
In orthognathic surgery, the drillings are performed close to nerves, making the accuracy of the hole center line a critical element. With a variation between directions in our set of planned holes, the marker holder design would require thorough consideration and possibly simulations to ensure all the entry points would be reachable at the correct angle.
5. CONCLUSIONS
Our study demonstrates that significant challenges remain while AR technology and robotic integration show promise for navigated surgery. Using rigid materials for 3D‐printed mounts, improved tracking systems, and enhanced visualization techniques can contribute to better accuracy and usability. Integrating robotic systems provides additional stability but does not entirely eliminate issues such as drill bit slipping and LOS problems. Future work should focus on refining these systems, exploring new materials and designs, and conducting thorough simulations to optimize surgical outcomes.
AUTHOR CONTRIBUTIONS
Marek Żelechowski: Conceptualization; formal analysis; investigation; methodology; software; writing—original draft; writing—review and editing. Jokin Zubizarreta‐Oteiza: Investigation; methodology; writing—review and editing. Murali Karnam: Investigation; methodology; writing—review and editing. Balázs Faludi: Software; writing—review and editing. Norbert Zentai: Software. Nicolas Gerig: Formal analysis; writing—review and editing. Georg Rauter: Writing—review and editing. Florian M. Thieringer: Conceptualization; methodology; writing—review and editing. Philippe C. Cattin: Conceptualization; methodology; writing—review and editing.
CONFLICT OF INTEREST STATEMENT
The authors declare no conflicts of interest.
ACKNOWLEDGEMENTS
This work was financially supported by the Werner‐Siemens Foundation through the MIRACLE project.
Żelechowski, M. , Zubizarreta‐Oteiza, J. , Karnam, M. , Faludi, B. , Zentai, N. , Gerig, N. , Rauter, G. , Thieringer, F.M. , Cattin, P.C. : Augmented reality navigation in orthognathic surgery: Comparative analysis and a paradigm shift. Healthc. Technol. Lett. 12, e12109 (2025). 10.1049/htl2.12109
DATA AVAILABILITY STATEMENT
The data supporting this study's findings are available from the corresponding author upon reasonable request.
REFERENCES
- 1. Khechoyan, D.Y. : Orthognathic surgery: General considerations. In: Seminars in Plastic Surgery, vol. 27, pp. 133–136. Thieme Medical Publishers, New York: (2013) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Khadka, A. , Liu, Y. , Li, J. , Zhu, S. , Luo, E. , Feng, G. , et al.: Changes in quality of life after orthognathic surgery: A comparison based on the involvement of the occlusion. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. Endodontol. 112(6), 719–725 (2011) [DOI] [PubMed] [Google Scholar]
- 3. Seo, H.J. , Choi, Y.K. : Current trends in orthognathic surgery. Arch. Craniofacial Surg. 22(6), 287 (2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Joda, T. , Lenherr, P. , Dedem, P. , Kovaltschuk, I. , Bragger, U. , Zitzmann, N.U. : Time efficiency, difficulty, and operator's preference comparing digital and conventional implant impressions: A randomized controlled trial. Clin. Oral Implants Res. 28(10), 1318–1323 (2017) [DOI] [PubMed] [Google Scholar]
- 5. Ting‐shu, S. , Jian, S. : Intraoral digital impression technique: A review. J. Prosthodontics 24(4), 313–321 (2015) [DOI] [PubMed] [Google Scholar]
- 6. Meyer, U. , Valentin, K. : Decision‐making in orthognathic surgery by virtual planning and execution. In: Meyer, U. (ed.) Fundamentals of Craniofacial Malformations, pp. 185–201. Springer International Publishing, Cham: (2023). 10.1007/978-3-031-28069-6_14 [DOI] [Google Scholar]
- 7. Li, D.T.S. , Leung, Y.Y. : Patient‐specific implants in orthognathic surgery. Oral Maxillofacial Surg. Clin. 35(1), 61–69 (2023) [DOI] [PubMed] [Google Scholar]
- 8. Sabev, B. , Abazi, S. , Patcas, R. , Hertig, G. , Meyer, S. , Rommers, N. , et al.: Fully digital occlusion planning in orthognathic surgery–A crossover study. J. Cranio‐Maxillofac. Surg. 52(11), 1348–1353 (2024) [DOI] [PubMed] [Google Scholar]
- 9. Chakravarthy, C. , Sunder, S. , Malyala, S.K. , Tahmeen, A. : 3d printed surgical guides in orthognathic surgery: A pathway to positive surgical outcomes. In: Proceedings of the International Conference on ISMAC in Computational Vision and Bio‐Engineering 2018 (ISMAC‐CVB), pp. 1483–1490. Springer, Cham: (2019) [Google Scholar]
- 10. Thieringer, F.M. , Honigmann, P. , Sharma, N. : Medical additive manufacturing in surgery: Translating innovation to the point of care. In: The Future Circle of Healthcare: AI, 3D Printing, Longevity, Ethics, and Uncertainty Mitigation, pp. 359–376. Springer, Cham: (2022) [Google Scholar]
- 11. Suojanen, J. , Leikola, J. , Stoor, P. : The use of patient‐specific implants in orthognathic surgery: A series of 32 maxillary osteotomy patients. J. Cranio‐Maxillofac. Surg. 44(12), 1913–1916 (2016) [DOI] [PubMed] [Google Scholar]
- 12. Pabst, A. , Goetze, E. , Thiem, D.G. , Bartella, A.K. , Seifert, L. , Beiglboeck, F.M. , et al.: 3d printing in oral and maxillofacial surgery: A nationwide survey among university and non‐university hospitals and private practices in Germany. Clin. Oral Investig. 26(1), 911–919 (2022) [DOI] [PubMed] [Google Scholar]
- 13. Zeller, A.N. , Goetze, E. , Thiem, D.G. , Bartella, A.K. , Seifert, L. , Beiglboeck, F.M. , et al.: A survey regarding the organizational aspects and quality systems of in‐house 3d printing in oral and maxillofacial surgery in Germany. Oral Maxillofac. Surg. 27(4), 661–673 (2022) [DOI] [PubMed] [Google Scholar]
- 14. Calvo‐Haro, J.A. , Pascau, J. , Mediavilla‐Santos, L. , Sanz‐Ruiz, P. , Sánchez‐Pérez, C. , Vaquero‐Martín, J. , et al.: Conceptual evolution of 3d printing in orthopedic surgery and traumatology: From “do it yourself” to “point of care manufacturing”. BMC Musculoskelet. Disord. 22, 1–10 (2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Burström, G. , Persson, O. , Edström, E. , Elmi‐Terander, A. : Augmented reality navigation in spine surgery: A systematic review. Acta Neurochir. 163, 843–852 (2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Okamoto, T. , Onda, S. , Yanaga, K. , Suzuki, N. , Hattori, A. : Clinical application of navigation surgery using augmented reality in the abdominal field. Surg. Today 45, 397–406 (2015) [DOI] [PubMed] [Google Scholar]
- 17. Falk, V. , Mourgues, F. , Adhami, L. , Jacobs, S. , Thiele, H. , Nitzsche, S. , et al.: Cardio navigation: Planning, simulation, and augmented reality in robotic assisted endoscopic bypass grafting. Ann. Thoracic Surg. 79(6), 2040–2047 (2005) [DOI] [PubMed] [Google Scholar]
- 18. Żelechowski, M. , Faludi, B. , Karnam, M. , Gerig, N. , Rauter, G. , Cattin, P.C. : Automatic patient positioning based on robot rotational workspace for extended reality. Int. J. Comp. Assist. Radiol. Surg. 18(11), 1951–1959 (2023) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Edström, E. , Burström, G. , Omar, A. , Nachabe, R. , Söderman, M. , Persson, O. , et al.: Augmented reality surgical navigation in spine surgery to minimize staff radiation exposure. Spine 45(1), E45–E53 (2020) [DOI] [PubMed] [Google Scholar]
- 20. Croci, D.M. , Guzman, R. , Netzer, C. , Mariani, L. , Schaeren, S. , Cattin, P.C. , et al.: Novel patient‐specific 3d‐virtual reality visualisation software (spectovr) for the planning of spine surgery: A case series of eight patients. BMJ Innov. 6(4), 215–219 (2020) [Google Scholar]
- 21. Greuter, L. , De Rosa, A. , Cattin, P. , Croci, D.M. , Soleman, J. , Guzman, R. : Randomized study comparing 3d virtual reality and conventional 2d on‐screen teaching of cerebrovascular anatomy. Neurosurg. Focus 51(2), E18 (2021) [DOI] [PubMed] [Google Scholar]
- 22. Hajek, J. , Unberath, M. , Fotouhi, J. , Bier, B. , Lee, S.C. , Osgood, G. , et al.: Closing the calibration loop: An inside‐out‐tracking paradigm for augmented reality in orthopedic surgery. In: Medical Image Computing and Computer Assisted Intervention–MICCAI 2018: 21st International Conference, Proceedings, Part IV 11, pp. 299–306. Springer, Cham: (2018) [Google Scholar]
- 23. Garrido‐Jurado, S. , Muñoz‐Salinas, R. , Madrid‐Cuevas, F.J. , Marín‐Jiménez, M.J. : Automatic generation and detection of highly reliable fiducial markers under occlusion. Patt. Recogn. 47(6), 2280–2292 (2014) [Google Scholar]
- 24. Oščádal, P. , Heczko, D. , Vysockỳ, A. , Mlotek, J. , Novák, P. , Virgala, I. , et al.: Improved pose estimation of aruco tags using a novel 3d placement strategy. Sensors 20(17), 4825 (2020) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. León‐Muñoz, V.J. , Santonja‐Medina, F. , Lajara‐Marco, F. , Lisón‐Almagro, A.J. , Jiménez‐Olivares, J. , Marín‐Martínez, C. , et al.: The accuracy and absolute reliability of a knee surgery assistance system based on aruco‐type sensors. Sensors 23(19), 8091 (2023) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Iacono, V. , Farinelli, L. , Natali, S. , Piovan, G. , Screpis, D. , Gigante, A. , et al.: The use of augmented reality for limb and component alignment in total knee arthroplasty: Systematic review of the literature and clinical pilot study. J. Exp. Orthop. 8, 1–7 (2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Novarad. VisAR: Augmented Reality Surgical Navigation | Novarad (2024). https://www.novarad.net/visar. Accessed 28 June 2024
- 28. Gibby, W. , Cvetko, S. , Gibby, A. , Gibby, C. , Sorensen, K. , Andrews, E.G. , et al.: The application of augmented reality‐based navigation for accurate target acquisition of deep brain sites: Advances in neurosurgical guidance. J. Neurosurg. 137(2), 489–495 (2021) [DOI] [PubMed] [Google Scholar]
- 29. Gibby, J.T. , Swenson, S.A. , Cvetko, S. , Rao, R. , Javan, R. : Head‐mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int. J. Comp. Assist. Radiol. Surg. 14, 525–535 (2019) [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data supporting this study's findings are available from the corresponding author upon reasonable request.
