Skip to main content
Journal of Neurological Surgery. Part B, Skull Base logoLink to Journal of Neurological Surgery. Part B, Skull Base
. 2017 May 3;78(5):385–392. doi: 10.1055/s-0037-1602791

Performance of Robotic Assistance for Skull Base Biopsy: A Phantom Study

Jian-Hua Zhu 1, Jing Wang 1, Yong-Gui Wang 2, Meng Li 2, Yu-Xing Guo 1, Xiao-Jing Liu 1, Chuan-Bin Guo 1,
PMCID: PMC5582959  PMID: 28875116

Abstract

Objectives  This study aims to evaluate the feasibility of a custom robot system guided by optical cone beam computed tomography (CBCT)-based navigation for skull base biopsy.

Design  An accuracy study was conducted.

Setting  Platform for navigation and robot-aided surgery technology.

Participants  Phantom skull.

Main Outcome Measures  The primary outcome measure was to investigate the accuracy of robot-assisted needle biopsy for skull base tumors. A 14-gauge needle was automatically inserted by the five degrees of freedom robot into the intended target, guided by optical navigation. The result was displayed on the graphical user interface after matrix transformation. Postoperative image scanning was performed, and the result was verified with image fusion.

Results  All 20 interventions were successfully performed. The mean deviation of the needle tip was 0.56 ± 0.22 mm (measured by the navigation system) versus 1.73 ± 0.60 mm (measured by image fusion) ( p  < 0.05). The mean insertion depth was 52.3 mm (range: 49.7–55.2 mm). The mean angular deviations off the x-axis, y-axis, and z-axis were 1.51 ± 0.67, 2.33 ± 1.65, and 1.47 ± 1.16 degrees, respectively.

Conclusions  The experimental results show the robot system is efficient, reliable, and safe. The navigation accuracy is a significant factor in robotic procedures.

Keywords: robotics, image-guided biopsy, skull base, navigation

Introduction

With the development of minimally invasive surgery, percutaneous interventions are being increasingly used, to the benefit of both patients and surgeons. Image-guided percutaneous biopsy of suspicious lesions, particularly in deeply situated target areas, is the first step for these lesions in clinical management. 1 However, the efficiency of the diagnosis depends on the accuracy of the needle insertion. Lesions in the deep lateral facial region, especially in the skull base and infratemporal fossa, still challenge surgeons technically because of the need to manually locate the skin entry site and to adjust the angulation of the needle, and the problems posed by the mandibular bone and the many vital structures (e.g., the internal carotid artery, jugular vein, and cranial nerves). Existing techniques such as image navigation can provide significant assistance, but much still depends on the surgeon's experience and hand–eye–mind coordination. 2 Robot-assisted surgery is another alternative that can help out the surgeon who is subject to tremor, fatigue, and the risk of exposure to radiation. The integration of imaging and robotic technology can act as a “third hand and eye” for the surgeon and avoid the need for having to switch vision between the patient and the monitor repeatedly. 1 3 Percutaneous interventions guided by cone beam computed tomography (CBCT) have been shown to be effective for biopsy of suspicious lesions. 4 We developed a robot system for the biopsy of suspicious lesions, and in the present study, we assessed the accuracy and feasibility of the robotic system guided by optical CBCT-based navigation for performing needle insertions automatically simulating percutaneous needle biopsy. Also, although the available high-quality vision can make compensation for the disadvantage of the lack of the haptic feedback in robotic surgery, the haptics are still important to provide safety and decrease the postoperative complications, particularly in skull base surgery. 5 Different from previously developed robot systems, 6 7 8 the concept of force feedback was introduced in this robot system. To the best of our knowledge, although robot-assisted techniques have been applied in oral and maxillofacial surgery, 9 its use for a needle biopsy in the field of skull base has not yet been reported.

Materials and Methods

Phantom Model

A synthetic human skull model (A150, Kexin Scientific Equipment, Zhangjiagang, China) was used, with Plasticine (Taoyuan Electronic Technology Co.) placed around the skull base to imitate the soft tissue. Meatballs of 2 to 2.5 cm diameter were placed beneath the skull base to act as the target “tumors.” A total of 20 variable approaches were performed by the robotic system on this model.

Robotic Device

As shown in Fig. 1 , the robot device is an arch-like structure, with five degrees of freedom (DOF), and consists of a positioning mechanism and an end effector. The robot was fixed rigidly on a custom-made table whose relative size was the same as a standard operating table. The robot weighed only 15 kg, making it convenient for surgeons to assemble. Taking the configuration of the operation bed and the need for maintenance of hygiene into consideration, the robot device was constructed with dimensions of 55 cm × 20 cm (width × height) in the “home position.” The end effector was connected to the fifth joint by a clamping slot connection to facilitate easy exchange and sterilization. The design of the robot can achieve a 300 × 400 × 400 mm (x, y, and z) three-dimensional workspace. To reach this goal, the movements of the second, third, and fourth joints need to angulate ± 50, ± 30, ± 51 degrees, respectively, while the first and fifth joints need to achieve translational movements within ± 125 and ± 110 mm, respectively. In theory, six DOF are necessary to achieve the complex orientations for deployment of medical instruments; however, for needle insertion (without rotary motion of the needle), this combination of three DOF for rotational motions and one DOF for horizontal motion, with one DOF for translational motion, is sufficient for advancing the needle with the end effector. A six-dimension force sensor was placed between the fifth joint and the end effector. Any axial force is perpendicularly transferred to the sensor and displayed in the graphical user interface (GUI) after signal translation.

Fig. 1.

Fig. 1

The skull phantom and the robot system setup with five degrees of freedom (black arrows).

Registration

To insert the needle into the target, the robot, skull model, and CBCT images need to be correlated by matrix transformation. The optical tracking system POLARIS (Northern Digital Inc., Waterloo, Canada), as an intermediate coordinate, was used to align the different coordinate systems. For image to skull model registration, titanium screws (1.5 mm × 5 mm) were drilled into the skull model to act as fiducial markers; the recommended collection of paired points included the anterior nasal spina, the nasomaxillary buttresses of the maxilla, the frontozygomatic suture/the frontozygomatic processes, the roof of external auditory canal, the mastoid foramen, and the tip of the mastoid. 10 11 The positions of the titanium screws were identified manually on the CBCT slices to obtain the coordinates in the image space, while its coordinates in the tracking system, acquired by the use of a calibrated tracked pointer, were displayed in the GUI after data exchange. The matrix transformation was achieved using iterative closest points algorithm 12 13 14 between the corresponding fiducial markers to achieve the registration of the CBCT image to the skull model. For the registration of the robot to the tracking system, matrix transformation was also accomplished by a point-to-point registration process with the least-square fitting algorithm method, 15 and a dynamic reference frame was attached to the fifth joint, with which the coordinates of the needle tip were updated continuously to track the position of needle in real-time. After robot to navigation system registration, the locomotion of the robot was guided by the optical navigation system through the robot close-loop control strategy ( Fig. 2 ), because of which the motion error was negligible verified by a calibrated standard model with visual feedback. The fiducial registration error (FRE) of the CBCT image to the skull model by the navigation system was identified to reduce the target registration error (TRE).

Fig. 2.

Fig. 2

Closed-loop control to orientate the robot guided by optical tracking system.

Procedure

Preoperative CBCT data of the phantom was acquired using a CBCT scanner NewTom VG (Quantitative Radiology, Verona, Italy). The CBCT data (110 kV, 13.88 mAs, field-of-view 15 cm × 15 cm, matrix 512 × 512, slice thickness 0.3 mm) was transferred to the computer console in DICOM format and displayed on a custom GUI developed with a C + + programming language for planning, registration, and navigation. Socket communication was used to interconnect the GUI with the robot controller displayed on another computer console through the local area network. After the segmentation of the “tumor” with the graph-cut algorithm, 16 the graphical selection of the target point and the “skin” entry point was made on the GUI to define the needle trajectory automatically with allowance made for manual adjustment. After the planning process, the relevant data were calculated and sent to the robot controller. To keep the spatial relationship stable, the skull model was fixed firmly to the trial table with a head clamp. With the registration performed, the robot with a universal 14-gauge needle with a diamond tip clamped into a custom end effector was driven to the target position automatically, following the planned straight trajectory. The optical tracking system (20 Hz update rate and 0.35 mm accuracy) could update the needle orientation data in real-time, with the imaging feedback displayed on the navigation interface. The interactive control system unit was designed as a “man-in-closed-loop” mode ( Fig. 3 ) with the inclusion of the tracking system, robot, imaging data, and the surgeon, to provide double feedback for calibrating and monitoring the process. Upon getting to the entry point, the robot would remain motionless, advancing the needle further only after the surgeon's confirmation. When the puncture procedure was completed, the instantaneous data of orientation acquired by navigation system was sent back to the GUI for accuracy verification after matrix transformation. Subsequently, after the needle had been released, CBCT scanning was also acquired to reverify the position of the needle tip and its trajectory ( Fig. 4 ). The total errors, including the deviation in the angle and the distance traversed, were measured after image fusion of preoperative and postoperative images ( Fig. 5 ).

Fig. 3.

Fig. 3

Man-in-closed-loop control.

Fig. 4.

Fig. 4

Postoperative cone beam computed tomography image of the skull with the needle (white arrows).

Fig. 5.

Fig. 5

Image fusion of the preoperative skull (silver) and postoperative skull (brown-green) (the pink line is the planned trajectory, and the brownish-green linear object is the needle [white arrows]).

Safety Considerations

Safety can be compromised by malfunction of hardware or software. Korb et al 17 categorized surgical robot risks into seven types according to the probability of occurrence and severity and described the measures be taken to minimize them. We incorporated two measures to guarantee safety. First, we included an emergency switch that could cut off power to the system if an accident happened. All components of the system would then stop moving within 0.5 seconds and remain stationary. The surgeon would then need to manage the risk state and take a decision on whether to make further movement by reverifying the kinematics parameters. Second, an emergency button was presented in the control unit of the robot controller. Once the option was clicked, further movements of all components would be forbidden and, if necessary, the robot could even be drawn back to the “home position.” In addition to these measures, we incorporated force feedback in the system to supervise the whole operative process. In view of the literature reports on needle insertion forces in different tissue material and with different needle sizes 18 19 and the percutaneous needle insertion forces in craniofacial region measured on cadavers in our prior work (data not published), we determined 30 N axial force as the security threshold; if this was exceeded, alert data would be shown on the GUI, and an emergency response recommended.

Statistics

IBM SPSS Statistics 20 (IBM Corp., New York, United States) was used for statistical analysis. The paired t -test was used to compare the differences in the deviation of the needle tip from the target measured by the navigation system and by image fusion.

Results

In all 20 procedures ( Table 1 ) the mean deviation of the needle tip from the target measured by the navigation system and by image fusion was 0.56 ± 0.22 mm (range: 0.20–0.96 mm) versus 1.73 ± 0.60 mm (range: 0.99–2.83 mm) ( p  < 0.0001) ( Fig. 6 ). As shown in Table 2 , the registration of CBCT data to the skull model achieved a mean FRE of 0.52 ± 0.08 mm. The mean insertion depth was 52.3 mm (range: 49.7–55.2 mm) for needle insertions simulating percutaneous biopsy. The mean angular deviation of the x-axis, y-axis, and z-axis were 1.51 ± 0.67, 2.33 ± 1.65, and 1.47 ± 1.16 degrees, respectively.

Table 1. Deviation from needle tip to the target.

Number Deviation from needle tip to the target
Error 1 a (mm) Error 2 b (mm)
1 0.20 1.81
2 0.29 2.13
3 0.57 1.90
4 0.81 2.53
5 0.32 1.28
6 0.38 1.23
7 0.47 2.71
8 0.49 1.29
9 0.55 1.48
10 0.49 2.06
11 0.56 1.04
12 0.83 1.35
13 0.61 1.13
14 0.84 1.96
15 0.50 2.83
16 0.63 1.34
17 0.57 2.81
18 0.93 0.99
19 0.23 1.72
20 0.96 1.02
a

Error 1 = the error measured by the navigation system.

b

Error 2 = the error measured by the image fusion.

Fig. 6.

Fig. 6

The p value, mean deviation of the needle tip from the target and standard deviation measured by the navigation system (error 1) and by image fusion (error 2).

Table 2. FRE, angular deviation, and insertion depth for each approach.

Number FRE (mm) Angular deviation Insertion depth (mm)
Off x-axis Off y-axis Off z-axis
1 0.53 1.23 6.20 3.87 53.0
2 0.66 2.37 4.79 1.29 54.3
3 0.47 1.90 0.95 1.79 50.2
4 0.50 0.67 1.82 1.02 50.5
5 0.49 1.46 3.88 1.77 55.2
6 0.54 1.04 2.33 2.64 51.0
7 0.52 2.19 1.80 0.01 54.9
8 0.40 1.91 2.54 1.02 53.3
9 0.53 0.81 0.47 0.74 52.8
10 0.45 2.91 5.54 4.75 51.4
11 0.58 0.62 1.30 0.29 52.4
12 0.48 0.99 1.85 0.85 54.9
13 0.59 0.97 3.15 1.40 52.0
14 0.57 0.89 3.09 1.58 53.0
15 0.71 1.99 2.07 0.15 50.7
16 0.61 2.24 0.16 0.96 52.9
17 0.52 2.30 0.87 2.22 50.1
18 0.74 1.79 0.27 1.11 50.1
19 0.47 0.82 2.20 1.70 54.0
20 0.47 1.01 1.36 0.35 49.7
Mean ± SD 0.52 ± 0.08 1.51 ± 0.67 2.33 ± 1.65 1.47 ± 1.16 52.30 ± 1.75

Abbreviations: FRE, fiducial registration error; SD, standard deviation.

Discussion

As it is a relatively novel technology, medical robots are presently mainly used in minimally invasive surgery for procedures such as biopsy and local therapy. 20 Imprecise insertions can have a significant impact on the efficiency of diagnosis and treatment besides causing serious complications. The integration of robot systems with image-based navigation is becoming the trend for different medical applications. 3 Imaging data can be integrated with an accurate prediction of needle-tissue interaction for precise needle placement in robotic procedures. 18 Various imaging modalities, including CT/fluoroscopy, 2 7 8 19 21 22 23 24 25 26 27 MRI, 28 29 ultrasound (US), 30 31 and CBCT, 32 have been used in combination with different robot systems. Along with CT fluoroscopy, intraoperative MRI, and the US, optical navigation is also now an important tracking system for real-time guidance. Real-time vision feedback can verify the needle's advancement along the planned path without intraoperative X-ray exposure to the patient. The combination of force and vision feedback data was used to prevent accidental collision and guarantee efficient needle placement in present preliminary study. Accuracy and safety depend on the uninterrupted update of position data acquired by optical tracking, which provides accurate vision feedback and closed-loop control. A faster data acquisition frequency has been advised, 22 but is not yet available for this optical localizer, which is the main hurdle to increasing intervention speed.

No standard for position accuracy has been described for routine clinical practice, and surgeons define the error threshold according to the individual case. 1 26 28 Any component of the system can contribute to the whole systematic error; problems may arise from image distortion, target displacement due to tissue deformation or unexpected movement, registration error, optical localizer error, the limitations of the control and compensation algorithm, inherent kinematic limitation, needle deflection, and human error. 18 22 28 Although it is difficult to separate the contribution of each component, needle deflection is primarily responsible for needle displacement in previous studies. 2 19 23 24 The option of correcting needle deflection is to create a model for theoretically estimating needle–tissue interaction so that compensation can be made in advance (during planning). 20 A proper force interaction modeling for needle insertion is essential to deal with this problem, and we will be focusing on this in our next work. Although many efforts have been made to develop theoretical models, 33 34 35 they have not succeeded because of the complex biomechanical properties of soft tissues. So far, in clinical practice, it has not been possible to distinguish between different tissue types and layers with force feedback data. Manual insertion of the needle, guided by the tactile experience of the surgeon, is an alternative to the robotic system. 22 But as Gerovichev et al 36 have reported, real-time vision feedback has proved more efficient than force feedback for detecting different tissue types. In this case, we used a six-dimension force sensor not as a navigation device but as a warning mechanism to detect excessive resistance caused by needle slippage or deflection with the manner of force feedback. In fact, in this phantom study, neither the Plasticine nor the meatballs used as targets exhibited significant anisotropism, which is unlike the complex behavior of soft tissues. The targeting error measured by the navigation system and by image fusion of preoperative and postoperative CBCT data showed a statistically significant difference ( p  < 0.0001), which means that the navigation errors, including registration error and optical localizer error, may contribute considerably to the needle tip location error in this case, where a relatively stiff needle, short insertion distance, and nonelastic Plasticine were used. Although we verified the FRE after each registration of the image to the skull model, the FRE does not have a linear correlation with the TRE. Even if the FRE is small, there can still be a large TRE. 3 The mean TRE is 0.79 mm (0.1–1.8 mm) in the lateral skull base, 10 whereas it is 0.8 mm (0.1–1.6 mm) in the viscerocranium 11 with different paired-points matching. As Bao et al 37 advised, with the nonlinear arrangement and extensive distribution of the fiducial markers around the target and with the gravitational center of the fiducial markers as close to the target as possible, it is possible to get better registration accuracy. Therefore, we combined the different fiducial markers to improve the registration accuracy of the planning “skin” entry points and the target points close to the skull base. Although the robot motion error is negligible as a consequence of closed-loop control after the registration of the robot to the navigation system, the verification using just visual feedback is a limitation of the study. Human error is another important factor influencing the results. A 0.5 to 0.9 mm deviation of the needle tip can be caused by the deviation of only a few pixels when the target point is selected on the computer screen with a mouse click. 38 Unfortunately, it is hardly practical to separate all the possible influencing factors and to quantify them. The total error is not just a sum of the errors of each component; it may be below the summation. 28 A tumor in the skull base or infratemporal region is unlikely to be discovered early unless it causes clinical symptoms such as pain, swelling, or limitation in opening the mouth, by which time the size of the tumor usually exceeds 2 cm. Given this, needle placement accuracy below 3 mm may be acceptable for a needle biopsy in clinical practice. However, this is also the minimum safety distance to protect the carotid artery for this robot system in the preoperative planning.

Although medical robots have not been widely applied in clinical practice, various robot systems are in different stages of development for minimally invasive medical treatments. The iSYS1 robot system, comprising a seven DOF multifunctional holding arm and four DOF robot positioning unit (RPU), showed a 2.3 ± 0.8 mm (range: 0.9–3.7 mm) targeting accuracy within a 92.8 ± 14.4 mm needle insertion depth. 26 However, with only a 40 × 40 mm workspace, manual prelocation of the RPU with a passive arm need to be performed to get close to the entry point within the area. In comparison, convenient adjustment of the pose position through rotary and linear motion was possible in our system. Whereas, iSYS1 robot system uses a few millimeters deeper insertion to compensate for the elastic recoil caused by needle–tissue interaction, 26 a close-loop control strategy was implemented in our study. Another phantom study with the iSYS1 robot system with CBCT-based guidance showed 1.1 mm (range: 0–4.5 mm) accuracy within an average 8.5 cm (range: 4.2–13.5 cm) needle path. 32 This result is better than that reported in the earlier study, and the primary cause may be that a C-arm-mounted CBCT device was used to provide fluoroscopic monitoring so that fine adjustments could be made during the procedure. Differences in the mechanical properties of the phantoms used in the two studies could be another nonnegligible reason. But even if the need for CBCT data are cut down to the minimum, radiation exposure to patients cannot be avoided altogether, and in that respect optical navigation is superior. Song et al 30 implemented the first in vivo trial of a two-stage Cartesian robot for prostate brachytherapy where they demonstrated the accuracy of the robotic arm, with only submillimeter errors detected by the POLARIS system, and the ability to make fine adjustments whenever the US detected needle deflection. Although locating the position of the needle tip with the US is prone to errors, the ability to make fine adjustments to compensate for needle deflection without X-ray exposure is an obvious advantage. However, for the infratemporal and skull base areas, which are surrounded by bony structures and with access obstructed by the mandibular bone, US guidance is not feasible. Pollock et al 25 designed a phantom study to compare AcuBot robot-assisted needle insertion with manual needle insertion guided by CT-based optical navigation, and the former showed significantly better accuracy (1.2 vs. 5.8 mm). Phantom studies, 7 19 24 25 28 32 animal trials, 24 27 and in human experiments, 29 30 have all proved that robot-assisted intervention surgery is a promising field offering incomparable advantages, particularly in improving precision and efficacy.

In addition, the study does have several limitations that are worth discussing. First, this is a preliminary phantom study without complicated tissues such as carotid artery and nerves; we suggest that further studies be performed in animal models and humans to explore the use of robot-assisted needle insertions even for multiangulated trajectories. Second, given the inserted markers for registration are invasive in clinical use, the noninvasive methods such as point-based registration with artificial markers attached to an external referencing frame or surface matching should be proposed. Furthermore, the head should be rigidly fixed with a head clamp during operation. Given the accidental motion of the head, in reality, a dynamic registration frame can be rigidly fixed to the patient head to track the position of the head in real-time as used in the currently routine clinic, by which we can make immediate compensate for the programmable trajectory without the necessity of reregistration. Last but not the least, although the current robot serves as an auxiliary positioning device for biopsy, it can be developed for various needle interventions, such as robot-assisted radiofrequency, thermocoagulation for Gasserian ganglion, and brachytherapy with different end effectors, which is an interesting aspect of our further work.

Conclusions

We developed a geared robot system for percutaneous interventions and verified the accuracy and feasibility of the robotic positioning system guided by CBCT-based optical navigation. The experimental results show the robot system to be efficient, reliable, and safe. The navigation accuracy including overall registration error and optical localizer error is one of the most significant factors in robotic procedures. The “man-in-closed-loop” mode control and the combination of the force and vision feedback are critical to ensure accuracy and safety in needle placement.

Acknowledgments

This work was funded by the China National High-tech R&D Program (863 Program) under Grant No. 2012AA041606 and Beijing Science and Technology Project under Grant No. Z141100002014003 and Beijing Municipal Science & Technology Commission under Grant No. Z161100000516043. We thank Dr. Mu-Qing Liu (Department of Radiology, Peking University School and Hospital of Stomatology), and Gen-Yuan Xia (Institute of Computer-Aided-Design, Graphics and Visualization, School of Software, Tsinghua University) for their significant assistance during the preparation of this article.

Conflict of Interest None.

Note

Prof. Chuan-Bin Guo and Dr. Xiao-Jing Liu contributed equally to the article.

References

  • 1.Kronreif G, Fürst M, Kettenbach J, Figl M, Hanel R. Robotic guidance for percutaneous interventions. Adv Robot. 2003;17(06):541–560. [Google Scholar]
  • 2.Masamune K, Fichtinger G, Patriciu A et al. System for robotically assisted percutaneous procedures with computed tomography guidance. Comput Aided Surg. 2001;6(06):370–383. doi: 10.1002/igs.10024. [DOI] [PubMed] [Google Scholar]
  • 3.Masamune M, Hong J. Advanced imaging and robotics technologies for medical applications. Int J Optomechatronics. 2011;5(04):299–321. [Google Scholar]
  • 4.Kim H, Park C M, Lee S M, Goo J M. C-arm cone-beam CT virtual navigation-guided percutaneous mediastinal mass biopsy: diagnostic accuracy and complications. Eur Radiol. 2015;25(12):3508–3517. doi: 10.1007/s00330-015-3762-8. [DOI] [PubMed] [Google Scholar]
  • 5.Maan Z N, Gibbins N, Al-Jabri T, D'Souza A R. The use of robotics in otolaryngology-head and neck surgery: a systematic review. Am J Otolaryngol. 2012;33(01):137–146. doi: 10.1016/j.amjoto.2011.04.003. [DOI] [PubMed] [Google Scholar]
  • 6.Cleary K, Melzer A, Watson V, Kronreif G, Stoianovici D. Interventional robotic systems: applications and technology state-of-the-art. Minim Invasive Ther Allied Technol. 2006;15(02):101–113. doi: 10.1080/13645700600674179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Koethe Y, Xu S, Velusamy G, Wood B J, Venkatesan A M. Accuracy and efficacy of percutaneous biopsy and ablation using robotic assistance under computed tomography guidance: a phantom study. Eur Radiol. 2014;24(03):723–730. doi: 10.1007/s00330-013-3056-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Solomon S B, Patriciu A, Bohlman M E, Kavoussi L R, Stoianovici D. Robotically driven interventions: a method of using CT fluoroscopy without radiation exposure to the physician. Radiology. 2002;225(01):277–282. doi: 10.1148/radiol.2251011133. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.De Ceulaer J, De Clercq C, Swennen G R. Robotic surgery in oral and maxillofacial, craniofacial and head and neck surgery: a systematic review of the literature. Int J Oral Maxillofac Surg. 2012;41(11):1311–1324. doi: 10.1016/j.ijom.2012.05.035. [DOI] [PubMed] [Google Scholar]
  • 10.Caversaccio M, Zulliger D, Bächler R, Nolte L P, Häusler R. Practical aspects for optimal registration (matching) on the lateral skull base with an optical frameless computer-aided pointer system. Am J Otol. 2000;21(06):863–870. [PubMed] [Google Scholar]
  • 11.Luebbers H T, Messmer P, Obwegeser J A et al. Comparison of different registration methods for surgical navigation in cranio-maxillofacial surgery. J Craniomaxillofac Surg. 2008;36(02):109–116. doi: 10.1016/j.jcms.2007.09.002. [DOI] [PubMed] [Google Scholar]
  • 12.Besl P J, Mckay H D. A method for registration of 3-D shapes. IEEE Trans Pattern Anal Mach Intell. 1992;14:239–256. [Google Scholar]
  • 13.Masuda T, Yokoya N. A robust method for registration and segmentation of multiple range images. Comput Vis Image Und. 1995;61:295–307. [Google Scholar]
  • 14.Bae K H, Lichti D. A method for automated registration of unorganised point clouds. ISPRS J Photogramm. 2008;63(01):36–54. [Google Scholar]
  • 15.Umeyama S. Least-squares estimation of transformation parameters between twopoint patterns. IEEE T Pattern Anal. 1991;13(04):376–380. [Google Scholar]
  • 16.Xu N, Ahuja N, Bansal R. Object segmentation using graph cuts based active contours. Comput Vis Image Und. 2007;107:210–224. [Google Scholar]
  • 17.Korb W, Kornfeld M, Birkfellner W et al. Risk analysis and safety assessment in surgical robotics: a case study on a biopsy robot. Minim Invasive Ther Allied Technol. 2005;14(01):23–31. doi: 10.1080/13645700510010827. [DOI] [PubMed] [Google Scholar]
  • 18.Abolhassani N, Patel R, Moallem M. Needle insertion into soft tissue: a survey. Med Eng Phys. 2007;29(04):413–431. doi: 10.1016/j.medengphy.2006.07.003. [DOI] [PubMed] [Google Scholar]
  • 19.Zhou Y, Thiruvalluvan K, Krzeminski L, Moore W H, Xu Z, Liang Z. CT-guided robotic needle biopsy of lung nodules with respiratory motion - experimental system and preliminary test. Int J Med Robot. 2013;9(03):317–330. doi: 10.1002/rcs.1441. [DOI] [PubMed] [Google Scholar]
  • 20.Dogangil G, Davies B L, Rodriguez y Baena F. A review of medical robotics for minimally invasive soft tissue surgery. Proc Inst Mech Eng H. 2010;224(05):653–679. doi: 10.1243/09544119JEIM591. [DOI] [PubMed] [Google Scholar]
  • 21.Kettenbach J, Kronreif G, Figl M et al. Robot-assisted biopsy using computed tomography-guidance: initial results from in vitro tests. Invest Radiol. 2005;40(04):219–228. doi: 10.1097/01.rli.0000155285.05672.cf. [DOI] [PubMed] [Google Scholar]
  • 22.Tovar-Arriaga S, Vargas J E, Ramos J M, Aceves M A, Gorrostieta E, Kalender W A. A fully sensorized cooperative robotic system for surgical interventions. Sensors (Basel) 2012;12(07):9423–9447. doi: 10.3390/s120709423. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Aghayev E, Ebert L C, Christe A et al. CT data-based navigation for post-mortem biopsy--a feasibility study. J Forensic Leg Med. 2008;15(06):382–387. doi: 10.1016/j.jflm.2008.02.007. [DOI] [PubMed] [Google Scholar]
  • 24.Penzkofer T, Isfort P, Bruners P et al. Robot arm based flat panel CT-guided electromagnetic tracked spine interventions: phantom and animal model experiments. Eur Radiol. 2010;20(11):2656–2662. doi: 10.1007/s00330-010-1837-0. [DOI] [PubMed] [Google Scholar]
  • 25.Pollock R, Mozer P, Guzzo T J et al. Prospects in percutaneous ablative targeting: comparison of a computer-assisted navigation system and the AcuBot Robotic System. J Endourol. 2010;24(08):1269–1272. doi: 10.1089/end.2009.0482. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kettenbach J, Kara L, Toporek G, Fuerst M, Kronreif G. A robotic needle-positioning and guidance system for CT-guided puncture: Ex vivo results. Minim Invasive Ther Allied Technol. 2014;23(05):271–278. doi: 10.3109/13645706.2014.928641. [DOI] [PubMed] [Google Scholar]
  • 27.Yanof J, Haaga J, Klahr P et al. CT-integrated robot for interventional procedures: preliminary experiment and computer-human interfaces. Comput Aided Surg. 2001;6(06):352–359. doi: 10.1002/igs.10022. [DOI] [PubMed] [Google Scholar]
  • 28.Seifabadi R, Cho N B, Song S E et al. Accuracy study of a robotic system for MRI-guided prostate needle placement. Int J Med Robot. 2013;9(03):305–316. doi: 10.1002/rcs.1440. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Zangos S, Melzer A, Eichler K et al. MR-compatible assistance system for biopsy in a high-field-strength system: initial results in patients with suspicious prostate lesions. Radiology. 2011;259(03):903–910. doi: 10.1148/radiol.11101559. [DOI] [PubMed] [Google Scholar]
  • 30.Song D Y, Burdette E C, Fiene J et al. Robotic needle guide for prostate brachytherapy: clinical testing of feasibility and performance. Brachytherapy. 2011;10(01):57–63. doi: 10.1016/j.brachy.2010.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Davies B L, Harris S J, Dibble E. Brachytherapy--an example of a urological minimally invasive robotic procedure. Int J Med Robot. 2004;1(01):88–96. doi: 10.1002/rcs.10. [DOI] [PubMed] [Google Scholar]
  • 32.Schulz B, Eichler K, Siebenhandl P et al. Accuracy and speed of robotic assisted needle interventions using a modern cone beam computed tomography intervention suite: a phantom study. Eur Radiol. 2013;23(01):198–204. doi: 10.1007/s00330-012-2585-0. [DOI] [PubMed] [Google Scholar]
  • 33.Simone C, Okamura A M. Modeling of needle insertion forces for robot-assisted percutaneous therapy. International Conference on Robotics & Automation. 2002;2:2085–2091. [Google Scholar]
  • 34.Kataoka H, Washio T, Audette M, Mizuhara K. A Model for Relations Between Needle Deflection, Force, and Thickness on Needle Penetration. Paper presented at: Medical Image Computing and Computer-Assisted Intervention-Miccai 2001, International Conference; October 14-17, 2001; Utrecht, the Netherlands
  • 35.Okamura A M, Simone C, O'Leary M D. Force modeling for needle insertion into soft tissue. IEEE Trans Biomed Eng. 2004;51(10):1707–1716. doi: 10.1109/TBME.2004.831542. [DOI] [PubMed] [Google Scholar]
  • 36.Gerovichev O, Marayong P, Okamura A M. The effect of visual and haptic feedback on manual and teleoperated needle insertion. Lect Notes Comput Sc. 2002;2488:147–154. [Google Scholar]
  • 37.Bao N, Chen Y, Yue Y et al. Fiducial markers configuration optimization in image-guided surgery. Biomed Mater Eng. 2014;24(06):3361–3371. doi: 10.3233/BME-141159. [DOI] [PubMed] [Google Scholar]
  • 38.Kettenbach J, Kronreif G. Robotic systems for percutaneous needle-guided interventions. Minim Invasive Ther Allied Technol. 2015;24(01):45–53. doi: 10.3109/13645706.2014.977299. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Neurological Surgery. Part B, Skull Base are provided here courtesy of Thieme Medical Publishers

RESOURCES