Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2017 Jun 1.
Published in final edited form as: Int J Comput Assist Radiol Surg. 2016 Jun 1;11(6):1163–1171. doi: 10.1007/s11548-016-1406-3

On-Demand Calibration and Evaluation for Electromagnetically Tracked Laparoscope in Augmented Reality Visualization

Xinyang Liu 1, William Plishker 2, George Zaki 2, Sukryool Kang 1, Timothy D Kane 1, Raj Shekhar 1,*
PMCID: PMC5290310  NIHMSID: NIHMS846292  PMID: 27250853

Abstract

Purpose

Common camera calibration methods employed in current laparoscopic augmented reality systems require the acquisition of multiple images of an entire checkerboard pattern from various poses. This lengthy procedure prevents performing laparoscope calibration in the operating room (OR). The purpose of this work was to develop a fast calibration method for electromagnetically (EM) tracked laparoscopes, such that calibration can be performed in the OR on demand.

Methods

We designed a mechanical tracking mount to uniquely and snugly position an EM sensor to an appropriate location on a conventional laparoscope. A tool named fCalib was developed to calibrate intrinsic camera parameters, distortion coefficients, and extrinsic parameters (transformation between the scope lens coordinate system and the EM sensor coordinate system) using a single image that shows an arbitrary portion of a special target pattern. For quick evaluation of calibration result in the OR, we integrated a tube phantom with fCalib and overlaid a virtual representation of the tube on the live video scene.

Results

We compared spatial target registration error between the common OpenCV method and the fCalib method in a laboratory setting. In addition, we compared the calibration re-projection error between the EM tracking-based fCalib and the optical tracking-based fCalib in a clinical setting. Our results suggested that the proposed method is comparable to the OpenCV method. However, changing the environment, e.g., inserting or removing surgical tools, would affect re-projection accuracy for the EM tracking-based approach. Computational time of the fCalib method averaged 14.0 s (range 3.5 s – 22.7 s).

Conclusions

We developed and validated a prototype for fast calibration and evaluation of EM tracked conventional (forward viewing) laparoscopes. The calibration method achieved acceptable accuracy and was relatively fast and easy to be performed in the OR on demand.

Keywords: augmented reality, electromagnetic tracking, camera calibration, laparoscopic procedure, laparoscopic visualization

1 Introduction

Minimally invasive laparoscopic surgery is an attractive alternative to conventional open surgery and is known to improve outcomes, cause less scarring, and lead to significantly faster patient recovery. In laparoscopic procedures, the primary means of intraoperative visualization is through real-time video of the surgical field acquired by a laparoscopic camera. The laparoscope and other surgical tools are inserted into the patient through trocars, which are mounted at various sites on the abdomen. Compared with open surgery, conventional laparoscopy lacks tactile feedback. In addition, laparoscopic view only provides a surface depiction of the organs and cannot show anatomic structures beneath the exposed organ surfaces. These limitations create a greater need for enhanced intraoperative visualization during laparoscopic procedures.

Laparoscopic augmented reality (AR), a method to overlay tomographic images (e.g., computed tomography [1, 2] and ultrasound [35]) with live laparoscopic video, has emerged as a promising technology to enhance visualization. For creating accurate AR visualization, the augmenting images must be rendered using a virtual camera that mimics the optics of the actual laparoscopic camera and is also positioned and oriented exactly like it. This necessitates camera calibration, a well-studied procedure in computer vision to determine the relationships between points in the 3D space and their correspondences in the image space. Specifically, the laparoscope calibration studied in this paper determines (a) intrinsic camera parameters (focal length, principal point, lens distortion, etc.), and (b) extrinsic parameters resulting from the hand-eye calibration [6] that relates the scope lens coordinate system with the coordinate system of the sensor attached on the laparoscope.

Common method for calibrating intrinsic parameters is Bouguet’s implementation [7] of Heikkilä and Silvé’s camera model [8], which can be simplified to the well-accepted Zhang’s camera model [9]. A C++ implementation of the toolbox is included in the freely available OpenCV library (Intel Corp., Santa Clara, CA, USA). Most of previous AR systems [14], including ours previous work [5], used Bouguet’s method. In these calibrations, images of a checkerboard pattern – alternating black and white squares – taken by the laparoscope from various poses are needed as the input. The corner points of the checkerboard pattern are detected in the image either manually or automatically (e.g., OpenCV). The size of the square in the pattern and the number of squares needed are determined by the field of view and the working distance of the laparoscope employed. For example, Feuerstein et al. [1] used a checkerboard pattern of 8×7 corners with a 10-mm distance between 2 adjacent corners for a conventional 2D laparoscope (KARL STORZ GmbH & Co. KG, Tuttlingen, Germany).

Bouguet’s method usually requires acquisition of multiple (typically 15 or more) images to achieve good calibration result. For AR applications, this lengthy procedure limits the use of Bouguet’s method in the operating room (OR). The current workflow of using the AR system clinically [10] is as follows. The laparoscope is first calibrated before clinical use. Then, the laparoscope and the tracking mount (the device holding the marker/sensor) are removed and send for sterilization. Finally, a specialist assembles the sterilized items in the OR at the beginning of the procedure. Although feasible in some situations, there are two major drawbacks associated with this approach. First, even though people design tracking mount to affix the marker/sensor exactly the same way each time, some errors are unavoidable when attaching the tracking mount on the laparoscope in the OR. Second, the workflow is only feasible for laparoscopes with no exchangeable optics, such as the highly integrated stereoscopic laparoscope (Visionsense Corp., New York, NY, USA) used in previous AR applications [4, 5]. This workflow is not feasible for conventional 2D laparoscopes, which contain two detachable parts – a camera head and a telescope shaft. For these laparoscopes, some optical parameters, e.g., the focal length, are adjustable by the surgeon during the surgery. When this happens, pre-calibrated camera parameters cannot be used. Therefore, a fast and easy laparoscope calibration method that can be performed and repeated in the OR is critical to extend AR methods to conventional laparoscopes.

Perceive3D (Coimbra, Portugal) is a start-up company developing image calibration software for improving visualization and guidance during clinical endoscopy. One of their products, rdCalib algorithm, features automatic calibration of intrinsic camera parameters from a single image acquired in an arbitrary position. The algorithm is based on their previously developed camera calibration method [11, 12], in which the image distortion is described by the so-called division model. In our previous work [13], we compared camera intrinsic parameters calibrated by rdCalib using a singe image with that calibrated by OpenCV using multiple images. The results suggested rdCalib was able to provide accurate and stable intrinsic parameters that were comparable to the parameters obtained using the OpenCV method.

To ensure accurate image fusion, a registration method is needed to overlay augmenting images on laparoscopic video. In the current practice, the use of an external tracking hardware is necessary for reliable clinical use. Many previous AR prototypes employed optical tracking [1, 2, 5], which uses an infrared camera to track optical markers affixed on the laparoscope. One limitation of this tracking method is the line-of-sight requirement. It does not permit articulation of the flexible-tip laparoscopic ultrasound (LUS) transducer, which is popular among AR applications [35, 14]. Another real-time tracking method without the line-of-sight restriction is electromagnetic (EM) tracking, which reports the location and orientation of a wired positional sensor inside a 3D working volume with magnetic field, created by a field generator. Many groups have measured the accuracy of EM tracking in clinical settings [1517], as well as configurations in which an EM sensor is embedded in imaging probes [18]. However, the use of EM tracking in a clinical laparoscopic AR system has not been reported and remains challenging. This is mainly due to the potential for distortion of the magnetic field due to the presence of ferrous metals and conductive materials. In order to reduce this potential for distortion, the where and how of EM sensor placement on surgical tools or imaging devices emerges as a critical question for the application of EM tracking to computer-assisted surgery.

In this work we developed a fast calibration and evaluation tool, called fCalib, for conventional 2D laparoscopes. The tool can be used to perform intrinsic (based on rdCalib) and hand-eye calibrations using a single image. In addition, a tube phantom was integrated with the tool for immediate visual evaluation of the calibration accuracy. To perform hand-eye calibration, we designed a mechanical tracking mount to uniquely and snugly position an EM sensor to an appropriate location on the laparoscope. We compared the spatial target registration error between the common OpenCV method and the fCalib method in a laboratory setting. In addition, we compared the calibration re-projection error between the EM tracking-based fCalib and the optical tracking-based fCalib in a clinical setting.

2 Materials and Methods

2.1 System Overview

As shown in Fig. 1, the system involved in this study includes a conventional 2D laparoscopic camera (1188HD, Stryker Endoscopy, San Jose, CA, USA) with a 0° 5-mm scope, an EM tracking system with a Tabletop Field Generator (Aurora, Northern Digital Inc., Waterloo, ON, Canada), and a laptop computer running calibration and evaluation software. A frame grabber is used to stream live laparoscopic video to the laptop. The Tabletop Field Generator is specially designed for OR applications. The generator is positioned between the patient and the surgical table and incorporates a shield that suppresses distortions caused by any metallic material underneath it. Recent studies found that the tabletop arrangement could reduce EM tracking errors [15, 17]. We also evaluated the employed EM tracking system and the results suggested it was able to provide accurate tracking data for our application [19]. For tracking the laparoscope and the fCalib tool, we used Aurora Cable Tool (2.5 mm diameter), which contains a 6 degree-of-freedom (DOF) sensor at the tip of a flexible shielded cable. To simulate an OR setting, experiments in the laboratory were performed on a surgical table.

Fig. 1.

Fig. 1

System setup in the laboratory

Let pEMT be a homogeneous point in the EM tracker coordinate system (EMT). Let pLapu denote the corresponding point of pEMT in the undistorted video image. If we denote TAB as the 4 × 4 homogeneous transformation matrix from the coordinate system of A to that of B, the relationship between pEMT and pLapu can be expressed as the following chain of transforms:

pLapu~K·[I30]·TLapSLens·TEMTLapS·pEMT (1)

where ‘LapS’ refers to the EM sensor attached to the laparoscope; ‘Lens’ refers to the scope lens; I3 is a unit matrix of size 3; and K is the camera matrix. TEMTLapS can be obtained from tracking data; TLapSLens can be obtained from hand-eye calibration; and K can be obtained from camera calibration. pLapu can be distorted using lens distortion coefficients also obtained from camera calibration.

2.2 EM Tracking Mount

As discussed in Section 1, the where and how to place the EM sensor on the laparoscope is critical for our application. In the current practice, it is necessary to integrate the sensor externally to the laparoscope. One option is to affix the sensor at the imaging tip and attach the sensor wire along the shaft of the scope. However, this solution would inevitably increase the trocar size needed for entering the integrated laparoscope. Moreover, putting the sensor inside patient’s body needs further justification of safety. A more practical solution is to attach the EM tracking mount on the handle of the laparoscope, so that the sensor is kept outside patient’s body during surgery. Because there are more metallic and electronic materials contained in the handle of the laparoscope than in the telescope shaft, there is the potential for larger distortion error in the EM tracker readings in the vicinity of the handle. Based on our empirical study of EM tracking errors, it is necessary to place the sensor sufficiently away from the handle. This and other concerns of clinical feasibility have led to our design of a tracking mount described as follows.

We built a mechanical mount that can be uniquely and snugly snapped on the laparoscope, such that the EM sensor is positioned at a designated location away from the handle. The tracking mount was printed using a 3D printer (Objet500 Connex, Stratasys Ltd., Eden Prairie, MN, USA) with materials that could withstand the commonly used low-temperature sterilization process (e.g., STERRAD®, ASP, Irvine, CA, USA). Figure 2 shows the insertion of the laparoscope, attached with the mount, into a 5-mm plastic trocar. In this design, the sensor is placed about the maximum distance away from the handle (in the scope axis direction) to avoid touching with the patient during the procedure. The design also allows free rotation of the trocar.

Fig. 2.

Fig. 2

Two views of the snap-on mechanical tracking mount

2.3 fCalib as a Fast Calibration Tool

Figure 3a shows the 3D printed fCalib prototype comprising a post for firmly embedding an EM sensor, a tube phantom for quick evaluation of calibration accuracy, and the rdCalib target pattern that was printed on paper and cut and glued on fCalib. The rdCalib pattern, originates from [20], has 30×30 alternating black and white squares, each containing a unique marker inside the square. The size of the square used in this study is 3.38 mm but can be adapted with respect to different laparoscopes or various clinical requirements. In contrast to common camera calibration methods, rdCalib can automatically detect corner points in an image with only a portion of the pattern, as demonstrated in Fig. 3b.

Fig. 3.

Fig. 3

(a) fCalib prototype. (b) Automatically detected 259 corner points (red cross) by rdCalib from an image showing only a portion of the target pattern

We implemented a fully automated and integrated calibration module. All one needs to do is to press a key on the laptop keyboard while pointing the laparoscope to the rdCalib target pattern, and the intrinsic and extrinsic parameters are calculated. When the key is pressed, an image is acquired as the input to rdCalib, and current pose ( TEMTLapS in Eq. 1) of the laparoscope is recorded. Outputs of rdCalib include the intrinsic camera parameters (focal lengths and principal points) and a distortion metric of the division model [11, 12]. Since we would use OpenCV’s function for hand-eye calibration, the distortion coefficients of Bouguet’s camera model were estimated based on the distortion metric obtained from rdCalib. For each detected corner point, its x-y coordinates in the video image plane and the x-y indexes (integer between 0 and 31) in the target pattern plane were also given by rdCalib.

If we denote the 4 corners of the entire pattern as the outer corner points, the coordinates of any corner point in the EM tracker coordinate system (pEMT in Eq. 1) can be obtained based on its x-y indexes in the target pattern and the coordinates of any 3 of the 4 outer corner points. Once an EM sensor was embedded with fCalib, the geometric relationship between the target pattern and the sensor is fixed. A registration process was developed to obtain the coordinates of the 3 outer corner points in the sensor coordinate system, so that whenever the sensor is tracked, the coordinates of the 3 outer corner points in the tracker coordinate system are known. To register the outer corner points with the sensor, we touched the outer corner points with a pre-calibrated and tracked stylus (Aurora 6DOF Probe). To reduce EM jitter error, the registration process was carried out multiple times and the results were averaged.

Now we have the detected corner points in both tracker coordinate system (pEMT in Eq. 1) and image plane (pLap); TEMTLapS is recorded when the key is pressed; camera matrix K is obtained from intrinsic calibration; and extrinsic parameters TLapSLens are calculated using OpenCV’s function of solving the Perspective-n-Point (PnP) problem. Points in EM tracker coordinate system can be projected to the video image according to Eq. 1.

2.4 fCalib as a Quick Evaluation Tool

As shown in Fig. 3a, we designed a tube phantom for quick evaluation of laparoscope calibration accuracy. The tube simulates a blood vessel or duct, which is a typical visualization target of the AR system. We created a virtual representation of the tube with the same diameter and length as the actual tube, and superimposed it on the live video scene. A mathematical model of a series of rings in 3D space was developed to represent a 3D tube. Similar as the aforementioned registration process, we registered the tube phantom with the EM sensor by using a pre-calibrated and tracked stylus to touch 2 small divots, located as the centroids of the 2 end circular faces of the tube. The virtual tube model was projected to the video image using calibrated intrinsic and extrinsic parameters, and finally rendered for display.

To improve portability and compactness, we implemented the calibration and evaluation modules on a laptop computer (Precision M4800, Dell; 4-core 2.9 GHz Intel CPU, 8 GB memory). To reduce video processing time, functions involved in the evaluation module were performed using OpenGL (Silicon Graphics, Sunnyvale, CA, USA) on a mobile graphics processing unit (Quadro K2100M, NVidia).

3 Experiments and Results

3.1 Experiments in the Laboratory

In this section, we compared calibration accuracy between the common OpenCV method and the proposed fCalib method in a laboratory setting. We replaced the rdCalib target pattern on fCalib with a standard checkerboard pattern, which contains 9×7 corners with a 5.6-mm distance between 2 adjacent corners. Similar to the aforementioned registration process, we registered the outer corner points of the checkerboard pattern to the EM sensor coordinate system. We acquired 2 sets of 20 images, and 5 sets of 3 images, capturing the entire checkerboard pattern using the laparoscope with various poses. We kept the optical parameters of the laparoscope and the EM tracking mount on the laparoscope unchanged during all experiments. Intrinsic parameters were obtained using OpenCV’s camera calibration function based on 20 or 3 images. Extrinsic parameters were first obtained for each image using OpenCV’s solvePnP function, and averaged over 20 or 3 images. To average rotations, we transformed the obtained rotation matrices to quaternions and estimated the mean rotation using the algorithm proposed in [21].

We compared results of 5 single-image fCalib calibrations, 5 3-image OpenCV calibrations, and 2 20-image OpenCV calibrations, using target registration error (TRE), the difference between the ground-truth and the observed coordinates of landmarks (points in 3D space). TRE has been used in several AR applications to evaluate individual device calibration as well as overall system registration accuracy [4, 5]. Similar to the OpenCV calibration process, we used the fCalib plate with the checkerboard pattern for measuring TRE. Fixing the fCalib plate, 2 images of the checkerboard pattern were acquired using the laparoscope from 2 different viewpoints. The locations of the corner points in the EM tracker coordinate system were estimated using triangulation [22] of the 2 views based on the detected corner points in the images, the poses of the laparoscope when acquiring the images, and the calibrated intrinsic and extrinsic parameters. The estimated locations were compared with the actual locations of the corner points, which were known because the checkerboard pattern was registered with the EM sensor. TREs were calculated using different sets of calibration results, but with the same 2 images and tracking data. We performed the experiment 4 times by placing the fCalib plate at different locations in the working volume of EM tracking. Table 1 summarizes the resulting TREs, averaged over the 4 locations and the 9×7 corner points. Under the same conditions, fCalib’s single-image TRE is slightly better than OpenCV’s 20-image TRE, and much better than OpenCV’s 3-image TRE. This demonstrates the accuracy and robustness of using fCalib.

Table 1.

Mean and standard deviation of TREs for the three calibration methods

Single-image fCalib 20-image OpenCV 3-image OpenCV
TRE (mm) 1.48 ± 0.31 1.58 ± 0.33 2.35 ± 0.36

To demonstrate immediate evaluation of calibration accuracy, a video clip showing the original laparoscopic view and the view overlaid with the virtual tube was supplied as supplementary multimedia material. The video was recorded using fCalib results while holding the laparoscope with hand. We dyed one end circular face of the actual tube blue for better visualization in the video. Two sample snapshots of the video are shown in Fig. 4.

Fig. 4.

Fig. 4

Two sample snapshots of the submitted video clip. The video shows the original laparoscopic view and the view overlaid with the virtual tube, represented as a series of red rings in 3D space.

3.2 Experiments in the OR

It is known that EM tracking accuracy is susceptible to the presence of metallic and conductive materials. In this section, we validated the calibration accuracy of the proposed EM tracking-based fCalib, by comparing it with the optical tracking-based fCalib in a clinical setting. As shown in Fig. 5a, we created a realistic setup in an OR of our hospital to simulate a real laparoscopic surgery. The EM field generator was placed on a surgical table. A plastic laparoscopic trainer used for training surgeons was placed on the field generator. Three common laparoscopic surgical tools, two graspers and one pair of scissors, were inserted into the trainer through trocars. A Nathanson arm, a passive arm used for holding surgical tools during laparoscopic surgery, was used in this experiment to hold a liver retractor, which is used to mobilize the liver in order to create surgical space. It should be noted that the Nathanson arm and the liver retractor are not required in many laparoscopic procedures, one example of which is cholecystectomy (removal of the gall bladder), the most common laparoscope procedure. The setup simulated a situation with relatively high potential of metal interference.

Fig. 5.

Fig. 5

Experiment setup in the OR. (a) EM tracking-based fCalib. (b) Optical tracking-based fCalib

fCalib plate is intended to be directly placed on patient’s abdomen. Figure 5a shows a possible location to place the plate, however, the location can be varied according to the actual setup of trocars and surgical tools. Based on our preliminary experience, it is necessary to keep the EM sensors as close to the field generator as possible. Thus, the fCalib plate was placed slightly tilted so that the sensor attached on the laparoscope can be kept close to the field generator.

We developed an optical tracking-based fCalib tool in the same way as the proposed EM tracking-based fCalib. Figure 5b shows the clinical setup for this tool. Calibration results of optical tracking-based fCalib were used as the reference for comparison with results obtained using the EM tracking-based fCalib.

We first performed 20 free-hand single-image calibrations using fCalib for each of the tracking methods. The images were acquired with various poses of the laparoscope while keeping the sensor attached on the laparoscope relatively close to the field generator. In addition, the distance between the laparoscope lens and the center of fCalib target pattern was kept between 8 cm and 12 cm, the typical working distance of laparoscopes. This distance was calculated by transferring the coordinates of the target pattern center in the tracker coordinate system to the lens coordinate system. For the EM tracking-based approach, the computational time averaged 14.0 s (range 3.5 s – 22.7 s), and the number of detected corner points averaged 338 (range 213 – 417). Generally, the computational time increases as more corner points are detected. For comparison, it takes about 4 min for an expert to acquire 20 checkerboard images and use OpenCV for calibration.

For the EM tracking-based approach, the root-mean-square (RMS) calibration re-projection error (RPE) averaged 0.89 ± 0.29 pixel for Perceive3D’s rdCalib calibration (no tracking involved), and 1.30 ± 0.46 pixel for the fCalib calibration. For the optical tracking-based approach, the RMS calibration RPE averaged 0.87 ± 0.18 pixel for the rdCalib calibration, and 1.48 ± 0.36 pixel for the fCalib calibration. The results suggest the calibration RPE obtained using the EM tracking-based fCalib is comparable to that obtained using the optical tracking-based fCalib..

Tables 2 and 3 show the uncertainty of the estimated intrinsic and extrinsic parameters of the 20 calibrations using the EM tracking-based fCalib. The image resolution of the employed laparoscope camera is 1280×720. There was no great variation of parameter values among these single-image calibrations.

Table 2.

Mean and standard deviation of intrinsic parameters of the 20 calibrations using EM tracking-based fCalib

f(1) a f(2) a c(1) b c(2) b k(1) c
1175.4 ± 3.2 1175.4 ± 3.4 586.9 ± 6.5 341.9 ± 5.4 −0.356 ± 0.007
a

Focal length in pixel

b

Principal point in pixel

c

First radial distortion coefficient in Bouguet’s camera model [7]

Table 3.

Mean and standard deviation of extrinsic parameters of the 20 calibrations using EM tracking-based fCalib

R(1) a R(2) a R(3) a T(1) b T(2) b T(3) b
0.052 ± 0.006 −0.046 ± 0.008 2.112 ± 0.003 −9.7 ± 1.8 −59.8 ± 1.2 −267.9 ± 0.6
a

Rotation vector

b

Translation vector in mm

The above experiment indicates low RPE for the EM tracking-based fCalib in a fixed environment. However, the calibration optimization could be based on sensor poses in a potentially distorted magnetic field caused by the surgical tools. To reflect this distortion, we performed experiments by changing the environment. As shown in Fig. 5a, the laparoscope was first placed stationary in a potentially distorted environment, and a calibration using fCalib was performed. This yields intrinsic and extrinsic parameter values and a calibration RPE. Next, without touching the laparoscope and the fCalib plate, we carefully removed all surgical tools from the EM tracking volume, creating an undistorted environment. The detected target pattern corners in the tracker coordinate system were re-projected to the image plane using previously obtained parameter values, generating an environment-changed RPE. The process was repeated 10 times with various laparoscope poses and distances from the lens to the target pattern. Moreover, we performed another 10 experiments in the reverse order, i.e., first calibrating in an undistorted environment and then obtaining an environment-changed RPE in a potentially distorted environment. As a reference, we performed one more set of experiments using the optical tracking-based fCalib.

Table 4 summarizes the RPEs of these experiments. As can be seen, there is a greater increase in RPE for the EM tracking-based approach compared with the optical tracking-based approach when the environment changes. The small increase of RPE for the optical tracking-based approach could be caused by small movement of the fCalib plate during changing of the environment.

Table 4.

Mean and standard deviation of re-projection error (pixel) when changing the environment

EM Process 1 a EM Process 2 b Optical Process 1 a
Calibration RPE 1.80 ± 0.89 1.57 ± 0.66 1.30 ± 0.49
Environment-changed RPE 7.89 ± 1.43 7.76 ± 2.29 2.49 ± 0.64
a

Calibrate with surgical tools in the field and re-project without them

b

Calibrate without surgical tools in the field and re-project with them

4 Discussion

In this work, we presented a fast and easy (forward viewing) laparoscope calibration method that can be performed in the OR on demand. Although intrinsic camera calibration using Perceive3D’s rdCalib algorithm and hand-eye calibration have been previously studied, the proposed work is innovative for efficient integration of the two calibration steps, EM tracking, and an immediate evaluation mechanism into a compact platform. Pursuit of clinical translation in the near-term differentiates our reported research from all prior efforts.

It is possible that the target registration error obtained using fCalib with a single image is better than that obtained using OpenCV with 20 images, as reported in Table 1. fCalib’s ability of accurate detection of corner points in the image periphery enables its potential better estimation of lens distortion metric. This could also yield better estimation of extrinsic parameters.

The proposed EM tracking-based fCalib method was demonstrated to achieve a low re-projection error in a relatively stable environment, even with heavy metal interference. However, when there is a significant change in the environment, using the previously calibrated result could lead to high errors. Re-calibration using fCalib is an option in this situation if possible.

The planned workflow of using the EM tracking-based fCalib in the OR is as follows. The sterilized EM tracking mount embedded with a sensor is first attached on the laparoscope. After adjusting the optical parameters of the laparoscope and creating a relatively stable surgical environment, the surgeon places the fCalib plate on patient’s abdomen, points the laparoscope to the calibration target pattern and acquires an image. The intrinsic and extrinsic parameters are then automatically calculated, and the virtual tube model is updated in the laparoscopic video according to the calibration result. Finally, the surgeon points the laparoscope to the tube phantom with various poses to assess the calibration accuracy. If not satisfied, the surgeon can repeat the calibration process.

Immediate follow-on work will include manufacturing the fCalib tool with special material so that it can be sterilized for OR use. One solution could be laser marking the calibration target pattern on Radel® polyphenylsulfone.

5 Conclusion

We developed and validated a prototype for fast calibration and evaluation of EM tracked conventional (forward viewing) laparoscopes. The calibration method achieved acceptable accuracy and was relatively fast and easy to be performed in the OR on demand. This work is a critical step toward clinical adoption of AR methods.

Acknowledgments

The authors would like to thank Dr. Joao P. Barreto and Mr. Rui Melo of Perceive3D, SA for providing the rdCalib API and the associated calibration target pattern. The authors would also like to thank James McConnaughey for his assistance in 3D printing the mechanical EM tracking mount. This work was supported partially by the National Institutes of Health grant 1R41CA192504.

Footnotes

Conflict of Interest: The authors declare that they have no conflict of interest.

Informed consent: Informed consent was obtained from all individual participants included in the study.

References

  • 1.Feuerstein M, Mussack T, Heining SM, Navab N. Intraoperative laparoscope augmentation for port placement and resection planning in minimally invasive liver resection. IEEE Trans Med Imaging. 2008;27(3):355–369. doi: 10.1109/TMI.2007.907327. [DOI] [PubMed] [Google Scholar]
  • 2.Shekhar R, Dandekar O, Bhat V, Philip M, Lei P, Godinez C, Sutton E, George I, Kavic S, Mezrich R, Park A. Live augmented reality: A new visualization method for laparoscopic surgery using continuous volumetric computed tomography. Surg Endosc. 2010;24(8):1976–1985. doi: 10.1007/s00464-010-0890-8. [DOI] [PubMed] [Google Scholar]
  • 3.Leven J, Burschka D, Kumar R, Zhang G, Blumenkranz S, Dai XD, Awad M, Hager GD, Marohn M, Choti M, Hasser C, Taylor RH. DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability. Proc Med Image Comput Comput Assist Interv. 2005;8(Pt 1):811–818. doi: 10.1007/11566465_100. [DOI] [PubMed] [Google Scholar]
  • 4.Cheung CL, Wedlake C, Moore J, Pautler SE, Peters TM. Fused video and ultrasound images for minimally invasive partial nephrectomy: A phantom study. Proc Med Image Comput Comput Assist Interv. 2010;13(Pt 3):408–415. doi: 10.1007/978-3-642-15711-0_51. [DOI] [PubMed] [Google Scholar]
  • 5.Kang X, Azizian M, Wilson E, Wu K, Martin AD, Kane TD, Peters CA, Cleary K, Shekhar R. Stereoscopic augmented reality for laparoscopic surgery. Surg Endosc. 2014;28(7):2227–2235. doi: 10.1007/s00464-014-3433-x. [DOI] [PubMed] [Google Scholar]
  • 6.Shiu Y, Ahmad S. Calibraiton of wrist-mounted robotic sensors by solving homogeneous transform equations of the form ax = xb. IEEE Trans Bobot Autom. 1989;5(1):16–29. [Google Scholar]
  • 7.Bouguet JY. [Accessed 20 July 2015];Camera calibration With OpenCV. http://docs.opencv.org/doc/tutorials/calib3d/camera_calibration/camera_calibration.html.
  • 8.Heikkila J, Silven O. A four-step camera calibration procedure with implicit image correction. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit. 1997:1106–1112. [Google Scholar]
  • 9.Zhang Z. Flexible camera calibration by viewing a plane from unknown orientations. Proc International Conference on Computer Vision. 1999:666–673. [Google Scholar]
  • 10.Shekhar R, Liu X, Wilson E, Kang S, Petrosyan M, Kane TD. Stereoscopic augmented reality visualization for laparoscopic surgery – initial clinical experience. Proc Annual Meeting of Society of American Gastrointestinal and Endoscopic Surgeons.2015. [Google Scholar]
  • 11.Barreto JP, Roquette J, Sturm P, Fonseca F. Automatic Camera Calibration Applied to Medical Endoscopy. Proc British Machine Vision Conf 2009 [Google Scholar]
  • 12.Melo R, Barreto JP, Falcão G. A new solution for camera calibration and real-time image distortion correction in medical endoscopy-initial technical evaluation. IEEE Trans Biomed Eng. 2012;59(3):634–644. doi: 10.1109/TBME.2011.2177268. [DOI] [PubMed] [Google Scholar]
  • 13.Liu X, Su H, Kang S, Kane TD, Shekhar R. Application of single-image camera calibration for ultrasound augmented laparoscopic visualization. Proc SPIE Medical Imaging 94151T. 2015 doi: 10.1117/12.2082194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Feuerstein M, Reichl T, Vogel J, Traub J, Navab N. Magneto-optical tracking of flexible laparoscopic ultrasound: Model-based online detection and correction of magnetic tracking errors. IEEE Trans Med Imaging. 2009;28(6):951–967. doi: 10.1109/TMI.2008.2008954. [DOI] [PubMed] [Google Scholar]
  • 15.Yaniv Z, Wilson E, Lindisch D, Cleary K. Electromagnetic tracking in the clinical environment. Med Phys. 2009;36(3):876–892. doi: 10.1118/1.3075829. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Franz AM, März K, Hummel J, Birkfellner W, Bendl R, Delorme S, Schlemmer HP, Meinzer HP, Maier-Hein L. Electromagnetic tracking for US-guided interventions: Standardized assessment of a new compact field generator. Int J Comput Assist Radiol Surg. 2012;7(6):813–818. doi: 10.1007/s11548-012-0740-3. [DOI] [PubMed] [Google Scholar]
  • 17.Maier-Hein L, Franz AM, Birkfellner W, Hummel J, Gergel I, Wegner I, Meinzer HP. Standardized assessment of new electromagnetic field generators in an interventional radiology setting. Med Phys. 2012;39(6):3424–3434. doi: 10.1118/1.4712222. [DOI] [PubMed] [Google Scholar]
  • 18.Moore JT, Wiles AD, Wedlake C, Bainbridge D, Kiaii B, Luisa Trejos A, Patel R, Peters TM. Integration of trans-esophageal echocardiography with magnetic tracking technology for cardiac interventions. Proc SPIE Medical Imaging 76252Y 2010 [Google Scholar]
  • 19.Liu X, Kang S, Wilson E, Peters CA, Kane TD, Shekhar R. Evaluation of Electromagnetic Tracking for Stereoscopic Augmented Reality Laparoscopic Visualization. Proc MICCAI Workshop on Clinical Image-based Procedures: Translational Research in Medical Imaging. 2014;8361:84–91. [Google Scholar]
  • 20.Atcheson B, Heide F, Heidrich W. CALTag: High precision fiducial markers for camera calibration. 15th International Workshop on Vision, Modeling and Visualization; Siegen, Germany. 2010. [Google Scholar]
  • 21.Johnson MP. Dissertation. MIT; 2003. Exploiting Quaternions to Support Expressive Interactive Character Motion. [Google Scholar]
  • 22.Hartley RI, Sturm P. Triangulation. Comput Vis Image Underst. 1997;68(2):146–157. [Google Scholar]

RESOURCES