Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Nov 6.
Published in final edited form as: Int J Med Robot. 2012 Jul 4;9(2):10.1002/rcs.1446. doi: 10.1002/rcs.1446

A study on the theoretical and practical accuracy of conoscopic holography-based surface measurements: toward image registration in minimally invasive surgery

J Burgner 1,*, A L Simpson 2, J M Fitzpatrick 3, R A Lathrop 1, S D Herrell 2,4, M I Miga 2, R J Webster III 1
PMCID: PMC3819208  NIHMSID: NIHMS523845  PMID: 22761086

Abstract

Background

Registered medical images can assist with surgical navigation and enable image-guided therapy delivery. In soft tissues, surface-based registration is often used and can be facilitated by laser surface scanning. Tracked conoscopic holography (which provides distance measurements) has been recently proposed as a minimally invasive way to obtain surface scans. Moving this technique from concept to clinical use requires a rigorous accuracy evaluation, which is the purpose of our paper.

Methods

We adapt recent non-homogeneous and anisotropic point-based registration results to provide a theoretical framework for predicting the accuracy of tracked distance measurement systems. Experiments are conducted a complex objects of defined geometry, an anthropomorphic kidney phantom and a human cadaver kidney.

Results

Experiments agree with model predictions, producing point RMS errors consistently < 1 mm, surface-based registration with mean closest point error < 1 mm in the phantom and a RMS target registration error of 0.8 mm in the human cadaver kidney.

Conclusions

Tracked conoscopic holography is clinically viable; it enables minimally invasive surface scan accuracy comparable to current clinical methods that require open surgery.

Keywords: surface measurement, conoscopic holography, image-guided surgery, registration, accuracy

Introduction

Image-guided surgery or therapy delivery typically requires intra-operative measurements of features that have been identified in preoperative images. The process of using these corresponding features to determine the transformation between pre-operative images and the physical patient is known as registration (1). Registered images can be used to provide the surgeon with a view of the position and orientation of instruments with respect to anatomy, so that subsurface features can be visualized before and during incisions (2). It also enables surgical tools, including needles, to be inserted accurately to desired targets (3).

It is possible to achieve these objectives with intra-operative imaging. However, in many applications registered pre-operative images are preferred to existing intraoperative imaging methods, for a variety of reasons. For example, preoperative images provide higher resolution than ultrasound, do not expose the patient or physician to additional radiation like that delivered by computed tomography (CT) or fluoroscopy, and can be done significantly less expensively than magnetic resonance imaging (MRI). These considerations have spurred the widespread adoption of surgical navigation based on preoperative images.

Commercial surgical navigation systems increasingly make use of surface-based registration methods. For example, surface points on the patient can be collected using an optically or magnetically tracked pointer (4) and then registered to surfaces identified in preoperative images. Navigation systems are most often used in the head (5) in applications such as craniofacial surgery, neurosurgery and sinus surgery. One source of error in these systems is the tissue deformation that occurs when the pointer is swabbed along the patient’s skin. To eliminate this deformation, laser pointing devices have been introduced for contact-less acquisition of surface points (68). For example, Brainlab’s zTouch employs a laser that the user can scan over the desired anatomical features without touching them. Optical cameras simultaneously observe the laser point and the system uses triangulation to determine the three-dimensional (3D) location of each point.

Laser range scanners (LRS) also utilize the principle of triangulation of observed laser light, but automate the process of scanning the laser over the surface. LRS has been used in image-guided surgery, and is particularly useful for soft tissue in the abdomen (9,10). Hand-held laser range scanners that utilize the same principle have also been demonstrated (11,12).

While laser range scanners are applicable for open surgery (13,14), they cannot be applied in laparoscopic surgery because of the line of sight requirements for triangulation. Laser systems based on triangulation have been deployed in a minimally invasive setting, but they currently require complex mechanical systems to aim the laser and two laparoscopic ports, one for the laser and one for the camera (15).

It is highly desirable to provide surgeons with image-guidance for soft tissue through a single port, via robust, off-the-shelf hardware. Perhaps the most straightforward method of accomplishing this is by processing endoscope images. Algorithms such as shape from shading (16), stereoscopic imaging (17,18), or structure from motion (19) are applicable. Furthermore, application of structured light through a laparoscope and simultaneous imaging can be used to reconstruct the anatomy of interest (20,21). While each of these approaches holds promise, until one of them achieves the robustness, speed, accuracy, cost effectiveness and ease of use necessary to spur widespread clinical adoption, there will be ample motivation for developing new single-port image-guidance approaches.

Recently, the first 3D endoscope was proposed that utilizes the time-of-flight (ToF) technique (22). A comparison of the quality of surfaces acquired with this ToF endoscope to those achieved by stereoscopic endoscopes has also been performed (23). The main conclusion was that while ToF is promising, it cannot yet outperform stereoscopic surface reconstruction.

A measurement principle that combines accuracy comparable to laser triangulation with the ease of laparoscopic deployment of endoscopic imaging is conoscopic holography. Conoscopic holography is a low-cost, commercially available technology based on polarized light interference (24,25). The advantages over laser triangulation are that the send and receive paths of the laser light are collinear, and that distance measurements are derived from a solid angle (i.e. a cone of light), rather than from a single ray. This makes conoscopic holography more precise, stable and robust than triangulation-based methods. In view of these advantages, we recently proposed conoscopic holography for image-guided surgery (26), showing initial feasibility studies collecting points on simple geometric shapes, known a priori, and illustrating that conoscopic holography information can be used to guide a needle to a desired location in a liver phantom.

Contributions

In this paper we provide several novel contributions (both theoretical and practical) beyond the initial proof-of-concept work presented in (26), which are essential steps toward a clinically deployable system for minimally invasive soft-tissue image-guidance. These contributions are as follows: (1) We present a novel closed-form calibration technique that employs a minimal parameterization. (2) We provide a novel theoretical contribution in showing how one can estimate the overall system error in any tracked distance measurement device subject to anisotropic and non-homogeneous localization errors. This has implications beyond medicine to industrial applications (e.g. manufacturing quality control, etc. – any situation in which it is useful to track a distance measurement device). We also provide a case study applying this new theory to our tracked conoprobe system, which is essential to ensure patient safety during clinical deployment; state-of-the-art clinical tracking systems (e.g. the Polaris Spectra, which we use) are known to exhibit anisotropic and non-homogeneous localization errors. (3) We provide a more rigorous benchtop experimental evaluation of surface scan accuracy than in (26), using a calibration fixture with more complex and more precisely known geometry, as well as an experimental assessment of scan errors on an anthropomorphic kidney phantom (prior work used only liver phantoms). (4) We provide the first human tissue results, assessing system accuracy on a human cadaver kidney.

Materials and methods

The purpose of applying conoscopic holography in image-guided surgery is to enable acquisition of surface points from an organ of interest. In order to enable acquisition of a spatial surface measurement, the conoscopic holography sensor itself has to be located by means of an additional localization sensor. In this paper, we use optical tracking for localization of a one-dimensional (1D) conoscopic holography sensor (see Figure 1). By attaching a rigid body (i.e. a set of optical tracking markers) to this sensor, we can use the spatial location and orientation of that body in conjunction with the distance measurement that the sensor provides to calculate the 3D position of the measurement point. Scanning over a surface with the tracked sensor enables acquisition of multiple surface points, which can then be used for surface registration for image-guided surgery.

Figure 1.

Figure 1

Conoscopic surface measurement through a laparoscopic port. The surgeon sweeps the laser over the tissue to measure the surface for image registration. The conoscopic holography sensor is optically tracked

We utilize a conoscopic holography sensor called a conoprobe (ConoProbe Mark 3.0, Optimet Metrology Inc., USA) with a 250 mm lens. The measurement range for our particular lens is 155–335 mm and the laser spot size specified by the manufacturer as 107 μm. Lenses are available from 16 mm to > 250 mm from the manufacturer, which modify the focal length and thus adjust the measurement range, both in terms of its extent and its distance from the front of the conoprobe lens. The measurement frequency can be up to 3 kHz, with a stated error of the distance measurement ≤100 μm with our particular lens. The laser in this device is a class II red diode laser with a wavelength of 655 nm and a maximum output power of 1 mW.

In contrast to our proof-of-concept study, in which we performed benchtop experiments with a Micron tracker (H3-60, Claron Technology Inc.) and a simplistic software interface, we report here a system in which the conoprobe is integrated into the development version of an image-guided surgery system (Pathfinder Therapeutics Inc., USA). A Polaris Spectra (Northern Digital Inc., Canada) is used in this system to optically track the pose of the conoprobe via the attached rigid body with respect to a reference frame. The optical tracking system has an extended pyramidal measurement volume with a stated accuracy of 0.33 mm RMS. Measurements are acquired at 30 Hz.

The rigid-body attached to the conoprobe is comprised of four retroreflective marker spheres, which provide passive localization via infrared light from the Polaris optical tracker (see Figure 4) [Correction made here after initial online publication.]. The rigid body employed in our system is manufactured by NDI (No. 8700449). All measurements throughout this paper were performed with respect to a fixed reference frame, represented by a second rigid body located close to the region of interest (30–50 cm). This fixed reference frame will be referred to as the world coordinate system or patient coordinate system. Hardware interfacing to the conoprobe is accomplished with Ethernet, and synchronization with the optical tracking system is handled in software.

Figure 4.

Figure 4

System components. (a) A 1D conoscopic holography sensor is affixed to an optical bench, using an articulated holding arm. The spatial location of the conoprobe is optically tracked, using the attached rigid body with respect to a fixed reference frame, represented by the rigid body on the left. (b) A prototype attachment for laparoscopic applications. (c) A step phantom with adhesive cross marks (green) and divots (red circles). Divots are measured using a tracked point probe for point-based registration

The collinear measurement characteristics of conoscopic holography allow for straightforward application through a laparoscopic port. We have built a laparoscopic adapter unit for the conoprobe, as depicted in Figure 4c [Correction made here after initial online publication.]. The adapter allows for the attachment of stainless steel tubes with diameters of 10–12 mm, sizes typically used in laparoscopic abdominal surgery. The length of the tube used in our system is 195 mm. Figure 1 illustrates the use of the conoprobe through a laparoscopic port.

Accuracy analysis

Each point measured by our tracked conoprobe system is subject to localization error inherent in the tracking system and distance error inherent to the conoprobe. The rigid body used for optical tracking contains N marker spheres, referred to as fiducials. These fiducials are subject to localization error, known as fiducial localization error (FLE). After point-based registration is performed, the location of a point of interest (other than the fiducial points) is referred to as a target. Each target is subject to target registration error (TRE), which is the displacement between the transformed target and the true target location (27). In the following, we will estimate the TRE for an optically tracked conoscopic holography sensor as a measure for the achievable accuracy.

TRE for conoscopic holography

For the accuracy estimation, consider a 1D conoscopic holography sensor, which measures the distance to a point on a surface. The optics of such a sensor utilize a lens with focal length f. Conoscopic holography allows for a defined measurement range Δ, i.e. defining the range for distance measurements [f − Δ/2, f + Δ/2]. Figure 2a illustrates this.

Figure 2.

Figure 2

(a) Anisotropic localization error of an optical tracking system measuring four fiducials. (b) A 1D conoscopic holography sensor is characterized by a focal length f and a working range [−Δ/2, Δ/2] around the focal spot

The conoscopic measurement has specified axial and lateral resolutions, which are also dependent on the applied focal lens. The error along the z axis (σz) corresponds to the measurement precision, whereas the errors along the x and y axes (σx and σy) are the lateral resolution. The lateral resolution is characterized by the laser spot size and describes the ability of the sensor to detect closely separated surface points.

Given the variance of the coordinate components of the measurement, the TRE is defined as the root mean squared error (RMSE):

RMSFLE(C)=σx2+σy2+σz2 (1)

where (C) means ‘conoprobe’. Here we are assuming that within the working range, the measurement uncertainty is zero-mean, uncorrelated and homogeneous, where by ‘uncorrelated’ we mean specifically that the error in the x, y and z directions are uncorrelated and by ‘homogeneous’ we mean that the uncertainty does not vary spatially.

With these assumptions, the 3×3 covariance matrix of the conoprobe’s TRE has the form:

TRE(C)=[σx2000σy2000σz2] (2)

where the diagonal elements are the variances of the independent components of the FLE along each of the principal axes, which are for the conoprobe coincident with the conoprobe axes. For conoscopic holography, the FLE depends on the focal length of the lens.

We will illustrate the determination of the TRE for the conoscopic holography sensor introduced in the beginning of this section: The centre of the working range of our 250 mm lens is f = 245 mm, with Δ = 180 mm. The precision of the device is reported as σz ≤ 100 μm. The lateral resolution is reported as 94 μm at the focal distance and increases only slightly over the working range. These values lead to the approximation that the FLE is almost homogeneous and isotropic with the covariance matrix:

TRE(C)=[0.010000.010000.01] (3)

from which we find that RMSTRE(C) = 0.17 mm. We note that for lenses with shorter focal length, the TRE might become anisotropic, with the measurement error substantially smaller than the lateral resolution, but for our lens, homogeneous, isotropic FLE is a good approximation.

TRE estimation

Danilchenko and Fitzpatrick recently provided a general approach for first-order prediction of the TRE in rigid point-based registration (28). They derived formulae that allow direct calculation of RMSTRE for a given set of N fiducial locations, weightings W, FLE covariance matrices and the target location. For this study, we applied uniform weighting where the diagonal elements of W are all set to 1/N (as is utilized by NDI). We note that using ideal weighting (28), the accuracy would increase, but the increased accuracy would not reflect the actual accuracy achievable with standard NDI optical tracking systems.

We assume that the FLE covariance matrix of the applied optical tracking system ΣFLE(T), where (T) means ‘tracking’, is known throughout the tracking workspace. Given the configuration of (3×N) fiducials on the rigid body that is attached to the conoscopic holography sensor, we can calculate the covariance ΣTRE(T) of the component of TRE arising from tracking. Since the tracking errors are uncorrelated with the conoprobe error, the overall TRE covariance is ΣTRE = ΣTRE(T) + ΣTRE(C). From the square root of the trace of this expression, we can express the overall RMSTRE as:

RMSTRE=RMSTRE(T)2+RMSTRE(C)2 (4)

In order to estimate RMSTRE for a point measured with an optically tracked conoscopic holography sensor, the fiducial marker locations of the applied rigid body used for localization of the sensor have to be known (e.g. the sphere centre locations of the applied rigid body). The expected RMSTRE(C) can then be estimated using Danilchenko and Fitzpatrick’s methodology (29). To determine the boundary values for the TRE along the laser axis, the minimum measurement distance f − Δ/2, the focal distance f and the maximum measurement distance f + Δ/2 are considered as target points for the estimation. The TRE values at these points are referred to as TREmin, TREfocus and TREmax.

Accuracy estimation case studies

We considered the NDI Polaris Spectra, a state-of-the-art optical tracking system available in most operating rooms, and the NDI Optotrak Certus, an optical tracking system with superior accuracy. For both case studies, the RMSTRE(T) and overall RMSTRE at a point measured with our optically tracked conoprobe were estimated using Danilchenko and Fitzpatrick’s method (28).

Case study I: NDI Polaris Spectra

The NDI Polaris Spectra has RMSFLE = 0.33 mm (extended pyramidal volume) according to the specifications of the manufacturer. The FLE is anisotropic, which means it varies with direction (29,30). In particular, the error along the z axis (σz) is assumed to be three times higher than along the x and y axes (σx and σy). Numerical values are given in Table 1. From these values one can assume that, since the conoprobe is more accurate in comparison to the tracking system, the overall RMSTRE at a measured point is dominated by the RMSTRE(T) of the tracking system.

Table 1.

FLE characteristics for the NDI Polaris Spectra and Optotrak Certus. All values in mm.

FLE model Polaris Spectra Anisotropic Optotrak Certus Isotropic

ΣFLE
[1/990001/990001/11]
[1/3000001/3000001/300]
RMSFLE 0.33 0.1

Case study II: NDI Optotrak Certus

We considered the NDI Optotrak Certus optical tracking system in our second case study. The Optotrak Certus has a stated RMSFLE of 0.1 mm. In comparison to the Polaris Spectra, this system has three cameras with higher resolution, resulting in a larger working volume and better accuracy characteristics. We assume the RMSFLE to be isotropic with the variances stated in Table 1.

Calibration

In order to enable optically tracked surface measurements with any kind of collinear laser distance measurement sensor, the device is equipped with a rigid body. Thereby a coordinate frame C is assigned to the device. As shown in Figure 3a, the laser beam has a direction defined in the frame C, which corresponds to one axis in the lens frame L. The measured distance d is with respect to the origin of L, which is defined by the translational offset l with respect to frame C.

Figure 3.

Figure 3

Calibration of a 1D conoscopic holography sensor. (a) The calibration parameters are the offset vector l to the lens frame L and the laser direction in the local coordinate frame C of the sensor. (b) To determine the calibration parameters, a point p, known with respect to the world coordinate frame W, is measured by the conoscopic holography sensor from different locations. The local coordinate frame Ci of the sensor is measured as Ti with respect to the world coordinate frame W. Additionally, the corresponding laser distance measurements di are recorded

A measured point p can be expressed in the local coordinate frame C of the laser distance sensor, as follows:

pC=l+dd^ (5)

where d is the reported distance of the sensor.

Problem formulation

For determination of the calibration parameters, the conoprobe is repeatedly aimed at varying orientations toward points, each of which is measured with a ground-truth measurement system (e.g. a tracked point probe) and recorded: p1, …, pi, …, pn, where n is the number of points. For each point, the conoprobe’s measured distance di and the rigid transformation Ti from the ground-truth coordinate frame to coordinate frame C are also recorded. These quantities are related as folows:

pC,i=l+did^=qi,fori=1n (6)

where we have defined qi = Ti pi for notational convenience. This is illustrated in Figure 3b.

Our problem is to use these measurements to determine the vectors l and d. This problem is similar to the well-known hand–eye calibration problem, which aims to simultaneously solve for two unknown spatial relationships, i.e. in its classical definition from the camera to the hand and from the robot coordinate system to the calibration pattern. While there are numerous solutions available to solve that problem, they are needlessly complex for our present problem because only the origin and one axis of the ‘eye’ system are required. Furthermore, there is no unknown calibration-block pose to contend with in this problem. We present here a calibration that employs a minimal parameterization tailored to our specific problem.

We assume that any difference between pC, i and qi arises from normally distributed random noise with zero mean. Therefore, the vector l and the unit vector can be determined by least-squares fitting. Thus, we find the vector l and the unit vector that together minimize in|1+did^-qi|2. To find their values, we first note that:

in|1+did^-qi|2=in|did^-qi|2+n|1+d¯d-q¯|2 (7)

where and are the means of di and qi, i = di, and i = qi. It can be seen that equation (3) can be minimized by minimizing the first term on the right, and then setting l = . We note that:

in|did^-qi|2=indi2+inqi2-2d^·indiqi (8)

Only the third term involves and that term is minimized by:

d^=indiqi/|indiqi| (9)

Thus:

1=q¯-d¯indiqi/|indiqi| (10)

equations (5) and (6) and the definitions of , , i and i provide the desired calibration parameters.

Calibration data acquisition

To acquire calibration data, we chose a fixed fiducial in the shape of a marked cross on a rigid surface. Using the set-up shown in Figure 4a, the conoprobe was pointed at the cross from 30 different poses with varying distances (spanning the measurement range of the device) [Correction made here after initial online publication.]. The conoprobe was attached to a passive articulated holding arm (MA60003, Noga Engineering, Israel). This allows spatial fixation for each conoprobe pose, while the pose of the rigid body was measured using the optical tracker and the distance reported by the conoprobe was recorded. To establish the ground truth position p of the fiducial, its position was measured using a tracked point probe (depicted in Figure 4b) [Correction made here after initial online publication.]. Since the fiducial was fixed pi = p = constant.

A total of five calibration datasets were acquired. For each, the calibration error was analysed. The measurements of each dataset were transformed into the world coordinate frame using:

qi=Ti(l+did^) (11)

with i = 1 …n, n being the number of measurements in the dataset, and l, being the calibration parameters determined using the same dataset. The root mean square error (RMSE) was determined as:

RMSE=1ni=1nqi-p2. (12)

Experimental evaluation of measurement accuracy

To characterize the accuracy of an optically tracked conoprobe in image-guided surgery, a series of experimental evaluations were performed with the system components described in in the beginning of this section. Table 2 summarizes the main characteristics of the experiments. The experiments described in this subsection are for evaluation of the measurement accuracy of optically tracked conoscopic holography in general, whereas the experiments in the following section are specifically designed to evaluate the accuracy of surface-based image registration, using surface measurements acquired with an optically tracked conoscopic holography sensor.

Table 2.

Summary of accuracy evaluation experiments performed in this paper. For each experiment type, the material, acquired number of datasets and range of measurement points per dataset is given. The acquisition parameters of the conoprobe are also given, i.e. frequency (in Hz), power and average signal-to-noise ratio (both in %)

Type Name Material No. of datasets No of points Frequency (Hz) Power (%) SNR (%)
Point measurement Calibration Adhesive fiducial 5 30 700 45 88
TRE Adhesive fiducial 5 15–28 700 45 88
Relative point Step phantom with adhesive fiducial 14 9 700 45 88

Surface scan Plane Matte black ceramic planar surface 20 635–2788 600 60 84
Sphere Teflon sphere 6 620–873 700 25 84
Step phantom Step phantom with nine divot fiducials 16 1569–4830 600 40 80
Phantom kidney Silicon rubber 5 477–1865 500 40 85
Human kidney Ex vivo human cadaver kidney 5 1022–1371 700 40 86

Target registration error (TRE)

To characterize the TRE of an optically tracked conoprobe as theoretically derived in the accuracy analysis section, measurements of one fixed point in space p0 were performed. A marked cross on a rigid surface was used as the measurement point. A total of five datasets were acquired, each containing 22–30 measurements. The pose of the optical tracker was changed between datasets. For each measurement the position and orientation as well as the distance of the conoprobe was altered. The conoprobe was held by the passive articulated arm described earlier, while the pose determined by the optical tracking system and the distance were reported. The measurements of each dataset were then transformed using equation (5). The calibration parameters l and were chosen as described in the Calibration section.

The ground truth position of p0 was determined using a tracked point probe. The average position over a sampling time of 20 s was taken where the point probe was pivoted around p0. The TRE was then determined by:

RMSTRE=1ni=1nqi-p02 (13)

with n being the number of measurements.

Relative point measurement accuracy

The accuracy of tracked conoscopic holography for relative measurements was determined using a step phantom (depicted in Figure 4b) [Correction made here after initial online publication.]. Each of the nine black platforms of the phantom was equipped with a self-adhesive fiducial point in the form of a cross. The ground truth position of the fiducials was determined using a tracked point probe, giving Pdivot = {p1, …, p9}. For each dataset, the nine fiducials were measured using the tracked conoprobe, giving:

Pmeasured={q1,,q9}.

After adjusting angle and distance of the conoprobe randomly, it was fixed in space using the passive articulated holding arm for measurement of each fiducial. All measurements were acquired with respect to the world coordinate frame represented by a rigid body close to the step phantom (approximately 20 cm). A total of 14 datasets were acquired where the pose of the optical tracking system was changed with respect to the experimental set-up for each dataset.

While altering the optical tracker pose between acquisition of the datasets, the spatial relationship between the step phantom and the world frame remained constant; hence, all measurements are in the same frame, i.e. are registered to each other. That allows direct comparison between the points measured with the conoprobe and the ground-truth points Pdivot.

For evaluation of the relative measurement accuracy, we compared the deviations in the relative positions for all possible pairs of fiducials per dataset (36 combinations) with the ground truth:

RMSrelative2=1ni,j(qi-qj)-(pi-pj)2 (14)

with n = 36 and (i, j) represents all combinations of pairs between the nine fiducials without repetition.

Geometric object surface measurements

The main application of tracked conoscopic holography measurements in image-guided surgery is the acquisition of surface points on the organ of interest. While the previously described methods for assessment of the accuracy dealt with single point measurements, the methods in this section deal with the acquisition of surface measurements from geometric, non-medical objects.

Plane

We chose a matte black ceramic planar surface for these experiments. We acquired a total of 20 datasets, each containing a tracked continuous conoprobe measurement of the planar surface. The conoprobe was hand-held, orientated at varying angles and distances with respect to the surface, and manually scanned with varying velocities along the surface. The pose of the optical tracking system was changed between each dataset acquisition to eliminate bias. Twelve datasets were acquired without any laparoscopic tube attached, and eight datasets using the laparoscopic tube.

The measurements were evaluated by applying principal component analysis (PCA) to determine the best-fitting plane for each dataset. For each measurement point in a given dataset, we determined the closest point distance to the fitted plane of that dataset as a measure of the accuracy of the planar surface measurement.

Sphere

For this evaluation series, we measured white Teflon spheres (Small Parts Inc., Miami Lakes, FL, USA) which were precisely manufactured to 12.7 mm radius with a 25 μm tolerance. We acquired a total of six datasets. For each we manually scanned the Teflon sphere with the tracked conoprobe, varying the scanning angle and distance during the measurements. The pose of the optical tracking system was changed between each dataset acquisition to eliminate bias.

We determined the best-fitting sphere in a least-squares sense for each dataset. We compared the radius of the fitted sphere to the ground-truth radius of the sphere, and we evaluated the distance error of each measurement point from the ground-truth sphere.

Evaluation of conoscopic surface measurements for image registration

In order to evaluate the accuracy of surface registrations using conoscopic holography surface measurements, we assessed three experimental series. For all experiments the surface of the subject was measured using the optically tracked conoprobe and the resulting surface point cloud was used for surface-based registration with CT images. The determined transformation was compared to ground truth point-based registration in terms of FRE and TRE.

Step phantom

We utilized the step phantom (see relative point measurement subsection) for a series of non-medical surface measurements. A total of 16 datasets were acquired. For each dataset the tracked conoprobe measurement of the step phantom surface was acquired by manually scanning over the surface, varying the scanning angle and distance during the measurements. The pose of the optical tracking system was changed between each dataset acquisition, to eliminate bias.

To establish ground truth for the evaluation of the measured point-clouds, a CT scan of the step phantom was acquired. The resolution of the CT dataset was 0.684 mm in the xy plane with a 1 mm slice thickness. Each platform of the step phantom consists of centred disks with a 3 mm hemispherical divot. These divots were measured using a tracked point probe before acquisition of the conoscopic measurements. The centroid of the measurement tip, located near the centre of the disk, was determined with respect to the world coordinate frame. To determine the ground truth transformation between the image coordinate system and the physical world coordinate system, each divot position was manually segmented in the CT images. Using rigid point-based registration, the homogeneous transformation TGT between the world coordinate frame W and the image coordinate frame CT was determined and serves as the ground truth registration. We further generated a surface model of the step phantom from the CT data. Segmentation was performed using the open-source DICOM viewer OsiriX (31).

The transformation matrix T(ICP) between the world coordinate frame W and the image coordinate frame CT was determined for each conoscopic dataset, using surface matching. A variant of the iterative closest point (ICP) algorithm was employed that uses robust statistics for surface-based registration (32). The ICP algorithm iteratively establishes point correspondences for the current alignment of the two datasets and computes a rigid transformation minimizing the FRE (33). The ICP variant used here further refines the ICP estimate by perturbing the solutions and conducting a local heuristic search through possible registrations, searching for the registration that gives the best least-squares fit for the most points (32). In our experiment, the ground truth estimate was used as the initial estimate for the registration. We note that the ICP algorithm will converge to the global minimum for this optimal initialization.

For each derived T(ICP), the FRE and TRE of the registration were calculated. The TRE is calculated by transforming all measured points xi per dataset into the image coordinate system, using T(ICP) and subsequently transforming the image points back to the world coordinate system using T(GT)−1. This gives a set of target locations yi. The mean TREICP for these targets is defined as:

TREICP=1Ni=1Nyi-xi2 (15)

with N being the number of points in the dataset.

Medical objects

In this series of experiments, the tracked conoprobe was evaluated for surface measurement of medical objects, with application to surface registration in image-guided surgery. In particular, we focused on a human kidney as the experimental target.

We utilized a tissue-mimicking anthropomorphic phantom of a human kidney, made of silicon rubber (Dragon Skin, Smooth-On Inc., Easton, PA, USA) and a fresh human ex vivo cadaver kidney covered in perirenal fat. Since the human kidney consists of soft tissue and is thus subject to morphological changes, we performed the experiment in a flat-panel C-arm operating room (Allura Xper FD20/20, Philips Medical Systems) in order to eliminate morphological changes between CT acquisition and conoscopic surface measurement. Acquired CT datasets had isotropic voxels with 0.491 mm resolution. The image coordinate frame is referred to as I.

The specimen was situated on the operating table, with a rigid body defining a fixed reference frame about 25 mm away (referred to as the patient coordinate system P). A CT dataset was acquired prior to conoscopic surface measurement. The specimen and reference rigid body did not move during the conoscopic surface measurement. For each specimen, a total of five datasets were acquired. The pose of the optical tracking system was changed between the datasets to exclude bias. The conoprobe was hand-held and manually positioned over the specimen. The scanning angle and distance were altered during the measurements. Figure 5 shows the experimental set-up in the OR [Correction made here after initial online publication.].

Figure 5.

Figure 5

Conoscopic surface measurement of a human ex vivo kidney covered in perirenal fat

The acquired CT datasets were manually segmented to extract a surface model of the specimen. Figure 6 [Correction made here after initial online publication.] shows the determined surface model for the anthropomorphic kidney phantom and one acquired dataset before surface registration. We performed surface-based registration using a variant of the ICP algorithm (see previous subsection) with the respective surface models.

Figure 6.

Figure 6

Surface model of the anthropomorphic kidney phantom in the image coordinate system I (left) and the conoscopic surface measurement acquired in the patient coordinate system P (right). After successful surface-based registration, the measured point can be transformed into the image coordinate system using ITP (bottom)

The ground truth registration from image to physical space was performed, making use of the reference rigid body visible in the CT dataset. The centres of the reflective spheres were manually segmented in the CT dataset. The coordinates of the spheres from the rigid body configuration were used as the corresponding point set. Rigid point-based registration was used to determine T(GT) from the patient coordinate frame to the image coordinate frame. As described in the previous section, this ground truth transformation was used as initial estimate for the ICP algorithm. For each determined T(ICP) the FRE and TRE of the registration were calculated, as well as the corresponding mean errors and standard deviations.

Results

Accuracy analysis

Case study I: NDI Polaris Spectra

The expected RMSTRE(T) for the NDI Polaris Spectra at the minimum measurement distance is 0.70 mm, at the focal distance 0.95 mm and at the maximum measurement distance 1.21 mm. Using equation (4), the combined RMSTRE is expected to be 0.72 mm for the minimum measurement distance, 0.97 mm at the centre and 1.22 mm at the maximum measurement distance. All values are summarized in Table 3 and expressed in the rigid-body frame of the conoprobe, as shown in Figure 2.

Table 3.

FLE characteristics for the NDI Polaris Spectra and Optotrak Certus. The RMSTRE for the tracking system and variances are estimated for targets representing the measurement range of the conoprobe (fΔ/2,f,f + Δ/2). The overall RMSTRE including the FLE of the conoprobe is stated. All values in mm

Polaris Spectra Optotrak Certus
fΔ/2 RMSTRE(T) 0.7 0.27
[σx2,σy2,σz2]TRE(T)
[0.33, 0.14, 0.02] [0.05, 0.03, 0.0008]
Total TREmin 0.72 0.32

f RMSTRE(T) 0.95 0.37
[σx2,σy2,σz2]TRE(T)
[0.63, 0.26, 0.02] [0.08, 0.05, 0.0008]
Total TREfocus 0.97 0.40

f + Δ/2 RMSTRE(T) 1.21 0.46
[σx2,σy2,σz2]TRE(T)
[1.03, 0.42, 0.02] [0.14, 0.08, 0.0008]
Total TREmax 1.22 0.49

The anisotropic FLE of the Polaris Spectra results in significantly higher deviation in the x direction, which corresponds to the distance of the conoprobe to the Spectra. The FLE of the conoprobe is comparably small and thus the contribution to the overall RMSTRE is insignificant.

Case study II: NDI Optotrak Certus

For the NDI Optotrak Certus, the expected RMSTRE(T) at the minimum measurement distance is 0.27 mm at the focal distance 0.37 mm and at the maximum measurement distance 0.46 mm. The combined RMSTRE is expected to be 0.32 mm for the minimum measurement distance, 0.40 mm at the centre and 0.49 mm at the maximum measurement distance (see Table 2).

Calibration

For the five calibrations performed, the RMSE was determined and the results are given in Table 4 (calibration datasets are listed in chronological order of collection). The corresponding absolute mean errors and standard deviations (SDs) were also determined, as well as the maximum error. The nominal error is zero mean. Table 4 shows a trend toward decreasing error. We believe that the person performing the experiments was subject to a learning curve, i.e. as more experiments are performed, the user’s ability to accurately aim the conoprobe laser at a fiducial improves.

Table 4.

Results for five calibrations (mm)

Mean SD RMSE Max
1 0.75 0.34 0.82 1.60
2 0.63 0.48 0.79 2.24
3 0.48 0.28 0.56 1.01
4 0.51 0.29 0.59 1.38
5 0.53 0.25 0.58 0.93

We further evaluated the convergence behaviour for the calibration algorithm. To do this we determined the RMSE of a calibration calculated with a subset of three to n data points out of each dataset. The root of the sum of squared RMSE for each subset was determined and plotted against the number of data points used for calibration (see Figure 7) [Correction made here after initial online publication.]. The convergence can be approximated by RMSE=2.7n-1, where n is the number of data points.

Figure 7.

Figure 7

Convergence behaviour of the calibration. The RMSE is plotted against the number of data points (n) used for calibration. The dashed line is the convergence function 2.7n-1

Experimental evaluation of measurement accuracy

Target registration error (TRE)

Five datasets were evaluated using the best calibration (see Table 4, row five). Over all datasets we determined a mean absolute error of 0.69 mm, with a SD of 0.33 mm, which results in a RMSTRE of 0.77 mm. The RMSTRE component in the x direction of 0.55 mm is the highest, followed by the y direction at 0.4 mm and the z direction at 0.33 mm. All results are summarized in Table 5.

Table 5.

Experimentally determined target registration error results for five datasets using the same calibration. The mean absolute error and SD, Cartesian RMSE and RMSE in all directions and maximum error are reported per dataset (15) as well as the mean values over all datasets. All values are in mm

SD RMS||.|| RMSx RMSy RMSz Max
1 0.59 0.29 0.65 0.41 0.43 0.28 1.20
2 0.81 0.50 0.95 0.75 0.48 0.36 2.24
3 0.87 0.41 0.96 0.86 0.30 0.32 1.72
4 0.65 0.20 0.68 0.46 0.39 0.30 1.09
5 0.62 0.29 0.69 0.40 0.42 0.38 1.18
Total 0.69 0.33 0.77 0.55 0.40 0.33 2.24

Relative point measurement accuracy

The relative measurement accuracy was evaluated in 14 datasets. The relative measurement error was determined over all datasets, i.e. 504 points. The mean absolute error, SD, RMSE and observed maximum error are summarized in Table 6. As expected, the error was largest in the x direction at 0.85 mm, which was the optical axis of the optical tracking system in our set-up. The error in the z direction was the smallest, at 0.38 mm.

Table 6.

Average relative point measurement error for 14 data-sets. All values in mm

Mean SD RMSE Max
x 0.65 0.55 0.85 3.58
y 0.41 0.31 0.52 1.68
z 0.27 0.27 0.38 1.95
||.|| 0.91 0.56 1.07 3.82

Geometric object surface measurements

Plane

A total of 20 datasets of a planar surface were acquired and evaluated. For the 12 non-laparoscopic planar surface scans, the weighted arithmetic mean over all datasets was 0.56 mm, with a weighted SD of 0.5 mm. As weighting, we utilized the ratio of the number of measurement points per dataset and the number of measurement points over all datasets. The corresponding RMSE was 0.74 mm. The maximum error observed was 4.98 mm. For the eight scans performed through a laparoscopic port, the weighted arithmetic mean over all datasets was 0.64 mm, with a weighted SD of 0.58 mm. The RMSE was 0.86 mm. Maximum was 4.82 mm. The laparoscopic tube did not impact the measurement accuracy.

Sphere

The error of the fitted sphere radius compared to the ground truth radius of the Teflon sphere was between −1.22 and 0.14 mm. The weighted arithmetic mean of the distance error of all measurement points to the best fitting sphere over all datasets was 0.58 mm, with a weighted SD of 0.36 mm. The RMSE was 0.68 mm. The maximum observed deviation was 2.61 mm.

Evaluation of conoscopic measurements for image registration

Step phantom

The FRE of the point-based registration, which served as ground truth, was 0.46 mm. The weighted arithmetic mean FRE over all registrations was 1.08 mm, with a SD of 1.32 mm. The corresponding RMSFRE was 1.68 mm. For the target registration error, the weighted arithmetic mean error over all registrations was 0.78 mm, with a SD of 0.19 mm. The overall RMSTRE was 0.8 mm. The maximum observed TRE was 1.18 mm. An example of surface registration is depicted in Figure 8a, b.

Figure 8.

Figure 8

Example results of the surface registration. The left column indicates colour-coded FRE values and the right column colour-coded TRE values. (a, b) Step phantom; (c, d) anthropomorphic kidney phantom; (e, f) human ex vivo cadaver kidney covered in perirenal fat

Medical objects

For the anthropomorphic kidney phantom, the FRE of the ground truth point-based registration was 0.16 mm. The weighted arithmetic mean FRE over all five datasets was 0.75 mm, with a SD of 0.61 mm, resulting in an RMSFRE of 0.97 mm. The overall RMSTRE was 1.51 mm. The maximum observed TRE was 2.32 mm.

For the ex vivo human cadaver kidney, the FRE of the ground truth point-based registration was 0.38 mm. The weighted arithmetic mean error over all five datasets was 0.81 mm, with a SD of 0.59 mm, resulting in an RMSFRE of 1 mm. The overall RMSTRE was 0.8 mm. The maximum observed TRE was 1.74 mm. All results are summarized in Table 7.

Table 7.

Anthropomorphic kidney phantom and ex vivo kidney registration results. All values in mm

Dataset FRE
TRE
Mean SD RMS Mean SD RMS Max
Kidney phantom 1 0.83 0.68 1.08 1.67 0.37 1.71 2.32
2 0.98 0.82 1.28 0.95 0.15 0.96 1.30
3 0.95 0.80 1.24 1.40 0.37 1.45 2.19
4 0.81 0.63 1.03 1.45 0.17 1.46 1.75
5 0.56 0.39 0.68 1.61 0.16 1.61 1.91

Total 0.75 0.61 0.97 1.49 0.25 1.51 2.32

Ex vivo kidney 1 0.79 0.56 0.97 1.22 0.10 1.22 1.46
2 0.82 0.59 1.01 0.60 0.10 0.61 0.87
3 0.83 0.61 1.03 0.83 0.39 0.92 1.74
4 0.76 0.56 0.95 0.50 0.12 0.51 0.78
5 0.84 0.62 1.05 0.72 0.28 0.77 1.32

Total 0.81 0.59 1.00 0.77 0.22 0.80 1.74

Discussion

The accuracy of an optically tracked conoscopic holography sensor depends on the accuracy of the conoscopic distance measurement itself, the accuracy of the tracking system used and the accuracy of calibration. To address the calibration issue, we presented a closed-form calibration method based on point-based registration. Noteworthy conclusions of our calibration work are that it is accurate (0.58 mm RMSE achieved), has a similar convergence curve to standard point-based registration results, and is applicable to any tracked distance measurement device.

Of the other two sources of error, we showed theoretically and experimentally that the optical tracking system is the dominant source of error in the system. Our experimental TRE evaluation for the conoprobe system (RMSTRE 0.77 mm) agreed with theoretical calculations, where the RMSTRE in the focus was estimated at 0.97 mm. Manufacturers’ specifications and the geometry of the laser spot led to a FLE estimate of 0.17 mm for the conoprobe. Our results suggest that the FLE of the conoprobe is probably smaller than this conservative assumption.

Two important conclusions can be drawn from these considerations. First, our results are comparable to prior results achieved in open surgery with laser range scanners, which are considered to be sufficient for surgical applications in abdominal soft tissues (34). Second, if some future clinical scenario requires a higher accuracy threshold, it can be achieved by using a more accurate tracking system. For example, in case study II of the accuracy estimation we calculated that replacing the NDI Polaris Spectra with an NDI Optotrak Certus would decrease the TRE by > 55%. One could also use an encoded physical arm for tracking (e.g. from FARO Technologies Inc., USA), which should be expected to decrease the TRE still further.

One last noteworthy issue in our experiments is that some overhanging points can be observed in the step phantom experiments (see Figure 8a, b). These are a result of synchronization delay between the conoprobe measurement and the optical tracking system. The conoprobe records data at 500–700 Hz, whereas the Polaris Spectra has a frame rate of 30 Hz. Unfortunately, the Polaris software interface does not allow measurements to be time-stamped or triggered at precise time points, neither is the time between measurements precisely repeatable (personal communication with NDI, 21 July 2011). The fact that we achieved good registration results with the kidney phantom and the cadaver kidney (which were scanned fairly rapidly, at a velocity comparable to what we expect surgeons will wish to use clinically) indicates that this small synchronization lag does not degrade overall system performance. However, it is an issue for the engineer to be aware of, and to advise the clinician not to use excessive speed when scanning, until more rapid optical trackers reach widespread clinical use, or NDI makes the delay deterministic, so that the lag can be accounted for if the surgeon desires to scan extremely rapidly.

Conclusions

We have presented a comprehensive accuracy evaluation and experiments with an optically tracked conoscopic holography sensor for application in image-guided surgery. The system acquires 3D surface measurements of objects of medical interest, which can be used to register pre-operative images to anatomy. Our accuracy evaluation experiments show RMS errors that are consistently < 1 mm for point measurements. Surface-based registrations performed using the system show a mean closest point error of < 1 mm after registration. The average RMSTRE in our human ex vivo cadaver kidney trials was 0.8 mm. These results suggest that conoprobe-enabled scans can be useful in image-guided surgery; they enable accuracy comparable to the existing methods used in open surgery.

Minimally invasive laparoscopic surgery, such as partial nephrectomy, can profit from image guidance in order to support the surgeon in removing tumours (35). An interesting future direction is direct coupling of the conoprobe to a robot, so that scanning and re-registration could be carried out continually during the surgery. We also believe there will be many other surgical indications for conoscopic holography beyond the kidney. Examples include liver surgery and other soft-tissue applications in abdominal, cranio-maxillofacial, sinus and neurosurgery. For example, in endonasal surgery, a tracked conoprobe could collect internal points through the nostril to enhance the facial surface-based registration typically used. We are currently performing laparoscopic animal experiments in order to evaluate in vivo accuracy. We are acquiring in vivo human data on tumour resection cavities in neurosurgery in order to perform direct comparisons with laser range scanning (36). Compensating for intraoperative tissue deformation is a challenging task in image-guided surgery. We have had success with using surfaces acquired from laser-range scanning technology to initialize complex mathematical models for compensating for deformation in neurosurgery and liver applications (14,37). We are investigating surface acquisition with the conoprobe in place of the laser range scanner in our deformation correction framework.

In conclusion, conoscopic holography provides a measurement method for extending surface-based registration techniques to the laparoscopic setting. Our experimental results show that it can achieve accuracies comparable to LRS and endoscopic techniques. Its advantages are the robustness and accuracy of conoprobe measurements, and the fact that the system makes use of proven and commercially available modules (the conoprobe and the Polaris, the latter of which is already present in many operating rooms). Thus, we believe that this approach has the potential to bring the advantages of image guidance to minimally invasive human surgeries in the near-term future.

Acknowledgments

This study was supported in part by the National Institute for Neurological Disorders and Stroke Grant No. R01 NS049251 and the National Cancer Institute Grant No. R01 CA162477 of the National Institutes of Health (NIH). The authors would like to thank Valerie D. Waggoner for her help with CT data acquisition and Andrew D. Wiles from NDI for insights on the Polaris Spectra measurement acquisition. We also wish to thank the anonymous reviewers whose suggestions greatly improved this paper.

Footnotes

This article was published online on July 4, 2012. Errors were subsequently identified. This notice is included in the online and print versions to indicate that both have been corrected August 13, 2012.

References

  • 1.Fitzpatrick JM, Hill DLG, Maurer CR. Image registration. In: Sonka M, Fitzpatrick JM, editors. Handbook of Medical Imaging, vol 2: Medical Image Processing and Analysis. SPIE Press; 2000. pp. 449–514. [Google Scholar]
  • 2.Peters TM, Cleary K. Image-guided Interventions: Technology and Applications. Springer; 2008. [DOI] [PubMed] [Google Scholar]
  • 3.Dogangil G, Davies BL, Rodriguez y Baena F. A review of medical robotics for minimally invasive soft tissue surgery. Proceedings of the Institution of Mechanical Engineers, Part H. J Eng Med. 2010;224(5):653–679. doi: 10.1243/09544119JEIM591. [DOI] [PubMed] [Google Scholar]
  • 4.Strong EB, Rafii A, Holhweg-Majert B, et al. Comparison of three optical navigation systems for computer-aided maxillofacial surgery. Arch Otolaryngol Head Neck Surg. 2008;134(10):1080–1084. doi: 10.1001/archotol.134.10.1080. [DOI] [PubMed] [Google Scholar]
  • 5.Eggers G, Mühling J, Marmulla R. Image-to-patient registration techniques in head surgery. Int J Oral Maxillofac Surg. 2006;35 (12):1081–1095. doi: 10.1016/j.ijom.2006.09.015. [DOI] [PubMed] [Google Scholar]
  • 6.Krishnan R, Raabe A, Seifert V. Accuracy and applicability of laser surface scanning as new registration technique in image-guided neurosurgery. Int Congr Ser. 2004;1268:678–683. [Google Scholar]
  • 7.Marmulla R, Hassfeld S, Lueth T, et al. Laser-scan-based navigation in cranio-maxillofacial surgery. J Craniomaxillofac Surg. 2003;31(5):267–277. doi: 10.1016/s1010-5182(03)00056-8. [DOI] [PubMed] [Google Scholar]
  • 8.Schicho K, Figl M, Seemann R, et al. Comparison of laser surface scanning and fiducial marker-based registration in frameless stereotaxy. J Neurosurg. 2007;106(4):704–709. doi: 10.3171/jns.2007.106.4.704. [DOI] [PubMed] [Google Scholar]
  • 9.Cash DM, Sinha TK, Chapman WC, et al. Incorporation of a laser range scanner into image-guided liver surgery: surface acquisition, registration, and tracking. Med Phys. 2003;30(7):1671. doi: 10.1118/1.1578911. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Pheiffer TS, Lennon B, Simpson AL, et al. Development of a novel laser range scanner. SPIE Med Imag. 2011;7964:796424. [Google Scholar]
  • 11.Suppa M, Kielhofer S, Langwald J, et al. The 3D-modeller: a multi-purpose vision platform. IEEE International Conference on Robotics and Automation (ICRA) 2007:781–787. [Google Scholar]
  • 12.Tobergte A, Konietschke R, Hirzinger G. Planning and control of a teleoperation system for research in minimally invasive robotic surgery. IEEE International Conference on Robotics and Automation (ICRA) 2009:4225–4232. [Google Scholar]
  • 13.Clements LW, Chapman WC, Dawant BM, et al. Robust surface registration using salient anatomical features for image-guided liver surgery: algorithm and validation. Med Phys. 2008;35(6):2528. doi: 10.1118/1.2911920. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Dumpuri P, Clements LW, Dawant BM, et al. Model-updated image-guided liver surgery: preliminary results using surface characterization. Progr Biophys Mol Biol. 2010;103(2–3):197–207. doi: 10.1016/j.pbiomolbio.2010.09.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Hayashibe M, Suzuki N, Nakamura Y. Laser-scan endoscope system for intraoperative geometry acquisition and surgical robot safety management. Med Image Anal. 2006;10(4):509–519. doi: 10.1016/j.media.2006.03.001. [DOI] [PubMed] [Google Scholar]
  • 16.Okatani T. Shape reconstruction from an endoscope image by shape from shading technique for a point light source at the projection center. Comput Vision Image Understand. 1997;66 (2):119–131. [Google Scholar]
  • 17.Stoyanov D, Darzi A, Yang GZ. Medical Image Computing and Computer-assisted Intervention (MICCAI) Springer; Berlin: 2004. Dense 3D depth recovery for soft tissue deformation during robotically assisted laparoscopic surgery; pp. 41–48. [Google Scholar]
  • 18.Wengert C, Cattin PC, Duff JM, et al. Markerless endoscopic registration and referencing. Medical Image Computing and Computer Assisted Intervention (MICCAI) 2006:816–823. doi: 10.1007/11866565_100. [DOI] [PubMed] [Google Scholar]
  • 19.Hu M, Penney G, Figl M, et al. Reconstruction of a 3D surface from video that is robust to missing data and outliers: application to minimally invasive surgery using stereo and mono endoscopes. Med Image Anal. 2010;16(3):597–611. doi: 10.1016/j.media.2010.11.002. [DOI] [PubMed] [Google Scholar]
  • 20.Fuchs H, Livingston MA, Raskar R, et al. Augmented reality visualization for laparoscopic surgery. Medical Image Computing and Computer-Assisted Intervention (MICCAI) 1998:934–943. [Google Scholar]
  • 21.Ackerman JD. Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality. Proc SPIE. 2002;4661:39–46. [Google Scholar]
  • 22.Penne J, Höller K, Stürmer M, et al. Medical Image Computing and Computer Assisted Intervention (MICCAI) Springer; Berlin: 2009. Time-of-flight 3D endoscopy; pp. 467–474. [DOI] [PubMed] [Google Scholar]
  • 23.Groch A, Seitel A, Hempel S, et al. 3D surface reconstruction for laparoscopic computer-assisted interventions: comparison of state-of-the-art methods. In Proc SPIE. 2011;7964:796415. [Google Scholar]
  • 24.Sirat GY. Conoscopic holography. I. Basic principles and physical basis. J Optic Soc Am A. 1992;9(1):70. [Google Scholar]
  • 25.Sirat GY. Conoscopic holography. II. Rigorous derivation. J Optic Soc Am A. 1992;9(1):84. [Google Scholar]
  • 26.Lathrop RA, Hackworth DM, Webster RJ., III Minimally invasive holographic surface scanning for soft-tissue image registration. IEEE Trans Biomed Eng. 2010;57(6):1497–1506. doi: 10.1109/TBME.2010.2040736. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Fitzpatrick JM, West JB, Maurer CR. Predicting error in rigid-body point-based registration. IEEE Trans Med Imag. 1998;17 (5):694–702. doi: 10.1109/42.736021. [DOI] [PubMed] [Google Scholar]
  • 28.Danilchenko A, Fitzpatrick JM. General approach to first-order error prediction in rigid point registration. IEEE Trans Med Imag. 2011;30(3):679–693. doi: 10.1109/TMI.2010.2091513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Wiles AD, Thompson DG, Frantz DD. Accuracy assessment and interpretation for optical tracking systems. Proc SPIE. 2004;5367:421–432. [Google Scholar]
  • 30.Wiles AD, Likholyot A, Frantz DD, Peters TM. A statistical model for point-based target registration error with anisotropic fiducial localizer error. IEEE Trans Med Imag. 2008;27(3):378–390. doi: 10.1109/TMI.2007.908124. [DOI] [PubMed] [Google Scholar]
  • 31.Rosset A, Spadola L, Ratib O. OsiriX: an open-source software for navigating in multidimensional DICOM images. J Digital Imag. 2004;17(3):205–216. doi: 10.1007/s10278-004-1014-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Ma B, Ellis RE. Robust registration for computer-integrated orthopedic surgery: laboratory validation and clinical experience. Med Image Anal. 2003;7(3):237–250. doi: 10.1016/s1361-8415(02)00133-0. [DOI] [PubMed] [Google Scholar]
  • 33.Besl PT, McKay ND. A method for registration of 3D shapes. IEEE Trans Pattern Anal Machine Intell. 1992;14(2):239–256. [Google Scholar]
  • 34.Miga MI, Dumpuri P, Simpson AL, et al. The sparse data extrapolation problem: strategies for soft-tissue correction for image-guided liver surgery. Proc SPIE. 2011;7964:79640C. [Google Scholar]
  • 35.Glisson C, Ong R, Simpson A, et al. Registration methods for gross motion correction during image-guided kidney surgery. Int J Comput Aid Radiol Surg. 2011;6(S1):160–161. [Google Scholar]
  • 36.Simpson AL, Burgner J, Chen I, et al. Intraoperative brain tumor resection cavity characterization with conoscopic holography. Proc SPIE. 2011;8316:831631. [Google Scholar]
  • 37.Cash DM, Miga MI, Sinha TK, et al. Compensating for intra-operative soft tissue deformations using incomplete surface data and finite elements. IEEE Trans Med Imag. 2005;24(11):1479–1491. doi: 10.1109/TMI.2005.855434. [DOI] [PubMed] [Google Scholar]

RESOURCES