Skip to main content
UKPMC Funders Author Manuscripts logoLink to UKPMC Funders Author Manuscripts
. Author manuscript; available in PMC: 2017 Oct 1.
Published in final edited form as: IEEE Trans Biomed Eng. 2016 Jun 21;64(4):946–958. doi: 10.1109/TBME.2016.2582734

Development and Phantom Validation of a 3D-Ultrasound-Guided System for Targeting MRI-visible Lesions during Transrectal Prostate Biopsy

Yipeng Hu 1, Veeru Kasivisvanathan 2, Lucy A M Simmons 3, Matthew J Clarkson 4, Stephen A Thompson 5, Taimur T Shah 6, Hashim U Ahmed 7, Shonit Punwani 8, David J Hawkes 9, Mark Emberton 10, Caroline M Moore 11, Dean C Barratt 12
PMCID: PMC5053368  EMSID: EMS68946  PMID: 27337710

Abstract

Objective

Three- and four-dimensional transrectal ultrasound transducers are now available from most major ultrasound equipment manufacturers, but currently are incorporated into only one commercial prostate biopsy guidance system. Such transducers offer the benefits of rapid volumetric imaging, but can cause substantial measurement distortion in electromagnetic tracking sensors, which are commonly used to enable 3D navigation. In this paper, we describe the design, development and validation of a 3D-ultrasound-guided transrectal prostate biopsy system that employs high-accuracy optical tracking to localize the ultrasound probe and prostate targets in 3D physical space.

Methods

The accuracy of the system was validated by evaluating the targeted needle placement error after inserting a biopsy needle to sample planned targets in a phantom using standard 2D ultrasound guidance versus realtime 3D guidance provided by the new system.

Results

The overall mean needle-segment-to-target distance error was 3.6±4.0 mm and mean needle-to-target distance was 3.2±2.4 mm.

Conclusion

a significant increase in needle placement accuracy was observed when using the 3D guidance system compared with visual targeting of invisible (virtual) lesions using a standard B-mode ultrasound guided biopsy technique.

Index Terms: prostate cancer, image-guided interventions, image registration, tracking, transrectal biopsy

I. Introduction

Prostate cancer is the most commonly diagnosed noncutaneous cancer in the Western World and a leading cause of cancer-related death in men in North America, Australasia, and many countries in Western and Northern Europe. In the USA, for example, the estimated number of new prostate cancer cases and deaths resulting from the disease in 2015 is 22,800 and 27,540, respectively [1]. Transrectal needle biopsy performed under transrectal ultrasound (TRUS) guidance is the standard clinical method for diagnosing and staging prostate cancer, and has become one of the most common diagnostic procedures performed in hospitals [2], [3]. Although it is very widely practiced, a major limitation of this method is that prostate tumors are in general poorly distinguished in B-mode TRUS images. Therefore, unlike most other solid-organ cancers, tumor-targeted biopsy cannot be performed unless a tumor is clearly visible, and is therefore not standard practice. As a direct result of the subsequent sampling error, a proportion of clinically significant prostate cancers – i.e. cancers that are likely to progress and requires treatment – are missed and approximately 10% of patients require one or more repeat biopsies needed to establish a correct diagnosis [4]. Risk stratification can also be adversely affected by sampling error because histopathology results may be only poorly representative of the true burden (volume) and grade of the disease: The reported discordance between biopsy determined Gleason grade and pathological Gleason grade determined following surgical excision of the gland ranges from 17 to 43% [5], [6]. In one study that investigated the clinical significance of this discrepancy, biopsy led to a “clinically significant” undergrading in 23% of patients [7], implying that a significant proportion of patients with high-risk disease may not receive appropriate treatment based on biopsy results. Furthermore, since the widespread adoption of elevated prostate specific antigen (PSA) as a screening test for prostate cancer and indication for a prostate biopsy, there is widespread recognition that clinically insignificant disease is over-detected by conventional biopsy methods, and, as a consequence, over-treated [8], [9].

In recent years, the increased availability of magnetic resonance imaging (MRI), combined with growing evidence supporting its clinical use in the detection and localization of prostate cancer [10], has led to growing interest in tumor targeted biopsy in which MRI-visible lesions that are deemed to be “suspicious” (i.e. regions that may harbor cancer based on their radiological appearance) are sampled preferentially. This approach is currently being evaluated by a number of clinical research groups internationally and offers the potential to improve the detection of clinically significant tumors whilst avoiding very small, low-risk tumors that are not detected by MRI and are of no clinical consequence. MRI-targeted lesion sampling also has the potential to increase biopsy sampling efficiency, measured as the number of tissue samples required to establish a diagnosis, as well as grading accuracy compared with conventional TRUS-guided biopsy [11]–[13]. A recent large-scale clinical study concluded that computer-assisted targeting of MRI-visible lesions led to an increased detection of high-risk prostate cancer and a decreased detection of low-risk prostate cancer [14].

Targeting lesions identified on MRI during prostate biopsy is usually performed by using either visual targeting or with the aid of a computer guidance system that incorporates MRI-TRUS image registration/fusion software. In the former the operator’s anatomical knowledge and 3D perception are used to locate the MRI-visible lesions mentally within TRUS images acquired during a biopsy procedure, whereas most computer guidance systems use US to provide real-time feedback on the inserted needle location, combined with MRI-TRUS image registration (i.e. alignment) to present the operator with a graphical representation of an MRI-visible tumor superimposed onto B-mode TRUS images [15], [16]. An alternative approach, currently available only in a relatively small number of centers internationally, is to perform the procedure in an MRI scanner under direct image guidance [17]. This avoids the need for cross-modality image fusion, but has the disadvantage of requiring special-purpose, MRI-compatible equipment and instruments, and is in general a more complex and resource-intensive procedure compared with TRUS-guided biopsy. In particular, performing a biopsy within a closed-bore diagnostic MR scanner limits the access to the patient and constrains the patient position, resulting in a more technically demanding and time-consuming procedure, though robotically controlled needle insertion may provide a helpful solution to overcome these problems [18], [19].

A number of research and commercial computer-assisted guidance systems for performing MRI-targeted prostate biopsy under real-time TRUS guidance have been developed [16], [20]–[26]. A recent review of commercial systems marketed in the USA can be found in Marks et al. [16]. In order to achieve an accurate registration of MRI and TRUS images, and in particular to allow compensation of prostate deformation between scans, 3D TRUS images are generally required. Various methods have been employed to acquire 3D US images, most commonly, freehand 3D acquisition in which the motion of a curvilinear array transducer is tracked using a mechanical arm or electromagnetic position sensors attached to the probe casing [24], [27], [28]. Specialized 3D/4D transducers suitable for prostate imaging have also been employed [22], [23], [29]. All commercially available 3D/4D transducers for prostate imaging use an internal motor (sometimes called a “wobbler”) to rotate a transducer array within the probe casing as B-mode images are recorded at regular angular intervals. These 2D images are then reconstructed into a 3D volume. This approach has the advantage of rapid volume acquisition, with acquisition rates of 0.2-1.0 volumes per second achievable depending on the 2D image field of view. Compared with freehand 3D US imaging, it also minimizes the risk of artefacts in the reconstructed volume due to varying probe-pressure that deforms the prostate as it is moved in the rectum because the probe is held stationary during 3D imaging. Using a 3D/4D transducer also avoids reconstruction artefacts arising from probe tracking measurement errors, which can be difficult to correct (for example, by registering misaligned B-mode images as described by Treece et al. and Solberg et al. [30], [31]. Finally, motorized, swept-volume acquisition has the benefit that reconstructing the US volume is relatively simple from a computational perspective, as the image data samples are regularly spaced.

Although endo-rectal 3D/4D US imaging transducers are available from nearly all major ultrasound scanner manufacturers (including Analogic (BK and Ultrasonix), GE, Hitachi, Philips, Siemens, and Toshiba), they have been rarely employed within computer-assisted prostate biopsy systems. Moreover, in the systems described by Bax et al. and Long et al. [22], [27], 3D/4D transducers are employed, but probe tracking is not; for instance, the system developed by Bax et al. uses a rapid, automatic, non-rigid registration method to register successive 3D TRUS images and determine the relative probe motion with respect to an MRI-identified target. This approach is implemented by the commercial Koelis Urostation (Koelis, La Tronche, France; [32]).

In this paper, we describe a computer-assisted transrectal biopsy guidance system that incorporates a 3D/4D TRUS transducer to enable rapid 3D imaging of the prostate for the purposes of MRI-TRUS registration whilst avoiding the problem of probe pressure artefacts in reconstructing 3D US images. The system software performs non-rigid MRI-TRUS image registration, introduced in our previous publications [33]–[35], to account for organ deformation between MRI and TRUS imaging. Given the substantial 3D tracking errors that can occur when EM tracking sensors are attached to a motorized 3D/4D US transducer [36], 3D optical tracking similar to that used in neuronavigation systems was adopted as an alternative. In the remainder of this paper, we describe the system and present the results of phantom validation experiments to assess the accuracy of targeting US-visible and invisible (virtual) lesions, identified in pre-biopsy MRI scans, with and without 3D guidance.

II. Materials and Methods

A. System Overview

The 3D-TRUS-guided biopsy system developed in this research is illustrated in Fig. 1. The system comprises the following main components: a commercial US scanner (Ultrasonix SonixMDP, Ultrasonix Medical Co., British Columbia, Canada) equipped with a motorized 3D/4D transrectal imaging transducer (4DEC9-5/10; 5-9 MHz broadband curvilinear array); an optical 3D position tracking system (Polaris Vicra, Northern Digital Inc., Waterloo, Ontario, Canada); and a PC workstation (Dell Intel(R) Core(TM) i7 2.8GHz CPU with 16GB RAM) with custom-written software installed that incorporates a graphical user interface (GUI) for biopsy planning, image registration, and real-time probe/needle navigation (Fig. 2). The TRUS probe is tracked to enable the 3D position and orientation of the probe and the predicted biopsy needle trajectory relative to the last acquired 3D TRUS volume to be calculated and presented as a graphical display on the PC. The 3D graphic of the TRUS transducer and needle trajectory (assuming no needle deflection) is updated in real-time and provides visual feedback to the operator, enabling him/her to orientate the probe and needle to the correct angle until the virtual trajectory intersects a predetermined target within the prostate gland. All software was developed using a combination of MATLAB (The Mathworks, Cambridge, MA) and C++. Details of each system component are provided in the following sections.

Fig. 1.

Fig. 1

Photographs of the 3D TRUS guided prostate biopsy system. Top: A stereo tracking camera measures the 3D probe position and orientation in real-time. The tracked attachment comprises 4 retro-reflective spherical markers mounted on a cross-shaped piece that is fixed to the probe casing. Bottom: A needle guide is attached to shaft of the TRUS probe, through which the biopsy needle is inserted. The probe is inserted into a tissue-mimicking prostate biopsy training phantom (the synthetic prostate gland within the phantom is coloured blue).

Fig 2.

Fig 2

Screen shots of the GUI developed for biopsy guidance. Top: the registered gland (yellow mesh) and target (yellow sphere) appeared on the monitor after registration. Bottom: the needle trajectory hits the target while the target changes the color to green.

B. 3D Probe/needle Tracking

As shown in Fig. 1, the tracked object has four spherical, retro-reflective markers attached, each of which is detected by the stereo camera and localized in 3D with respect to the camera coordinate system. The tracking device outputs a rigid-body transformation that specifies the physical 3D position and orientation of the tracked object – and hence the US probe and US images (see also Section II.D) – relative to the camera. Compared to EM tracking, which can be prone to errors due to field distortions due to metallic objects and sources of EM interference within the operating region [37], optical tracking provides relatively high accuracy and reliability in real-world clinical environments [38], [39]. However, an unobstructed line-of-sight between the markers of the tracked probe attachment and the camera must be maintained, and the markers must remain within the operating range of the camera (approx. 0.56 - 1.34 m for the NDI Vicra), to track the US probe pose continuously. For our system, both of these criteria were achieved by mounting the camera on a surgical mounting clamp approximately one meter from the TRUS probe, with the camera higher and looking down so that all markers are within the field-of-view. In practice, care also needs to be taken not to excessively rotate the probe about its axis such that some of the markers are no longer in view, and in exceptional cases the camera may need to be moved to ensure an optimal tracking region during a biopsy. A real-time visual indication of whether the probe is being tracked was incorporated into the software interface to alert the user of when the tracking is inadequate.

C. 3D TRUS Imaging

The 3D TRUS probe used in this work has no external moving parts and is able to acquire a 3D TRUS image of the prostate within 3 seconds. Software was written in C++ to retrieve and reconstruct data stored by the US scanner that specify the pixel intensity values for each radial B-mode scanline, the physical length of the scanline (in millimeters), the in-plane angle of the field-of-view of the B-mode images, the angular positions of each B-mode image plane, the coordinates of the origins of the scanlines, and the center of rotation of the transducer. Using these data, the position vector of any point identified in a reconstructed US volume can be determined with respect to a local image coordinate system. A 3D US image of the scanned volume is reconstructed by interpolating the US intensity values at measured locations to calculate the intensity values across a rectangular grid. The in-plane field-of-view of the sector-shaped B-mode image was set to 144.8°, which was sufficient to image a phantom prostate with a volume up to 56.4cc. Each B-mode image comprised 128 scanlines with 780 intensity samples over a physical length of 60.06 mm. During a volume acquisition, 97 B-mode images were recorded, resulting in a sweep angle of 74.16°. The predefined 3D spatial position of each image slice was calibrated and verified by scanning a calibration phantom of known geometry. A brachytherapy ultrasound QA phantom (Model 045, CIRS Inc., Norfolk, VA USA) containing a set of parallel wires separated by fixed, known distances, was used in this study. As illustrated in Fig. 3, by locating straight wires positioned in approximately the middle of the field-of-view of different frames, the angle, θx (x = 1, 2, …97), of each frame with respect to the reference plane (indicated by the dotted line perpendicular to the two example wires in Fig. 3) and the distance, R, between the rotation center and the image frames were estimated using the following equations:

θx=arccos(Dy/(yxAyxB))

and

R=(y1By2Ay1Ay2B)/(y1A+y2By2Ay1B),

where yxA and yxB are the y coordinates of the two wires identified in the xth image frame, and Dy is the distance between the two parallel wires.

Fig 3.

Fig 3

A B-mode image of a phantom with multiple parallel wires, two of them being labelled Wire A and Wire B, was used to determine R and the angle θx of each frame (see text). The geometric relationship between these wires is shown on the right. At least two frames are required to determine R, while two points on each frame are required to determine θx.

D. System Calibration

For simplicity, a pinhead-based calibration method was adopted in this work, which involved scanning a pinhead in a water bath from multiple view angles as described in [40], [41]. The calibration used in this work was chosen based on its simplicity and availability, but many of the methods for calibrating tracked 3D US transducers described in the literature may be used as an alternative (e.g. see Bergmeir et al. [42]). The experimental set-up for calibration is illustrated in Fig. 4 and the 3D coordinate systems for the 3DUS guidance system are shown in Fig. 5. The relative position of the probe and attached needle guide is fixed with respect to the US image coordinates during a biopsy procedure. With reference to Fig. 5, calibration is required to calculate the 3D transformation between the US image coordinate system (i) and the local coordinates of the tracked object (t) attached to the US probe. Once determined, target coordinates defined in an US volume can be expressed within the world coordinate system (w), a fixed global coordinate system defined by the spatial position of the tracking camera in physical space, and related to the predicted biopsy needle trajectory. In the proposed system, the locations of targets are determined after registering an MRI-derived 3D anatomical model of the prostate (including the biopsy plan) to the 3D TRUS image of the prostate, as described in Section II.F.

Fig 4.

Fig 4

Photographs of the calibration experiments: (a) the experimental setup; (b) a pin inserted into a blue slab of ultrasound absorbing material; (c) and (d) 3D imaging of the pinhead in a water bath with the tracked 3D TRUS transducer held in different positions and orientations by a flexible clamp.

Fig 5.

Fig 5

Illustration of three coordinates systems for different components of the 3D guidance system: The world- and tracking coordinate systems are defined by the positions of the tracking camera and the tracked object attached to the US transducer casing, respectively. The tracking system reports the transformation between these coordinate systems. The image coordinate system is defined by the voxel positions from the US volume.

For each pinhead image, the tracking system reports the transformation from tracking coordinates to world (i.e. camera) coordinates, denoted here by the transformation function, Ttw. The coordinates of a point location in image-, tracking-, and world coordinates are specified by the position vectors Pi, Pt and Pw, respectively, each containing x-, y- and z coordinates in millimeters. Because the tracked object is fixed with respect to both the US transducer and the imaged 3D US volume, the transformation from image coordinates to tracking coordinates, Tit is also fixed. Therefore, Pt = Tit(Pi), Pw = Ttw(Pt) and Pw = Ttw(Tit(Pi)). The pinhead-based calibration exploits the fact that, when the tracking camera and the pin remain fixed in physical space, Pw is invariant as the position and orientation of the US probe (and hence the tracked object) is varied for each scan of the pinhead. For the nth scan, Pwn=(Ttwn(Tit(Pin)), where Pin is the position vector that defines the location of the pinhead in US image coordinates. This location was found by manually identifying the pinhead coordinates in the acquired image volume. Ttwn was obtained directly from the optical tracking system. The calibration transformation, Tit and the unknown location of the pinhead in world coordinates, given by the position vector, Pw, was estimated by solving a set of linear equations from a calibration experiment in the least squares sense [41]. The sum-of-the-squares residual, which represents the difference between the estimated pinhead location, Pw^, and the transformation-estimated location, Pwn, is minimized.

Although 2D B-mode US are normally used with this calibration method [41] – and can be used to calibrate a system incorporating a 3D/4D transducer, such as the one described here – we found it more convenient to identify the pinhead within a reconstructed US volume. This avoids the need to orientate the transducer so that the pinhead appears within each B-mode image plane, leading to a much quicker procedure.

E. Biopsy Planning and 3D Deformable Prostate Model Generation

In this and in our prior work, a model-to-image registration approach is adopted in which a 3D geometric model of the prostate, derived from an MRI scan obtained prior to a biopsy procedure, is aligned to surface features extracted automatically from B-mode TRUS images obtained during the procedure [35]. This effectively transforms the problem into a feature-based registration task. Three-dimensional models take the form of a finite element (FE) mesh computed from the 3D triangulated surface meshes reconstructed from prostate boundary contours manually delineated in a T2-weighted MRI scan on a slice-by-slice basis. MRI-visible lesions are also included in the model by delineating suspicious regions within any image from a multi-parametric MRI sequence in the same way. For this study, the Osirix DICOM viewing software (Pixmeo SARL, Bernex, Switzerland) was used to define MRI contours and the 3D surface and volumetric mesh computation were performed automatically using custom-written software as described in [35]. Fig. 6 illustrates the process.

Fig 6.

Fig 6

Generation of the MRI-derived 3D prostate models. (a) Example slices of an MRI volume of the phantom used in this study, with the contours of the prostate capsule (shown in green) and hypoechoic lesions (blue) delineated on each slice to the left of the urethra, which appears as a high-intensity circular region; (b) An example of a virtual, isoechoic lesion contoured in blue to the right of the urethra (see Section II.H); (c) Example slices of an MR volume of a patient with the contours of the prostate capsule (shown in yellow) and lesion (red) delineated on each slice; (d) The gland and lesion contours segmented from (c) are displayed in 3D space and a triangulated surface mesh is fitted.

To initialize the model-to-US image registration, the coordinates of the following six anatomical landmarks are also identified in the MRI: the apex, the base, and the most anterior-, posterior-, left- and right- points on the capsule boundary, defined in the middle of the gland in a transverse view. These points are matched to the six corresponding landmarks defined in the field-of-view of a TRUS volume at the start of a biopsy procedure by maneuvering the US probe, so subsequent landmark selection by hand during a procedure is not required (see Section II.F for details). Further points representing biopsy targets can be included into the 3D model if required and constitute a plan for the biopsy procedure, and the locations of tissue samples removed during a biopsy can also be represented in the same way (i.e. by 3D points) and recorded for future clinical use.

During a model-to-image registration, prostate models can deform to account for tissue deformation between MRI and TRUS scans. Model deformation is constrained by the modes of variation of a patient-specific statistical motion model (SMM – a specific type of statistical shape model that captures object motion). SMMs were built using simulated training data generated by performing FE simulations of prostate deformations due to varying TRUS probe position, orientation, and balloon diameter using the methodology described in [35]. In summary, 500 FE simulations of TRUS-probe-induced gland deformation are performed with each result corresponding to different randomly assigned material properties and mechanical loadings. The prostate model used in these simulations was represented by a tetrahedral mesh, which in turn describes the 3D geometry of the prostate in a “reference state” defined by its shape in the T2-weighted MRI scan. To apply this technique for the guidance system presented here it was assumed that large prostate deformations, defined as deformations where the average surface vertex displacement, excluding rigid movement, exceeds 10 mm, are not encountered when using the proposed guidance system since, with training, the transrectal pressure on the prostate exerted by the US probe can be regulated effectively by the operator so that excessive deformations are avoided. (“Excessive deformations” in this context are defined as having a magnitude approximately equivalent to compressive strains on the order of 20-30%). This enabled the simplified FE simulations developed in our previous research to be used to approximate possible prostate deformations due to probe pressures that are typically sufficient during TRUS-guided biopsy to achieve adequate acoustic coupling so that good quality images may be obtained [33]. In addition, the ranges of the simulated probe translation and rotation components used to generate training data for the SMM were extended by 50% compared to those specified in our previous work, which focused on modelling the range of TRUS probe motion encountered during transperineal needle biopsy and interventions, such as HIFU [43].

F. Image Registration

During a procedure the 3D deformable prostate model (with an optional associated biopsy plan) is registered directly to 3D TRUS images using the semi-automatic algorithm developed in our previous research [35]. This algorithm matches the prostate capsule surfaces defined by the model and the 3D US image by iteratively aligning surface normal vectors computed for the 3D model to corresponding vectors automatically extracted from US volume by applying a Hessian-based boundary surface enhancement algorithm. During the registration, the model is deformed by varying the weights of the principal modes of shape variation within 2 standard deviations, thereby ensuring that the deformation is constrained and the shapes adopted are representative of those predicted by biomechanical simulations. In this sense, the model deformation is physically constrained. Once matched, one or more MRI-visible tumor targets and any additional targets defined within the biopsy plan are transformed into US image coordinates using the dense displacement field obtained from the model-US registration transformation.

The adopted algorithm has a reported capture range of at least 5 mm in each of the x-, y- and z directions for landmark-based initialization [35], meaning that a robust initialization is still required. The presented system is initialized by setting the US transducer to its central position – i.e., so that the sagittal plane of the TRUS image is centered and parallel with the longitudinal axis of the probe – and maneuvering the probe until a B-mode TRUS image of the sagittal-mid-gland is obtained, as illustrated in Fig. 7. At this position the TRUS view corresponds to the central slice of the reconstructed 3D volume. The six anatomical TRUS landmarks, defined in Section II.E and corresponding to points identified in the pre-biopsy MRI image, do not need to be identified manually, since they are fixed relative to the field-of-view; i.e. apex-base to the x-axis, anterior-posterior to the y-axis and left-right to TRUS image z-axis. By moving the probe position, these landmarks can be roughly aligned with the prostate gland being imaged using US. This initialization procedure ensures that the initial orientation of the prostate mesh model containing the biopsy plan is aligned roughly with the baseapex axis, and provides useful starting estimates for the rigid component parameters of the MRI/model-to-TRUS-volume registration transformation.

Fig 7.

Fig 7

Example of an initial TRUS image of a prostate phantom, left, just before a 3D volume is acquired. Selecting an imaging plane through the center of the prostate helps to ensure that the field of view of the acquired 3D TRUS volume captures the entire gland. A re-sliced TRUS image from the volume, overlaid with a 3D mesh model derived from the MRI image, is shown in the figure on the right.

After an initialization volume is obtained, the 3D TRUS data are transferred automatically to a PC hosting the registration and visualization software, via a custom-written private local network based on standard TCP/IP protocol. The automatic registration is then performed.

G. Clinical Protocol for 3DUS-guided Biopsy

Fig. 8 illustrates the clinical set-up for the system described in this paper. A carefully designed clinical protocol was created to maximize the targeting accuracy when using the system. This is summarized as follows:

  1. Prior to starting a biopsy procedure, the tracking camera is positioned so that the full range of movement of the TRUS probe expected during a TRUS biopsy can be tracked. The visibility of tracked probe markers is indicated by a color-coded indicator on the GUI as shown in Fig. 2.

  2. Obtain a mid-gland view and a 3D US volume is acquired by pushing a button on the scanner, whilst keeping the probe steady for a few seconds during acquisition. Once the (initialization) acquisition is complete, the B-mode US image data is transferred automatically to the host PC and a 3D image is reconstructed. The 3D prostate model is then registered automatically to the 3D image by the software.

  3. The alignment of the model surface and the displayed (2D) TRUS image are checked visually by the operator. Different 2D views through the 3D image are obtained by moving the TRUS probe and using the tracked data to calculate the plane of intersection of the current B-mode view with respect to the captured 3D image. If significant misalignment is observed – for example, due to unexpected probe or patient movement during a 3D acquisition –Step 2 is repeated. (In the phantom experiments reported in this paper, repeating TRUS volume acquisition due to probe motion was necessary three times over all of the 54 trials). Following initialization and registration, a real-time graphical representation of tracked US probe and needle trajectory is displayed to help guide each needle to a target, as shown in Fig. 2. The GUI also reports the predicted target depth with respect to the needle exit point of the needle guide as an indication of the predicted needle insertion depth. To avoid the need for further tracking of the needle, the operator uses the graduated markings on the needle to insert the needle to the required depth.

  4. The TRUS probe/needle is orientated so that the virtual needle trajectory intersects the chosen target. This is indicated visually by the target color changing from yellow to green, as shown in Fig. 2. The operator can switch freely between different 2D TRUS views obtained by re-slicing the last acquired 3D US image using a GUI control to inspect the relative position of the target and the needle trajectory.

  5. To account for probe-induced prostate motion, particularly in cases when the angle between the initial and the on-target probe orientations is relatively high, Steps 2-4 can be repeated using the new probe orientation and position to initialize the registration. (In the phantom experiments described in the next section, this was not found to be necessary, because the deformation due to probe motion and probe pressure changes subsequent to the image registration was limited. Therefore, the results presented in Section III are based on performing a single registration).

  6. With the TRUS probe held steady, the needle is inserted using the physical depth markers on the needle as a guide. Once the needle has entered the prostate, the real-time US scanner display is used to visualize the needle location (see Fig. 9).

  7. After making sure that the virtual needle trajectory still intersects the target, the biopsy gun is fired to obtain a tissue sample.

  8. At this point, another (optional) TRUS volume, hereonafter referred to as the validation TRUS volume, was acquired from which the final needle position with respect to the prostate can be determined. This step was for validation purposes in the experiments described in the next section.

Fig 8.

Fig 8

Clinical set-up for the 3D-guided biopsy system

Fig 9.

Fig 9

Example slices before (left) and after (right) the insertion of the biopsy needle into a phantom. The white arrow indicates the needle in the US.

H. Validation Experiments

Experiments were carried out to validate the proposed 3D TRUS-guided system when performing a targeted biopsy of MRI-visible lesions. The aims of the validation experiments were to estimate the needle placement error using computerassisted targeting and visual (sometimes called cognitive) targeting in which conventional B-mode TRUS guidance is employed. Tissue-mimicking US/MRI-compatible prostate biopsy training phantoms (Model 053A-EF, CIRS Inc., Norfolk, VA USA) were used in this study, which are designed for use with end-firing TRUS probes and contain three spherical lesions, each with diameter of 10mm (volume: 0.52cc), that are clearly visible in both US and T2-weighted MRI scans, as shown in Fig. 6. The lesions have a hypoechoic appearance in B-mode US images. A 3D T2-weighted MRI of each phantom was obtained using a turbo spin echo sequence with an axial in-plane resolution of 0.28657×0.2865 mm/pixel and a slice thickness of 2 mm (Philips Achieva 3.0T MR scanner, Guildford, Surrey, UK). Together with the prostate surface, each of the 3 lesions was segmented in this image and used to construct 3 prostate models and 3D biopsy plans, one for each lesion, using the method described in Section II.E. In addition, three virtual lesions of the same size and shape not present in the images were also defined at arbitrary locations by manually contouring regions in MRI images of the prostate phantom to create three planned isoechoic targets. The virtual targets reflect the common situation in which (tumor) targets are isoechoic in B-mode images in relation to surrounding prostate tissue, and therefore cannot be targeted directly using US imaging alone. Two phantoms were used; the first was replaced after approximately two-thirds of the experiments once the US image quality became unacceptable because of the air-filled needle tracks.

Three experienced clinical research fellows (LS, VK and TS), each of whom had performed over a hundred patient transrectal biopsies prior to participating as operators in this study. Each operator biopsied the phantom using the same technique as used in clinical practice, i.e. aim to direct the end 18mm section of the needle so that it passed through the lesion, with the needle-tip lying beyond the lesion. During these biopsies, each of the 6 lesions (3 visible + 3 virtual) was targeted 6 times, 3 times using computer-assisted targeting and 3 times using conventional visual targeting. This resulted in 108 biopsy samples in total (3 needle deployments × 6 lesions × 2 targeting methods × 3 observers). For each computerassisted targeting procedure, each operator followed the steps outlined in Section II.G (including Step 7). To assess the feasibility of Step 5 described in Section II.G, an additional 3DUS volume was acquired and the registration was updated immediately prior to each needle insertion. This step was found only to add on average 30 seconds to the whole procedure. When targeting US-visible lesions using the computer-assisted system, the monitor of the US scanner was turned away after the initial 3D US volume had been acquired so that the operator was unable to see a real-time 2D B-mode image, thereby eliminating navigation bias. The probe was also then repositioned to force the user to rely entirely on feedback provided by the 3D guidance system.

Visual targeting was performed using only 2D B-mode images displayed by the US scanner to guide the needle placement. In the case of targeting image-visible lesions using this technique, lesions were visible to the operator in the real-time TRUS images at all times during the procedure and therefore direct targeting was possible. Although this is not generally representative of clinical practice since only a small proportion of MRI-identified prostate lesions are also visible in B-mode US images, the results provide a useful reference that is representative of the optimum targeting accuracy that can be achieved during an idealized procedure in which direct targeting of a lesion is possible using TRUS.

A 3D TRUS image was acquired immediately after each needle deployment to determine the needle location in 3D and quantify needle placement accuracy. An example is shown in Fig. 10. The final locations of each needle were identified manually from these scans, including the tip and the entrance points of the needles to determine the needle position. Visible lesions were also segmented manually with the centroid defined as their centers based on the segmented gland surface. The virtual lesions were transformed into the space of the validation TRUS volume using an independent, manual, non-rigid registration method described in [44]. In this method, the prostate capsule surface and anatomical landmarks, including the urethra, hypoechoic lesions, the seminal vesicles, and the apex and base points, identified in both the MRI and 3DUS images, were all matched. A leave-one-out test of the accuracy of this registration was carried out in which each of the 3 image-visible lesions was excluded as a registration landmark in turn and used as an independent target to estimate a target registration error (TRE), defined as the Euclidean distance between the centers of corresponding lesions defined in the MRI and US images following registration.

Fig 10.

Fig 10

Example slice from a 3DUS image of a biopsy training phantom after needle insertion. The needle is indicated by white arrows and the red dotted region is the target lesion.

To assess the overall biopsy targeting accuracy, the following three distance error measures were calculated (see Fig. 11) in the 3D validation TRUS volume: d1, defined as the distance between the center of the target lesion and the closest point on the needle trajectory; d2, defined as the distance between the center of the target lesion and the center of the biopsy sample core (i.e. the 18mm section of the biopsy needle from which tissue samples are obtained The sample core was estimated as the 18 mm line segment from the needle tip but this may be different with other biopsy needle designs.); and d3, defined as the distance between the center of the target lesion and the closest point on the line segment that represents sample core length.

Fig 11.

Fig 11

Illustration of the distance error measures, d1, d2, and d3, estimated in this study (see text for details).

Although d1 is adopted commonly in the literature as an overall measure of biopsy needle placement accuracy, d2 and d3 provide more stringent error metrics that take into account the relative location of the tissue core. In addition, the lesion hit rate and estimated cancer core length (CCL) – i.e. the length of cancerous tissue within the tissue sample – were computed from the length of the needle segment that intersects the target lesion, measured from the final 3D TRUS image (see Fig. 11). The CCL is a well-established clinical measure for classifying medium- versus low-risk disease. When this length was non-zero, a lesion “hit” was recorded. The maximum CCL achievable was 10mm, corresponding to a direct hit through the center of a spherical lesion.

III. Results

Using method described in Section II.C to reconstruct the 3D TRUS image, a root-mean-square (RMS) distance error in locating one of the left wires was estimated to be 0.3mm, based on the manual localization of 30 wires, which were visible in 10 acquired frames (2-8 wires per frame). Further verification of the 3D US image reconstruction accuracy was performed by comparing the volumes of a phantom prostate imaged using US and MRI (details of the MRI acquisition and the phantom are provided in Section II.H). A difference in lesion volume of 3% (~16mm3) and an average prostate surface distance of 0.4mm (~1.5 voxels – after removing the rigid translation and rotation) were found. These results compare well to those of similar experiments described by Fenster et al. [28]. For calibrating the tracking system (Section II.D), a total of 36 US volumes with cubic voxel size of 0.23 mm3 were obtained at different probe positions as illustrated in Fig. 4. The RMS residual error in reconstructed 3D pinhead location was 1.67mm. For determining the virtual lesion positions in validation TRUS volume (Section II.H), a mean RMS error of 0.2mm with a range of [0.0-0.4] was obtained. This is considered significantly smaller than registration error or overall targeting error (with both t-tests p-values<0.0001).

The results from the targeted biopsy experiment are summarized in Table 1 and Fig. 12. Each computer-assisted biopsy took on average ~1.3 minutes to place each needle (Steps 2-7, described in Section II.G). The mean±SD needle placement errors using measures d1, d2 and d3 for all of the biopsies performed using 3D computer-assisted targeting were 3.2±2.4, 8.9±5.6 and 3.6±4.0 mm, respectively. The results in Table 1 suggest that the needle placement accuracy was higher overall when targeting US-visible lesions compared with isoechoic lesions, which is to be expected since direct targeting of an US-visible target was possible, whereas targeting a virtual lesion relies primarily on the 3D perception of the anatomy and, in the case of 3D computer-assisted targeted, real-time graphical feedback on the current needle trajectory relative to the target. Applying a non-parametric two-sample Kolmogorov-Smirnov (K-S) test and a Student’s t-test confirmed this finding statistically for 2D visual targeting, but no statistically significant difference was found for 3D computer-assisted targeting (p=0.47, p=0.70, p=0.47 for the K-S test and p=0.44, p=0.43, p=0.32 for the t-test for d1, d2, and d3, respectively; Null hypothesis: No difference in accuracy). Furthermore, no statistically significant difference was found between the needle placement errors for 2D visual targeting of US-visible lesions compared with 3D computerp-assisted targeting of the same lesions (K-S test: p=0.28, p=0.70, p=0.47; t-test: p=0.60, p=0.51, p=0.80 for d1, d2, d3, respectively). Comparing the needle placement errors for the cases where only virtual lesions were targeted using conventional versus 3D computer-assisted guidance, which is representative of the most common clinical scenario where a MRI-identified target lesion cannot be distinguished independently within B-mode US images, revealed that d1 and d3 were significantly lower (p<0.0001) when computer-assisted targeting was used. However, d2 failed to show significant improvement at the same confidence level (α=0.05) with p=0.1073.

Table I.

Summary of Validation Results

Targeting methods Types of target Number of trials (N) d1 (mm) [5%-95%] d2 (mm) [5%-95%] d3 (mm) [5%-95%] Hit rate 1 (%) Hit rate 2 (%) CCL (mm) [5%-95% Cl]
2D Visual targeting US-visible lesions 27 2.7 [0.9-6.3] 9.2 [2.8-20.1] 3.3 [1.0-9.6] 85.2 (23/27) 100 6.4 [0.0-9.5]
Virtual lesions 27 5.9 [1.4-13.7] 10.7 [2.4-20.8] 6.4 [1.4-15.2] 37.0 (10/27) 66.7 2.8 [0.0-9.4]
Both lesions 54 4.3 [1.0-11.5] 10.0 [2.5-20.2] 4.8 [1.0-14.4] 61.1 (33/54) 83.3 4.6 [0.0-9.5]
3D Computer-assisted targeting US-visible lesions 27 3.0 [0.5-7.1] 8.2 [1.5-16.9] 3.0 [0.5-7.1] 88.9 (24/27) 100 6.5 [0.0-9.7]
Virtual lesions 27 3.5 [0.5-7.8] 9.5 [1.4-21.0] 4.1 [0.5-10.3] 77.8 (21/27) 100 5.6 [0.0-9.7]
Both lesions 54 3.2 [0.5-6.8] 8.9 [1.5-18.1] 3.6 [0.5-7.0] 83.3 (45/54) 100 6.2 [0.0-9.6]

Mean needle placement errors, lesion hit rates, and CCL calculated for lesion-targeted biopsies of a phantom using MRI-derived biopsy plans. Confidence intervals are provided in square brackets. Hit rate 1 – percentage of lesions sampled during a single biopsy needle deployment per lesion; Hit rate 2 – percentage of lesions sampled during three biopsy needle deployments per lesion.

Fig 12.

Fig 12

Histograms of needle-to-target distance errors, d1, d2 and d3 (left-to-right, respectively), calculated for biopsies of a phantom performed with- and without the 3D guidance (i.e. visual 2D versus 3D computer-assisted targeting; top and middle rows, respectively). The histograms on the bottom row show the difference in errors between these two targeting methods.

Further analysis of the component contribution in signed distance error vectors reveals that the mean±SD of d2 components for 3D computer-assisted biopsies were -0.8±1.7, 7.5±6.9 and 0.5±1.7 mm in x-, y- and z directions, respectively. The corresponding values for conventional 2D targeted biopsy were 0.3±3.6, 7.8±8.0 and 1.2±3.7 mm, respectively. Therefore, the distance errors, including d2, are dominated by the error in y-direction, as illustrated in Fig. 13. For the phantom used in this study, the x-, y-, and z-axis directions corresponded anatomically to the apex-base, anterior-posterior and left-right directions of the prostate, respectively, with the direction of (transrectal) needle insertion approximately aligned with the y-axis. The error in this direction is likely to be a result partly from the imprecision of judging the depth of needle placement and partly a characteristic of the standard clinical technique in which the needle is inserted such that the target tissue region (lesion) lies within the range of the tissue sample, but is not necessarily centered within that sample. In practice, it is often difficult to accurately determine needle-tip location by eye using B-mode US images (or using the needle graduations as in the experiments reported here), and because the length of this sample is typically 15-20mm, errors in d2 larger than 5mm are theoretically possible when targeting a 10mm diameter (spherical) lesion even if a direct hit is achieved and the lesion lies within the tissue sampling range of the needle.

Fig 13.

Fig 13

Orthogonal components of the needle targeting errors (d2) with and without computer-assisted guidance are plotted on the left and right, respectively. The black points are errors plotted in 3D coordinates. The blue, green and red surfaces represent the multivariate Gaussian confidence regions within ±1σ, ±2σ and ±3σ, respectively, where σ is the standard deviation. Base on the presented experimental setting, X, Y and Z coordinate axes are roughly corresponding to the anatomical directions, Apex-Base, Anterior-Posterior and Left-Right, respectively.

Inspection of the results for both lesions presented in Fig. 13 reveals the following observations: (i) there is significantly greater variance in the y-direction (paired Chi-squared variance test, with all p-values<0.0001) both with and without using the 3D guidance system; (ii) computer-assisted 3D guidance significantly reduced the errors in the x- and z-axis directions, orthogonal to the direction of needle insertion (two-sample F-test, with p-values of 0.0003 and 0.0001, respectively); and (iii) introducing depth control by providing the user with an estimated depth of insertion may not improve the precision in y-direction (two-sample F-test with p=0.1819) because this observed uncertainty in needle insertion depth is likely to be due to a combination of human error (as discussed above) but also difficulty in accurately identifying the tip of the needle in 3D US volumes used for validation.

In terms of lesion hit rate, 83% (45 out of 54) of needle deployments performed using the computer-assisted guidance system hit the respective lesion target versus 61% (33 out of 54) performed using visual targeting. A clinically relevant finding is that 100% hit rate was achieved by using three needle deployments for each lesion with 3D computer-assisted guidance. Further analysis revealed that in 2 cases where US-visible lesions were targeted using conventional 2D visual targeting, the needle passed through the lesion but a lesion hit was not registered because no part of the lesion lied within the sample capture region. In another 2 cases, the trajectory of the needle meant that the needle did not pass through the target lesion. In no cases was the incorrect lesion successfully targeted (and this would be registered as a “miss” in any case because the CCL is zero).

Using 3D guidance to target virtual lesions yielded a mean (±SD) CCL of 5.6±3.8mm, which is significantly greater than the CCL of 2.8±3.8mm obtained using conventional visual targeting (two-sample K-S test, p=0.0003). This finding, if reproduced in vivo, has important clinical implications as the CCL is a well-established factor in prostate cancer risk assessment. Another interesting finding was that the biopsy efficiency in terms of the percentages of cancer positive cores (hit rates 1 and 2 in Table 1) was improved from 37% and 66.7% (visual targeting for virtual lesions) to 83.3% and 100% (computer-assisted targeting for all lesions), respectively, when a lesion is targeted three times using three needle deployments.

For the purposes of comparison with previously published work [35], the target registration error (TRE) was also calculated for all the lesion targeting experiments by computing the distance between the segmented lesion center, defined in the 3D US volume, and the registered lesion center, calculated after transforming form MRI coordinates to US volume coordinates. Using this definition, the RMS TRE was 2.0±1.0 mm, which compares well with that found in our previous work focusing on transperineal biopsy [35].

IV. Discussion

In this paper, we have described a system for targeted transrectal prostate biopsy, which combines 3D US imaging using a motorized 3D/4D transducer, a deformable 3D model-image registration algorithm, and optical TRUS probe tracking to provide the user with feedback on the predicted needle trajectory relative to one or more target lesions defined by surgical plan based on MRI data. The results of experiments to estimate the accuracy of targeting 0.5cc spherical lesions within a phantom, identified on MRI images, revealed mean (± SD) errors of 3.6±4.0 mm and 3.2±2.4 mm measured as the distance between the needle sample core section and the target lesion center (d3), and the closest distance between the needle trajectory and center of the target lesion (d1), respectively. The magnitudes of these errors are comparable with those of other similar studies in the literature, although a number of different error measures have been adopted by different research groups: Most studies report what is often referred to as the “overall targeting error”, defined as the minimum distance between a target and the needle (or needle trajectory), as the primary accuracy measure [23], [24], [27], [45]. This measure is generally equivalent to d1 reported in the present study. Xu et al. report an accuracy of 2.4±1.2 mm [24] using this measure. Ukimura et al. [23] report a similar measure for phantom-based validation of the commercially available Koelis system (Koelis, La Tranche, France), but derive the overall targeting errors by combining a procedural targeting error and registration error, assuming that these errors are independent and additive. The resulting mean errors are 2.35 mm and 2.92 mm for targeting lesions visible and invisible in US images, respectively. Another phantom-based validation of a system that employs a mechanical arm to track the TRUS probe found a mean (± SD) “needle guidance error” of 2.13±1.28mm [27]. This measure appears to be identical to d1 in our study, but could equally be closer to d3 since the biopsy core is used as a reference. Bax et al. [27] also report a “biopsy localization error” of 3.87±1.81mm, defined in the same way as d2, indicating superior depth control, which may be attributed to the use of mechanical-arm to stabilize the TRUS probe.

Whilst the needle insertion depth calculated by our guidance software, used in conjunction with the depth markers on a standard biopsy needle, provides a simple and practical means of determining the insertion depth without additional needle tracking, it is evident from the analysis of the relative magnitudes of the needle-tip placement error components that the error was largest in the direction of needle insertion (see Section III and Fig. 13). This is likely to be due to the limited accuracy with which a needle can be inserted manually, using the depth markers as a guide, in addition to motion of the target due to needle insertion and firing of the biopsy gun. Furthermore, the estimated insertion depth is calculated with the assumption that the needle-tip follows a straight-line trajectory and therefore needle deflection is not taken into account. Nevertheless, because the primary aim during a biopsy procedure is to ensure that the extended needle passes through the lesion upon firing the biopsy gun, and the sample core length (15-20mm) is larger than the lesion diameter when sampling small lesions – a situation where computer-aided targeting potentially confers most clinical benefit – the needle insertion depth error is arguably not as critical as the errors in the perpendicular directions, which determine the needle trajectory. The results for the predicted tissue CCL and cancer hit rates, summarised in Table 1, suggest that the overall targeting performance was improved significantly by the proposed system despite the lack of significant change in the error component in the y direction in d2, but further studies are required to investigate this issue and a possibly to develop a more accurate method for needle-tip depth control.

In common with other accuracy validation studies reported in the literature, as well as those carried out for CE-marking and FDA-approval of commercial guidance systems, an US/MRI-compatible prostate phantom was used in this study to investigate targeting errors. Such phantoms have the benefit of providing a well-controlled environment for validation experiments, and contain clearly distinguished structures with a known geometry that are not subject to imaging artefacts found in clinical practice. However, these characteristics mean that errors are not necessarily indicative of those encountered in clinical practice where poorer image quality and therefore poorer anatomical visualization can adversely affect biopsy targeting. Furthermore, although the phantoms used in this and most other studies move and deform in response to TRUS transducer pressure and needle insertion, this motion is not representative of in vivo prostate motion, which introduces a further significant source of error. A phantom with mechanical characteristics closer to those of in vivo tissue, such as the one described by Hungr et al. [46], would allow the impact of tissue deformation on needle placement accuracy to be studied under more realistic conditions. Nevertheless, evidence from an earlier study evaluating our plan-to-image registration algorithm using patient MR/3DUS data from TRUS-guided transperineal biopsy and HIFU procedures [35] demonstrate a median target registration error of 2.40mm, which is comparable to the registration accuracy found in this study (a mean of 2.0mm). It is also important to note that the reported validation results are based on a manual segmentation of the prostate phantom and biopsy needles, which introduces localisation errors. To further investigate the influence of such errors on d1, d2 and d3, 10,000 Monte-Carlo simulations were ran with each 3D independent, isotropic Gaussian noise added to the manually-identified coordinates of the gland centers and the needle tips. As the error level increased to 2mm RMS, no statistically significant difference was found between the resulting overall targeting errors and those reported in Table 1, with p-values being 0.11, 0.81 and 0.08 based on K-S tests for all the lesions. For example, using computer-assisted targeting of the virtual lesions (corresponding to 5th row in Table 1), the mean errors and [5%-95%] CIs are 4.0 [0.9-8.3], 9.8 [2.3-19.7] and 4.6 [0.9-8.8] mm, with p-values being 0.61, 0.96 and 0.85 when testing significant difference for d1, d2 and d3, respectively. Given the reported 3D TRUS reconstruction error being ~0.2 mm, which includes a similar line/boundary segmentation error as well as TRUS reconstruction error, we can reasonably conclude that, in this case, this 3D US needle tip and lesion center localisation errors have no significant effect on the overall targeting errors.

A number of commercial biopsy guidance systems employ EM tracking as a means of localizing an US probe in 3D in real-time. Examples include the Philips PercuNav system (Philips Healthcare, Best, The Netherlands), the InVivo UroNav (InVivo, Gainesville, FL USA), the Hitachi HI-RVS (Hitachi Medical Systems, Northamptonshire, UK), and the Ultrasonix SonixGPS (Analogic Corp., MA, USA) devices. A key design decision for the system described in this paper is the adoption of optical tracking. This decision was largely motivated by our experience from attempts to use EM tracking, which resulted in noisy tracking measurements and unacceptably high tracking errors (>5mm / 5°), and using an NDI Aurora EM tracker (Northern Digital Inc., Ontario, Canada) it was not possible to measure the sensor position within 10 cm of the US probe, with a “Bad Fit” error being reported by the system when the transducer motor is active. We attribute these findings largely to the presence of an electric motor in the specific type of transducer used, which generates both static and oscillating EM fields in the vicinity of the EM sensor. Similar observations are reported by [36], and it appears that at least some commercial EM tracking devices can be very sensitive to EM radiation generated by certain types of US probe. Optical probe/instrument tracking has been employed successfully in other surgical navigation applications [39], and is generally accepted to achieve superior accuracy and robustness (in terms of relative insensitivity to environmental factors) in real-world clinical environments compared with EM tracking [37]. Although with careful calibration and set-up EM tracking devices can achieve errors within 2mm / 2°, the difficulty in identifying and controlling sources of significant interference present in clinical environments so that consistently high levels of accuracy are achieved presents a barrier to widespread clinical adoption of such technologies, at least for applications where high accuracy is required. Notwithstanding the situation of optical occlusion, which is easily detected by the control software, the accuracy of optical tracking devices, such as the one used in this study, is highly stable in comparison. Therefore, although EM tracking provides a low-cost and convenient solution for US-based surgical guidance systems, optical tracking offers the potential advantages of wider compatibility with different types of US transducer, higher accuracy, and greater reliability in clinical use. An important issue, however, is maintaining the line-of-sight between the markers mounted on the US probe and the tracking camera. We found that maintaining a line-of-sight for this application is feasible for tumor-targeted biopsy, although further research is required to optimize clinical protocols and to determine sub-groups of patients in which line-of-sight issues are problematic. Added flexibility in maneuvering the US probe may be achieved by employing additional tracking markers and/or cameras.

We found that adequate US image quality during the initial volume acquisition is important to achieve a good initial registration with an MRI-derived 3D model. In particular, the boundary of the prostate should be distinguishable and the prostate should fall within the field-of-view of the reconstructed volume. The system incorporates a simple initialization step in which the TRUS probe is orientated manually so that the prostate lies within the field of view of the first captured TRUS volume and is aligned approximately with the plan without the requirement for capsule segmentation or identification of anatomical landmark points from US images. This procedure provides an approximate starting estimate for the automatic image registration algorithm and can be performed easily and quickly by a clinical operator. If the prostate does not entirely fit within the acquired TRUS volume, this could compromise the (plan-to-image) registration accuracy. In such cases, the operator has the option either to perform a rigid registration, rather than a deformable registration, which is likely to be more robust to the missing data, or to manually register the biopsy plan by moving the TRUS probe and use the probe tracking data to reposition the prostate model. This measure was not required for the phantom procedures reported here.

Following a successful initial registration, the subsequent use of 3D transducer tracking with graphical feedback means that navigation is much less dependent on operator skill and US image quality during a biopsy procedure, indicated by the significant reduction in variance in targeting error due to the use of the computer-assisted 3D guidance. The relative lack of dependence on image quality to visualize the prostate for navigational purposes means that, after the initial volume acquisition, applying increased probe pressure to improve acoustic coupling and image quality as the probe is re-orientated is unnecessary, and this is likely to help by reducing patient discomfort. However, it is still recommended that the operator refers to the real-time 2D TRUS images provided by the US scanner during a 3D guided procedure, particularly during needle insertion, to provide additional information to verify the probe/needle location and ensure safe usage. The use of a navigational feedback in the form of 3D graphical display of the live 3D position and orientation of the probe and predicted needle trajectory relative to the current plan/target differs from the solution used in many other systems, such as the Ultrasonix SonixGPS (Analogic Corp., MA, USA), which present this information as a graphical overlay on live 2D US images. Displaying a 3D representation of the probe, needle, gland, and biopsy plan in this way provides an intuitive and easy to control means of navigation with the benefit that the need to capture US images in real-time and synchronize these images with tracking data is avoided.

V. Conclusion

The proposed guidance system and accompanying workflow have been found to be clinically feasible from laboratory phantom studies, but further work is required to fully evaluate the guidance system using patient data to provide clinical evidence to support its effectiveness for targeted transrectal prostate biopsy. This will be the focus of future research.

Acknowledgments

This project was funded by Prostate Cancer UK research project (Grant Code PG10-30) with additional financial support from the Health Innovation Challenge (HIC) Fund (Grant Ref. HICF-T4-310). V. Kasivisvanathan is funded by a Doctoral Research Fellowship (Ref. DRF-2014-07-146) from the National Institute for Health Research. This publication presents independent research supported by the HIC Fund, a parallel funding partnership between the Department of Health and the Wellcome Trust. The views expressed in this publication are those of the authors and not necessarily those of the Department of Health, Wellcome Trust, NHS or the National Institute for Health Research. The research was undertaken at UCL/ULCH who received a proportion of funding from the Department of Health’s NIHR Biomedical Research Centres funding scheme. The authors would like to thank Dr. David Atkinson from the UCL Centre for Medical Imaging for his help in MR imaging the phantom used in this study.

Contributor Information

Yipeng Hu, UCL Centre for Medical Image Computing, Dept. of Medical Physics and Biomedical Engineering, University College London, 2nd Floor, Malet Place Engineering Building, Gower Street, London, WC1E 6BT.

Veeru Kasivisvanathan, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU.

Lucy A. M. Simmons, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU

Matthew J. Clarkson, UCL Centre for Medical Image Computing, Dept. of Medical Physics and Biomedical Engineering, University College London, 2nd Floor, Malet Place Engineering Building, Gower Street, London, WC1E 6BT

Stephen A. Thompson, UCL Centre for Medical Image Computing, Dept. of Medical Physics and Biomedical Engineering, University College London, 2nd Floor, Malet Place Engineering Building, Gower Street, London, WC1E 6BT

Taimur T. Shah, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU

Hashim U. Ahmed, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU

Shonit Punwani, UCL Centre for Medical Imaging, Division of Medicine, University College London, 3rd Floor, 250 Euston Road, London NW1 2PG.

David J. Hawkes, UCL Centre for Medical Image Computing, Dept. of Medical Physics and Biomedical Engineering, University College London, 2nd Floor, Malet Place Engineering Building, Gower Street, London, WC1E 6BT

Mark Emberton, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU.

Caroline M. Moore, Division of Surgical and Interventional Sciences, University College London, 4th Floor, 74 Huntley Street, London WC1E 6AU

Dean C. Barratt, UCL Centre for Medical Image Computing, Dept. of Medical Physics and Biomedical Engineering, University College London, 2nd Floor, Malet Place Engineering Building, Gower Street, London, WC1E 6BT

References

  • [1].Siegel R, Miller K, Jemal A. Cancer statistics, 2015. CA Cancer J Clin. 2015;65(1):29. doi: 10.3322/caac.21254. [DOI] [PubMed] [Google Scholar]
  • [2].Djavan B, Margreiter M. Biopsy standards for detection of prostate cancer. World Journal of Urology. 2007;25(1):11–17. doi: 10.1007/s00345-007-0151-1. [DOI] [PubMed] [Google Scholar]
  • [3].Mottet N, et al. EAU guidelines on prostate cancer Part II: Treatment of advanced, relapsing, and castration-resistant prostate cancer. Eur Urol. 2011;59(4):572–83. doi: 10.1016/j.eururo.2011.01.025. [DOI] [PubMed] [Google Scholar]
  • [4].van Nagell JR, et al. Ultrasound and assessment of ovarian cancer risk. Cancer. 2013;37(2):408–14. [Google Scholar]
  • [5].Scattoni V, et al. Extended and saturation prostatic biopsy in the diagnosis and characterisation of prostate cancer: a critical analysis of the literature. Eur Urol. 2007 Nov;52(5):1309–1322. doi: 10.1016/j.eururo.2007.08.006. [DOI] [PubMed] [Google Scholar]
  • [6].Cookson MS, et al. Correlation between Gleason score of needle biopsy and radical prostatectomy specimen: Accuracy and clinical implications. J Urol. 1997 Feb;157(2):559–562. [PubMed] [Google Scholar]
  • [7].King CR, et al. Extended prostate biopsy scheme improves reliability of Gleason grading: Implications for radiotherapy patients. Int J Radiat Oncol Biol Phys. 2004 Jul;59(2):386–391. doi: 10.1016/j.ijrobp.2003.10.014. [DOI] [PubMed] [Google Scholar]
  • [8].Cooperberg MR, et al. The changing face of low-risk prostate cancer: Trends in clinical presentation and primary management. J Clin Oncol. 2004 Jun;22(11):2141–2149. doi: 10.1200/JCO.2004.10.062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [9].Moyer VA. Screening for prostate cancer: US preventive services task force recommendation statement. Ann Intern Med. 2012 Jul;157(2):120–134. doi: 10.7326/0003-4819-157-2-201207170-00459. [DOI] [PubMed] [Google Scholar]
  • [10].Dickinson L, et al. Magnetic resonance imaging for the detection, localisation, and characterisation of prostate cancer: Recommendations from a European consensus meeting. Eur Urol. 2011 Apr;59(4):477–494. doi: 10.1016/j.eururo.2010.12.009. [DOI] [PubMed] [Google Scholar]
  • [11].Robertson NL, Emberton M, Moore CM. MRI-targeted prostate biopsy: a review of technique and results. Nat Rev Urol. 2013 Oct;10(10):589–97. doi: 10.1038/nrurol.2013.196. [DOI] [PubMed] [Google Scholar]
  • [12].Moore CM, et al. Image-guided prostate biopsy using magnetic resonance imaging-derived targets: A systematic review. Eur Urol. 2013;63(1):125–140. doi: 10.1016/j.eururo.2012.06.004. [DOI] [PubMed] [Google Scholar]
  • [13].Baco E, et al. A Randomized Controlled Trial to Assess and Compare the Outcomes of Two-core Prostate Biopsy Guided by Fused Magnetic Resonance and Transrectal Ultrasound Images and Traditional 12-core Systematic Biopsy. European Urology. 2016;69(1):149–156. doi: 10.1016/j.eururo.2015.03.041. [DOI] [PubMed] [Google Scholar]
  • [14].Siddiqui MM, et al. Comparison of MR/ultrasound fusion-guided biopsy with ultrasound-guided biopsy for the diagnosis of prostate cancer. Jama. 2015 Jan;313(4):390–7. doi: 10.1001/jama.2014.17942. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].Sonn GA, et al. Targeted biopsy in the detection of prostate cancer using an office based magnetic resonance ultrasound fusion device. J Urol. 2013;189(1):86–91. doi: 10.1016/j.juro.2012.08.095. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Marks L, Young S, Natarajan S. MRI-ultrasound fusion for guidance of targeted prostate biopsy. Curr Opin Urol. 2013 Jan;23(1):43–50. doi: 10.1097/MOU.0b013e32835ad3ee. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [17].Pondman KM, et al. MR-Guided Biopsy of the Prostate: An Overview of Techniques and a Systematic Review. Eur Urol. 2008;54(3):517–527. doi: 10.1016/j.eururo.2008.06.001. [DOI] [PubMed] [Google Scholar]
  • [18].Mozer PC, Partin AW, Stoianovici D. Robotic image-guided needle interventions of the prostate. Rev Urol. 2009 Jan;11(1):7–15. [PMC free article] [PubMed] [Google Scholar]
  • [19].Ho H, et al. Robotic transperineal prostate biopsy: Pilot clinical study. Urology. 2011 Nov;78(5):1203–1208. doi: 10.1016/j.urology.2011.07.1389. [DOI] [PubMed] [Google Scholar]
  • [20].Cool D, et al. Design and evaluation of a 3D transrectal ultrasound prostate biopsy system. Med Phys. 2008;35(10):4695–4707. doi: 10.1118/1.2977542. [DOI] [PubMed] [Google Scholar]
  • [21].Guo Y, et al. Image Registration Accuracy of a 3-Dimensional Transrectal Ultrasound-Guided Prostate Biopsy System. J Ultrasound Med. 2009;28(11):1561–1568. doi: 10.7863/jum.2009.28.11.1561. [DOI] [PubMed] [Google Scholar]
  • [22].Long J-A, et al. Real-time three-dimensional (4D) ultrasound-guided prostatic biopsies on a phantom Comparative study versus 2D guidance. Prog en Urol J lAssociation Fr durologie la Soc Fr durologie. 2007;17(7):1337–1342. doi: 10.1016/s1166-7087(07)78573-1. [DOI] [PubMed] [Google Scholar]
  • [23].Ukimura O, et al. 3-Dimensional Elastic Registration System of Prostate Biopsy Location By Real-Time 3-Dimensional Transrectal Ultrasound Guidance With Magnetic Resonance/Transrectal Ultrasound Image Fusion. J Urol. 2012;187(3):1080–1086. doi: 10.1016/j.juro.2011.10.124. [DOI] [PubMed] [Google Scholar]
  • [24].Xu S, et al. Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies. Comput Aided Surg. 2008;13(5):255–264. doi: 10.1080/10929080802364645. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Pinto PA, et al. Magnetic resonance imaging/ultrasound fusion guided prostate biopsy improves cancer detection following transrectal ultrasound biopsy and correlates with multiparametric magnetic resonance imaging. J Urol. 2011 Oct;186(4):1281–1285. doi: 10.1016/j.juro.2011.05.078. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Logan JK, et al. Current status of magnetic resonance imaging (MRI) and ultrasonography fusion software platforms for guidance of prostate biopsies. BJU Int. 2014;114(5):641–652. doi: 10.1111/bju.12593. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [27].Bax J, et al. Mechanically assisted 3D ultrasound guided prostate biopsy system. Med Phys. 2008;35(12):5397–5410. doi: 10.1118/1.3002415. [DOI] [PubMed] [Google Scholar]
  • [28].Fenster A, Downey DB, Cardinal HN. Three-dimensional ultrasound imaging. Phys Med Biol. 2001;46(5):R67–R99. doi: 10.1088/0031-9155/46/5/201. [DOI] [PubMed] [Google Scholar]
  • [29].Baumann M, et al. Prostate biopsy assistance system with gland deformation estimation for enhanced precision. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 2009;5761 LNCS(PART 1):67–74. doi: 10.1007/978-3-642-04268-3_9. [DOI] [PubMed] [Google Scholar]
  • [30].Solberg OV, et al. Freehand 3D Ultrasound Reconstruction Algorithms-A Review. Ultrasound in Medicine and Biology. 2007;33(7):991–1009. doi: 10.1016/j.ultrasmedbio.2007.02.015. [DOI] [PubMed] [Google Scholar]
  • [31].Treece G, et al. Correction of probe pressure artifacts in freehand 3D ultrasound. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 2001;2208:283–290. [Google Scholar]
  • [32].De Gorski A, et al. Accuracy of magnetic resonance imaging/ultrasound fusion targeted biopsies to diagnose clinically significant prostate cancer in enlarged compared to smaller prostates. J Urol. 2015;194(3):669–673. doi: 10.1016/j.juro.2015.03.025. [DOI] [PubMed] [Google Scholar]
  • [33].Hu Y, et al. Modelling prostate motion for data fusion during image-guided interventions. Med Imaging. 2011;30(11):1887–1900. doi: 10.1109/TMI.2011.2158235. [DOI] [PubMed] [Google Scholar]
  • [34].Hu Y, et al. Population-based prediction of subject-specific prostate deformation for MR-to-ultrasound image registration. Med Image Anal. 2015;26(1):332–344. doi: 10.1016/j.media.2015.10.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [35].Hu Y, et al. MR to ultrasound registration for image-guided prostate interventions. Med Image Anal. 2012;16:687–703. doi: 10.1016/j.media.2010.11.003. [DOI] [PubMed] [Google Scholar]
  • [36].Hastenteufel M, et al. Effect of 3D ultrasound probes on the accuracy of electromagnetic tracking systems. Ultrasound Med Biol. 2006;32(9):1359–1368. doi: 10.1016/j.ultrasmedbio.2006.05.013. [DOI] [PubMed] [Google Scholar]
  • [37].Franz AM, et al. Electromagnetic tracking in medicine -A review of technology, validation, and applications. IEEE Trans Med Imaging. 2014;33(8):1702–1725. doi: 10.1109/TMI.2014.2321777. [DOI] [PubMed] [Google Scholar]
  • [38].Wiles AD, Thompson DG, Frantz DD. Accuracy assessment and interpretation for optical tracking systems. Medical Imaging 2004. 2004;5367:421–432. [Google Scholar]
  • [39].Khadem R, et al. Comparative tracking error analysis of five different optical tracking systems. Comput Aided Surg. 2000;5(2):98–107. doi: 10.1002/1097-0150(2000)5:2<98::AID-IGS4>3.0.CO;2-H. [DOI] [PubMed] [Google Scholar]
  • [40].Barratt DC, et al. Accuracy of an electromagnetic three-dimensional ultrasound system for carotid artery imaging. Ultrasound Med Biol. 2001;27(10):1421–1425. doi: 10.1016/s0301-5629(01)00447-1. [DOI] [PubMed] [Google Scholar]
  • [41].Poon TC, Rohling RN. Comparison of calibration methods for spatial tracking of a 3-D ultrasound probe. Ultrasound Med Biol. 2005;31(8):1095–1108. doi: 10.1016/j.ultrasmedbio.2005.04.003. [DOI] [PubMed] [Google Scholar]
  • [42].Bergmeir C, et al. Comparing calibration approaches for 3D ultrasound probes. Int J Comput Assist Radiol Surg. 2009;4(2):203–213. doi: 10.1007/s11548-008-0258-x. [DOI] [PubMed] [Google Scholar]
  • [43].Dickinson L, et al. Image-directed, tissue-preserving focal therapy of prostate cancer: a feasibility study of a novel deformable magnetic resonance-ultrasound (MR-US) registration system. BJU Int. 2013;112(5):594–601. doi: 10.1111/bju.12223. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [44].Hu Y, et al. Deformable Vessel-Based Registration Using Landmark-Guided Coherent Point Drift. Med Imaging Augment Real. 2010:60–69. [Google Scholar]
  • [45].De Silva T, et al. Quantification of prostate deformation due to needle insertion during TRUS-guided biopsy. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 2010;6363 LNCS(PART 3):213–220. doi: 10.1007/978-3-642-15711-0_27. [DOI] [PubMed] [Google Scholar]
  • [46].Hungr N, et al. A realistic deformable prostate phantom for multimodal imaging and needle-insertion procedures. Medical physics. 2012;39(4):2031–2041. doi: 10.1118/1.3692179. [DOI] [PubMed] [Google Scholar]

RESOURCES