Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Dec 1.
Published in final edited form as: Proc IEEE Int Symp Biomed Imaging. 2008 Jun 13;2008:780–783. doi: 10.1109/ISBI.2008.4541112

INTENSITY-BASED REGISTRATION OF PROSTATE BRACHYTHERAPY IMPLANTS AND ULTRASOUND

Z Karimaghaloo 1, G Fichtinger 2, DG Gobbi 2, EC Burdette 3, RN Rohling 4, P Abolmaesumi 1,2
PMCID: PMC2995282  NIHMSID: NIHMS247358  PMID: 21132062

Abstract

Purpose

In prostate brachytherapy, determining the 3D location of the seeds relative to surrounding structures is necessary for calculating dosimetry. Ultrasound imaging provides the ability to visualize soft tissues, and implanted seeds can be reconstructed from C-arm fluoroscopy. Registration between these two complementary modalities would allow us to make immediate provisions for dosimetric deviation from the optimal implant plan.

Methods

We propose intensity-based registration between ultrasound and a reconstructed model of seeds from fluoroscopy. The ultrasound images are pre-processed with recursive thresholding and phase congruency. Then a 3D ultrasound volume is reconstructed and registered to the implant model using mutual information.

Results

A standard training phantom was implanted with 49 seeds. Average registration error between corresponding seeds relative to the ground truth is 0.09 mm. The effect of false positives in ultrasound was investigated by masking seeds from the fluoroscopy reconstructed model. The registration error remained below 0.5 mm at a rate of 30% false positives.

Conclusion

Our method promises to be clinically adequate, where requirements for registration is 1.5 mm.

Index Terms: Prostate brachytherapy, Ultrasound, Fluoroscopy, Registration

1. INTRODUCTION

Prostate cancer is the most commonly diagnosed cancer and the second leading cause of cancer death in North American men [1]. Brachytherapy is a definitive treatment option for early stage prostate cancer. The treatment involves permanent implantation of radioactive sources (seeds) into the prostate where they deliver a highly localized radiation dose to destroy cancer. The effectiveness of brachytherapy primarily relies on our ability to adjust the dosimetry of seeds with respect to the prostate and surrounding tissues.

Ultrasound (US) imaging visualizes soft tissues but at the same time, it tends to be noisy and to suffer from artifacts. Therefore, exact localization of the seeds in US images is challenging. Previous studies have addressed the issue of segmentation and registration of seeds directly from ultrasound. Mitri et al. [2] used vibro-acoustography to vibrate and detect the seeds in a gel phantom. Yue et al. [4] utilized distributed constant false alarm rate (CFAR) processors and orientation-sensitive morphological filtering to locate the seeds in B-mode US images. Ding et al. [3] used the needle location to predict approximate seed locations to confine elaborate local segmentation. Wen et al. [9] developed a seed detection method using the power spectrum of raw radio frequency US data. None of these approaches have yielded a clinically viable solution for seed segmentation and reconstruction from US.

C-arm fluoroscopy is routinely used in brachytherapy for qualitative (visual) assessment of the implant. Recently, reliable and accurate reconstruction of seeds from fluoroscopy has also become possible [11, 14, 19]. As fluoroscopy does not show soft tissues contrast, registration of fluoroscopy to ultrasound (RUF) is a logical clinical alternative [5, 6, 7, 8, 15, 16].

Zaider et al. [6] suggested affixing radio-opaque fiducials to the US probe, thereby permanently altering standard clinical equipment. Jain et al. [15] proposed precision machined fiducial structure calibrated to the needle guide template. Gong et al. [5] used needle tips as fiducials for the registration. Since C-arm and US imaging is non-concurrent and seeds are known to dislocate due mechanical forces and edema, these approaches are not sufficiently reliable and practical. Su et al. and and Tutar et al. suggested point based registration between two clouds of seeds, one reconstructed from fluoroscopy and one from US [7, 16]. This approach, besides being highly sensitive to false positives and deformations, is also dependent on exact segmentation of the seeds from ultrasound. However, the generally poor quality of US images (due to noise, speckle, acoustic decoupling, calcifications masquerading as seeds, shadowing, multiple reflections, etc.) makes seed segmentation highly unreliable and prone to error. Segmentation errors are propagated to the final registration [16] or may even trap the registration algorithm in local minima [7]. In contrast to prior art, we apply intensity-based registration between the US volume and seeds reconstructed from fluoroscopy. We pre-process the US data with recursive thresholding and phase congruency, then apply standard mutual information registration.

2. METHODOLOGY

Our processing pipeline is summarized in Figure 1. We preprocess the 2D US images, reconstruct a 3D US volume, and finally apply 3D rigid body mutual information registration to align the US volume with the cloud of seeds previously reconstructed from C-arm fluoroscopy.

Fig. 1.

Fig. 1

Registration framework.

2.1. Ultrasound Pre-processing

In ultrasound, low signal-to-noise ratio, shadows, reflections, etc. tend to obscure seeds and also result in artifacts that appear to be seeds, a.k.a. false positives. Reducing noise and enhancing the features of true seeds therefore are highly desirable for reliable registration.

2.1.1. Noise Reduction

Based on the intuition that brighter areas in the image are more likely to contain seeds, all pixels with less than the average intensity of the US volume are colored black. The average intensity is calculated for the pixels within the smallest box containing the prostate boundary. Then, the average is recalculated and the procedure is repeated. After these successive thresholdings, intensity values are rescaled to lay between 0 and 1, thereby producing a set of intensity-based probability images.

2.1.2. Phase Congruency Method

Previously, Hacihaliloglu et al. [12] have shown that phase congruency is an effective tool for detecting the true bone edge location in US images. Based on this idea, we adapted phase congruency for pre-processing the US images to enhance the features of true seeds, i.e. to suppress artifacts and false positive appearances. Phase congruency is a method for evaluating features based on the phase rather than the amplitude information of images. Since the method gives a measure of significance for each point invariantly to image brightness or contrast, a constant threshold can be applied to extract feature points from the phase information. Hence we applied a uniform threshold for all images [17]. For extracting the seed-like regions from a single B-mode image, calculating the phase congruency of pixels provides useful information: the more symmetrical the phase of a region is, the more likely it is a seed. For calculating the phase congruency in the image, two filters are applied, an even symmetry and an odd symmetry filter. A measure of symmetry is calculated in each point based on the weighted average of coefficients resulting from applying these two filters. At symmetry points, the absolute value of the even-symmetry filter result is large and the absolute value of odd symmetry filter result is small. Thus, the measure of symmetry is defined as follows [17]:

PC(x)=onWo(x)Ano(x)ΔΦno(x)ToonAno(x)+ε, (1)

where

An(x)ΔΦn(x)=en(x)ϕe(x)¯+on(x)ϕo(x)¯|en(x)ϕo(x)¯onϕe(x)¯| (2)
ϕe(x)¯=nen(x)E(x)ϕo(x)¯=non(x)E(x) (3)

Here, o and n define the number of orientations and scales which are found empirically. We used 4 scales and 6 orientations. Wo is a weighting function, and To is for noise compensation which is calculated from the maximum output that can be generated only by considering the noise in each orientation independently. The small term ε is to avoid division by zero. en(x) and on(x) are the results of applying even and odd symmetric filters, respectively, and E(x) is the local energy function. In this paper, we used the MATLAB® implementation of this algorithm provided by P. Kovesi1.

2.2. Registration

In the operating room, ultrasound imaging and fluoroscopy are performed almost concurrently. After ultrasound data collection, the probe is retracted from the rectum, not to block seeds during fluoroscopy, causing the prostate to sag usually without apparent deformation. Thus, rigid registration should suffice. A 3D rigid body registration is performed between the pre-processed US volume and the 3D model of seeds reconstructed from fluoroscopy. Since exact localization of the seeds from US images can not be achieved reliably, we chose intensity-based registration rather than point based registration. Mutual information is used as the metric for registration.

3. EXPERIMENTS

3.1. Experimental Setup

A standard brachytherapy training phantom (CIRS Inc, Virginia) was implanted with 49 non-radioactive seeds according to a clinically realistic implant plan. Para-sagittal US images of the phantom were captured using a linear probe operating at 6.6 MHz, and the Sonix RP machine (Ultrasonix, Richmond, Canada). A dynamic reference body (DRB) was attached to the US probe and one was attached to the phantom in a way that both DRBs were visible with an OPTOTRAK Certus camera (NDI, Waterloo, Canada) used as the tracking system.

3.2. Ground Truth

Six small metal fiducials were attached to the corners of the phantom box, and their spatial position was measured with respect to the DRB on the phantom using a calibrated stylus. Since these fiducials were also visible in the CT images, the ground truth for the registration was obtained based on these fiducials.

3.3. Ultrasound Volume Reconstruction

During the scanning procedure, the position of the US probe was recorded by the tracking system with respect to the coordinate frame of the DRB on the phantom [18]. Following the earlier described pre-processing of the individual US images, based on tracking information, the images are compounded into a 3D volume using the method of Gobbi et al. [13].

3.4. Implant Reconstruction

In actual clinical setting, implanted brachytherapy seeds are reconstructed intra-operatively from C-arm fluoroscopy. In this study, we assumed that a reconstructed 3D model is available and we simulated it with binary CT data. We obtained CT images of the implanted phantom with a spacing of 0.43 × 0.43 mm of in-plane resolution and interpolated slice thickness of 0.625 mm. A constant threshold was applied to all images in a way that seeds were masked to white and everything else to black.

4. RESULTS AND DISCUSSION

For validation of the robustness of our registration, we misaligned the CT and US volumes (relative to the ground truth) by applying a random transformation (±10 degree for the rotation angle, ±5 mm for translation) to the US volume. We then registered the US volume back to the CT volume. Registration was performed using Mattes mutual information method implemented in ITK 3.4 [20]. This process was repeated 100 times using different random transformations.

The initial average error is the average distance between the correct position of the seeds (using the ground truth) and their perturbed positions. Registration error is the average distance between the correct position of the seeds and their position obtained from the registration. The average registration error achieved over 100 trials was 0.09 mm. Each registration took approximately 3 minutes on a Dell desktop computer running at 2.8 GHz with 3.5 GB of RAM. In order to examine the positive effect of phase congruency on registration, we performed the exact same experiments as above, but this time without the phase congruency filter. We only did noise reduction in the ultrasound images. The average registration error jumped to 4.2 mm, clearly attesting to the positive effect of phase congruency.

In order to validate the robustness of our method to false positives in the US images, up to 15 seeds (about 30% of the total number of seeds) in the CT volume were randomly masked. For each percentage of false positives, registration was repeated 100 times by applying to the US data the same random transformations as in the earlier experiments. According to Table 1, registration remained robust up to 30% false positives, with maximum error of 0.52 mm.

Table 1.

Registration error for up to 30 % false positives (FP) in the data set.

#of FP FP% Mean(Std) (mm) Maximum Error(mm) #of Failure

1 2.4% 0.08 (0.02) 0.22 1
2 4.08% 0.09 (0.01) 0.22 1
3 6.12% 0.11 (0.01) 0.23 1
4 8.16% 0.08 (0.02) 0.22 0
5 10.29% 0.09 (0.02) 0.22 0
6 12.24% 0.10 (0.02) 0.22 0
7 14.28% 0.11 (0.02) 0.26 0
8 16.32% 0.11 (0.02) 0.24 0
9 18.36% 0.16 (0.02) 0.36 0
10 20.4% 0.15 (0.03) 0.38 0
11 22.44% 0.14 (0.02) 0.30 0
12 24.48% 0.14 (0.02) 0.31 0
13 26.53% 0.18 (0.02) 0.42 0
14 28.57% 0.19 (0.02) 0.44 0
15 30.67% 0.22 (0.01) 0.52 0

In the current implementation, we assumed that the ultrasound and tracking data were accurately synchronized. This assumption was not necessarily valid and led to minor errors in alignment using the ground truth. In our new implementation, we have compensated for this error and experiments are underway to further analyze the accuracy of the registration technique given synchronized data sets.

The calculation of phase congruency is the most computationally intensive step of the proposed method. For each ultrasound image, this calculation currently takes about 13 seconds in MATLAB. Significant improvements in performance can be obtained by careful tuning of the parameters. For example, by recognizing the directionality of ultrasound images, it is possible to remove the phase calculations at orientations near 90° without significantly affecting accuracy. Reducing the orientations to only 0°, 30° and 150° reduces the computation time by half and increases the registration error by only 0.09 mm.

The clinical requirement for registration is about 1.5 mm, which raises strong hope that our method will produce adequate performance in actual clinical cases. Ethics board approval has been obtained for clinical validation of our method. We point out that clinical data does not contain apparent ground truth. Although we did not perform explicit segmentation of the seeds from US images, a byproduct of our registration technique is, in fact, segmentation of seeds in the US images. In clinical cases, the accuracy of registration will be validated by comparing our automated seed segmentation to manual seed segmentation by multiple expert clinicians.

Acknowledgments

We would like to thank the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council and the National Institutes of Health (NIH 1R01CA111288) for funding this project.

Footnotes

1

Peter Kovesi, The University of Western Australia, http://www.csse.uwa.edu.au

REFERENCES

  • 1.Jemal A, Siegel R, Ward E, Murray T, Xu J, Smigal C, Thun MJ. Cancer Statistics 2006. CA: A Cancer J. Clinicians. 2006;vol. 56(2):106–130. doi: 10.3322/canjclin.56.2.106. [DOI] [PubMed] [Google Scholar]
  • 2.Mitri FG, Trompette P, Cahpelon JY. Improving the use of vibro-acoustography for brachytherapy metal seed imaging: a feasibility study. IEEE Trans. Medical Imaging. 2004;vol. 23(1):1–6. doi: 10.1109/TMI.2003.819934. [DOI] [PubMed] [Google Scholar]
  • 3.Ding M, Wei Z, Gardi L, Downey DB, Fenster A. Needle and seed segmentation in intra-operative 3D ultrasound-guided prostate brachytherapy. IEEE Ultrasonics Symp. 2006;vol. 44:331–336. doi: 10.1016/j.ultras.2006.07.003. [DOI] [PubMed] [Google Scholar]
  • 4.Yue Y, Acton ST, Thornton K. Detection of radioactive seeds in ultrasound images of the prostate. IEEE Intl. Conf. Image Processing. 2001:319–322. [Google Scholar]
  • 5.Gong L, Cho PS, Han BH, Wallner KE, Sutlief SG, Pathak SD, Haynor DR, Kim Y. Ultrasonography and fluoroscopic fusion for prostate brachytherapy dosimetry. Intl. J. Radiation Biol. Phys. 2002;vol. 54:1322–1330. doi: 10.1016/s0360-3016(02)03754-9. [DOI] [PubMed] [Google Scholar]
  • 6.Zhang M, Zaider M, Worman MF, Cohen GN. On the question of 3D seed reconstruction in prostate brachytherapy: the determination of x-ray source and film locations. Phys. Med. Biol. 2004;vol. 49:335–345. doi: 10.1088/0031-9155/49/19/n03. [DOI] [PubMed] [Google Scholar]
  • 7.Su Y, Davis BJ, Furutani KM, Herman MG, Robb RA. Seed localization and TRUS-fluoroscopy fusion for intraoperative prostate brachytherapy dosimetry. Computer Aided Surgery. 2007;vol. 12(no. 1):25–34. doi: 10.3109/10929080601168239. [DOI] [PubMed] [Google Scholar]
  • 8.French D, Morris J, Keyes M, Salcudean SE. Real-time dosimetry for prostate brachytherapy using TRUS and fluoroscopy. Medical Image Computing and Computer-Assisted Intervention, LNCS 3217. 2004:983991. [Google Scholar]
  • 9.Wen X, Salcudean SE, Lawrence PD. Detection of brachytherapy seeds using ultrasound radio frequency signals. SPIE Medical Imaging. 2006;vol. 6147:J1–J8. [Google Scholar]
  • 10.Han BH, Wallner K, Merrick G, Butler W, Sutlief S, Sylvester J. Prostate brachytherapy seed identification on post-implant TRUS images. Med. Phys. 2003;vol. 30:898–900. doi: 10.1118/1.1568976. [DOI] [PubMed] [Google Scholar]
  • 11.Jain AK, Zhou Y, Mustafa T, Burdette EC, Chirikjian GS, Fitchinger G. Matching and reconstruction of brachytherapy seeds using the Hungarian algorithm (MARSHAL) Med. Phys. 2005;vol. 32(no. 11):34753492. doi: 10.1118/1.2104087. [DOI] [PubMed] [Google Scholar]
  • 12.Hacihalilogue I, Abugharibeh R, Hodgson AJ, Rohling RN. Enhancement of bone surface visualization from 3D ultrasound based on local phase information. IEEE Ultrasonics Symp. 2006:21–24. [Google Scholar]
  • 13.Gobbi DG, Peters TM. Newblock interactive intraoperative 3D ultrasound reconstruction and visualization. Medical Image Computing and Computer-Assisted Intervention, LNCS 2489. 2002:156–163. [Google Scholar]
  • 14.Tubic D, Zaccarin A, Pouliot J, Beaulieu L. Automated seed detection and three-dimensional reconstruction. Med. Phys. 2002;vol. 28(no. 11):2265–2271. doi: 10.1118/1.1414308. [DOI] [PubMed] [Google Scholar]
  • 15.Jain AK, Mustafa T, Zhou Y, Burdette EC, Chirikjian GS, Fichtinger G. FTRAC-A robust fluoroscopy tracking fiducial. Med. Phys. 2005;vol. 32:3185–3198. doi: 10.1118/1.2047782. [DOI] [PubMed] [Google Scholar]
  • 16.Tutar IB, Narayanan S, Lenz H, Nurani R, Orio P, Cho PS, Wallner K, Kim Y. Seed-based ultrasound and fluoroscopy registartion using iterative optimal assignment for intraoperative prostate brachytherapy dosimetry. SPIE Medical Imaging. 2004:65091–650914. [Google Scholar]
  • 17.Kovesi P. Vider: A Journal of Computer Vision Research. no. 3. vol. 1. MIT press; 1999. Image feature from phase congruency; pp. 1–27. [Google Scholar]
  • 18.Chen TK, Abolmaesumi P, Thurston AD, Ellis RE. Automated 3D freehand ultrasound calibration with real-time accuracy control. Medical Image Computing and Computer-Assisted Intervention. 2006:899–906. doi: 10.1007/11866565_110. [DOI] [PubMed] [Google Scholar]
  • 19.Todor DA, Cohen GN, Worman MF, Zelefsky MJ. Intraoperative dynamic dosimetry for prostate implants. Phys. Med. Biol. 2003;vol. 48(no. 99):1153–1171. doi: 10.1088/0031-9155/48/9/306. [DOI] [PubMed] [Google Scholar]
  • 20.Mattes D, Haynor DR, Vesselle H, Lewellen TK, Eubank W. PET-CT image registration in the chest using free-form deformations. IEEE Trans. Medical Imaging. 2003;vol. 22(no. 1):120–128. doi: 10.1109/TMI.2003.809072. [DOI] [PubMed] [Google Scholar]

RESOURCES