Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2011 Jan 11.
Published in final edited form as: Ultrason Imaging. 2010 Apr;32(2):118–127. doi: 10.1177/016173461003200205

Simulation of Autonomous Robotic Multiple-Core Biopsy by 3D Ultrasound Guidance

Kaicheng Liang 1, Albert J Rogers 1, Edward D Light 1, Daniel von Allmen 2, Stephen W Smith 1
PMCID: PMC3018680  NIHMSID: NIHMS258972  PMID: 20687279

Abstract

An autonomous multiple-core biopsy system guided by real-time 3D ultrasound and operated by a robotic arm with 6+1 degrees of freedom has been developed. Using a specimen of turkey breast as a tissue phantom, our system was able to first autonomously locate the phantom in the image volume and then perform needle sticks in each of eight sectors in the phantom in a single session, with no human intervention required. Based on the fraction of eight sectors successfully sampled in an experiment of five trials, a success rate of 93% was recorded. This system could have relevance in clinical procedures that involve multiple needle-core sampling such as prostate or breast biopsy.

Keywords: 3D ultrasound, computer-aided diagnosis, surgical robotics

INTRODUCTION

Multiple-core needle biopsy is a common technique used in the investigation of breast lesions1 and is an irreplaceable procedure in the diagnosis of prostate cancer.2 Prostate cancer is the second most common cause of cancer death among American men and the American Cancer Society has predicted that there will be 192,280 newly diagnosed cases of prostate cancer and 27,360 deaths due to the disease in the year 2009.3 Each year, approximately one million patients undergo a transrectal ultrasound (TRUS) guided-needle biopsy procedure in order for the disease to be diagnosed; despite undergoing significant improvements in technique over the past two decades, the process continues to be associated with discomfort and a number of complications, including bleeding and infection.2, 4

Present-day clinical protocol for prostate biopsy involves manual operation of a TRUS probe producing 2-dimensional images, with an attached biopsy guide.5 Transrectal probes are generally either the side-viewing or forward-viewing variety; the former group consists of either transverse/axial or longitudinal/sagittal scanners while the latter may produce transverse or longitudinal scans depending on the angle of rotation of the probe.6 These 2D scanners may be radial, producing transverse image slices, have linear arrays that produce a sagittal plane view or use mechanical sector scanning that can image the prostate in either of these planes.7 Ongoing product development testing the feasibility of 3D US androbotic systems in prostate biopsy have shown promise. The Voluson 3D US endorectal probe (General Electric) has been found effective in prostate-biopsy studies.8 Mechanical systems that demonstrate high repeatability of needle placement have been developed but require a clinician to perform considerable positioning adjustments and consequently, manual selection of biopsy sites.911 Envisioneering Medical Technologies (St. Louis, MO) offers a prostate-biopsy system employing both 3D TRUS and a fixed-biopsy carriage that uses a computerized rotating mechanism to reach biopsy sites selected by a human operator.10 Other nonautonomous robots with more degrees of freedom have also been proposed, with the biopsy needle designed to access the prostate either in the usual transrectal fashion or transperineally, in a brachytherapy-type application.9, 11

Previously, we used a real-time 3-dimensional (RT3D) laparoscopic ultrasound (US) probe to measure the coordinates of high-contrast objects and direct a robot in targeting those objects and noted the potential use of this system in surgical applications.12 In a later study, using only 3D US guidance combined with simple image-thresholding algorithms, the robot was successfully directed to both small hyperechoic targets and large anechoic regions, simulating calcifications and cysts, respectively, in an ultrasound phantom, with no need for human intervention.13 Subsequently, the 3D color Doppler system was tested for the autonomous location of magnetically-vibrated ferrous shrapnel.14 Most recently, in excised samples of turkey breast, we tested the feasibility of an RT3D ultrasound-guided autonomous robotic breast biopsy.15

In this paper, we propose an RT3D ultrasound-guided multiple core biopsy to be autonomously conducted by a robotic arm. The device in our feasibility study produced multiple widely-distributed needle sticks in simulated organ tissue (excised turkey breast) to mimic current clinical practice such as the sextant and other multiple-core schemes for the prostate.16 These results suggest that an image-guided autonomous robot could perform a biopsy procedure commonly performed in clinics and hospitals at potentially much lower cost by reducing the need for human control of the biopsy process.

MATERIALS AND METHODS

The Duke/VMI 3D scanner (Volumetrics Medical Imaging, Durham, NC) produces a 65°–90° pyramidal scan at rates up to 30 volumes per second.17,18 The scanner simultaneously displays two B-scans and up to three parallel C-scans at any angle and depth as well as real-time 3D volume rendering, 3-D color and 3-D pulsed Doppler. Each 3D US scan is comprised of 64 × 64 = 4096 B-mode lines, including 512 samples per image line yielding 2 Mbytes of data per 3D scan. Echo amplitude is eight bits, ranging from 0 to 255. For this study, we used in tandem with our 3D scanner a forward-viewing endoscopic matrix array transducer that includes 504 transmit channels and 252 receive channels operating at 4.5 MHz with an aperture of 6.48 mm, as previously described by Light et al.19 As shown in the photograph of figure 1, the probe includes a 10 mm diameter footprint plus a needle accessory port (2 mm diameter) in parallel alignment with the axis of the 3D scan, constituting an assembly of diameter 12 mm. The axial resolution of the transducer probe was 0.83 mm and the lateral resolution was 2.1 mm at a depth of 40 mm.

FIG. 1.

FIG. 1

Forward-viewing transducer.

The robot used was an iARM Assistive Robotic Manipulator with 6 + 1 degrees of freedom manufactured by Exact Dynamics BV (Didam, The Netherlands). The robot arm is primarily intended for mounting on a wheelchair or bed to be controlled by physically-handicapped persons using a keypad or j oystick, but can also accept input commands and coordinates from a computer in a separate software mode. Robot coordinates are read and written in arrays of six values, three of which describe Cartesian coordinates and three describe the orientation of the robot gripper hand in quantitative measurements of ‘yaw’, ‘pitch’ and ‘roll.’ Figure 2a illustrates the six joints of the robot numbered for convenience. The seventh degree of freedom corresponds to the gripping ability of the robot. The robot arm may be folded into a compact form or unfolded as required, has a reach of 80 cm and each of the six joints may rotate a full 360°, except joint 5 which has a 120° arc of movement. The robot can be instructed to move either individual or multiple joints at specified angular velocities or it may also move the center point of the gripper in Cartesian directions given either distance or velocity inputs. Note that a Cartesian movement by the robot would involve some or all of its joints. The robot arm was permanently mounted to a portable table which could be moved to a patient bedside.

FIG. 2.

FIG. 2

Robot, with schematic (a) and experimental setup (b).

The tip of an echogenic biopsy needle (18G, 15 cm, Cook Medical, Bloomington, IN) was secured within the needle port on the transducer, simulating a clinical end-firing ultrasound probe with attached biopsy-needle guide, such as one used in TRUS biopsy. The transducer was then held in the robot gripper and bound tightly with cable ties to ensure the transducer and its needle did not shift during the biopsy, simulating the firm grip of a human operator. The center axis of the transducer was aligned to run directly through the rotational axis of the gripper such that rotation of that joint would result in the needle tip tracing out a circle of diameter 14 mm centered on that same axis. It was also important to initially align the transducer face parallel to the x-y plane of the table. A standardized unfolded position and orientation of the gripper, as shown in photograph of figure 2b, was used as the initial prebiopsy configuration of the robot throughout this study.

The tissue phantom used was a specimen of boneless turkey breast obtained from a butcher. The phantom was 50 mm long, 40 mm wide and 20 mm thick, simulating the flat outer hemisphere of a target organ. The phantom was secured to an acoustically-absorbing rubber base in a water bath with multiple pins to prevent it from moving during the biopsy.

Once the robot had assumed its initial unfolded position during 3D scanning of the phantom, at a user-controlled trigger, a complete 3D image volume of echo data in depth r, azimuth angle θ and elevation angle ϕ was transmitted from the 3D scanner, with no user selection of image slices, to a computer running our MATLAB (Mathworks, Natick, MA) image segmentation algorithm. In order to obtain a clear voxel plot of the phantom surface while eliminating noncontiguous noise, a ‘first-arrival’ thresholding algorithm was applied to the 3D image volume of the entire tissue phantom such that only the first voxel in each image line of the pyramidal scan exceeding a preset threshold would be retained. The threshold was fixed at half the maximum possible voxel intensity, found to be a convenient cutoff to both eliminate low intensity noise and retain sufficient information for a surface reconstruction. A 3D voxel plot, such as that of figure 3 was then obtained.

FIG. 3.

FIG. 3

Voxel plot of phantom surface. The asterisk represents the algorithm’s approximation of the surface center. The large dots are sampled voxels distributed around the center.

The coordinates of the center of the phantom surface (asterisk) was approximated by taking the arithmetic mean of the voxel coordinates in the x and y axes and then finding the voxel whose x and y coordinates were closest to those calculations. These coordinates would be later used for the autonomous centering of the phantom in the transducer’s field of view by the robot. This centering task is not dissimilar to the slight displacements of a TRUS probe made by the clinician such that the prostate is in an appropriate position for scanning and biopsy. The depths of eight equally-distributed voxels (large dots) found 12 mm away from the center as measured in the x-y plane were sampled. These eight voxels were not identified for the purpose of specific targeting by the robot but were simply used for depth approximation. The mean of these values was taken as an approximation for the distance between the transducer face and the expected locations of the tissue surface to be sampled. This distance was thus the depth of the vertical robotic movement to be performed between the initial unfolded position, simulating the probe assembly in the scanning, prebiopsy phase, and the penetration of the needle in the tissue, simulating the biopsy sampling phase. This quantity would later be used to determine the extent of the robotic arm rotation.

Before proceeding with the biopsy, the robot used slight Cartesian movement of a few mm in the x and y directions to center the transducer and its needle directly over the phantom. The robot moved the transducer a distance (Δx = x′−x, Δy = y′-y) where (x′,y′) are the center coordinates of the tissue sample calculated during image segmentation and (x,y) are the center coordinates of the RT3D scan. Figure 4 illustrates the centering of the phantom in the transducer pyramid showing a RT3D ultrasound volume-rendered view of the tissue surface before (Fig. 4a) and after (Fig. 4b) the centering operation. The arrows show the directions of the slight robot motion for this operation.

FIG. 4.

FIG. 4

3D volume rendered image of the centering of the transducer over the phantom. The white lines are midpoint markers and the arrows indicate the direction of shift. In (a), the transducer has yet to move and in (b), the transducer has centered itself over the phantom.

The robot then proceeded to perform the biopsy procedure. It is important to note that our goal was not to target specific features within the tissue but rather to obtain adequate sampling throughout the tissue volume as is common in current clinical practice. Tissue depth achieved by each stab was not expected to be consistent since our emphasis was on achieving sufficient breadth and range rather than a specific depth or entry point. Therefore, the voxels corresponding to the tissue surface were divided into eight equal sectors and the robot was expected to penetrate the prostate surface at an arbitrary location in each of all eight sectors, while not attempting to pinpoint coordinate sites. The gripper was rotated 45° before each of 8 needle sticks, allowing the needle to sample the entire surface.

It is known that nearly 80% of prostate cancers are found in the peripheral zone of the prostate.16 Therefore, in order for the needle to reach the peripheral regions of the tissue surface, demonstrating functionality that could be useful in a prostate application, the gripper was tilted slightly as required depending on which side of the phantom was being sampled. Joint 2 was then employed to perform the dive of the robotic arm, executing the biopsy movement, as well as achieving some needle penetration. The time automatically calculated for the rotation of this joint was approximated as a linear relation to the mean depth of the sampled points in the voxel plot with an additional time offset associated with the depth of needle penetration. After each needle stick, joint rotations were reversed, such that the needle was completely withdrawn from the phantom to prevent unnecessary tissue distortion, and then the gripper was rotated further to access the next sector zone of the surface.

Five trials of eight needle sticks each were conducted. The phantom was repositioned after each trial. During these feasibility trials, the needle motion was monitored by 3D US. The robot software was paused at the end of each stick and serial photographs were acquired in order to determine success rate.

A stick was judged to be successful if it penetrated the tissue surface. The success rate of a single trial was calculated based on the number of zones out of eight that received a successful stick during that trial. Image overlays to delineate zones and help determine success rate were superimposed on photographs of the needle sticks. If a successful stick landed on a dividing line between zones, the stick would be attributed to whichever of the two adjacent zones that had not already been previously counted as having received a successful stick. A repeated stick in a sector already previously sampled was deemed a failed stick. A mean success percentage was determined based on the success rate of the five trials.

RESULTS

Figure 5 shows RT3D ultrasound volume-rendered images of a typical needle stick comparing the tissue specimen before (Fig. 5a) versus during a needle penetration approximately 1 cm deep (Fig. 5b). From 5 trials, the overall success rate was 7.4 zones ± 0.49 out of 8, giving a success percentage of 93% based on 3 misses out of a total of 40 attempts. Figure 6 shows the photographic results of one of the trials. Unsuccessful attempts were due to inadequate centering of the transducer over the phantom due to inaccurate movements of the robot as described in the next section, resulting in needle sticks either not achieving sufficient depth, hitting a zone already previously covered, or missing the phantom completely. Mean and maximum error measurements typical of robot experiments are not applicable in our feasibility study due to our objective of region-distributed needle sticks rather than point-directed targeting.

FIG. 5.

FIG. 5

3D volume-rendered image of a typical needle stick. In (a), the needle is about to penetrate the tissue surface. In (b), the needle has just penetrated the surface.

FIG. 6.

FIG. 6

A trial with a success rate of 7 out of 8. Arrows indicate placement of needle tip. In (h), the needle is placed in the correct zone but the needle tip has failed to penetrate the surface

DISCUSSION AND SUMMARY

We have developed and tested the feasibility of an RT3D US guided robotic system that can autonomously locate a tissue phantom and perform multiple widely-distributed needle sticks in a single procedure. We emphasize that our rudimentary phantom and biopsy needle are not meant to replace more sophisticated rectal phantoms and automatic prostate biopsy guns but are only to test the feasibility of a 3D ultrasound-guided autonomous robot for the new task of a distribution of needle samplings in a tissue volume. These findings are seen as a prelude to more sophisticated ex vivo trials, animal experiments and investigation into the automation of other routine needle procedures.

Errors in centering the transducer and biopsy needle placement were due to: (1) the limited spatial resolution of the scanner; (2) slight misalignment of the robot’s frame of reference with that of the transducer when calculating voxel coordinates; (3) mechanical backlash and friction in the drive train of the robot joints; and (4) ultrasound velocity miscalibration between the water medium and tissue. (3) was the most significant factor for failure in biopsy. The joint backlash led to considerable hysteresis when the robot attempted Cartesian movements, such that significant overshoots or shortfalls were observed when displacements were made or no movement at all was observed when small displacements were directed. Therefore, measurements of robot accuracy or resolution were not attempted. Consequently, for the biopsy needle placements, we used the mode of individual joint rotation at a calibrated velocity for an automatically-calculated time interval as described above.

A further inadequacy of the robot was its inability to vary its compliance when penetrating tissue, ideally based on real-time information regarding the properties of the material being incised, such that the right balance is struck between sufficient robotic force and patient safety. In our simulation, the default compliance of the robot was found sufficient for penetrating the turkey tissue; however, in more realistic surgical scenarios where tissue density and elasticity can vary widely in a single region of tissue, an autonomous system would need a default compliance setting corresponding to the lowest and safest gear ratio or a mechanism ensuring rapid near-instantaneous feedback of the medium’s mechanical properties,20, 21 possibly with an imaging modality such as acoustic radiation force, which is capable of such feedback.22

Our proposed method, while intended only as a simulation, presents several limitations in light of current clinical practice in urology. Firstly and most significantly, some of the physical and imaging constraints representative of a TRUS prostate biopsy, such as the narrow-rectal passage and the consequent need for a novel probe-needle configuration, as well as the unique shape and geometry of an actual prostate surface, are not mimicked in our experimental setup; we used a section of turkey tissue rather than a commercial prostate phantom, our transducer-needle setup moving together during the actual biopsy of tissue differs from current physician protocol, which uses a needle-firing biopsy port while the transducer is held steady,6 and the ergonomics of the biopsy motion conducted by the robot is not applicable to typical transrectal approaches in a clinical setting, such as the lithotomy or lateral decubitus position.23

Specifically, the turkey tissue phantom does not adequately reflect the variance of Young’s modulus24 or required penetrative force25 that has been previously noted for the prostate and will need significant improvement in design sophistication in future studies, the lack of which can lead to excessive tissue deformation and consequently, poor image acquisition in a clinical scenario. Fictinger et al26 discuss a wide variety of prostate phantoms implementable in a robotic setting, ranging from a honeydew melon to realistic full-body plaster casts with embedded ultrasound-training phantoms but propose these in the limited context of a CT-scanner application and transperineal approach, conditions irrelevant to the unique difficulties of a TRUS simulation. Improvements to our needle-insertion method could incorporate a TRUS biopsy gun rather than an elementary needle accessory, allowing for real-time image monitoring of the procedure and a more conventional needle-firing approach as is current practice. The initial unfolded configuration of the robot arm could be pre-adjusted by a clinician or engineer to the most appropriate physical orientation. While fully recognizing that the lack of a realistic phantom robs our study of immediate clinical potential, we intend our simulation, while incorporating several initial considerations relevant to a prostate biopsy procedure, only as a lead up to further autonomous robot investigations that will integrate more of existing clinical technologies, as well as the design and use of a prostate phantom able to demonstrate feasibility of an autonomous clinical procedure while experimentally viable.

Secondly, we used a simple first-arrival image segmentation algorithm to define the surface of the tissue sample; electronic noise in the 3DUS volume was present occasionally in the resulting voxel plot, resulting in a poor approximation for the organ location. This electronic noise, of variable echo amplitude, randomly distributed in the volume and often occurring in small but significant quantities, also made more sophisticated treatment of the 3D data, such splining to form a continuous surface, unfeasible, although a simple random sampling of voxels for depth approximation has shown itself in this study to be preliminarily adequate. Elaborate ultrasound image-segmentation techniques capable of eliminating noise while capturing the detailed 3D geometry of the prostate surface have been demonstrated2730 and could be immediately implemented in an autonomous robot procedure. Thirdly, the calibrated joint motions intended to cover the peripheral regions of the surface were precalculated based on the phantom used in this experiment and not autonomously derived, indicating that the needle stabs would not achieve the same peripheral range in prostates of widely-varying sizes as would be expected in a prostate-specific clinical study.

Results of our simulation could be significantly improved by several approaches. Higher frequency ultrasound probes conventionally used in prostate biopsies are 7–7.5 MHz,31 which would yield images of improved spatial resolution. A more specialized robot arm with limited backlash and capable of more precise point-to-point control would significantly reduce error in robotic displacement and achieve repeatable targeting by coordinate location rather than inexact regional sampling. Finally, more sophisticated image segmentation algorithms, including detection of suspicious regions, as we previously described for autonomous breast biopsy,15 would improve biopsy accuracy and clinical relevance.

ACKNOWLEDGEMENTS

This study was supported in part by grant HL89507 from the National Institutes of Health.

REFERENCES

  • 1.Fishman JE, Milikowski C, Ramsinghani R, Velasquez MV, Aviram G. US-guided core-needle biopsy of the breast: how many specimens are necessary? Radiology. 2003;226:779–782. doi: 10.1148/radiol.2263011622. [DOI] [PubMed] [Google Scholar]
  • 2.Matlaga BR, Eskew LA, McCullough DL. Prostate biopsy: indications and technique. J Urol. 2003;169:12–19. doi: 10.1016/S0022-5347(05)64024-4. [DOI] [PubMed] [Google Scholar]
  • 3. [accessed 07/01/2009];Prostate cancer, American Cancer Society. 2008 http://documents.cancer.org/117.00/117.00.pdf.
  • 4.Chang SS, Cookson MS. Complications of transrectal ultra sound-guided prostate biopsy. In: Jones JS, editor. Prostate Biopsy: Indications, Techniques and Complications. New Jersey: Humana Press; 2008. pp. 255–268. [Google Scholar]
  • 5.Shinohara K. Prostate biopsy techniques. In: Jones JS, editor. Prostate Biopsy: Indications, Techniques and Complications. New Jersey: Humana Press; 2008. pp. 255–268. [Google Scholar]
  • 6.Patel U, Rickards D. Transrectal ultrasound - ultra sound physics and equipment. In: Patel U, Rickards D, editors. Handbook of Transrectal Ultrasound and Biopsy of the Prostate. London: Informa Healthcare; 2002. pp. 1–12. [Google Scholar]
  • 7.Resnick MI. Transrectal ultrasonography in the detection and staging of prostate cancer. World J Urol. 1989;7:2–6. [Google Scholar]
  • 8.Long J, Daanen V, Moreau-Gaudry A, et al. Prostate biopsies guided by 3-dimension real-time (4D) transrectal ultrasound on a phantom. Comparative study versus 2D transrectal ultrasound guided biopsies. Eur Urol. 2007;52:1097–1104. doi: 10.1016/j.eururo.2006.11.034. [DOI] [PubMed] [Google Scholar]
  • 9.Bax J, Cool D, Gardi L, et al. Mechanically assisted 3D ultrasound guided prostate biopsy system. Med Phys. 2008;35:5397–5410. doi: 10.1118/1.3002415. [DOI] [PubMed] [Google Scholar]
  • 10.Taneja SS. Prostate biopsy: targeting cancer for detection and therapy. Rev Urol. 2006;8:173–182. [PMC free article] [PubMed] [Google Scholar]
  • 11.Wei Z, Wan G, Gardi L, et al. Robot-assisted 3D-TRUS guided prostate brachytherapy: system integration and validation. Med Phys. 2004;31:539–548. doi: 10.1118/1.1645680. [DOI] [PubMed] [Google Scholar]
  • 12.Pua EC, Fronheiser MP, No ble J, et al. 3-D Ultrasound guidance of surgical robotics: a feasibility study. IEEE Trans Ultrason Ferroelec Freq Contr. 2006;53:1999–2008. doi: 10.1109/tuffc.2006.140. [DOI] [PubMed] [Google Scholar]
  • 13.Whitman J, Fronheiser MP, Ivancevich NM, Smith SW. Autonomous surgical robotics using 3-D Ultrasound Guidance: Feasibility Study. Ultrasonic Imaging. 2007;29:213–219. doi: 10.1177/016173460702900402. [DOI] [PubMed] [Google Scholar]
  • 14.Rogers AJ, Light ED, Smith SW. 3D ultrasound guidance of autonomous robot for location of ferrous shrapnel. IEEE Trans Ultrason Ferroelec Freq Contr. 2009;56:1301–1303. doi: 10.1109/TUFFC.2009.1185. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Liang K, Rogers AJ, Light ED, von Allmen D, Smith SW. Three-dimensional ultrasound guidance of autonomous robotic breast biopsy: feasibility study. Ultrasound Med Biol. 2010;36:173–177. doi: 10.1016/j.ultrasmedbio.2009.08.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Presti JC. Biopsy strategies - how many and where? In: Jones JS, editor. Prostate Biopsy: Indications, Techniques and Complications. New Jersey: Humana Press; 2008. pp. 255–268. [Google Scholar]
  • 17.Smith SW, Pavy HE, von Ramm OT. High speed ultrasound volumetric imaging system part I: transducer design and beam steering. IEEE Trans Ultrason Ferroelec Freq Contr. 1991;38:100–108. doi: 10.1109/58.68466. [DOI] [PubMed] [Google Scholar]
  • 18.von Ramm OT, Smith SW, Pavy HE. High speed ultrasound volumetric imaging system part II: parallel processing and display. IEEE Trans Ultrason Ferroelec Freq Contr. 1991;38:109–115. doi: 10.1109/58.68467. [DOI] [PubMed] [Google Scholar]
  • 19.Light ED, Mukundan S, Wolf PD, Smith SW. Real-time 3-D intracranial ultrasound with an endoscopic matrix array transducer. Ultrasound Med Biol. 2007;33:1277–1284. doi: 10.1016/j.ultrasmedbio.2007.02.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Davies B, Harris S, Lin W, et al. Active compliance in robotic surgery—the use of force control as a dynamic constraint. Proc Inst Mech Eng H. 1997;211:285–292. doi: 10.1243/0954411971534403. [DOI] [PubMed] [Google Scholar]
  • 21.Davies B, Jakopec M, Harris SJ, et al. Active-constraint robotics for surgery. Proc IEEE. 2006;94:1696–1704. [Google Scholar]
  • 22.Nightingale KR, Palmeri ML, Nightingale RW, Trahey GE. On the feasibility of remote palpation using acoustic radiation force. J Acoust Soc Amer. 2001;110:625–634. doi: 10.1121/1.1378344. [DOI] [PubMed] [Google Scholar]
  • 23.Melchior SW, Brawer MK. Role of transrectal ultrasound and prostate biopsy. J Clin Ultrasound. 1996;24:463–471. doi: 10.1002/(SICI)1097-0096(199610)24:8<463::AID-JCU6>3.0.CO;2-I. [DOI] [PubMed] [Google Scholar]
  • 24.Ahn BM, Kim J, Ian L, Rha K-H, Kim H-J. Mechanical property characterization of prostate cancer using a minimally motorized indenter in an ex vivo indentation experiment. Urology. 2010 doi: 10.1016/j.urology.2010.02.025. (in press). [DOI] [PubMed] [Google Scholar]
  • 25.Podder T, Clark D, Sherman J, et al. In vivo motion and force measurement of surgical needle intervention during prostate brachytherapy. Med Phys. 2006;33:2915–2922. doi: 10.1118/1.2218061. [DOI] [PubMed] [Google Scholar]
  • 26.Fichtinger G, DeWeese TL, Patriciu A, et al. System for robotically assisted prostate biopsy and therapy with intraoperative CT guidance. Acad Radiol. 2002;9:60–74. doi: 10.1016/s1076-6332(03)80297-0. [DOI] [PubMed] [Google Scholar]
  • 27.Hodge AC, Fenster A, Downey DB, Ladak HM. Prostate boundary segmentation from ultrasound images using 2D active shape models: optimisation and extension to 3D. Comput Meth Prog Biol. 2006;84:99–113. doi: 10.1016/j.cmpb.2006.07.001. [DOI] [PubMed] [Google Scholar]
  • 28.Hu N, Downey DB, Fenster A, Ladak HM. Prostate boundary segmentation from 3D ultrasound images. Med Phys. 2003;30:1648–1659. doi: 10.1118/1.1586267. [DOI] [PubMed] [Google Scholar]
  • 29.Nanayakkara ND, Samarabandu J, Fenster A. Prostate segmentation by feature enhancement using domain knowledge and adaptive region based operations. Phys Med Biol. 2006;51:1831. doi: 10.1088/0031-9155/51/7/014. [DOI] [PubMed] [Google Scholar]
  • 30.Zhang Y, Sankar R, Qian W. Boundary delineation in transrectal ultrasound image for prostate cancer. Comput Biol Med. 2007;37:1591–1599. doi: 10.1016/j.compbiomed.2007.02.008. [DOI] [PubMed] [Google Scholar]
  • 31.Terris MK. Principles of prostate ultrasound. In: Jones JS, editor. Prostate Biopsy: Indications, Techniques and Complications. New Jersey: Humana Press; 2008. pp. 255–268. [Google Scholar]

RESOURCES