Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2017 Jun 28.
Published in final edited form as: Med Image Comput Comput Assist Interv. 2014;17(Pt 2):684–691. doi: 10.1007/978-3-319-10470-6_85

Needle Guidance using Handheld Stereo Vision and Projection for Ultrasound-based Interventions

Philipp J Stolka 1,, Pezhman Foroughi 1, Matthew Rendina 1, Clifford Weiss 2, Gregory D Hager 1,3, Emad M Boctor 1,2,3,a
PMCID: PMC5488249  NIHMSID: NIHMS841124  PMID: 25485439

Abstract

With real-time instrument tracking and in-situ guidance projection directly integrated in a handheld ultrasound imaging probe, needle-based interventions such as biopsies become much simpler to perform than with conventionally-navigated systems. Stereo imaging with needle detection can be made sufficiently robust and accurate to serve as primary navigation input. We describe the low-cost, easy-to-use approach used in the Clear Guide ONE generic navigation accessory for ultrasound machines, outline different available guidance methods, and provide accuracy results from phantom trials.

Keywords: guidance, ultrasound, needle interventions, stereo vision, projection

1 Introduction

Most needle interventions are performed under ultrasound (US) guidance because it is the most flexible, cost-effective, and available intraoperative imaging modality. However, it is demanding to use, given that typically the operator controls ultrasound imaging, the probe, and the needle, and it takes significant skill acquisition to keep target and needle in view at all times. These significant technical barriers have limited the efficacy, safety, and applicability for percutaneous interventions such as biopsy, ablation, sclerotherapy etc. To utilize such guidance, significant training must be provided, usually in the context of advanced radiology fellowships.

2 State of the Art

Interventional guidance systems mostly use either optical tracking or electromagnetic (EM) sensing technologies, which are 1) cumbersome and interfere with workflow (even in small versions such as NDI Aurora Compact), 2) expensive, and 3) require special needles or probes and extensive calibration before use. Alternatively, mechanical guides (e.g. by CIVCO) constrain needle paths so complexity is reduced.

However, another group of tracking solutions exists that is based on visual tracking using sensors local to the US probe, such as with cameras (stereo [1], monocular [2] needle tracking, [3]), contact-based optical ([4], [5] probe tracking; [6] needle tracking), or magnetic ([7] needle tracking) means. Other uses of probe-mounted cameras include active probe tracking (e.g. [8], [9]) or passive alignment guidance with overlaid lines [10]. The latter can also be performed using laser-beam alignment [11].

All of these approaches improve a particular aspect of interventional guidance (needle or probe tracking, visualization, or targeting assistance), but suffer from the drawbacks above, are untested in clinical practice, or both. The system proposed here is unique in that: 1) It is non-intrusive. It provides tracking without external hardware such as EM field generators or optical tracking devices and markers, and offers guidance in several different formats. 2) It is inexpensive, based on off-the-shelf components. 3) It is always on and therefore obviates a physician decision that a procedure is more complicated than anticipated and therefore guidance is necessary. 4) It is intuitive and easy to use. Beyond the spatial skills needed for hand-eye coordination, the system learning curve is minimal. 5) It allows ultrasound screen sharing on a web browser between a geographically remote station and a specialist in a different location, allowing the specialist to participate in the ultrasound exam and to select desired targets. 6) It integrates with nearly every marketed US system to retrofit guidance.

3 Methods

The guidance system variations described in the following are stand-alone accessories to third-party ultrasound devices. They comprise both hardware and software components. An overarching design principle is purely passive interaction with the underlying ultrasound device, i.e. they impose no requirements on the US machine other than the ability to output a real-time stream of 2D ultrasound images

The device itself consists of a computer base station (medical-grade panel 17″ touchscreen PC with quad-core Intel Core i7, running Windows 8 64-bit) and a custom handheld portion (Fig. 1) with an optical sensing head mounted on a standard US probe (Fig. 2). The user interface is presented on-screen on the PC unit, can be shown in-situ using a laser projector (PDE2, Microvision Inc.), or be viewed on an optional remote station equipped with a web browser, such as a PC, tablet, or smartphone. The original ultrasound device and its user interface are not modified, and remain the central means of interaction with the complete system. Passive interfacing with generic US machines is realized by capturing the real-time US video stream from the US machine video port by a digital+analog frame grabber card.

Fig. 1.

Fig. 1

Clinical guidance system (left) on Ultrasonix SonixTablet machine; educational version (right) integrated with Interson ultrasound imager and with Apple iPad remote user interface.

Fig. 2.

Fig. 2

Optical head mounted onto linear and convex US probes using probe-specific adapters (left); standard sterile US probe cover installed over SuperPROBE (center); camera coordinates (“C”), ultrasound coordinates (“US”), ultrasound region of interest (gray) vs. raw frame (blue) (right)

The optical head contains stereo cameras (MU9PC_MH cameras, USB 2.0, IR-sensitive APTINA MT9P031 5-megapixel sensor, by XIMEA GmbH), focused and aligned for an imaging distance representative of normal use on the patient skin (approx. 20cm). A laser projector for in-situ guidance projection can be integrated (Fig. 3). The head can be mounted to support mainly in- or out-of-plane interventions; the camera field of view (and thus allowed needle insertion angles) covers about ± 45°.

Fig. 3.

Fig. 3

Top row: PRBS needle pattern for tip tracking; on-screen guidance needle overlay (blue) with tip marker (green); in-situ augmented US image projection in-vivo; structured light projection for surface reconstruction; bottom row: in-situ active target deviation projection (blue and red dots); in-situ active target deviation projection (aligned green and red circles); in-situ passive guidance line projection (green line, to shadow with needle).

The guidance software provides instrument tracking, US stream visualization, and graphical guidance overlay functionalities. User interaction is minimized to reduce complexity for the operator; only tap-to-target is supported to allow the definition of intervention targets. With a given target, the system continuously computes the alignment deviation and provides both visual and auditory feedback (beep signals at intervals proportional to distance, similar to parking sensors) to assist in achieving and maintaining correct targeting.

3.1 Calibrations

Several configurations and geometric calibrations need to be established and maintained throughout the system to allow for correct guidance presentation: Stereo camera calibration (“C”), stereo-to-ultrasound calibration (“C-US”), and the correct interpretation of the incoming US video stream to extract the US image and annotate it with appropriate meta-information (such as imaging depth etc.) (Fig. 2).

Stereo Camera Calibration

The purpose of stereo camera calibration is to identify the extrinsic and intrinsic parameters of the optical head cameras, so environment features can be reconstructed in camera (probe) coordinates. These are recovered by the standard Zhang stereo calibration approach [12]. The resulting calibration is stored together with identifying information that uniquely ties it to the underlying optical head.

Camera-Ultrasound Calibration

After calibration of the stereo cameras, environment features can be reconstructed in camera (probe) coordinates. Several methods to establish the transformation CMUS from camera to US image coordinates were explored. One well-known approach [13, 9] involves the presentation of linear objects (such as needles) to the cameras and the US imager, and to solve the resulting equation system of 3D camera lines and 2D intersection points with the US plane.

This relies, however, on the accuracy of stereo needle reconstruction, and works best for out-of-plane needles showing up as points in US. To avoid this source of error, we developed a “single-shot” phantom which includes both optical and unique US features and requires only one simple probe motion across the imaging area for calibration. Both feature types are automatically detected during the scan. The generic phantom geometry is adapted to classes of US probes, depending on their imaging geometry (linear, convex, …) and size. Given the phantom geometry, the resulting transform is recovered. With automatic image analysis, the calibration can be performed in less than 20 seconds directly on the deployment system.

Ultrasound Image Mode Calibration

The plain incoming US video stream is interpreted automatically in real time to extract the US image region of interest (ROI) and to annotate it with metric meta-information to later draw correct guidance overlays. This interpretation is based on a prior one-time, semi-manual analysis of frames from every relevant imaging mode (to parameterize online feature matching), which also serves to reject frames that do not correspond to known modes.

Camera-Projector Calibration

C-P calibration recovers the projector-to-camera coordinate transformation. The procedure is very similar to stereo camera calibration with the projector being an “inverse camera”. Therefore, a procedure similar to C calibration is followed, with the difference that the checkerboard is digitally projected onto a white, flat surface.

3.2 Needle Tracking in Five Degrees of Freedom

Unlike conventional EM-tracked systems where a stationary base references mobile sensors around it, the described system tracks objects directly from the handheld optical head. Different algorithms specialize on classes of objects to be tracked.

The basic method to reconstruct needle poses from the left and right camera images in real time and in 4DoF (degrees of freedom) has been described in [13]. We now track the needle tip by directly measuring insertion depth as well, leaving only the long-axis DoF untracked.

Fig. 3 depicts an example of a tracking pattern marked onto the needle shaft, where each subsequence of a given minimal length appears only once (here a pseudo random binary sequence/PRBS, [14]). The optical head tracks the needle insertion depth visually, using the discernible part of the pattern to determine the section and thus the position of the needle tip uniquely. The candidate location is overlaid on the augmented display, regardless of whether the tip is actually visible.

Given a pair of rectified stereo images and the calibration, the first step of tip localization is detection of straight 2D lines and 3D reconstruction. This line is then correlated against all possible “sub-patterns” (a minimal continuous pattern segment that is uniquely identifiable). Since the offset of the sub-pattern with respect to the tip is known, the 3D location of the tip can be estimated. The system can calculate depth when at least a fraction (n/kn) of the complete pattern is visible, if each distinct element of the pattern can have k different values (such as colors or shades) and the complete pattern is kn elements long. The tracking pattern should be non-periodic to provide a unique position on the medical tool. By only processing observed objects that exhibit a required pattern, the tool detection rate is improved.

3.3 Guidance Display

The information for needle alignment can be communicated to the operator in several different ways, depending the type of guidance (active or passive) and its modality (on-screen or in-situ, Fig. 3). Remote or secondary displays (in web browser or on a tablet/smartphone) can be used in telemedicine and education settings (Fig. 1).

On-screen display In-situ projection
Active tracking Needle overlay on live US (on PC or remote browser) Target deviation markers (dots or circles); or:
Needle overlay on live US
Passive guidance Direct-alignment lines overlay on stereo camera views (overlaying needles when in correct pose) Direct-alignment lines projection (shadowed by needle when in correct pose)

3.4 Experiment Setup

The described system is designed to be easy to use for ultrasound-guided interventions such as biopsies or ablations. A typical use would be to place a needle within a tumor or lesion with a solid organ such as kidney or liver. As such, the relevant measure of effectiveness is the accuracy with which a user can perform needle placement. To validate the accuracy of the system, an exhaustive series of tests has been performed in benchtop phantom models to prove the effectiveness of the device (in-vivo animal tests have been completed, and will be described in an upcoming publication).

Phantoms were created with transparent porcine skin gelatin (Sigma-Aldrich G2500). In order to ensure a uniform clear target, a 2.4mm steel ball was suspended at a depth of 6cm. The target itself is highly visible in ultrasound, removing any ambiguity of the target location. The guidance device was used in conjunction with an Ultrasonix SonixTablet ultrasound machine and two probe types (linear L14-5/38 (Abdominal, MSK, Nerve Block, Small Parts, Vascular uses) and convex C5-2/60 (Abdominal, Gynecology, Obstetric and Urology uses)). The small head (Figs 1,2,4) was used for validation; the projector version (Fig 3) did not contribute data. Both mount via probe-specific brackets (Figs 1,2). The perpendicular 3D distance from the needle trajectory to the target was considered the “distance to target” accuracy measurement, using standard (non-PRBS) needles. We measured the mean distance at target depths of ~6cm, which represents the most conservative (difficult) depth of lesions to biopsy (lesions in liver and kidney discovered by CT are 5–10cm deep [15], thus determining the choice of target placement).

Fig. 4.

Fig. 4

Phantom validation: US imaging and needle insertion using linear US probe (left); orthographic photographs for alignment measurements (center); sketch of orthographic image measurement geometry: observed needle (red arrow), measured points on needle (black diamonds), target BB (red circle) (right)

The subjects were two novice users (physicians) naïve to the device and not experienced in image guidance. They were given ten minutes to familiarize themselves with system operation, without any coaching from the authors. They set up imaging according to their preferences, as the guidance system adapts itself to US parameters. They were randomly assigned to a phantom and a probe, and were instructed to place the needle tip as close as possible to the steel ball target using the guidance system. The experiment setup made it impossible to observe the needle visually. Subjects chose insertion points and angle freely, as physicians regularly optimize probe/needle placements before and even during insertions. We disallowed retraction in order to measure true “pre-insertion” accuracy. When the user indicated completion, two orthogonal near-isometric digital photographs were taken of the target and the needle as they appeared in the transparent phantom (Fig. 4). These images were then manually processed to compute the true 3D distance of the needle trajectory to the target, by first determining the 2D distance in each image, and then computing the 3D target distance using the Pythagorean theorem.

4 Results

The study yielded a mean distance to target of 3.27 ± 2.28 mm (n=41); the 95% confidence interval for mean distance to target (in mm) is [2.546, 3.987]. Given the mean and variance of the data, the probability that a single attempt would yield a distance to target of more than 7 mm is ~5.1%, assuming a normal distribution.

The study showed a statistical difference between the two users (p < 0.05) (the slower subject showed higher accuracy). The convex and linear probes had no statistically different means, which interestingly is not consistent with clinical unguided practice.

5 Conclusion

We present the first clinical-grade camera-based guidance system for needle-shaped instruments in ultrasound-based interventions. This product-level device can be attached to standard ultrasound probes and machines, and provides instrument tracking and guidance visualization in multiple modalities, including telemedicine and in-situ projection modes. With minimal training, novice users can place needles with clinically relevant accuracy. Although the system does not compensate for outcome variation between users, interestingly no differences between US probes could be observed when using guidance. Clinical practice usually dictates (lower-resolution) convex probes because of their wider field of view; however, we showed results equivalent to high-resolution linear probes, indicating that use of the described system compensates for this with convex probes.

Fig. 5.

Fig. 5

Phantom results: needle/target distance by probe type (left) and subject (right)

References

  • 1.Khosravi, et al. One-step Needle Pose Estimation for Ultrasound Guided Biopsies. IEEE EMBS. 2007 doi: 10.1109/IEMBS.2007.4353046. [DOI] [PubMed] [Google Scholar]
  • 2.Najafi, et al. Single camera closed-form real-time needle trajectory tracking for ultrasound. SPIE Med Img. 2011 doi: 10.1016/j.ultrasmedbio.2015.05.016. [DOI] [PubMed] [Google Scholar]
  • 3.Wang, et al. The Kinect as an interventional tracking system. SPIE Med Img. 2012 [Google Scholar]
  • 4.Goldsmith, et al. An Inertial-Optical Tracking System for Portable, Quantitative, 3D Ultrasound. IEEE IUS. 2008 [Google Scholar]
  • 5.Stolka, et al. Mult-DoF Probe Trajectory Reconstruction with Local Sensors for 2D-to-3D Ultrasound. ISBI. 2010 [Google Scholar]
  • 6.Palmer, et al. Development and evaluation of optical needle depth sensor for percutaneous diagnosis and Therapies. SPIE Med Img. 2014 [Google Scholar]
  • 7.eZono4000 Product Page. http://www.ezono.com/products/ezono-4000/
  • 8.Sun, et al. Computer-guided US probe realignment by optical tracking. IEEE ISBI. 2013 [Google Scholar]
  • 9.Hoßbach, et al. Simplified Stereo-Optical US Plane Calibration. SPIE Med Img. 2013 [Google Scholar]
  • 10.Sauer, et al. Video-Assistance for Ultrasound Guided Needle Biopsy. 2003/0120155 A1 Patent US.
  • 11.Sauer, et al. Method and Apparatus for Ultrasound Guidance of Needle Biopsies. 6,689,067 B2 Patent US.
  • 12.Zhang Z. A flexible new technique for camera calibration. IEEE PAMI. 2000;22(11) [Google Scholar]
  • 13.Stolka, et al. Navigation with local sensors in handheld 3D ultrasound: initial in-vivo experience. SPIE Med Img. 2011 [Google Scholar]
  • 14.Petriu EM. Absolute Position Measurement Using Pseudo-Random Binary Encoding. IEEE Instrumentation and Measurement Magazine. 1998 Oct; doi: 10.1109/5289.706020. [DOI] [Google Scholar]
  • 15.Jones, et al. The frequency and significance of small (less than or equal to 15 mm) hepatic lesions detected by CT. American Journal of Roentgenology. 1999;155:535–539. doi: 10.2214/ajr.158.3.1738990. [DOI] [PubMed] [Google Scholar]

RESOURCES