Abstract
Ultrasound imaging is widely used to guide minimally invasive procedures, but the visualization of the invasive medical device and the procedure’s target is often challenging. Photoacoustic imaging has shown great promise for guiding minimally invasive procedures, but clinical translation of this technology has often been limited by bulky and expensive excitation sources. In this work, we demonstrate the feasibility of guiding minimally invasive procedures using a dual-mode photoacoustic and ultrasound imaging system with excitation from compact arrays of light-emitting diodes (LEDs) at 850 nm. Three validation experiments were performed. First, clinical metal needles inserted into biological tissue were imaged. Second, the imaging depth of the system was characterized using a blood-vessel-mimicking phantom. Third, the superficial vasculature in human volunteers was imaged. It was found that photoacoustic imaging enabled needle visualization with signal-to-noise ratios that were 1.2 to 2.2 times higher than those obtained with ultrasound imaging, over insertion angles of 26 to 51 degrees. With the blood vessel mimicking phantom, the maximum imaging depth was 38 mm. The superficial vasculature of a human middle finger and a human wrist were clearly visualized in real-time. We conclude that the LED-based system is promising for guiding minimally invasive procedures with peripheral tissue targets.
Keywords: photoacoustic imaging, ultrasonography, LED, needle guidance, vasculature, minimally invasive procedures
1. Introduction
Precise and efficient device guidance is critically important for minimally invasive procedures in many clinical fields such as fetal medicine, regional anesthesia and pain management, and interventional oncology [1,2]. Successful procedural outcomes are dependent on the accurate visualization of both the invasive medical device and the procedure target. Ultrasound (US) imaging is commonly used for image guidance, as it provides high-resolution anatomical images in real-time. However, US visualization of invasive medical devices such as metal needles can be challenging [3]. For example, when the needle is inserted into tissue at a steep angle, the US waves may be reflected away from the aperture of the US imaging probe, such that the needle effectively becomes invisible. Additionally, US imaging provides insufficient contrast to robustly differentiate tissue targets from their surroundings in many minimally invasive procedures. A loss of visibility of the needle or the tissue target can result in serious complications [4].
Photoacoustic (PA) imaging is a hybrid imaging modality that offers optical spectroscopic contrast and ultrasonic resolution. In PA imaging, pulsed excitation light is absorbed in tissue, and the resulting US waves are received and processed to reconstruct an image [5,6]. PA imaging has emerged as a powerful biomedical imaging modality in the last decade with tremendous potential for a wide range of clinical and preclinical applications [7,8,9,10,11,12,13,14,15,16,17]. PA imaging can be readily combined with US imaging using a clinical US imaging probe; with the same transducer elements in the probe used for reception, the images are naturally co-registered. These dual-mode photoacoustic and ultrasound (PA/US) imaging systems provide both structural and molecular contrast [17,18,19,20].
The use of dual-mode PA/US imaging with a clinical US imaging probe for guiding minimally invasive procedures has attracted increasing interest [21,22,23,24,25,26,27,28,29,30,31]. Kim et al. reported a PA/US imaging system for guiding sentinel lymph node biopsies, with light delivered to the tissue surface via a bifurcated optical fiber bundle integrated with the US imaging probe [21]. Similarly, with surface-based light illumination, Su et al. demonstrated the visualization of clinical metal needles with PA imaging [22]. Surface-based illumination provides wide-field tissue imaging, but it suffers from a rapid reduction of PA signal amplitude with an increase in tissue depth. Interventional PA imaging addresses this limitation with a complementary light delivery approach: PA excitation light is delivered inside tissue through an optical fiber integrated within an invasive medical device. Interventional PA imaging has a strong potential to guide minimally invasive procedures in different clinical contexts such as fetal surgery [23,24], regional anesthesia and interventional pain management [25,26], breast biopsies [27], prostate brachytherapy [28,29,30], and neurosurgery [31].
Compact laser diodes and light-emitting diodes (LEDs) can be used as PA excitation sources [32,33,34,35,36,37,38,39,40], and their small size and low cost relative to some conventional sources may facilitate the clinical translation of the technology [41]. Whilst their pulse energies are typically much lower than those of conventional excitation sources, their pulse repetition frequencies (PRFs) can be higher, so that frame-to-frame averaging and coded excitation sequences can be used to increase the signal-to-noise ratio (SNR).
In this work, the use of an LED-based PA/US imaging system for guiding minimally invasive procedures was investigated for the first time. The performance of this system for visualizing clinical metal needles inserted at various angles and tissue targets at different depths was evaluated with ex vivo tissue phantoms. Initial indications of the potential of the system for guiding peripheral procedures were obtained by imaging the superficial vasculature in the fingers and the wrists of human volunteers.
2. Materials and Methods
2.1. System Description
AcousticX (PreXion Corporation, Tokyo, Japan) is a commercially available LED-based PA/US system which can acquire interleaved PA and US images and display them in real-time. Figure 1 shows the block diagram of AcousticX in which data and processing flow is detailed.
2.1.1. Light Illumination
The PA/US imaging system uses LED arrays to deliver PA excitation light with a wavelength of 850 nm. Each array comprises four rows of LEDs, with 36 LEDs in each row; the size of each LED is 1 mm × 1 mm. Each array can deliver a maximum optical energy of 200 µJ per pulse. The peak current flow for each LED array is 20 A, and the LEDs can be driven at a PRF of 1 KHz to 4 KHz. The LED pulse duration is tunable over the range of 30 ns to 100 ns. In this study, a pulse duration of 70 ns was used to maximize the PA efficiency (signal strength/light pulse energy). The optical energy was dependent only on the driving voltage and the pulse width, and there was no relation between the PRF and the optical output power of LEDs within the range of PRFs considered in this study (data not shown). One LED array was affixed on each side of a linear-array US imaging probe so that the light beams from the two LED arrays overlapped at the focus depth of this probe. In the absence of optical scattering, the overlap region was approximately 50 mm × 7 mm in the YX-plane, with a maximum fluence of 0.11 mJ/cm2.
2.1.2. Image Acquisition and Processing
US reception was performed with a linear-array lead zirconate titanate (PZT)-based US probe that comprises 128 elements over a distance of 38.4 mm. Each element has a transverse length of 5 mm, a pitch of 0.3 mm, a central frequency of 9 MHz, and a measured −6 dB bandwidth of 77%. The array incorporates an acoustic lens to achieve an elevational focus of 15 mm.
For both PA and US imaging, RF data from all US transducer elements were acquired from 128 channels at sampling rates of 40 MHz and 20 MHz respectively, and the data were then transferred to the graphics processing unit (GPU) board using a USB 3.0 interface. US imaging was performed with single-angle plane-wave US transmissions, so that one B-mode US image was obtained for each plane-wave transmission. Likewise, one PA image was obtained for each pulse of excitation light. The acquired US and PA data were averaged across sequentially acquired PA images, reconstructed using an inbuilt GPU-based Fourier-domain reconstruction algorithm, and then displayed in real-time on a high-resolution monitor (Figure 1). Image thresholding was varied manually in real-time. The frame rate was dependent on the number of averages: when averaging was performed across 2560 images, the frame rate was 1.5 frames per second (FPS); when averaging was performed across 128 images, the frame rate was 30 FPS. A maximum of 1536 PA frames and 1536 US frames (imaging depth: 4 cm) can be stored in memory at one time. The raw RF data were saved and made available for offline Fourier-domain reconstruction [42].
2.2. Spatial Resolution Measurements
The spatial resolution of the dual-mode PA/US imaging system was measured by imaging a resolution phantom that comprised six light-absorbing wires (diameter: ca. 30 µm) as targets. The wires were fabricated by coating tungsten wires (diameter: ca. 25 µm) with a carbon black paint marker (no. 4610 337, Liquitex, Cincinnati, OH, USA) that were mounted on a U-shaped acrylic frame so that they were parallel to each other (Figure 2a). The distances between each pair of neighboring wires were approximately 5 mm.
Imaging was performed with both the resolution phantom and the imaging probe in water. The phantom (acrylic frame with the light-absorbing wires) was arranged so that the wires were perpendicular to the imaging plane and intersected at approximately the same lateral (Y) positon and at six different depths (Z) (Figure 2a,b). Raw PA and US data with the wires at two different lateral positions (side and center with respect to the ultrasound imaging probe) were acquired, and images were reconstructed offline. To calculate the spatial resolution for both PA and US images, axial and lateral profiles across the local maxima that corresponded to the wires were plotted, and the full width at half maximum (FWHM) values yielded the axial and lateral resolutions, respectively.
2.3. PA and US Imaging of Clinical Metal Needle Insertions
To demonstrate the potential of the system for visualizing clinical metal needles, a 14 gauge needle (Terumo, Surrey, UK) was inserted into chicken breast tissue ex vivo at different angles. At each angle, the signal-to-noise ratio (SNR) of the needle shaft was calculated for both the US and the PA images.
The SNR was defined as SNR = S/σ, where S is the mean of the image amplitude in a signal region, and σ is the standard deviation of the image amplitude in a noise region. The signal region was empirically chosen as a rectangular region where the PA signal was apparent; the noise region was manually chosen to correspond to a rectangular region away from the needle. The same signal and noise regions were used to calculate the SNR for both the PA and the US images.
2.4. PA Imaging of a Blood-Vessel-Mimicking Phantom
To evaluate the sensitivity with which PA imaging can be performed, a blood-vessel-mimicking phantom that was created by embedding cylindrical vessels (5 mm diameter) into chicken breast at different depths was used. The vessels comprised gel wax (Mindsets Online, Waltham Cross, UK), with carbon black ink (carbon black, Cranfield Colours, Cwmbran, UK) and TiO2 particles (13463-67-7, ReagentPlus 99%, Sigma-Aldrich, St. Louis, MO, USA) for optical absorption and scattering, respectively [43,44]. The optical absorption coefficient was 1 mm−1 at 850 nm; the optical reduced scattering coefficient was 0.5 mm−1. The PA/US probe was aligned in such a way that the vessels were perpendicular to the imaging plane. Several measurements were performed by adding multiple layers of chicken tissue and thereby increasing the depth of the vessel relative to the probe surface. The PA and US data of all measurements were saved for further processing. The SNR was calculated for the PA and the US images that were reconstructed offline for seven different depths that ranged from 12 mm to 38 mm and for four frame rates that ranged from 1.5 to 30 FPS. These calculations were performed in the same way as those in Section 2.3, where the signal region was empirically chosen as a rectangular region that enclosed the vessel.
To obtain a preliminary indication of the potential of our system to guide percutaneous procedures, PA/US imaging was performed while a 14 gauge needle was inserted into a blood-vessel-mimicking phantom. This phantom was the same type as that used for the sensitivity evaluation, with the gel wax vessel positioned at a depth of 2 cm. PA and US image pairs were displayed at 10 Hz, and reconstructed images and videos were saved in real-time.
2.5. In Vivo Imaging
To demonstrate the potential of our system to image superficial vasculature in vivo, a middle finger of a healthy volunteer was imaged with the imaging probe held manually and the finger positioned in water, interleaved PA and US imaging from the side of the finger comprising one of the digital arteries was performed, with images from each modality acquired at ten FPS. A real-time image display was useful to align the imaging probe in so that the pulsating radial artery was visible.
The same volunteer’s right wrist was scanned in cross-section. The protocol was similar to the protocol used to image the middle finger, except that ultrasound gel was used instead of water as a coupling medium, and a lower frame rate was used (six FPS).
3. Results
3.1. Spatial Resolution
The spatial resolutions of PA and US imaging were similar (Figure 2c,d). The axial resolution was consistent at the center and the side (Y direction), and it was over the measured depth (Z) range of 13 to 36 mm. For PA imaging, the mean axial resolution was 0.22 mm with a standard deviation of 0.01 mm, with statistics obtained from all measured Y and Z positions; for US imaging, the mean axial resolution was 0.22 ± 0.06 mm. The lateral resolution of both PA and US images was better at the center of the imaging plane than at the side. At the center, the lateral resolution of PA images was relatively consistent (0.46 ± 0.06 mm); with US imaging, it increased with Z from 0.35 mm to 0.57 mm. At the side, the lateral resolution of both PA and US images increased with Z, from 0.64 mm to 0.76 mm and from 0.52 mm to 0.69 mm, respectively.
3.2. Angle Dependence of US and PA Imaging for In-Plane Needle Insertions
When inserted into the chicken breast tissue, the needle was barely visible with US imaging (Figure 3a,d,g), and clearly visible with PA imaging (Figure 3b,e,h). The needle was visible with PA imaging at depths of up to 2 cm. The spatial locations of the needle observed on US images corresponded well to those observed on PA images, as evidenced by the overlays (Figure 3c,f,i). The SNR values for PA images were substantially higher than those for US images. With both US and PA imaging, the SNR decreased as the needle insertion angle increased (Figure 3j).
3.3. Imaging Depth—Phantom Experiment
With the blood-vessel-mimicking phantom, the vessel had a much more prominent appearance with PA imaging than with US imaging. The SNR of the PA images decreased with the depth at which the vessel was positioned in the chicken breast and also with the frame rate (Figure 4). This vessel was apparent with PA imaging at depths of up to 2.8 cm, with a frame rate of 30 Hz. When the frame rate was decreased to 10 Hz, the vessel was apparent at depths of up to 3.8 cm. At 1.5 FPS, the SNR of the US images was lower than that of the PA images at all depths. An image artefact of unknown origin (horizontal line indicated in the images) was also apparent in both US and PA images. With the needle inserted towards the vessel, the visibility of both the needle and the vessel was low with US imaging (Figure 5a) and high with PA imaging (Figure 5b). A video obtained from real-time reconstruction as the needle was inserted into the vessel is provided in the Supplementary Materials (Video S1).
3.4. In Vivo Imaging
While imaging the human finger, high PA signals were observed from a two-layered subsurface structure (Figure 6b). With a real-time image display, pulsations were readily apparent (Supplementary Materials; Video S2), which suggests that the subsurface structure was a digital artery. Additional subsurface signals were observed, which likely corresponded to smaller vessels such as superficial veins (blue arrows). During imaging, the finger was slightly compressed, and this may be the reason why the veins were not visible as two-layered features. Visually, there was a fairly strong correspondence between the locations of the PA signals and the appearances of the US images. However, PA and US images appeared to provide distinct information; PA signals were not present at all locations where there were features in the US images and vice-versa.
With the human wrist, the PA image comprised multiple regions with high signal values (Figure 7b). These regions likely originated from superficial blood vessels. They corresponded well to anechoic regions on the US image (Figure 7c,d). Bulk tissue motion and localized subsurface motion were clearly visible in real-time with both PA and US imaging (Supplementary Materials; Video S3). Despite this motion and the use of signal averaging for PA imaging, there were no visible fluctuations in the PA signal amplitudes.
4. Discussion and Conclusions
For the clinical translation of PA imaging to guide peripheral minimally invasive procedures, it will likely be important to have compact excitation light sources and real-time imaging capabilities for compatibility with current workflow. In this study, for the first time, we demonstrated the feasibility of visualizing clinical needles and a vascular target with real-time PA and US imaging using an LED-based PA system. This paradigm could be useful for guiding peripheral intravenous line placement, particularly in difficult cases that arise from diseases such as diabetes, intravenous drug abuse, and sickle cell disease [45].
There are several distinct advantages of using LEDs as PA excitation sources. First, as LEDs and their drivers are widely used in consumer electronics, they can benefit from cost reductions that arise from high-volume manufacturing. Second, as the frequency content of PA signals is dependent on the pulse width of light used for excitation, the pulse width of the LED light could be tuned to optimize imaging for different tissue depths and for different clinical contexts. Finally, coded excitation could be used to increase the SNR of generated PA signals [35,40]. In future implementations of this PA system, LED arrays could be fabricated on highly curved surfaces for endoscopic or vascular applications.
There are several ways in which the system could be improved. LED arrays could be fabricated with multiple wavelengths to perform multispectral PA imaging and thereby to provide spectroscopic tissue specificity. This functionality could be useful to differentiate arteries and veins based on differences in hemoglobin absorption spectra [23,24] and also to identify nerves based on differences in water and lipid absorption spectra [23,25,26]. Depending on their origin, horizontal PA artefacts could be mitigated by using the PAFUSion technique reported by Kuniyil Ajith Singh et al. [46,47,48] or with singular value decomposition [49]. US image quality could be improved using a sequence of electronically focused transmissions in place of plane-wave transmissions. In this study, the choice of plane-wave US transmissions was made to retain the high frame rate (PA + US) using parallel acquisition and processing. Further improvements in US image quality could be obtained with angular compounding and postprocessing speckle-reduction algorithms. With the LED side arrays positioned at an angle to the imaging plane and extending in depth beyond the surface of the ultrasound probe, it was often challenging to achieve acoustic coupling with ultrasound gel during in vivo human imaging. Future versions could include compact co-axial excitation schemes [50].
It is encouraging that the system used in this study could visualize both needles and vessel targets at cm-scale depths, considering the low pulse energies provided by the LED arrays. These pulse energies (200 µJ for each array) are approximately two orders of magnitude lower than those used in similar studies with Q-switched Nd:YAG lasers (typically >10 mJ) [41,49]. To further increase the depth at which medical devices and tissue targets could be visualized, excitation light could be coupled into an optical fiber to deliver light from within the tissue, in addition to the surface-based illumination provided by the LED arrays [23,24,25,26]. A system in which excitation light is delivered both at the surface and through an interventional device could provide both a large field of view and a large imaging depth.
The shape and the orientation of the needle were clearly visualized with PA imaging; however, an unambiguous determination of the needle tip position with this approach may be challenging. In the future, ultrasonic tracking could be combined with the current system, with a fiber-optic ultrasound receiver or transmitter within the needle being used to communicate with an external ultrasound array [51,52,53,54,55,56] and to deliver excitation light into tissue.
With the capability of providing real-time visualization of clinical metal needles and tissue targets, the LED-based PA/US system used in this study could be useful for guiding minimally invasive procedures in many clinical contexts.
Acknowledgments
The authors gratefully acknowledge the software and hardware team of PreXion Corporation, Japan for realizing the technical improvements for the measurements.
Supplementary Materials
The following are available online at http://www.mdpi.com/1424-8220/18/5/1394/s1, Video S1: Real-time photoacoustic and ultrasound visualization of a needle as it is inserted into a vessel phantom, Video S2: Real-time photoacoustic and ultrasound imaging of the finger of a human volunteer, Video S3: Real-time photoacoustic and ultrasound visualization of the wrist of a human volunteer.
Author Contributions
W.X., M.K.A.S. and A.E.D. conceived the study and designed the experiments; M.K.A.S. and W.X. performed the experiments; W.X. and M.K.A.S. analyzed the data; E.M. prepared the phantoms for the experiments; N.S., Y.S. and T.A. contributed to the development of the system and required tools for data acquisition and processing; W.X., M.K.A.S. and A.E.D. wrote the paper. S.O. provided scientific mentorship. S.J.W. provided clinical feedback throughout the project. All authors commented on the manuscript.
Funding
This research was funded by an Innovative Engineering for Health award by the Wellcome Trust (No. WT101957) and the Engineering and Physical Sciences Research Council (EPSRC) (No. NS/A000027/1), by a Starting Grant from the European Research Council (Grant No. ERC-2012-StG, Proposal 310970 MOPHIM), by an EPSRC First Grant (No. EP/J010952/1), by a Wellcome/EPSRC Centre award [203145Z/16/Z & NS/A000050/1], by the UCL EPSRC Centre for Doctoral Training in Medical Imaging (EP/L016478/1).
Conflicts of Interest
The authors declare no conflict of interest.
References
- 1.Bluvol M.K., Bluvol N., Shaikh A., Kornecki A., Del Rey Fernandez D., Downey D., Fenster A. A needle guidance system for biopsy and therapy using two-dimensional ultrasound. Med. Phys. 2008;35:307–317. doi: 10.1118/1.2829871. [DOI] [PubMed] [Google Scholar]
- 2.Randolph A.G., Cook D.J., Gonzales C.A., Pribble C.G. Ultrasound guidance for placement of central venous catheters: A meta-analysis of the literature. Crit. Care Med. 1996;24:2053–2058. doi: 10.1097/00003246-199612000-00020. [DOI] [PubMed] [Google Scholar]
- 3.Narouze S.N., editor. Atlas of Ultrasound-Guided Procedures in Interventional Pain Management. Springer Science & Business Media; New York, NY, USA: 2010. [Google Scholar]
- 4.Rathmell J.P., Benzon H.T., Dreyfuss P., Huntoon M., Wallace M., Baker R., Riew K.D., Rosenquist R.W., Aprill C., Rost N.S., et al. Safeguards to Prevent Neurologic Complications after Epidural Steroid Injections: Consensus Opinions from a Multidisciplinary Working Group and National Organizations. Surv. Anesthesiol. 2016;60:85–86. doi: 10.1097/01.sa.0000480641.01172.93. [DOI] [PubMed] [Google Scholar]
- 5.Zhou Y., Yao J., Wang L.V. Tutorial on photoacoustic tomography. J. Biomed. Opt. 2016;21:061007. doi: 10.1117/1.JBO.21.6.061007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Beard P. Biomedical photoacoustic imaging. Interface Focus. 2011;1:602–631. doi: 10.1098/rsfs.2011.0028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Kim C., Favazza C., Wang L.V. In vivo photoacoustic tomography of chemicals: High-resolution functional and molecular optical imaging at new depths. Chem. Rev. 2010;110:2756–2782. doi: 10.1021/cr900266s. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Ntziachristos V. Going deeper than microscopy: The optical imaging frontier in biology. Nat. Methods. 2010;7:603–614. doi: 10.1038/nmeth.1483. [DOI] [PubMed] [Google Scholar]
- 9.Heijblom M., Piras D., van den Engh F.M., van der Schaaf M., Klaase J.M., Steenbergen W., Manohar S. The state of the art in breast imaging using the Twente Photoacoustic Mammoscope: Results from 31 measurements on malignancies. Eur. Radiol. 2016;26:3874–3887. doi: 10.1007/s00330-016-4240-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Xia W., Daniele P., Mithun K.A.S., van Hespen J.C.G., van Leeuwen T.G., Steenbergen W., Manohar S. Design and evaluation of a laboratory prototype system for 3D photoacoustic full breast tomography. Biomed. Opt. Express. 2013;4:2555–2569. doi: 10.1364/BOE.4.002555. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Xia W., Steenbergen W., Manohar S. Photoacoustic mammography: Prospects and promises. Breast Cancer Manag. 2014;3:387–390. doi: 10.2217/bmt.14.32. [DOI] [Google Scholar]
- 12.Ermilov S.A., Khamapirad T., Conjusteau A., Leonard M.H., Lacewell R., Mehta K., Miller T., Oraevsky A.A. Laser optoacoustic imaging system for detection of breast cancer. J. Biomed. Opt. 2009;14:024007. doi: 10.1117/1.3086616. [DOI] [PubMed] [Google Scholar]
- 13.Van Es P., Biswas S.K., Bernelot Moens H.J., Steenbergen W., Manohar S. Initial results of finger imaging using photoacoustic computed tomography. J. Biomed. Opt. 2014;19:060501. doi: 10.1117/1.JBO.19.6.060501. [DOI] [PubMed] [Google Scholar]
- 14.Kruizinga P., van der Steen A.F., de Jong N., Springeling G., Robertus J.L., van der Lugt A., van Soest G. Photoacoustic imaging of carotid artery atherosclerosis. J. Biomed. Opt. 2014;19:110504. doi: 10.1117/1.JBO.19.11.110504. [DOI] [PubMed] [Google Scholar]
- 15.Dean-Ben X.L., Razansky D. Adding fifth dimension to optoacoustic imaging: Volumetric time-resolved spectrally enriched tomography. Nat. Light Sci. Appl. 2014;3:e137. doi: 10.1038/lsa.2014.18. [DOI] [Google Scholar]
- 16.Jathoul A.P., Laufer J., Ogunlade O., Treeby B., Cox B., Zhang E., Johnson P., Pizzey A.R., Philip B., Marafioti T., et al. Deep in vivo photoacoustic imaging of mammalian tissues using a tyrosinase-based genetic reporter. Nat. Photon. 2015;9:239–246. doi: 10.1038/nphoton.2015.22. [DOI] [Google Scholar]
- 17.Xu G., Rajian J.R., Girish G., Kaplan M.J., Fowlkes J.B., Carson P.L., Wang X. Photoacoustic and ultrasound dual-modality imaging of human peripheral joints. J. Biomed. Opt. 2012;18:010502. doi: 10.1117/1.JBO.18.1.010502. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Singh M.K.A., Steenbergen W., Manohar S. Frontiers in Biophotonics for Translational Medicine. Springer; Singapore: 2016. Handheld Probe-Based Dual Mode Ultrasound/Photoacoustics for Biomedical Imaging; pp. 209–247. [Google Scholar]
- 19.Kim J., Park S., Jung Y., Chang S., Park J., Zhang Y., Lovell J.F., Kim C. Programmable real-time clinical photoacoustic and ultrasound imaging system. Sci. Rep. 2016;12:35137. doi: 10.1038/srep35137. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Park S., Jang J., Kim J., Kim Y.S., Kim C. Real-time Triple-modal Photoacoustic, Ultrasound, and Magnetic Resonance Fusion Imaging of Humans. IEEE Trans. Med. Imaging. 2017;36:1912–1921. doi: 10.1109/TMI.2017.2696038. [DOI] [PubMed] [Google Scholar]
- 21.Kim C., Erpelding T.N., Maslov K., Jankovic L., Akers W.J., Song L., Achilefu S., Margenthaler J.A., Pashley M.D., Wang L.V. Handheld array-based photoacoustic probe for guiding needle biopsy of sentinel lymph nodes. J. Biomed. Opt. 2010;15:046010. doi: 10.1117/1.3469829. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Su J., Karpiouk A., Wang B., Emelianov S. Photoacoustic imaging of clinical metal needles in tissue. J. Biomed. Opt. 2010;15:021309. doi: 10.1117/1.3368686. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Xia W., Nikitichev D.I., Mari J.M., West S.J., Pratt R., David A.L., Ourselin S., Beard P.C., Desjardins A.E. Performance characteristics of an interventional multispectral photoacoustic imaging system for guiding minimally invasive procedures. J. Biomed. Opt. 2015;20:086005. doi: 10.1117/1.JBO.20.8.086005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Xia W., Maneas E., Nikitichev D.I., Mosse C.A., Sato dos Santos G., Vercauteren T., David A.L., Deprest J., Ourselin S., Beardet P.C., et al. Interventional photoacoustic imaging of the human placenta with ultrasonic tracking for minimally invasive fetal surgeries; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Munich, Germany. 5–9 October 2015; pp. 371–378. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Martial J.-M., Xia W., West S.J., Desjardins A.E. Interventional multispectral photoacoustic imaging with a clinical ultrasound probe for discriminating nerves and tendons: An ex vivo pilot study. J. Biomed. Opt. 2015;20:110503. doi: 10.1117/1.JBO.20.11.110503. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Xia W., West S.J., Nikitichev D.I., Ourselin S., Beard P.C., Desjardins A.E. Interventional multispectral photoacoustic imaging with a clinical linear array ultrasound probe for guiding nerve blocks. Proc. SPIE. 2016;9708:97080C1–97080C6. [Google Scholar]
- 27.Piras D., Grijsen C., Schütte P., Steenbergen W., Manohar S. Photoacoustic needle: Minimally invasive guidance to biopsy. J. Biomed. Opt. 2013;18:070502. doi: 10.1117/1.JBO.18.7.070502. [DOI] [PubMed] [Google Scholar]
- 28.Lediju Bell M.A., Kuo N.P., Song D.Y., Kang J.U., Boctor E.M. In vivo visualization of prostate brachytherapy seeds with photoacoustic imaging. J. Biomed. Opt. 2014;19:126011. doi: 10.1117/1.JBO.19.12.126011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Lediju Bell M.A., Guo X., Song D.Y., Boctor E.M. Transurethral light delivery for prostate photo-acoustic imaging. J. Biomed. Opt. 2015;20:036002. doi: 10.1117/1.JBO.20.3.036002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Singh M.K.A., Parameshwarappa V., Hendriksen E., Steenbergen W., Manohar S. Identification and removal of reflection artifacts in minimally-invasive photoacoustic imaging for accurate visualization of brachytherapy seeds. Proc. SPIE. 2017;10064:100640G. [Google Scholar]
- 31.Eddins B., Lediju Bell M.A. Design of a multifiber light delivery system for photoacoustic-guided surgery. J. Biomed. Opt. 2017;22:041011. doi: 10.1117/1.JBO.22.4.041011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Daoudi K., van den Berg P.J., Rabot O., Kohl A., Tisserand S., Brands P., Steenbergen W. Handheld probe integrating laser diode and ultrasound transducer array for ultrasound/photoacoustic dual modality imaging. Opt. Express. 2014;22:26365–26374. doi: 10.1364/OE.22.026365. [DOI] [PubMed] [Google Scholar]
- 33.Sivasubramanian K., Pramanik M. High frame rate photoacoustic imaging at 7000 frames per second using clinical ultrasound system. Biomed. Opt. Express. 2016;7:312–323. doi: 10.1364/BOE.7.000312. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Allen T.J., Beard P.C. Light emitting diodes as an excitation source for biomedical photoacoustics. Proc. SPIE. 2013;8581:85811F. doi: 10.1364/BOE.7.001260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Allen T.J., Beard P.C. High power visible light emitting diodes as pulsed excitation sources for biomedical photoacoustics. Biomed. Opt. Express. 2016;7:1260–1270. doi: 10.1364/BOE.7.001260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Hansen R.S. Using high-power light emitting diodes for photoacoustic imaging. Proc. SPIE. 2011;7968:79680A. [Google Scholar]
- 37.Kolkman R.G.M., Steenbergen W., van Leeuwen T.G. In vivo photoacoustic imaging of blood vessels with a pulsed laser diode. Lasers Med. Sci. 2016;21:134–139. doi: 10.1007/s10103-006-0384-z. [DOI] [PubMed] [Google Scholar]
- 38.Yoshitaro A., Tsutomu H. Photoacoustic Imaging with Multiple-Wavelength Light-Emitting Diodes. Jpn. J. Appl. Phy. 2013;52:07HB06. [Google Scholar]
- 39.Dai X., Yang H., Jiang H. In vivo photoacoustic imaging of vasculature with a low-cost miniature light emitting diode excitation. Opt. Lett. 2017;42:1456–1459. doi: 10.1364/OL.42.001456. [DOI] [PubMed] [Google Scholar]
- 40.Alles E.J., Colchester R.J., Desjardins A.E. Adaptive Light Modulation for Improved Resolution and Efficiency in All-Optical Pulse-Echo Ultrasound. IEEE Trans. Ultrason. Ferroelect. Freq. 2016;63:83–90. doi: 10.1109/TUFFC.2015.2497465. [DOI] [PubMed] [Google Scholar]
- 41.Wang L.V., Yao J. A practical guide to photoacoustic tomography in the life sciences. Nat. Meth. 2016;13:627–638. doi: 10.1038/nmeth.3925. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Jaeger M., Schupbach S., Gertsch A., Kitz M., Frenz M. Fourier reconstruction in optoacoustic imaging using truncated regularized inverse k-space interpolation. Inverse Probl. 2007;23:S51–S63. doi: 10.1088/0266-5611/23/6/S05. [DOI] [Google Scholar]
- 43.Maneas E., Xia W., Nikitichev D.I., Daher B., Manimaran M., Wong R.Y.J., Chang C.W., Rahmani B., Capelli C., Schievano S., et al. Anatomically realistic ultrasound phantoms using gel wax with 3D printed moulds. Phys. Med. Biol. 2018;63:015033. doi: 10.1088/1361-6560/aa9e2c. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Maneas E., Xia W., Ogunlade O., Fonseca M., Nikitichev D.I., David A.L., West S.J., Ourselin S., Hebden J.C., Vercauteren T., et al. Gel wax-based tissue-mimicking phantoms for multispectral photoacoustic imaging. Biomed. Opt. Express. 2018;9:1151–1163. doi: 10.1364/BOE.9.001151. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Gottlieb M., Sundaram T., Holladay D., Nakitende D. Ultrasound-Guided Peripheral Intravenous Line Placement: A Narrative Review of Evidence-based Best Practices. West. J. Emerg. Med. 2017;18:1047–1054. doi: 10.5811/westjem.2017.7.34610. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Singh M.K.A., Steenbergen W. Photoacoustic-guided focused ultrasound (PAFUSion) for identifying reflection artifacts in photoacoustic imaging. Photoacoustics. 2015;3:123–131. doi: 10.1016/j.pacs.2015.09.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Singh M.K.A., Jaeger M., Frenz M., Steenbergen W. In vivo demonstration of reflection artifact reduction in photoacoustic imaging using synthetic aperture photoacoustic-guided focused ultrasound (PAFUSion) Biomed. Opt. Express. 2016;7:2955–2972. doi: 10.1364/BOE.7.002955. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Singh M.K.A., Jaeger M., Frenz M., Steenbergen W. Photoacoustic reflection artifact reduction using photoacoustic-guided focused ultrasound: Comparison between plane-wave and element-by-element synthetic backpropagation approach. Biomed. Opt. Express. 2017;8:2245–2260. doi: 10.1364/BOE.8.002245. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Hill E.R., Xia W., Clarkson M.J., Desjardins A.E. Identification and removal of laser-induced noise in photoacoustic imaging using singular value decomposition. Biomed. Opt. Express. 2017;8:68–77. doi: 10.1364/BOE.8.000068. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Li M., Liu C., Gong X., Zheng R., Bei Y., Xing M., Du X., Liu X., Zeng J., Lin R., et al. Linear array-based real-time photoacoustic imaging system with a compact coaxial excitation handheld probe for noninvasive sentinel lymph node mapping. Biomed. Opt. Express. 2018;9:1408–1422. doi: 10.1364/BOE.9.001408. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Xia W., Noimark S., Ourselin S., West S.J., Finlay M.C., David A.L., Desjardins A.E. Ultrasonic Needle Tracking with a Fibre-Optic Ultrasound Transmitter for Guidance of Minimally Invasive Fetal Surgery; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Quebec City, QC, Canada. 10–14 September 2017; pp. 637–645. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Guo X., Guo X., Kang H.-J., Etienne-Cummings R., Boctor E.M. Active ultrasound pattern injection system (AUSPIS) for interventional tool guidance. PLoS ONE. 2014;9:e104262. doi: 10.1371/journal.pone.0104262. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Xia W., Mari J.M., West S.J., Ginsberg Y., David A.L., Ourselin S., Desjardins A.E. In-plane ultrasonic needle tracking using a fiber-optic hydrophone. Med. Phys. 2015;42:5983–5991. doi: 10.1118/1.4931418. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Xia W., Ginsberg Y., West S.J., Nikitichev D.I., Ourselin S., David A.L., Desjardins A.E. Coded excitation ultrasonic needle tracking: An in vivo study. Med. Phys. 2016;43:4065–4073. doi: 10.1118/1.4953205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Xia W., West S.J., Mari J.M., Ourselin S., David A.L., Desjardins A.E. 3D ultrasonic needle tracking with a 1.5 D transducer array for guidance of fetal interventions; Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention; Athens, Greece. 17–21 October 2016; pp. 353–361. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Xia W., Finlay M.C., Mari J.M., West S.J., Ourselin S., David A.L., Desjardins A.E. Looking beyond the imaging plane: 3D needle tracking with a linear array ultrasound probe. Sci. Rep. 2017;7:3674–3682. doi: 10.1038/s41598-017-03886-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.