Skip to main content
Applied Physics Letters logoLink to Applied Physics Letters
. 2013 Nov 11;103(20):203702. doi: 10.1063/1.4830045

In vivo virtual intraoperative surgical photoacoustic microscopy

Seunghoon Han 1,a),b), Changho Lee 2,b),c), Sehui Kim 1,d), Mansik Jeon 2,e), Jeehyun Kim 1,f), Chulhong Kim 2,3,f)
PMCID: PMC3843748  PMID: 24343135

Abstract

We developed a virtual intraoperative surgical photoacoustic microscopy system by combining with a commercial surgical microscope and photoacoustic microscope (PAM). By sharing the common optical path in the microscope and PAM system, we could acquire the PAM and microscope images simultaneously. Moreover, by employing a beam projector to back-project 2D PAM images onto the microscope view plane as augmented reality, the conventional microscopic and 2D cross-sectional PAM images are concurrently mapped on the plane via an ocular lens of the microscope in real-time. Further, we guided needle insertion into phantom ex vivo and mice skins in vivo.


A surgical microscope is a vital tool during operations since it started being used in otolaryngology.1, 2 In spite of the continuous improvement of the surgical microscopy performance, it could mainly provide the enlarged surface images without the sub-surface information. Because of this limitation, surgeons are required to have significant experiences and heavy trainings in current clinical practices. Thus, noninvasive visualization of the sub-surface information should play an important role during ophthalmic, micro-vascular, and neuro-surgeries.3, 4, 5

Conventional optical microscopic techniques such as optical coherence tomography (OCT) and fluorescence microscopy (FM) has been applied for this purpose.6, 7, 8, 9, 10 OCT mainly provide microstructures of tissues based on optical scattering while the penetration depth of FM is significantly shallow and requires to use the exogenous contrast agents. However, real-time noninvasive mapping of microvasculatures are crucial to verify bleeding regions, guide vascular reconnection, or delineate angiogenesis during the surgeries.11 To meet this purpose, we are convinced to develop a noninvasive surgical microscopy system to image microvasculatures and surgical intervention without any contrast agent beyond the skin surface in real-time.

Photoacoustic microscopy (PAM) is a rising imaging technique that combines optical excitation and ultrasound detection.12, 13, 14 Because PAM can provide label-free optical absorption information in a non-invasive manner with high contrast and resolution, this modality has widely been used to image tumor angiogenesis, tumor metabolism, brain function, ocular structures, molecular information, etc.15, 16, 17, 18, 19, 20

In this letter, we developed a near-real-time virtual intraoperative surgical photoacoustic microscope (VISPAM) by integrating PAM and conventional surgical microscopy. The VISPAM system could obtain, process, and display both PAM and microscopic images at the same time. The Hilbert-transformed cross-sectional PAM B-scan images were projected back onto the microscope view plane as augmented reality through a home-made optical beam splitter, and then the PAM images were shown via the ocular lens mounted on the microscope. Thus, an additional tabletop display is unnecessary, so it is significantly convenient because surgeons potentially do not need to move their sights during the surgeries. We demonstrated the feasibilities of VISPAM in phantoms and live animals. Specially, via the in vivo the experiments, we delineated surrounding microvaculatures and guided needle insertion simultaneously in near-real time (i.e., two PAM B-scan images per second).

First, we implemented an optical-resolution PAM (OR-PAM) system. Fig. 1a shows the schematic of the VISPAM system. A Q-switched-diode-pumped-solid-state-laser (Elforlight, SPOT-10-200-532, 532 nm) with a repetition rate of 5 kHz was utilized as a laser source. The laser light was first attenuated by a neutral density filter (Thorlabs, NE02B) and divided by a beam splitter (Thorlabs, CM1-BP108). 8% of light was detected by a photodiode (Thorlabs, SM05PD5A) to synchronize galvo scanners and a data acquisition (DAQ) system. The rest of the light was used for PAM. Then, we developed a VISPAM probe by adapting with a commercial surgical microscope (Carl Zeiss, OPMI). The VISPAM probe consisted of three main devices: (1) a display projector, (2) beam splitting, (3) PAM scanning subsystems. Each subsystem was notified as numbers “1,” “2,” and “3,” respectively. Fig. 1b shows the corresponding photograph.

Figure 1.

Figure 1

(a) Schematic of the VISPAM system. (b) Photograph of the VISPAM probe. (c) Axial (left) and lateral (right) profiles of a carbon fiber (6 μm). (d) VISPAM B-scan image of a needle inserted obliquely in a living mouse leg tissue. COM, computer; PD, photo diode; BS, beam splitter; NF, neutral density filter; AMP, amplifier; BP, beam projector; M, mirror; G, galvo scanner; OL, object lens; UT, ultrasonic transducer; WT, water tank; and S, sample.

The display projector subsystem 1 composed of a beam projector (Optoma, PR320) and two mirrors to control the beam path. The subsystem 2 (i.e., the beam splitting device) played a role to project the acquired PAM images back onto the field of view of the conventional microscope through the microscopic ocular lens. A custom-made mount which was attached to the commercial microscope held the beam splitter. The PAM scanning subsystem 3 composed of two galvo scanners (Thorlabs, GVS001), an objective lens (Thorlabs, AC254-060-A), and a beam splitter (Edmund optics, NT68-395). This beam splitter (40% of reflection and 60% of transmission) was designed to reflect the 532 nm beam (i.e., PAM light) and transmit the visible light (i.e., light for the optical microscope). Thus, 40% of the reflected 532 nm beam irradiated into the sample. Further, 60% of the reflected visible light transmitted via the beam splitter and the ocular lens. The two galvo scanners were used for optical scanning. The generated PA signals from the samples were obtained by an unfocused ultrasonic transducer (Olympus NDT, V312, and a center frequency of 10 MHz). We used a small water tray for ultrasound coupling. The acquired PA signals were amplified by a pulser/receiver (Olympus NDT, 5072PR) and then digitized by a DAQ board. Typically, 200 of PA A-line scans were utilized to form one PA B-scan image, and the image was displayed at 2 frames per second. The B-scan image was back-projected via the display projector subsystem into the microscopic image plane via the ocular lens. The axial and lateral resolutions were measured by imaging a carbon fiber (6 μm) in water, and they are 131 and 17 μm, respectively, as shown in Fig. 1c. Fig. 1d indicated the maximum tissue penetration (∼900 μm) in a living mouse thigh, experimentally measured by inserting a needle obliquely.

To demonstrate the performance of VISPAM, we guided a needle into four black hairs contained in a gelatin phantom. The field of view (FOV) of the back-projected PA B-scan was 6 mm × 5 mm along the x and z axes, respectively. As shown in media 1, we have guided the needle into the phantom containing the hairs. Figs. 2a, 2c, and 2e show the screen-shots acquired via the left ocular lens during the needle guidance. Figs. 2b, 2d are the close-ups of the PAM B-scan images overlaid in Figs. 2a, 2c, respectively. The cross-sections of the hairs and needle are clearly visible in the PAM images. A PAM maximum amplitude projection (MAP) image is shown in Fig. 2f. The FOV of the PAM MAP image is 6 mm × 6 mm along the x and y axes, respectively, and it took approximately 2 min to acquire the one MAP image. The needle and hairs are clearly mapped in the MAP image.

Figure 2.

Download video file (1.7MB, avi)

Real-time VISPAM in phantoms. (a), (c) The screen-shots of overlapped PAM and surgical microscopic images obtained via the left ocular lens during needle intervention to four black hairs embedded in a gelatin phantom. (b), (d) The enlarged PAM B-scan images extracted from (a) and (c), respectively. (e), (f) The screen-shot and PAM MAP image of the hairs and needle in the phantom, respectively. H, hair and N, needle (enhanced online).

To investigate the feasibility of VISPAM in vivo, we guided the needle insertion into the mouse skin. All animal experimental procedures satisfied the laboratory animal protocol approved by the institutional animal care and use committee. A healthy BALB/c mouse that weighted ∼20 g was used in animal experiments. The mouse was anesthetized with intraperitoneal injection of a mixture of xylazine (15 mg/kg) and ketamine (85 mg/kg body weight). After removing the hairs on the left thigh, the mouse was located on a homemade animal holder. As shown in media 2, we inserted the needle into the mouse thigh while we monitored the procedures using the VISPAM probe. The photograph of the mouse and needle insertion is shown in Fig. 3e. The laser pulse energy on the mouse skin was less than 17 mJ/cm2, which was below the American National Standards Institute (ANSI) safety limit (20 mJ/cm2). Figs. 3a, 3c show the screen-shots captured via the left ocular lens during the needle intervention. Figs. 3b, 3d are the magnified PAM B-scan images extracted from Figs. 3a, 3c, respectively. The cross-sections of the blood vessels and needle are clearly visible in the images. The needle and surrounding vasculatures are clearly visualized in the PAM MAP image, which was averaged 5 times to enhance the signal to noise ratio as shown in Fig. 3f.

Figure 3.

Download video file (1.8MB, avi)

Real-time VISPAM in vivo. (a), (c) The screen-shoots, acquired via left ocular lens, of overlaid the PAM and surgical microscopy images presenting microvasculatures in the left thigh of the mouse and needle insertion. (b), (d) The magnified PAM B-scan images cut from (a) and (c), respectively. (e) The photographs of the mouse (inset) and mouse skin with needle insertion. (f) The PAM MAP image showing the needle inserted into the skin and surrounding blood vessels. V, vessel and N, needle (enhanced online).

In summary, we developed a VISPAM system by fusing PAM and conventional microscopy. Unlike previously developed advanced surgical microscopy, VISPAM can potentially guide surgical processes and visualize surrounding microvasculatures simultaneously in real time without any injection of contrast agents. By employing augmented reality, we could project the PAM B-scan images on the microscopy image view plane. Thus, no extra PAM display is required, which can significantly enhance clinical translation. In the future, three issues needs to be addressed: (1) noncontact detection of photoacoustic signals to remove water or ultrasound gel coupling,21 (2) improvement of the image display rate using graphics processing unit,22 and (3) the use of an invisible near-infrared laser source to minimize the surgical disturbance from the visible PA laser light.

Acknowledgments

This work was supported in part by IT Consilience Creative Program of MKE and NIPA (Grant No. C1515-1121-0003) and NRF grant of Korea government (MSIP) (Grant No. 2011-0030075) to C.K. This work was supported in part by a grant of the Korea Healthcare Technology R&D Project, Ministry for Health, Welfare & Family Affairs (Grant No. A102024-1011-0000200) and the NIH (Grant No. 201225940000) to J.K.

References

  1. Mudry A., Am. J. Otol. 21, 877 (2000); http://www.ncbi.nlm.nih.gov/pubmed/11078079. [PubMed] [Google Scholar]
  2. Schultheiss D. and Denil J., Andrologia 34, 234 (2002). 10.1046/j.1439-0272.2002.00499.x [DOI] [PubMed] [Google Scholar]
  3. Boppart S. A., Brezinski M. E., Pitris C., and Fujimoto J. G., Neurosurgery 43, 834 (1998). 10.1097/00006123-199810000-00068 [DOI] [PubMed] [Google Scholar]
  4. Huttmann G., Lankenau E., Schulz-Wackerbarth C., Muller M., Steven P., and Birngruber R., Klin. Monatsbl. Augenheilkd. 226, 958 (2009). 10.1055/s-0028-1109939 [DOI] [PubMed] [Google Scholar]
  5. Geerling G., Muller M., Winter C., Hoerauf H., Oelckers S., Laqua H., and Birngruber R., Arch. Ophthalmol. (Chicago) 123, 253 (2005). 10.1001/archopht.123.2.253 [DOI] [PubMed] [Google Scholar]
  6. Ehlers J. P., Tao Y. K., Farsiu S., Maldonado R., Izatt J. A., and Toth C. A., Invest. Ophthalmol. Visual Sci. 52, 3153 (2011). 10.1167/iovs.10-6720 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. van Dam G. M., Themelis G., Crane L. M., Harlaar N. J., Pleijhuis R. G., Kelder W., Sarantopoulos A., de Jong J. S., Arts H. J., van der Zee A. G. et al. , Nat. Med. 17, 1315 (2011). 10.1038/nm.2472 [DOI] [PubMed] [Google Scholar]
  8. Tao Y. K. K., Ehlers J. P., Toth C. A., and Izatt J. A., Opt. Lett. 35, 3315 (2010). 10.1364/OL.35.003315 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Matsui A., Lee B. T., Winer J. H., Vooght C. S., Laurence R. G., and Frangioni J. V., Plast. Reconstr. Surg. 123, 125e (2009). 10.1097/PRS.0b013e31819a3617 [DOI] [PubMed] [Google Scholar]
  10. Ehlers J. P., Tao Y. K., Farsiu S., Maldonado R., Izatt J. A., and Toth C. A., Retina-J. Ret. Vit. Dis. 33, 232 (2013). 10.1097/IAE.0b013e31826e86f5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. McDonald D. M. and Choyke P. L., Nat. Med. 9, 713 (2003). 10.1038/nm0603-713 [DOI] [PubMed] [Google Scholar]
  12. Maslov K., Stoica G., and Wang L. V., Opt. Lett. 30, 625 (2005). 10.1364/OL.30.000625 [DOI] [PubMed] [Google Scholar]
  13. Hu S., Maslov K., and Wang L. H. V., Med. Phys. 36, 2320 (2009). 10.1118/1.3137572 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Kim C., Favazza C., and Wang L. H. V., Chem. Rev. 110, 2756 (2010). 10.1021/cr900266s [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Wang X. D., Xie X. Y., Ku G. N., and Wang L. H. V., J. Biomed. Opt. 11, 024015 (2006) 10.1117/1.2192804. [DOI] [PubMed] [Google Scholar]
  16. de la Zerda A., Paulus Y. M., Teed R., Bodapati S., Dollberg Y., Khuri-Yakub B. T., Blumenkranz M. S., Moshfeghi D. M., and Gambhir S. S., Opt. Lett. 35, 270 (2010). 10.1364/OL.35.000270 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Kim C., Cho E. C., Chen J., Song K. H., Au L., Favazza C., Zhang Q., Cobley C. M., Gao F., Xia Y. et al. , ACS Nano 4, 4559 (2010). 10.1021/nn100736c [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Deng Z., Wang Z., Yang X., Luo Q., and Gong H., J. Biomed. Opt. 17, 081415 (2012). 10.1117/1.JBO.17.8.081415 [DOI] [PubMed] [Google Scholar]
  19. Liu X., Lee C., Law W., Zhu D., Liu M., Jeon M., Kim J., Prasad P. N., Kim C., and Swihart M. T., Nano Lett. 13, 4333 (2013) 10.1021/nl402124h. [DOI] [PubMed] [Google Scholar]
  20. Yao J. J., Xia J., Maslov K. I., Nasiriavanaki M., Tsytsarev V., Demchenko A. V., and Wang L. V., NeuroImage 64, 257 (2013). 10.1016/j.neuroimage.2012.08.054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Wang Y., Li C. H., and Wang R. K., Opt. Lett. 36, 3975 (2011). 10.1364/OL.36.003975 [DOI] [PubMed] [Google Scholar]
  22. Jeong H., Cho N. H., Jung U., Lee C., Kim J. Y., and Kim J., Sensors (Basel) 12, 6920 (2012). 10.3390/s120606920 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Applied Physics Letters are provided here courtesy of American Institute of Physics

RESOURCES