Abstract
Existing streak-camera-based two-dimensional (2D) ultrafast imaging techniques are limited by long acquisition time, the trade-off between spatial and temporal resolutions, and a reduced field of view. They also require additional components, customization, or active illumination. Here, we develop compressed ultrafast tomographic imaging (CUTI), which passively records 2D transient events with a standard streak camera. By grafting the concept of computed tomography to the spatiotemporal domain, the operations of temporal shearing and spatiotemporal integration in a streak camera’s data acquisition can be equivalently expressed as the spatiotemporal projection of an (x, y, t) datacube from a certain angle. Aided by a new compressed-sensing reconstruction algorithm, the 2D transient event can be accurately recovered in a few measurements. CUTI is exhibited as a new imaging mode universally adaptable to most streak cameras. Implemented in an image-converter streak camera, CUTI captures the sequential arrival of two spatially modulated ultrashort ultraviolet laser pulses at 0.5 trillion frames-per-second. Applied to a rotating-mirror streak camera, CUTI records an amination of fast-bouncing balls at 5 thousand frames-per-second.
Ultrafast optical imaging is indispensable to numerous studies in physics, chemistry, and biology [1]. Advances in optoelectronic instrumentation propel ultrafast optical imaging with higher imaging speeds, higher sensitivity, and broader operating spectra [2]. Among existing devices, streak cameras are popularly used to passively record dynamic events [3]. By using a temporal shearing unit to convert the time of arrival to spatial deflection, commercially available streak cameras can directly measure transient optical signals with a temporal resolution of down to hundreds of femtoseconds [4]. Streak cameras have found widespread applications, including inertial confinement fusion [5], characterization of laser filaments [6], and fluorescence lifetime imaging [7].
Technical specifications of streak cameras have been greatly improved in recent years. Multiple sweep ranges with timespans from picoseconds to milliseconds are commonly implemented. Readout cameras with several millions of pixels have become standard configurations. Both advances endow streak cameras with multi-scale imaging speeds from thousands of frames per second (kfps) to trillions of frames per second (Tfps) [2]. Moreover, progress on new materials for photocathodes has expanded the spectral range of streak cameras from far-infrared regions to visible, ultraviolet (UV), even x-ray wavelengths [7–10]. Finally, to circumvent the space-charge effect in electron imaging and the Coulomb repulsive force in electron-photon conversion, innovative optical and electron imaging approaches have been implemented in designing novel temporal shearing units. The resultant new types of streak cameras have enhanced signal-to-noise ratios in the acquired images [11] and have improved temporal resolutions in recovered movies [12].
Despite this remarkable progress, in their conventional operation, streak cameras are still restricted to one-dimensional (1D) imaging Due to the time-to-space mapping in the temporal shearing operation, temporal information occupies one spatial axis on the two-dimensional (2D) readout camera. To avoid spatiotemporal ambiguity, spatial information can only be recorded on the other spatial axis. Thus, a narrow entrance slit (typically 50–100 μm wide) is added to restrict ultrafast imaging to a 1D field of view (FOV).
To overcome this limitation, many 2D streak imaging approaches have been developed. Existing methods can be generally divided into the multiple-shot and single-shot categories. For the former, the (x, y, t) information is acquired by combining the conventional operation of streak cameras with a scanning operation in the spatial dimension orthogonal to that of the entrance slit [13]. Although retaining the intrinsic contrast and resolutions of streak cameras, this approach requires a large number of measurements along the scanning direction to synthesize the (x, y, t) datacube. Alternatively, the (x, y, t) information can be obtained in a single measurement by combining streak imaging with other advanced imaging strategies. For example, by implementing a compressed sensing (CS) paradigm in a streak camera, compressed ultrafast photography (CUP) records a transient event compressively into a 2D snapshot and subsequently leverages the prior knowledge of the imaging model and the spatiotemporal sparsity of the scene to retrieve the (x, y, t) datacube [14]. In two other examples, a 2D-1D fiber array is used to maps a 2D FOV to a line [15], or a tilted lenslet-array is implemented to generate several replicas of the dynamic scene at different heights [16]. Both methods reduce the imaging dimension to accommodate the streak camera’s conventional operation so that a 2D ultrafast movie is recovered by allocating pixels in the acquired single streak image to the correct spatiotemporal positions in the (x, y, t) datacube. Despite contributing to many new studies [17–20], these methods reduce the spatial and/or temporal resolutions due to spatial encoding [14] or the FOV due to focal plane division [15,16]. Meanwhile, additional and customized components are either added in front of or inserted into the streak cameras to enable these operations. The increased system complexity may limit the application scope of these techniques [9].
Tomographic imaging is a promising method to surmount the limitations in existing 2D streak imaging techniques. Despite initially developed for recording (x, y, z) information, many tomographic imaging techniques have been implemented to record spatiotemporal (i.e., x, y, t) information [21, 22]. In a typical configuration, multiple identical ultrashort pulses [21] or a spatially chirped pulse [22] probes transient events. The transmitted light is measured by spectral interferometry to obtain the angular projections, which are fed to reconstruction algorithms to recover a movie with imaging speeds up to Tfps [22]. However, relying on active laser illumination, existing ultrafast tomographic imaging techniques are not applicable to imaging self-luminescent and color-selective dynamic events [23].
Here, we overcome these problems by developing compressed ultrafast tomographic imaging (CUTI), Grafting the principle of computed tomography to the spatiotemporal domain, CUTI uses temporal shearing and spatiotemporal integration to equivalently perform passive projections of a transient event By leveraging multiple sweep ranges readily available in a standard streak camera and a new CS-based reconstruction algorithm, the (x, y, t) datacube of the transient event can be accurately recovered using a few streak images.
The operating principle of CUTI is shown in Fig. 1. A repeatable dynamic scene, I(x, y, t), is directly imaged by a standard streak camera for a total of N times. The entrance port of the streak camera is wide-open to retain 2D spatial (i.e., x, y) information at each time point. In the ith acquisition (i = 1, 2, … , N), a shearing velocity, denoted by vi, is used by the temporal shearing unit. This process is denoted by the temporal shearing operator Si. Afterward, a 2D readout camera records the data by spatially integrating over each pixel and temporally integrating over the exposure time to a snapshot Ei. This process is denoted by the spatiotemporal integration operator T. Overall, the forward model of CUTI is described as
#(1) |
Fig. 1.
Operating principle of compressed ultrafast tomographic imaging (CUTI). TTR, TwIST-based tomographic reconstruction. Insert in the dashed box: Illustration of the equivalent spatiotemporal projections in data acquisition.
where E = [E1, E2, … , EN]T and S = [S1, S2, … , SN]T . This forward model equivalently describes the different passive spatiotemporal projections of I(x, y, t) in the y-t domain (see the inset in the dashed box in Fig. 1). The angle of the ith projection is θi = tan−1(υi/|υmax|), where |υmax| is the maximum shearing speed of the streak camera. Thus, θi ∈ [−45°, +45°] (see detailed derivation in Supplement 1).
After data acquisition, I(x, y, t) is reconstructed by a new algorithm developed based on the framework of sparse-view computed tomography and the two-step iterative shrinkage/thresholding (TwIST) algorithm [24]. In this TwIST-based tomographic reconstruction (TTR) algorithm, with an initialization is recovered by solving the optimization problem of
#(2) |
where τ is the regularization parameter, and ΦTV(·)is the three-dimensional total variation (TV) regularization function [14]. The reconstructed datacube has a sequence depth (i.e., the number of frames) of Nt = rts, where ts is the sweep time, r = |υmax|/pc denotes the imaging speed of CUTI, and pc is the pixel size of the readout camera. Each frame in the datacube has a (x, y) frame size of Nx ≤ Nh and Ny ≤ Nv − Nt + 1 pixels. Here, Nh and Nv are the horizontal and vertical pixel counts of the readout camera. Thus, CUTI is conceptually different from existing snapshot compressive imaging techniques [25–28] (see details in Supplement 1 and Visualization S1).
We demonstrated the feasibility of CUTI by simulating a dynamic jellyfish scene with the size of Nx × Ny × Nt = 512 × 512 × 80 pixels. Five projections (λi = 0, ±22.5°, and ± 45°) were applied to this scene according to the forward model [i.e. Eq. (1)]. All the projected images were input to the TTR algorithm (with τ = 0.0059) for image reconstruction. As a comparison, the algorithms of back projection (BP), recursive spatially adaptive filtering (RSAF) [29], and least squares (LSQR) [30] were also used for image reconstruction.
The results from all four algorithms are compared with the ground truth (GT) in Fig. 2 and Visualization 1. Four representative frames and local zoom-in views as well as the peak signal-noise ratio (PSNR) of each frame are shown in Figs. 2(a)–(b). These results show the superior performance of TTR In particular, TTR can recover more spatial details compared to BP and RSAF and has fewer artifacts than LSQR. Besides, to analyze the relationship between reconstructed image quality and the number of projections, we reconstructed twelve datacubes with the numbers of projections from 1 to 35, and the angles of these projections (i.e., θi) were uniformly distributed from −45° to +45° [Fig. 2(c)]. The normalized average correlation coefficient (NACC) between each reconstructed datacube and the GT was calculated. This result verifies that a reconstruction of good quality (i.e. NACC≥ 0.70) can be achieved in CUTI with ~5 projections.
Fig. 2.
Comparison of the ground truth (GT) with the reconstruction using four algorithms. (a) Four representative frames of the GT and the reconstructions by back projection (BP), recursive spatially adaptive filtering (RSAF), least squares (LSQR), and TTR. Last column: Zoomed-in views of a local feature in the 80th frame (marked by the cyan dashed box). (b) Peak signal-to-noise ratio (PSNR) of the reconstructions with the reconstructed results in (a). (c) Normalized average correlation coefficient (NACC) of TTR’s reconstruction with different numbers of projections. Error bar: standard deviation.
To demonstrate CUTI with an image-converter streak camera, we imaged a dynamic UV scene [Fig. 3(a)]. In particular, a 266-nm, 100-fs laser pulse was split into two arms by a beam splitter. In each arm, the laser pulse was retro-reflected by a mirror. A manual translation stage was added into one arm to generate a 1.6-ns time delay. The mirror M2 was slightly tilted with respect to the normal of the incident beam to generate a lateral shift to the reflected pulse. These two spatially and temporally separated UV pulses transmitting through a resolution target that contains patterns as shown in the inset in Fig. 3(a).
Fig. 3.
Demonstration of CUTI using an image-converter streak camera. (a) Experimental setup. Magenta-boxed inset: The reference image captured without using temporal shearing. (b) Representative frames of the reconstruction scenes. (c) Selected cross-sections of the resolution target in the x-direction (at y = 2.2 mm, green dash-dotted line) and in the y-direction (at x = 3.6 mm, blue dashed line) at t= 150 ps. The corresponding curves from the reference image are shown as magenta solid lines. (d) As (c), but shows the profiles in the x-direction (at y = 2.2 mm, blue dash-dotted line) and in the y-direction (at x = 5.5 mm, green dashed line) at t= 1746 ps. (e) Temporal trace of this event. FWHM, full width at half maximum.
We imaged this transient event using a standard UV-streak camera (AXIS-2DX-Pd, Axis Photonique), which has |υmax| = 10 μm/ps, ts = 2.8 ns, and pc = 20 μm. The intrinsic spatial and temporal resolutions of this streak camera are 22.5 lp/mm and 6 ps, respectively. In this configuration, CUTI had an imaging speed of r = 0.5 Tfps, a sequence depth of Nt = 1400 frames, and a frame size of Nx × Ny = 1024×1024 pixels. 11 projections were acquired using θi ∈ [−45°, +45°] with a 9° angular step. The regularization parameter was set to τ = 0.0204. The reconstructed movie is shown in Visualization 2. Six representative frames are presented in Fig. 3(b), showing two sequentially arrived laser pulses whose spatial profiles are modulated by the resolution target. To quantitatively analyze the reconstructed image qualify, we extracted selected cross-sections in the first pulse (at 150 ps) and the second pulse (at 1746 ps), as shown in Figs. 3(c)–(d). These results were also compared with the reference image captured without introducing temporal shearing Using the 10% contrast as the criterion, at t = 150 ps, the spatial resolutions were determined as 15.6 and 14.1 lp/mm in the x- and y-directions, respectively. At t = 1746 ps, the values were 13.2 and 14.1 lp/mm. We also analyzed the reconstructed temporal trace of this event [Fig. 3(e)]. The time delay between the two pulses is 1596 ps, which is consistent with the pre-set value. The full widths at half maximum of these pulses are 8 ps and 10 ps, respectively. This temporal broadening effect results synthetically from a limited range of projection angles, a limited number of projections, and the intrinsic temporal resolution of the streak camera. It is also noted that both the spatial resolution and the temporal accuracy of the second pulse are slightly decreased, which is attributed to the stronger image distortion at the bottom of the streak image (further explained in Supplement 1).
To demonstrate CUTI with a rotating-mirror streak camera, we imaged fast-moving ball patterns at 5 kfps. This datacube was repeatedly displayed by a digital micromirror device (DMD, AJD-4500, Ajile Light Industries). A collimated continuous-wave laser beam shone onto the DMD at an incident angle of ~24°. The light diffracted by the patterns was captured by a rotating-mirror streak camera built in-house, which uses a galvanometer scanner (6220H, Cambridge Technology) for temporal shearing and an electron-multiplying CCD camera (HNü 1024, Niivu Caméras) for spatiotemporal integration [Fig. 4(a)]. The pre-set parameters were |υmax| = 29.0 μm/ms, ts = 50 ms, and pc = 5.8 μm. Thus, CUTI operated at r = 5 kfps with a sequence depth of Nt = 250 frames and an image size of Nx × Ny = 512 × 512 pixels. 15 projections (θi ∈ [−45°, +45°] with a 6.4° angular step) were recorded in this experiment, and the regularization parameter was set to τ = 0.0463. The reconstructed movie is Visualization 3, and Fig. 4(b) presents selected frames of the GT and the TTR-reconstructed results. To evaluate CUTI’s performance, we calculated the PSNR and structural similarity index measure (SSIM) of TTR’s output [Fig. 4(c)]. Moreover, the centroids of each ball [labeled as B1 and B2 in the first panel of Fig. 4(b)] were traced [Fig. 4(d)]. The root-mean-square errors of reconstructed centroids along the x- and y- directions were calculated to be 19.18 and 19.16 μm for B1, and 18.36 and 18.03 μm for B2, respectively. These analyses confirm CUTI’s feasibility with rotating-mirror streak cameras.
Fig. 4.
Demonstration of CUTI using a rotating-mirror streak camera. (a) Experimental setup. Lens 1 (AC508-100, Thorlabs); Lenses 2 an 3 (AC508-075, Thorlabs). (b) Selected frames of the GT (top row) and TTR’s reconstruction with 15 projections (bottom row). (c) PSNR and structural similarity index measure (SSIM) of TTR’s reconstructions. (d) Tracing the centroids of B1 and B2 [marked in the first panel of (b)].
We have developed CUTI that synergizes streak imaging, tomographic imaging and CS. By implementing the concept of tomography in the spatiotemporal domain, CUTI passively records the spatiotemporal projections in an angular range from −45° to +45°. The acquired projections are processed by a newly developed TTR algorithm to accurately recover the dynamic scene. CUTI is a receive-only tomographic imaging paradigm with scalable imaging speeds. Implemented in an image-converter streak camera, CUTI captured the sequential arrival of two spatially modulated UV pulses at 0.5 Tips with a datacube of 1024×1024×1400 (x, y, t) pixels in size. Applied to a rotating-mirror streak camera, CUTI imaged fast-moving ball patterns at 5 kfps with a datacube size of 512×512×250 (x, y, t) pixels. As a universal scheme, CUTI can be readily applied to streak cameras without any hardware modification. Compared to the scanning-based multiple-shot 2D streak imaging approaches, CUTI largely reduces the data acquisition time. Compared to the singleshot methods, CUTI eliminates the trade between streak camera’s spatiotemporal resolution or FOV to temporal resolution. In the future, CUTI’s reconstruction qualify could be further boosted by applying other advanced CS algorithms [9]. Moreover, image rotators [18] could be integrated into the front optics of the streak cameras to further increase the number and coverage of projection angles. As a new imaging mode of streak cameras for 2D time-resolved imaging, CUTI will likely find new applications in time-of-flight ranging [31], laser manufacturing [32], and biomedicine [33].
Supplementary Material
Acknowledgments
Funding.
Natural Sciences and Engineering Research Council [NSERC] of Canada (RGPIN-2017-05959, RGPAS-507845-2017, CRDPJ-532304-18, RTI-2018-00505); Canada Foundation for Innovation and Ministère de l’Économie et de l’Innovation–Gouvernement du Québec (37146); Fonds de recherche du Québec–Nature et technologies (FRQNT) (2019-NC-252960); Fonds de recherche du Québec–Santé (FRQS) (280229, 267406); National Institutes of Health (R21GM137334).
Footnotes
Disclosures. The authors declare no conflict of interest.
Supplemental document. See Supplement 1 for supporting content.
FULL REFERENCES
- 1.Liang J and Wang LV, “Single-shot ultrafast optical imaging,” Optica 5, 1113–1127 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Faccio D and Velten A, “A trillion frames per second: the techniques and applications of light-in-flight photography,” Reports on Progress in Physics 81, 105901 (2018). [DOI] [PubMed] [Google Scholar]
- 3.Satat G, Heshmat B, Raviv D, and Raskar R, “All Photons Imaging Through Volumetric Scattering,” Scientific Reports 6, 33946 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Gallant P, Forget P, Dorchies F, Jiang Z, Kieffer JC, Jaanimagi PA, Rebuffie JC, Goulmy C, Pelletier JF, and Sutton M, “Characterization of a subpicosecond x-ray streak camera for ultrashort laser-produced plasmas experiments,” Review of Scientific Instruments 71, 3627–3633 (2000). [Google Scholar]
- 5.Kimbrough JR, Bell PM, Bradley DK, Holder JP, Kalantar DK, MacPhee AG, and Telford S, “Standard design for National Ignition Facility x-ray streak and framing cameras,” Review of Scientific Instruments 81, 10E530 (2010). [DOI] [PubMed] [Google Scholar]
- 6.Velten A, Schmitt-Sody A, Diels J-C, Rostami S, Rasoulof A, Feng C, and Arissian L, “Videos of light filamentation in air,” Journal of Physics B: Atomic, Molecular and Optical Physics 48, 094020 (2015). [Google Scholar]
- 7.Krishnan RV, Saitoh H, Terada H, Centonze VE, and Herman B, “Development of a multiphoton fluorescence lifetime imaging microscopy system using a streak camera,” Review of Scientific Instruments 74, 2714–2721 (2003). [Google Scholar]
- 8.Drabbels M, Lankhuijzen GM, and Noordam LD, “Demonstration of a far-infrared streak camera,” IEEE Journal of Quantum Electronics 34, 2138–2144 (1998). [Google Scholar]
- 9.Lai Y, Xue Y, Côté C-Y, Liu X, Laramée A, Jaouen N, Légaré F, Tian L, and Liang J, “Single-Shot Ultraviolet Compressed Ultrafast Photography,” Laser & Photonics Reviews 14, 2000122 (2020). [Google Scholar]
- 10.Naylor GA, Scheidt K, Larsson J, Wulff M, and Filhol JM, “A sub-picosecond accumulating streak camera for x-rays,” Measurement Science and Technology 12, 1858–1864 (2001). [Google Scholar]
- 11.Sarantos CH and Heebner JE, “Solid-state ultrafast all-optical streak camera enabling high-dynamic-range picosecond recording,” Optics Letters 35, 1389–1391 (2010). [DOI] [PubMed] [Google Scholar]
- 12.Itatani J, Quéré F, Yudin GL, Ivanov MY, Krausz F, and Corkum PB, “Attosecond Streak Camera,” Physical Review Letters 88, 173903 (2002). [DOI] [PubMed] [Google Scholar]
- 13.Velten A, Willwacher T, Gupta O, Veeraraghavan A, Bawendi MG, and Raskar R, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nature Communications 3, 745 (2012). [DOI] [PubMed] [Google Scholar]
- 14.Gao L, Liang J, Li C, and Wang LV, “Single-shot compressed ultrafast photography at one hundred billion frames per second,” Nature 516, 74–77 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Tsikouras A, Berman R, Andrews DW, and Fang Q, “High-speed multifocal array scanning using refractive window tilting,” Biomedical Optics Express 6, 3737–3747 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Heshmat B, Satat G, Barsi C, and Raskar R, “Single-shot ultrafast imaging using parallax-free alignment with a tilted lenslet array,” in CLEO: 2014, OSA Technical Digest; (online) (Optical Society of America, 2014), paper STu3E.7. [Google Scholar]
- 17.Shiraga H, Fujioka S, Jaanimagi PA, Stoeckl C, Stephens RB, Nagatomo H, Tanaka KA, Kodama R, and Azechi H. “Multi-imaging x-ray streak camera for ultrahigh-speed two-dimensional x-ray imaging of imploded core plasmas,” Review of scientific instruments 75, 3921–3925 (2004). [Google Scholar]
- 18.Liang J, Wang P, Zhu L, and Wang LV, “Single-shot stereo-polarimetric compressed ultrafast photography for light-speed observation of high-dimensional optical transients with picosecond resolution,” Nature Communications 11, 5252 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Kodama R, Okada K, and Setoguchi H, “Measurements of nonuniformly heated plasmas with a 2D space-resolved high-speed sampling camera,” in Proc. SPIE (IEEE, 2001) 4183. [Google Scholar]
- 20.Liang J, Ma C, Zhu L, Chen Y, Gao L, and Wang LV, “Single-shot real-time video recording of a photonic Mach cone induced by a scattered light pulse,” Science Advances 3, e1601814 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Li Z, Zgadzaj R, Wang X, Chang Y-Y, and Downer MC, “Single-shot tomographic movies of evolving light-velocity objects,” Nature Communications 5, 3085 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Matlis NH, Axley A, and Leemans WP, “Single-shot ultrafast tomographic imaging by spectral multiplexing,” Nature Communications 3, 1111 (2012). [DOI] [PubMed] [Google Scholar]
- 23.Liang J, “Punching holes in light: Recent progress in single-shot coded-aperture optical imaging,” Reports on Progress in Physics 83, 116101 (2020). [DOI] [PubMed] [Google Scholar]
- 24.Bioucas-Dias JM and Figueiredo MAT, “A New TwIST: Two-Step Iterative Shrinkage/Thresholding Algorithms for Image Restoration,” IEEE Transactions on Image Processing 16, 2992–3004 (2007). [DOI] [PubMed] [Google Scholar]
- 25.Wagadarikar A, John R, Willett R, and Brady D, “Single disperser design for coded aperture snapshot spectral imaging,” Applied Optics 47, B44–B51 (2008). [DOI] [PubMed] [Google Scholar]
- 26.Llull P, Liao X, Yuan X, Yang J, Kittle D, Carin L, Sapiro G, and Brady DJ, “Coded aperture compressive temporal imaging,” Optics Express 21, 10526–10545 (2013). [DOI] [PubMed] [Google Scholar]
- 27.Jalali S and Yuan X, “Snapshot Compressed Sensing: Performance Bounds and Algorithms,” IEEE Transactions on Information Theory 65, 8005–8024 (2019). [Google Scholar]
- 28.Liu X, Liu J, Jiang C, Vetrone F, and Liang J, “Single-shot compressed optical-streaking ultra-high-speed photography,” Optics Letters 44, 1387–1390 (2019). [DOI] [PubMed] [Google Scholar]
- 29.Egiazarian K, Foi A, and Katkovnik V, “Compressed Sensing Image Reconstruction Via Recursive Spatially Adaptive Filtering,” in 2007 IEEE International Conference on Image Processing (IEEE, 2007), 549–552. [Google Scholar]
- 30.Paige CC and Saunders MA, “LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares,” ACM Transactions on Mathematical Software (TOMS) 8, 43–71 (1982). [Google Scholar]
- 31.Liang J, Gao L, Hai P, Li C, and Wang LV, “Encrypted Three-dimensional Dynamic Imaging using Snapshot Time-of-flight Compressed Ultrafast Photography,” Scientific Reports 5, 15504 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Gautam GD and Pandey AK, “Pulsed Nd:YAG laser beam drilling: A review,” Optics & Laser Technology 100, 183–215 (2018). [Google Scholar]
- 33.Ntziachristos V, Tung C-H, Bremer C, and Weissleder R, “Fluorescence molecular tomography resolves protease activity in vivo,” Nature Medicine 8, 757–761 (2002). [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.