Abstract
We present software-based methods for automatic phase control and for mosaicing high-speed, Lissajous-scanned images. To achieve imaging speeds fast enough for mosaicing, we first increase the image update rate tenfold from 3 to 30 Hz, then vertically interpolate each sparse image in real-time to eliminate fixed pattern noise. We validate our methods by imaging fluorescent beads and automatically maintaining phase control over the course of one hour. We then image fixed mouse brain tissues at varying update rates and compare the resulting mosaics. Using reconstructed image data as feedback for phase control eliminates the need for phase sensors and feedback controllers, enabling long-term imaging experiments without additional hardware. Mosaicing subsampled images results in video-rate imaging speeds, nearly fully recovered spatial resolution, and millimeter-scale fields of view.
Keywords: Confocal microscopy, Lissajous imaging, mosaicing, real-time, software-based phase control
I. Introduction
MINIATURIZED, high resolution, laser scanning microscopes are usually constrained by small fields of view (FOVs) or slow frame rates due to mechanically scanning across an area, compiling images point-by-point. These devices commonly use microelectromechanical systems (MEMS) scanning mirrors [1], [2], [3] and piezoelectric fiber scanners [4], [5] for beam deflection. Raster scanning, in which one axis is driven at resonance while the other is scanned much more slowly, is simple to implement, but is susceptible to small FOVs, low electromechanical efficiency, and image warping caused by low frequency mirror actuation (Figure 1a). Lissajous scanning, in which both axes are scanned at resonance [6], maximizes FOV while using lower driving voltages, but suffers from inefficient sampling (Figure 1b) and phase drift (changes in phase lag over time). In this work, we aim to address the traditional disadvantages of Lissajous imaging with a miniaturized microscope: inefficient spatial sampling, slow frame rates, small FOVs, and phase drift.
Fig. 1.
Image comparison between (a) raster scanning and (b) Lissajous scanning using the same microscope and MEMS mirror. This mirror exhibits signs of inter-axis coupling, and thus distortion, during raster scanning. The Lissajous scan achieves a larger FOV and is less susceptible to image warping, but suffers from a slight phase misalignment and a few unsampled pixels near the center of the image. Group six of a reflective 1951 USAF resolution target shown. The FOVs in (a) and (b) are 90 μm × 215 μm (approximate due to warping) and 140 μm × 260 μm, respectively. Scale bars indicate 50 μm.
Lissajous patterns’ inefficient sampling has previously been addressed either by choosing driving frequency combinations resulting in nonrepeating patterns [7], or by updating the image prior to complete image sampling [7], [8]. On its own, increasing update rate results in undersampled, sparse images, reducing spatial resolution and requiring interpolation.
Reconstruction techniques developed to improve on linear interpolation can be categorized into three groups: (1) basic interpolation using local averaging; (2) inpainting [9], [10], which extends image features into unsampled regions; and (3) compressive sensing reconstruction [11], which attempts to find the sparsest description that fits the sampled data. Some of these techniques require minor modification to beam-scanning hardware and additional control and data acquisition electronics, and all of these techniques are computationally complex, requiring on the order of seconds to reconstruct each image on a standard computer.
Lissajous patterns slow frame rates have precluded the implementation of image mosaicing due to motion-based warping and blurring. Mosaicing is a fundamental technique in image processing in which consecutive images (frames) are automatically registered, de-warped, overlapped, and blended together to build a single, large image. This process increases both FOV and signal-to-noise ratio (SNR), and enables software-based image stabilization, but depends on fast frame rates. Previous works in this field has focused on applying mosaicing using fixed [12], fibered [13], and handheld [14], [15] microscopes, some in real-time [16], [17], [18].
The lack of phase control in Lissajous scanning has mostly led the field toward hardware-based solutions to measure the phase between driving waveforms [19], [20]. Phase-locked loops (PLLs) are most commonly used for tracking and controlling periodic signals [19], and have been applied in MEMS devices [21], AFM [22], and resonant fiber scanners [23]. Phase control hardware increases both the size and cost of the optical system, inhibiting miniaturization. For cases in which the Q-factor of the scanner is sufficiently low, and the imaging duration relatively short, manual adjustment of phase through a graphical user interface (GUI) is possible [7].
Our solution for tackling inefficient sampling, slow frame rates, small FOVs, and phase drift is comprised of four steps: First, in Section II, we boost the update rate of Lissajous-scanned images, thereby subsampling the FOV to form a sparse image. Second, in Section III, we interpolate the image in real-time, obtaining a coherent image with which to perform phase measurements and template matching. Third, in Section IV, we analyze every nth image in real-time to maintain phase control for each axis. Lastly, in Section V, we mosaic images together to recover the resolution and SNR that was sacrificed by undersampling. We demonstrate these techniques using a handheld, dual-axis, confocal (DAC) fluorescent microscope utilizing a 2D MEMS scanning mirror [24]. This microscope features a 10 mm outer diameter, with all optical components and beam paths contained within the central 5 mm. Excitation occurs at 660 nm and 785 nm. This solution enables high-speed Lissajous imaging with high resolution, large fields of view, and automatic phase control.
II. Spatial Scanning and Sampling
Pattern repeat rate, , describes the periodicity of a 2D scanning pattern. Update rate, , is the rate at which consecutive frames are displayed and saved, regardless of how completely we sample the image’s FOV. We distinguish from frame rate, a term that usually implies full frame sampling prior to update (i.e., when ). All of these rates are expressed in units of frames per second (fps), or Hz.
A. Scanning Patterns
The complex harmonic motion of a Lissajous curve is described by the parametric equations:
(1) |
(2) |
In these equations, f is driving frequency, is instantaneous phase, and A is the extent of FOV (amplitude) for each axis. Phase is a function of both frequency and elapsed time (Δt), and must be calculated in real-time in order to smoothly match driving signals and to properly construct images:
(3) |
In the case of raster scanning, , and the scanning patterns shape is insensitive to frequency (Figure 2a). The scanning paths of Lissajous curves, on the other hand, are highly sensitive to the ratio [25] (Figures 2b-d). For repeating patterns (Figure 2c), which occur if and only if is rational, the pattern repeat rate, , is given by
(4) |
where nx and ny are the smallest integer divisors of fx and fy, respectively, and determine the number of cycles scanned on each axis, each period [8]. This equation introduces an inherent tradeoff for repeating Lissajous patterns: increasing line density decreases pattern repeat rate. In other words, lateral resolution and frame rate are inversely related.
Fig. 2.
Common laser scanning driving modes. (a) Raster scan, in which . (b) Lissajous radial scan, in which , , , and both driving amplitudes ramp at the same linear rate. (c) Repeating Lissajous scan, in which is rational. (d) Nonrepeating, or sliding, Lissajous scan, in which is irrational. Less commonly used Lissajous patterns for imaging include a line scan, in which and , and a parabola scan, in which = 2 and (neither shown). Color represents elapsed time during a single frame period. Simulations generated in Matlab.
While this work focuses on repeating Lissajous scanning patterns (Figure 2c), the techniques described later are applicable to all of the scanning patterns shown in Figure 2. We elected to use a repeating pattern so that we could directly compare images with identical scanning patterns (and thus interpolation patterns, discussed later) to determine how image quality is affected by sampling, interpolation, and phase control, especially for undersampled images. If we were to use nonrepeating patterns, each image would be sampled and interpolated at slightly different spatial locations, making it difficult to directly compare images and quantify results, particularly during phase control testing.
B. Spatial Sampling and Update Rate
Mechanical scanning properties (scanning amplitude, frequency, phase, etc.), optical resolution, sample rate, and update rate all affect sampling sparsity. Of these parameters, update rate is the simplest to manipulate. The effect of varying update rate on spatial sampling is shown in Figure 3. White pixels in Figure 3a-d correspond to unsampled “gaps” in the FOV. Gap height is the vertical distance between sampled pixels, which varies as a function of position, and maximum gap height occurs near the center of the FOV. Note that because scanning frequency determines the number of lobes and periods for each axis, gap height is almost always less than gap width because for our scanner. Although these varying gaps all affect image quality, image resolution is ultimately limited by the maximumx gap height.
Fig. 3.
Sampling efficiency and spatial resolution versus update rate. (a) 3 Hz (), (b) 10 Hz, (c) 20 Hz, and (d) 30 Hz Lissajous scans. Color represents the number of times each pixel is sampled, with yellow, blue, and white representing greatly oversampled, Nyquist-rate sampled, and unsampled pixels, respectively. (e) Plot showing how sampling is affected by update rate. (f) Plot showing how spatial resolution is affected by update rate. All scanning parameters other than sample rate match those used for our microscope: FOV = 250 pixels × 450 pixels, fx = 2034 Hz, fy = 315, and starting phases . Sample rate SR = 5.77 MHz, satisfying Nyquist. Simulations generated in Matlab.
The pattern in Figure 3a naturally closes because is rational for our scanner (fx = 2,034Hz and fy = 315Hz). From Equation 4, the pattern has a natural repeat rate of 3 Hz. Reducing the update rate to below 3 Hz does not reduce the maximum gap height, regardless of starting phase. Increasing the update rate past the repeat rate increases the maximum gap height for each frame and causes sequential frames to have different scanning trajectories until the pattern reaches the end of its period.
Figure 3e shows that spatial sampling efficiency (how often we sample each pixel just once) improves logarithmically with update rate. For example, as we increase update rate from 3 Hz (Figure 3a) to 30 Hz (Figure 3d), the number of greatly oversampled pixels (those sampled ten times or more per frame) decreases by ~95%, while the number of unsampled pixels increases by just ~ 64%, depending on initial phase. Note that there are plateaus in the red plot of Figure 3e, and in both plots of Figure 3f extending from 1 Hz to 17 Hz, rather than from 1 Hz to 3 Hz as expected. This is due to the chosen starting phase combination, and is not representative of all starting phase combinations.
Figure 3 is calculated using the same initial phase value of 90°. In Figure 3a, , meaning each axis’ phase value ends precisely where it starts, at 90°. Thus the scanning patterns for two consecutive frames are identical. In Figures 3b-d, these traits no longer occur since . In these cases, ending phase does not match starting phase, and consecutive frames have different scanning patterns, sampling different regions in the FOV. This phenomenon will prove to be vital in Sections III and V.
C. Temporal Sampling
For one-dimensional harmonic oscillation, the scanning points maximum velocity is
(5) |
Pythagorean theorem gives us maximum velocity for a two-dimensional system:
(6) |
Maximum velocity occurs at the equilibrium position for our mass-spring system, in the center of our FOV, when acceleration is minimized and the MEMS springs are neither pushing nor pulling on the mirror. This explains why Lissajous images can exhibit poor sampling and resolution near the center of the FOV. Plugging in values for our microscope’s MEMS mirror (370μm, 650μm, 315Hz, 2.034kHz for Ax, Ay, fx, and fy, respectively) results in .
The experimentally measured full width at half maximum (FWHM) spatial resolution of the microscope is 3.0 μm for both horizontal and vertical axes, determined by scanning a mirrored knife-edge target across the beam. This represents the diffraction-limited, transverse confocal response of the system [26]. A digital acquisition (DAQ) card samples voltages from the photomultiplier tube (PMT) at a sample rate (SR) of 10 MHz. Using our value for , our spatial sampling rate is ≤ 0.417μm/pixel, greatly exceeding Nyquists sampling criterion.
To form the final image, each sample is integrated with the nearest pixel of a 250 × 450 pixel image using sample locations calculated ahead of time. With a 3 μm resolution and a 370 × 650 μm FOV, this image size gives us a ratio above 2 pixels per resolution element, also satisfying Nyquist. An integration map is incremented by one at its corresponding pixel location for each sample. The final image is then the quotient of the integrated signal divided by the integration map, and unsampled pixels are assigned to zero.
III. Image Interpolation
With the goal of broad applicability and real-time, video-rate use in mind, we linearly interpolate along the vertical axis:
(7) |
where y and I are pixel location and intensity, respectively. This process takes less than 1.5 ms for each 250 × 450 pixel image, more than fast enough for real-time use at 30 Hz.
Vertical interpolation has three benefits: First, interpolation occurs along the shortest route between sampled pixels. Second, horizontal resolution of static mosaics is unaffected because interpolated pixels are not averaged with sampled pixels during the blending process. Thus, for undersampled, interpolated images at fast update rates (), maximum gap height is the limiting factor in spatial resolution of static mosaics, and is a function of update rate alone. For moving samples, this effect is overwhelmed by motion artifacts and registration errors. Third, although interpolation has a low-pass filtering effect, this effect is constrained to unsampled pixels; sampled pixels are unaffected. This is necessary for resolution recovery through mosaicing.
Figure 4 shows the vertical blurring effect for individual, undersampled images (Figures 4b,e), and for static mosaics (Figures 4c,f). In calculating Figures 4c,f, thirty consecutive frames are averaged together, each with image quality comparable to that of Figures 4b,e. However, because each frame adds new spatial information through sampling slightly different scanning patterns, as discussed in Section III, the end result is comparable to simple integration (Figures 4a,d). Thus, through static mosaicing, we recover lateral resolution and SNR lost through subsampling to increase temporal sensitivity. Note that if we were to keep track of sampled locations, avoid interpolating, and then integrate according to number of times a given pixel is sampled (as apposed to simple averaging), we would obtain results identical to those shown in Figures 4a,d.
Fig. 4.
Image quality versus update rate, and its potential effect on mosaicing. From top to bottom: (a1, d1) a single 1 Hz frame; (b1, e1) a single 30 Hz frame after interpolation; and (c1, f1) 1 second’s worth of 30 Hz images averaged together, representing a static mosaic. (a2-f2) Corresponding 2D Fourier transforms, contrast enhanced to show detail and cropped to conserve space. In comparison to 1 Hz images, 30 Hz images show significant degradation due to their sparse sampling pattern. 30 Hz images integrated for the same amount of time show only a slight decrease in vertical resolution, confirmed by the darker top and bottom regions of the corresponding spectra. (a1-c1) 15 μm diameter fluorescent beads. (d1-f1) Fixed and cleared mouse brain tissue, labeled with LIQOR785 (red) and DRAQ5 (green) with dimethyl sulfoxide (DMSO) to increase label penetration. The vertical dark feature is a micro-tear in the tissue sample. Scale bars indicate 100 μm.
Interpolation is crucial for both mosaicing and phase feedback. For mosaicing, image sparsity introduces high contrast corners and patterning that would prohibit feature detection and image registration. For phase feedback, this patterning introduces high-frequency artifacts in the Fourier domain, skewing the signal used for phase alignment.
IV. Phase Control
A. The Phase Control Problem
The MEMS scanner at the heart of our microscope is an underdamped oscillator driven in open-loop with a sinusoidal force. The phase lag (or mapping phase) between the oscillation and driving force is
(8) |
where ω is the driving frequency, ω0 is the undamped angular frequency, and is the damping ratio. If instantaneous phase (from Eqs. 1 and 2) defines an input signal’s position relative to one full period, phase lag measures the physical system’s delay relative to that input signal. Both values are needed for accurate image reconstruction. As driving frequency approaches and surpasses resonant frequency , phase lag transitions from 0° to −180°. As damping ratio decreases, quality factor (Q-factor) increases, resulting in higher resonance amplitude (enabling larger FOVs) and phase sensitivity to frequency. This destabilizes the scanner, making it more difficult to control.
For raster scanning, phase drifts are presented as small shifts of the entire image along the resonant axis. For Lissajous scanning, phase drift occurs for both axes, causing interlaced copies of the image to shift along the vertical and horizontal axes, despite the beam path scanning along changing, non-orthogonal directions. These interlaced copies are sometimes referred to as fields, referring to when the mirror is, for example, scanning left to right (even) or right to left (odd). When these fields aren’t properly aligned, high frequency artifacts are introduced into the image, artificially increasing contrast.
B. Phase Algorithm and Sweep
The algorithm for determining phase lag for Lissajous scanned images is shown in Algorithm 1 and Figure 5. The intent is to generate a signal over a range of test phase lag values for which the peak corresponds to how “in phase,” or smooth, the image is, indicating properly aligned image fields. Peak signal corresponds maximizing the center, low-frequency bands of the image’s Fourier spectrum, corresponding to an image free from the high-frequency artifacts introduced from misaligned image fields. This is inversely related to classic autofocus contrast detection schemes, in which maximum contrast occurs at the focal plane [27], [28].
Fig. 5.
Two-dimensional phase sweep using images from a random location in fixed and cleared mouse brain tissue with DRAQ5 and LICOR785 staining. (a-c) Horizontal and (d-f) vertical phase sweeps corresponding to minimum, median, and maximum signal values, respectively. (g) Full plot of horizontal (blue) and vertical (red) phase sweeps, with image locations labeled. The initial x-axis phase value is held constant during the y-axis sweep, and vice versa. Images were taken at 10 Hz and tested every 0.1°, totaling 22 images in all. Scale bars indicate 100 μm.
After sampling the FOV to obtain an image vector, we (1) map the vector into an image according to Equations 1 and 2; (2) vertically interpolate the image according to Equation 7; (3) normalize the image to within [0,1]; (4) Fourier transform the image, shifting the DC term to center; and (5) sum the center portion (12.0% of each dimension by length, or 1.44% by area) of the absolute value of the image’s spectrum to obtain an output signal. The dimensions of this center portion correlate with sampling most of the center lobe of the Fourier transform, including the DC term, but not extending into higher frequencies. Fortunately, this dimension does not need to be precisely calibrated, as the artifacts introduced from misaligned fields are higher in frequency than our optical resolution. This will always be the case for Nyquist sampling or better. The example dimension is a result of roughly minimizing computation while sampling enough pixels to provide strong signal.
Figure 5 shows the resulting signals generated from Algorithm 1, along with representative images for each axis’ phase lag search. The phase sweep is performed separately for each axis. Algorithm 1 shows inputs corresponding to an x-axis sweep, holding y-axis phase lag constant (Figure 5, a-c and g, blue). A y-axis sweep would then naturally follow, where would be held constant (Figure 5, d-f and g, red).
Our method for computationally reconstructing images is purely software-based in that we reconstruct images using test phase lag values and display results in real-time (with the first few seconds of images being shown out of phase), however driving phase is kept constant to keep the mirror operating smoothly. This is one of the key differences between a hardware- and software-based phase control solutions: the quick feedback from circuitry enables the active control of driving phase, forcing the scanning beam along a predefined path. Because a method like this one calculates phase lag through processing image information, it takes orders of magnitude longer, and thus cannot actively control driving phase. Regardless, both methods are capable of providing images “in phase” in real-time.
This process is completed in ~360 ms per image, the majority of which is spent mapping our sampled data vectors into image form. Unfortunately this step cannot be replaced with less computationally expensive means (by, for example, simply shifting image pixels), since sinusoidal mapping dictates that integration is a function of position in the image. Processing the 22 test images in the example shown in Figure 5 takes ~7.5 s.
Before beginning the 2D phase lag search, initial values are first roughly set with the GUI using the real-time image display as feedback. Once within an acceptable range (±1° – 10°, depending on sample), the software begins a coarse phase sweep, in which image quality is tested over a wide range of phase values using coarse increments. After the initial sweep, the software updates the phase lag with an initial solution, then decreases the scanning range by a factor of ten and adjusts the increments such that the total sweep time remains constant.
Algorithm 1:
Phase Control Algorithm
![]() |
This process is repeated until the phase lag is determined to within 0.001° (0.1% of 1°). After testing a set of phase values, the software normalizes the responses (from step five, above) to within [0,1] and picks phase lag value that corresponds to maximum signal.
To test the algorithm’s sensitivity to parameters such as SNR and optical resolution, we varied noise, sample rate, and Gaussian filtering on tissue images undergoing simulated phase testing like that shown in Figure 5. In these tests, we define sensitivity as the range between the maximum and minimum observed signal values, corresponding with the best and worst aligned images of the test, respectively. Error is defined as the root-mean-square deviation (RMSD) between each plot’s normalized shape (Equation 9). As sample rate drops from 107 to 106, below Nyquist, sensitivity remains relatively constant, but error rises exponentially. To test sensitivity to SNR, we added zero-mean, Gaussian white noise. As noise variance rises to 0.01, sensitivity and accuracy both drop linearly. Together, these results confirm that skipping small numbers of samples during the 2D mapping step can increase algorithm speed with limited effect on performance. To test the algorithm’s sensitivity to optical resolution, we applied Gaussian filters with sigma values ranging from 1,000 (close to original) to 1 (no discernible features). For σ > 20, decreasing sigma has virtually no affect on sensitivity and accuracy. For σ < 20, sensitivity and accuracy begin to drop exponentially, with the algorithm losing effectiveness below σ = 10. These results suggest that low-pass filtering images may be an effective step toward adapting the algorithm for use with images with low SNR.
We have found the Fourier-based approach used here to be more robust than typical edge detection schemes (such as with Sobel or Canny operators) mainly due to noise considerations. Weak fluorescent staining and deep tissue imaging are common issues with fluorescence that cause significant drops in SNR. However, we have a few additional sources of phase lag for each axis (Figures 6d,e). Before the test started, we manually adjusted phase to within ±1° for each axis, set the final precision level to 0.1% of a degree, and set frame rate to 2 Hz. Each time the phase sweep completes, in this noise unique to our setup that cause variation in SNR within each image. First, due to the microscope’s optical design, our focal plane is parabolic, meaning imaging depth decreases with MEMS mirror deflection, and maximum imaging depth occurs in the center of the image. Second, the excitation and collection beams in our microscope are slightly misaligned; a problem that is exacerbated when the mirror nears maximum deflection.
Fig. 6.
Real-time phase control during one continuous hour of imaging. (a) Sample image of 15 μm diameter fluorescent beads. (b) Percent absolute difference between frames at times t = 0 min and t = 60 min, color-coded and capped at 50% to enhance contrast. Insets in (a) and (b) show the same magnified region. (c) RMSD values (in units of pixel intensity normalized to [0, 1]) for each frame compared to the first image (blue) and for consecutive image pairs (red) over time. (d) X-axis phase lag over time. (e) Y-axis phase lag over time. Scale bars indicate 100 μm.
C. Maintaining Phase Control
To test phase control over a long period of time, we continuously imaged a phantom of 15 μm diameter fluorescent microbeads suspended in polydimethylsiloxane (PDMS) for 60 minutes while saving each image (Figure 6a) and recording phase lag for each axis (Figures 6d,e). Before the test started, we manually adjusted phase to within ±1° for each axis, set the final precision level to 0.1% of a degree, and set frame rate to 2 Hz. Each time the phase sweep completes, in this case every 16 frames, or 8 s, phase lag gets updated, and a new sweep is initiated with the previous test’s phase solutions as starting points.
To evaluate phase control performance, each image is compared to the first image of the test (Figure 6c, blue) as well as to the previous frame (Figure 6c, red) by calculating the RMSD between said pairs of reconstructed images, defined as
(9) |
where each image, Im, is converted to double precision and normalized to within [0,1]. When compared to the previous frame, RMSD consistently hovers just below 0.01, with an average value of 8.9 × 10−3, indicating that each pairwise image changes by less than 1% on average. When compared to the first frame, RMSD grows linearly with time, suggesting either consistent accumulation of errors in phase control or gradual phantom shift. Note that this linear RMSD ramp does not correlate with the fluctuations observed in tracked phase (Figures 6d,e). Visually assessing the differences between the first and last images in the test (Figure 6b) confirms that this ramp is due to phantom shift. Recall from Section IV-A that inaccurate phase control results in horizontal and/or vertical translation of even and odd fields of the image. This kind of error would present itself as doubling (or quadrupling) of the beads along either axis in the grayscale image data, resulting in pairs of mirrored crescent-shaped patterns in Figure 6b. Instead, we observe two major features in the absolute difference image: roughly symmetric rings (Figure 6b, inset), suggesting axial translation of the sample, and crescent-like shapes, without mirroring, suggesting lateral translation. Either of these features could be from contraction or relaxation of the PDMS, or from a shift in imaging depth from the piezoelectric linear actuator. Other possible sources of FOV instability include evaporation of the index matching gel between the microscope’s objective and the sample, fluctuations in room temperature, and inadequate vibration control. Although the microscope’s scanner exhibited phase drift of varying severity for the two axes (maximum observed severity was 0.06 deg/min, or 10−3 deg/s; Figures 6d,e), image alignment was consistent over time (Figure 6c, red).
V. Image Mosaicing
With update rates increased and phase lag controlled, mosaicing with Lissajous scanned images is now possible. Our registration method uses normalized cross-correlation [29], a form of template matching in which one finds areas of an image that matches a template image kernel. In this approach, a response map R is calculated from scanning a template T over an image I, with the brightest locations indicating the best matches:
(10) |
To form our template, we crop the second image of each pair down to size w × w pixels, defining the search window, and amount of overlap, between sequential frames. This assumption constrains panning speed (how quickly the user can slide the microscope across the tissue surface) according to the linear relationship
(11) |
but keeps calculations to a minimum. For example, if we set our search window to ±30 pixels (±15 μm) per axis and image at 30 Hz, we can pan the microscope at speeds of up to 450 μm/s. Operating the microscope by hand under this constraint is challenging, but possible, whereas handheld operation at single digit update rates is impractical. Thus, at slower update rates, we depend on linear translation stages to control panning. Stages also facilitate maintaining optical contact between the microscope’s objective and the sample, enabling larger mosaics.
Cross-correlation is simple, computationally efficient when done in Fourier space, and performs well enough to avoid needing secondary fine-tuning. Methods like pyramidal optical flow, SIFT, and SURF were attempted early in development, but their dependence on high SNR, resolution, and number of observable features limited them to update rates below ∼15 Hz. Although frequency domain-based approaches can also be sensitive to noise, we found this phase correlation method to be robust to both noise and occlusion from, for example, small pieces of dust or tissue. Through experimentation, we found increased repeatability through normalizing each pair of images prior to registration.
Whether the microscope is stage-mounted or handheld, we make no assumptions about direction of travel. All registration is applied to a single channel of image data (the channel imaging DRAQ5-stained cell nuclei). We experimented with using both channels of data, which under different circumstances (e.g., a different modality with better SNR) is likely be a viable improvement. In our tests, the LIQOR785 channel didn’t show enough contrast for consistent template matching for all frames.
Image blending is handled with simple averaging, such that SNR is improved in proportion to the square root of the number of samples (in overlapping regions). Once imaging has ended, the mosaic goes through two post-processing steps to enhance aesthetics: trimming and contrast enhancing. The trimming step removes empty borders around the mosaic that are automatically padded on when the FOV approaches any of the mosaic’s borders. To enhance contrast, we apply contrast-limited adaptive histogram equalization (CLAHE) [30], assuming a linear distribution of image intensity values. This integration. The difference between the static mosaics of and other closely related approaches have been applied to mosaics in micro-angiography [31] and astronomy [32], and works well here to control lighting uniformity without greatly oversaturating cell nuclei.
Regardless of update rate, the algorithms used for image interpolation and integration remain constant. However, template matching high update rate, low-resolution images requires one critical tweak: low-pass filtering. In preparation for template matching, each pair of images is temporarily smoothed using a median filter with a kernel size that increases with update rate (e.g., 30 Hz images require a kernel size of 11 pixels, or 5.5 μm). Despite varying initial image resolution, we observed no difference in registration precision or mosaic resolution with update rates of up to 30 Hz.
Mosaicing is commonly employed to accumulate large imaging areas over time from smaller individual FOVs, to provide image stabilization, and to boost SNR in the overlapped regions of consecutive frames. However, using mosaicing to recover the resolution lost through subsampling images is novel. Recall from Sections II-B and III that although each individual frame can be greatly degraded, integrating additional high-resolution information along with interpolated information can result in image quality comparable to simple Figures 4 and those shown in Figures 7, 8, and 9 is lateral translation between the microscope and the sample (and longer total imaging times). To prevent motion blurring, we have increased the update rate. To determine distance and direction traveled, we perform template matching and overlay data accordingly. Thus, combining fast updating and mosaicing enables us to remove the assumption that the microscope is static while integrating, without sacrificing resolution. This approach is applicable for all manners of imaging with miniaturized microscopes, whether static, intentionally panned across tissue, or for image stabilization.
Fig. 7.
Multispectral, real-time 5 Hz mosaic of fixed and cleared mouse brain tissue stained with IR783 (red) and DRAQ5 (green). Top: Single 5 Hz frames before (a) and after (b) interpolation. Bottom (c): Mosaic comprises 400 consecutive images taken over 80 s. White rectangle shows same FOV as in (a) and (b). Sample translated via manual stage controller. Single frame FOV is 370 × 650 μm. Total Mosaic FOV is 1.37 × 1.43 mm. Scale bars indicate 100 μm (top) and 500 μm (bottom).
Fig. 8.
Multispectral, real-time 10 Hz mosaic of fixed and cleared mouse brain tissue stained with IR783 (red) and DRAQ5 (green). Top: Single 10 Hz frames before (a) and after (b) interpolation. Bottom (c): Mosaic comprises 6,398 consecutive images taken over 10.7 min. White rectangle shows same FOV as in (a) and (b). Sample translated via manual stage controller. Single frame FOV is 370 × 650 μm. Total Mosaic FOV is 4.43 × 4.27 mm. Scale bars indicate 100 μm (top) and 1 mm (bottom).
Fig. 9.
Multispectral, post-processed 30 Hz mosaic of fixed and cleared mouse brain tissue stained with IR783 (red) and DRAQ5 (green). Left (a): Mosaic comprises 385 consecutive images taken over 12.8 s. White rectangle shows same FOV as in (b) and (c). Right: Single 30 Hz frames before (b) and after (c) interpolation, showing severe image degradation due to subsampling. Microscope translated in hand. Single frame FOV is 370 × 650 μm. Total mosaic FOV is 1.27 mm × 679 μm. Scale bars indicate 500 μm (left) and 100 μm (right).
Figures 7, 8, and 9 show multispectral mosaics of fixed, cleared mouse brain tissues imaged at 5 Hz, 10 Hz, and 30 Hz, respectively. Each tissue sample was stained with IR783 to show general tissue morphology (red) and DRAQ5 to counterstain cell nuclei (green). These figures represent the first mosaics comprised entirely of Lissajous scanned microscopy images. In Figures 7 and 8, the microscope was inverted and mounted to a static post. The tissue was mounted to a glass slide and taped down to a manual, linear translation stage to control panning. These figures were mosaiced in real-time. In Figure 9, the microscope was detached from its mount and held by hand, keeping the tissue static. Due to limited computational power, this figure was mosaiced in post processing. In contrast to imaging at slower update rates, manual stages are not necessary when imaging at video-rate. Although each individual image shows low resolution and SNR, integration during the mosaicing process mitigates both these issues, with no appreciable difference in resolution between mosaics with different update rates.
VI. Discussion and Conclusions
With the inclusion of this work, there are now two categories for real-time phase control during Lissajous imaging: hardware- and software-based phase control. The work demonstrated here is not intended to replace physical sensors, but to offer an alterative for certain applications. For tabletop setups or in situations where bandwidth and correction response time is critical, or when frequency control is also needed, physical sensors are ideal. Such methods can also control beam path deflection in real-time, ensuring scanning occurs over a predefined path. However, hardware-based methods add size, complexity, and cost, making them particularly difficult to implement in miniaturized microscopes designed for in vivo imaging. In comparison, software-based methods can enable phase control for any imaging system, but require real-time imaging as part of their feedback loop and time to calculate reconstructions from sampled data. Scanners with high Q-factors, or those exhibiting gyroscopic sensitivity, may be too unstable for phase control methods that take on the order of seconds to respond. Additionally, these methods require constant optical contact with the sample to ensure a steady stream of image information with which to work. Thus, our method is not applicable to non-imaging Lissajous scanning systems, such as video projectors.
Image processing techniques like the one presented here are sensitive to SNR, optical resolution, and sample rate. With that in mind, it is applicable to any modality with comparable image quality. For example, operating a confocal microscope in reflectance mode offers identical resolution and sample rate, but increases SNR greatly. Optical coherence tomography (OCT), which typically displays images in which SNR degrades with imaging depth, is comparable to our fluorescent image data, where portions of the image appear out of focus due to optical misalignment, and image quality degrades with penetration depth. Even relatively low-resolution images (e.g., from magnetic particle imaging) are applicable.
Alternatives to linear image interpolation were considered early in the project. One idea is to update the image prior to complete image sampling, replacing only those pixels sampled [7]. However, this method may not work well with the template matching or feature tracking processes needed for mosaicing, as there exists a mixture of new and old information in each image. In our approach, although we have a mixture of high-and low-resolution information in each image, each frame consists entirely of new information in a temporal sense, enabling accurate motion tracking. A second idea is to shift each image half a FOV at a time, sampling the same region once in the low-resolution center of the FOV and then again in the highly sampled edges or corners. This places a major constraint on how the microscope is panned across the FOV, requiring motorized movement between frames, and preventing software-based image stabilization. A third option is to amplitude modulate the driving signals, alternating between full and reduced FOVs in an attempt to fully sample the center region every nth image. This method could help with the sampling issues inherent in Lissajous scanning, but at the costs of artificially limiting sampling bandwidth and complicating phase control. Finally, more advanced interpolation methods were considered, and likely would have been implemented if real-time mosaicing had not been the end goal. Choosing a simple interpolation method allowed us to direct more computational load toward mosaicing.
The spatial and temporal sampling advantages of Lissajous scanning, discussed in Section II, can also be thought of in terms of overall information sampling rates. Space bandwidth product (SPB) is a unitless metric defined as FOV × maximum spatial frequency. To compare the SBP of a raster- or Lissajous-scanned microscope (i.e., characterizing bandwidth independent of optics), we compare three major factors: (1) FOV, (2) rate of travel of the focal point, and (3) sampling efficiency. For fixed amplitude driving signals, driving both axes in resonance instead of just one results in a factor of up to 2–3 larger imaging area, depending on the Q-factor of the scanning mirror and assuming a lack of other limiting factors such as off-axis aberrations or vignetting.
To compare rates of travel, we use the average scanning speed for a raster-scan (approximated as a 1D equation),
(12) |
and that of a Lissajous-scan,
(13) |
Switching from raster scanning (fx = 2034 Hz, fy = 5 Hz) to Lissajous scanning (fx = 2034 Hz, fy = 315 Hz) increases the average scanning speed by just 0.4%. However, as shown in Figure 2, raster scanning is unidirectional, meaning only one direction of the fast axis’ sinusoidal motion is utilized. Bi-directional raster scanning is possible, but less commonly implemented due to requiring phase control but not increasing FOV. Thus, Lissajous scanning gains a factor of approximately two in effective scanning speed over raster scanning as a direct tradeoff for needing phase control.
As discussed in Section II, Lissajous sampling efficiency is a function of update rate. In fact, sampling efficiency relies so heavily upon update rate that these accumulations in SBP can be nearly negated for slow update rates, or substantially increased for fast updating. For example, let us compare a microscope in either raster (FOV = 150μm × 650μm, 100% sampling efficiency) or Lissajous (FOV = 370μm × 650μm, 64% sampling efficiency) scanning modes and update rates of 5 Hz for both. Since optical resolution, one axis’ deflection angle, and update rate cancel out, the increase in SPB is then 0.64 × 370 = 236.8 for Lissajous vs. 150 for raster, a difference of just 1.58X. When we increase the Lissajous update rate to 30 Hz, sampling efficiency drops to 45%, and the Lissajous vs. raster increase in SPB is then (0.45 × 370 × 30)/(150 × 5) = 6.66X.
Our solutions for phase control, boosting update rate, and mosaicing are applicable to most Lissajous scanning imaging modalities and setups. Modern optical microscopy techniques such as confocal [7], [33], [34], [35], [36], multiphoton [8], [37], [38], [39], [40], OCT [41], [42], [43], [44], [45], [46], [47], and resonant fiber scanning [37], [38], [39], [40], [43], [44], [45], [46], [47], [48] have all had recent success with Lissajous scanning, and are immediate choices in which to effectively implement the work shown here due to their similarity with our own microscopy setup. Other applicable modalities include high-speed atomic force microscopy (AFM) [20], [49], [50], [51], [52], magnetic particle imaging (MPI) [53], [54], [55], [56], and light detection and ranging (LIDAR) [57], [58], [59], [60]. Many of these systems are being employed on moving objects such as cars, drones, and other robots, where mosaicing is commonly used in autonomous applications, and and replacing phase sensing hardware with software would lighten the load.
This work could also be applied to raster scanning in two ways. First, our phase control algorithm (reduced to 1D) could be used for aligning even and odd fields of a raster scan, converting traditional raster scanning to bi-directional scanning, increasing frame rate by a factor of two with virtually no alterations to the experimental setup. Second, frame rate could be further improved by increasing the slow axis scanning speed past that dictated by Nyquist, resulting in images subsampled in the vertical direction, much like in this work. Through subsampling, vertical interpolation, and mosaicing to recover lost resolution, frame rate could be increased by at least another factor of two. Combined, these two techniques could improve the effective frame rate of a raster scanned imaging device by a factor of at least four, potentially boosting single digit frame rates to video-rate.
Supplementary Material
Acknowledgments
We are grateful to Frank Schonig for his help on designing mounting and tethering systems for the microscope.
This work was supported in part by the National Institute of Health (R01-CA172895), by the National Science Foundation (1918074), and by the Department of Energy (234402).
Contributor Information
Nathan O. Loewke, Edward L. Ginzton Laboratory, Stanford University, Stanford, CA 94305 USA Department of Electrical Engineering, Stanford University, Stanford, CA 94305 USA.
Zhen Qiu, Department of Biomedical Engineering, Michigan State University, East Lansing, MI 48824 USA.
Michael J. Mandella, Department of Biomedical Engineering, Michigan State University, East Lansing, MI 48824 USA
Robert Ertsey, Department of Otalaryngology, Head and Neck Surgery, Stanford, CA, 94305, USA.
Adrienne Loewke, Institute for Neurodegenerative Diseases, UCSF, San Francisco, California, 94143, USA.
Lisa A. Gunaydin, Institute for Neurodegenerative Diseases, UCSF, San Francisco, California, 94143, USA Department of Psychiatry, UCSF, San Francisco, California, 94143, USA.
Eben L. Rosenthal, Department of Otalaryngology, Head and Neck Surgery, Stanford, CA, 94305, USA
Christopher H. Contag, Department of Biomedical Engineering, Michigan State University, East Lansing, MI 48824 USA
Olav Solgaard, Edward L. Ginzton Laboratory, Stanford University, Stanford, CA 94305 USA; Department of Electrical Engineering, Stanford University, Stanford, CA 94305 USA.
References
- [1].Dickensheets D. and Kino G, “Micromachined scanning confocal optical microscope,” Optics letters, vol. 21, no. 10, pp. 764–766, 1996. [DOI] [PubMed] [Google Scholar]
- [2].Hoy CL, Durr NJ, Chen P, Piyawattanametha W, Ra H, Solgaard O, and Ben-Yakar A, “Miniaturized probe for femtosecond laser microsurgery and two-photon imaging,” Optics express, vol. 16, no. 13, pp. 9996–10005, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [3].Liu T-M, Chan M-C, Chen I-H, Chia S-H, and Sun C-K, “Miniaturized multiphoton microscope with a 24hz frame-rate,” Optics Express, vol. 16, no. 14, pp. 10501–10506, 2008. [DOI] [PubMed] [Google Scholar]
- [4].Helmchen F, Fee MS, Tank DW, and Denk W, “A miniature headmounted two-photon microscope: high-resolution brain imaging in freely moving animals,” Neuron, vol. 31, no. 6, pp. 903–912, 2001. [DOI] [PubMed] [Google Scholar]
- [5].Flusberg BA, Jung JC, Cocker ED, Anderson EP, and Schnitzer MJ, “In vivo brain imaging using a portable 4.9? gram two-photon fluorescence microendoscope,” Optics letters, vol. 30, no. 17, pp. 2272–2274, 2005. [DOI] [PubMed] [Google Scholar]
- [6].Lissajous J, “Note sur un nouveau moyen de mettre en évidence le mouvement vibratoire des corps,” CR Hebd. Seances Acad. Sci, vol. 41, pp. 93–95, 1855. [Google Scholar]
- [7].Liu JT, Mandella MJ, Loewke NO, Haeberle H, Ra H, Piyawattanametha W, Solgaard OD, Kino GS, and Contag CH, “Micromirror-scanned dual-axis confocal microscope utilizing a gradient-index relay lens for image guidance during brain surgery,” Journal of biomedical optics, vol. 15, no. 2, p. 026029, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [8].Hoy CL, Durr NJ, and Ben-Yakar A, “Fast-updating and nonrepeating lissajous image reconstruction method for capturing increased dynamic information,” Applied optics, vol. 50, no. 16, pp. 2376–2382, 2011. [DOI] [PubMed] [Google Scholar]
- [9].Sullivan SZ, Muir RD, Newman JA, Carlsen MS, Sreehari S, Doerge C, Begue NJ, Everly RM, Bouman CA, and Simpson GJ, “High frame-rate multichannel beam-scanning microscopy based on lissajous trajectories,” Optics express, vol. 22, no. 20, pp. 24224–24234, 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [10].Geiger AC, Newman JA, Sreehari S, Sullivan SZ, Bouman CA, and Simpson GJ, “Sparse sampling image reconstruction in lissajous trajectory beam-scanning multiphoton microscopy,” in HighSpeed Biomedical Imaging and Spectroscopy: Toward Big Data Instrumentation and Management II, vol. 10076. International Society for Optics and Photonics, 2017, p. 1007606. [Google Scholar]
- [11].Luo Y. and Andersson SB, “A comparison of reconstruction methods for undersampled atomic force microscopy images,” Nanotechnology, vol. 26, no. 50, p. 505703, 2015. [DOI] [PubMed] [Google Scholar]
- [12].Patel YG, Nehal KS, Aranda I, Li Y, Halpern AC, and Rajadhyaksha M, “Confocal reflectance mosaicing of basal cell carcinomas in mohs surgical skin excisions,” Journal of biomedical optics, vol. 12, no. 3, p. 034027, 2007. [DOI] [PubMed] [Google Scholar]
- [13].Vercauteren T, Perchant A, Malandain G, Pennec X, and Ayache N, “Robust mosaicing with correction of motion distortions and tissue deformations for in vivo fibered microscopy,” Medical image analysis, vol. 10, no. 5, pp. 673–692, 2006. [DOI] [PubMed] [Google Scholar]
- [14].Piyawattanametha W, Ra H, Mandella MJ, Loewke K, Wang TD, Kino GS, Solgaard O, and Contag CH, “3-d near-infrared fluorescence imaging using an mems-based miniature dual-axis confocal microscope,” IEEE Journal of Selected Topics in Quantum Electronics, vol. 15, no. 5, pp. 1344–1350, 2009. [Google Scholar]
- [15].Kose K, Gou M, Yelamos O, Cordova M, Rossi AM, Nehal KS,´ Flores ES, Camps O, Dy JG, Brooks DH et al. , “Automated video-mosaicking approach for confocal microscopic imaging in vivo: an approach to address challenges in imaging living tissue and extend field of view,” Scientific reports, vol. 7, no. 1, p. 10759, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [16].Loewke KE, Camarillo DB, Piyawattanametha W, Mandella MJ, Contag CH, Thrun S, and Salisbury JK, “In vivo micro-image mosaicing,” IEEE Transactions on Biomedical Engineering, vol. 58, no. 1, pp. 159–171, 2011. [DOI] [PubMed] [Google Scholar]
- [17].Bedard N, Quang T, Schmeler K, Richards-Kortum R, and Tkaczyk TS, “Real-time video mosaicing with a high-resolution microendoscope,” Biomedical optics express, vol. 3, no. 10, pp. 2428–2435, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [18].Behrens A, Bommes M, Stehle T, Gross S, Leonhardt S, and Aach T, “Real-time image composition of bladder mosaics in fluorescence endoscopy,” Computer Science-Research and Development, vol. 26, no. 1–2, pp. 51–64, 2011. [Google Scholar]
- [19].Csencsics E. and Schitter G, “Design of a phase-locked-loop-based control scheme for lissajous-trajectory scanning of fast steering mirrors,” in American Control Conference (ACC), 2017. IEEE, 2017, pp. 1568–1573. [Google Scholar]
- [20].Bazaei A, Yong YK, and Moheimani SR, “High-speed lissajousscan atomic force microscopy: Scan pattern planning and control design issues,” Review of Scientific Instruments, vol. 83, no. 6, p. 063701, 2012. [DOI] [PubMed] [Google Scholar]
- [21].Hung AC-L, Lai HY-H, Lin T-W, Fu S-G, and Lu MS-C, “An electrostatically driven 2d micro-scanning mirror with capacitive sensing for projection display,” Sensors and Actuators A: Physical, vol. 222, pp. 122–129, 2015. [Google Scholar]
- [22].Habibullah H, Pota H, and Petersen I, “Phase-locked loop-based proportional integral control for spiral scanning in an atomic force microscope,” IFAC Proceedings Volumes, vol. 47, no. 3, pp. 6563–6568, 2014. [Google Scholar]
- [23].Mokhtar M. and Syms R, “Resonant fiber scanner with optical feedback,” Optics express, vol. 22, no. 21, pp. 25629–25634, 2014. [DOI] [PubMed] [Google Scholar]
- [24].Liu JT, Mandella MJ, Ra H, Wong LK, Solgaard O, Kino GS, Piyawattanametha W, Contag CH, and Wang TD, “Miniature near-infrared dual-axes confocal microscope utilizing a two-dimensional microelectromechanical systems scanner,” Optics letters, vol. 32, no. 3, pp. 256–258, 2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [25].Hwang K, Seo Y-H, Ahn J, Kim P, and Jeong K-H, “Frequency selection rule for high definition and high frame rate lissajous scanning,” Scientific reports, vol. 7, no. 1, p. 14075, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [26].Wang TD, Mandella MJ, Contag CH, and Kino GS, “Dual-axis confocal microscope for high-resolution in vivo imaging,” Optics letters, vol. 28, no. 6, pp. 414–416, 2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [27].Groen FC, Young IT, and Ligthart G, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry: The Journal of the International Society for Analytical Cytology, vol. 6, no. 2, pp. 81–91, 1985. [DOI] [PubMed] [Google Scholar]
- [28].Firestone L, Cook K, Culp K, Talsania N, and Preston K Jr, “Comparison of autofocus methods for automated microscopy,” Cytometry: The Journal of the International Society for Analytical Cytology, vol. 12, no. 3, pp. 195–206, 1991. [DOI] [PubMed] [Google Scholar]
- [29].Yoo J-C and Han TH, “Fast normalized cross-correlation,” Circuits, systems and signal processing, vol. 28, no. 6, p. 819, 2009. [Google Scholar]
- [30].Zuiderveld K, “Contrast limited adaptive histogram equalization,” Graphics gems, pp. 474–485, 1994.
- [31].Yousefi S, Qin J, Zhi Z, and Wang RK, “Uniform enhancement of optical micro-angiography images using rayleigh contrast-limited adaptive histogram equalization,” Quantitative imaging in medicine and surgery, vol. 3, no. 1, p. 5, 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [32].Berriman GB and Good J, “The application of the montage image mosaic engine to the visualization of astronomical images,” Publications of the Astronomical Society of the Pacific, vol. 129, no. 975, p. 058006, 2017. [Google Scholar]
- [33].Dickensheets DL and Kino GS, “Silicon-micromachined scanning confocal optical microscope,” Journal of Microelectromechanical Systems, vol. 7, no. 1, pp. 38–47, 1998. [Google Scholar]
- [34].Bechtel C, Knobbe J, Grüger H, and Lakner H, “Large field of view mems-based confocal laser scanning microscope for fluorescence imaging,” Optik-International Journal for Light and Electron Optics, vol. 125, no. 2, pp. 876–882, 2014. [Google Scholar]
- [35].Hwang K, Kim J-B, Seo Y-H, Ahn J, Kim P, and Jeong K-H, “Fully packaged video-rate confocal laser scanning endomicroscope using lissajous fiber scanner,” in Solid-State Sensors, Actuators and Microsystems (TRANSDUCERS), 2017 19th International Conference on. IEEE, 2017, pp. 143–146. [Google Scholar]
- [36].Li G, Li H, Duan X, Zhou Q, Zhou J, Oldham KR, and Wang TD, “Visualizing epithelial expression in vertical and horizontal planes with dual axes confocal endomicroscope using compact distal scanner,” IEEE transactions on medical imaging, vol. 36, no. 7, pp. 1482–1490, 2017. [DOI] [PubMed] [Google Scholar]
- [37].Myaing MT, MacDonald DJ, and Li X, “Fiber-optic scanning two-photon fluorescence endoscope,” Optics letters, vol. 31, no. 8, pp. 1076–1078, 2006. [DOI] [PubMed] [Google Scholar]
- [38].Liang W, Murari K, Zhang Y, Chen Y, Li M-J, and Li X, “Increased illumination uniformity and reduced photodamage offered by the lissajous scanning in fiber-optic two-photon endomicroscopy.” Journal of biomedical optics, vol. 17, no. 2, pp. 021108–021108, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [39].Rao Y, Lin WY, and Chen L, “Image-based fusion for video enhancement of night-time surveillance,” Optical Engineering, vol. 49, no. 12, p. 120501, 2010. [Google Scholar]
- [40].Zhao Y, Nakamura H, and Gordon RJ, “Development of a versatile two-photon endoscope for biological imaging,” Biomedical optics express, vol. 1, no. 4, pp. 1159–1172, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [41].Chen Y, Hong Y-J, Makita S, and Yasuno Y, “Three-dimensional eye motion correction by lissajous scan optical coherence tomography,” Biomedical optics express, vol. 8, no. 3, pp. 1783–1802, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [42].Sun J, Guo S, Wu L, Liu L, Choe S-W, Sorg BS, and Xie H, “3d in vivo optical coherence tomography based on a low-voltage, large-scanrange 2d mems mirror,” Optics Express, vol. 18, no. 12, pp. 12065–12075, 2010. [DOI] [PubMed] [Google Scholar]
- [43].Park H-C, Seo Y-H, and Jeong K-H, “Lissajous fiber scanning for forward viewing optical endomicroscopy using asymmetric stiffness modulation,” Optics express, vol. 22, no. 5, pp. 5818–5825, 2014. [DOI] [PubMed] [Google Scholar]
- [44].Seo Y-H, Hwang K, Park H-C, and Jeong K-H, “Electrothermal mems fiber scanner for optical endomicroscopy,” Optics express, vol. 24, no. 4, pp. 3903–3909, 2016. [DOI] [PubMed] [Google Scholar]
- [45].Moon S, Lee S-W, Rubinstein M, Wong BJ, and Chen Z, “Semiresonant operation of a fiber-cantilever piezotube scanner for stable optical coherence tomography endoscope imaging,” Optics express, vol. 18, no. 20, pp. 21183–21197, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [46].Wu T, Ding Z, Wang K, Chen M, and Wang C, “Two-dimensional scanning realized by an asymmetry fiber cantilever driven by single piezo bender actuator for optical coherence tomography,” Optics express, vol. 17, no. 16, pp. 13819–13829, 2009. [DOI] [PubMed] [Google Scholar]
- [47].Zhang N, Tsai T-H, Ahsen OO, Liang K, Lee H-C, Xue P, Li X, and Fujimoto JG, “Compact piezoelectric transducer fiber scanning probe for optical coherence tomography,” Optics letters, vol. 39, no. 2, pp. 186–188, 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [48].Seibel EJ, Johnston RS, and Melville CD, “A full-color scanning fiber endoscope,” in Optical Fibers and Sensors for Medical Diagnostics and Treatment Applications VI, vol. 6083. International Society for Optics and Photonics, 2006, p. 608303. [Google Scholar]
- [49].Wu J-W, Lin Y-T, Lo Y-T, Liu W-C, and Fu L-C, “Lissajous hierarchical local scanning to increase the speed of atomic force microscopy,” IEEE Transactions on Nanotechnology, vol. 14, no. 5, pp. 810–819, 2015. [Google Scholar]
- [50].Yong YK, Bazaei A, and Moheimani SR, “Video-rate lissajousscan atomic force microscopy,” IEEE Transactions on Nanotechnology, vol. 13, no. 1, pp. 85–93, 2014. [Google Scholar]
- [51].Maroufi M, Fowler AG, Bazaei A, and Moheimani SR, “High-stroke silicon-on-insulator mems nanopositioner: Control design for non-raster scan atomic force microscopy,” Review of Scientific Instruments, vol. 86, no. 2, p. 023705, 2015. [DOI] [PubMed] [Google Scholar]
- [52].Bazaei A, Maroufi M, Fowler AG, and Moheimani SR, “Internal model control for spiral trajectory tracking with mems afm scanners,” IEEE Transactions on Control Systems Technology, vol. 24, no. 5, pp. 1717–1728, 2016. [Google Scholar]
- [53].Knopp T, Biederer S, Sattel T, Weizenecker J, Gleich B, Borgert J, and Buzug T, “Trajectory analysis for magnetic particle imaging,” Physics in Medicine & Biology, vol. 54, no. 2, p. 385, 2008. [DOI] [PubMed] [Google Scholar]
- [54].Panagiotopoulos N, Duschka RL, Ahlborg M, Bringout G, Debbeler C, Graeser M, Kaethner C, Lüdtke-Buzug K, Medimagh H, Stelzner J. et al. , “Magnetic particle imaging: current developments and future directions,” International journal of nanomedicine, vol. 10, p. 3097, 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [55].Kaethner C, Erb W, Ahlborg M, Szwargulski P, Knopp T, and Buzug TM, “Non-equispaced system matrix acquisition for magnetic particle imaging based on lissajous node points,” IEEE transactions on medical imaging, vol. 35, no. 11, pp. 2476–2485, 2016. [DOI] [PubMed] [Google Scholar]
- [56].Werner F, Gdaniec N, and Knopp T, “First experimental comparison between the cartesian and the lissajous trajectory for magnetic particle imaging,” Physics in Medicine & Biology, vol. 62, no. 9, p. 3407, 2017. [DOI] [PubMed] [Google Scholar]
- [57].Weiss U. and Biber P, “Plant detection and mapping for agricultural robots using a 3d lidar sensor,” Robotics and autonomous systems, vol. 59, no. 5, pp. 265–273, 2011. [Google Scholar]
- [58].Anderson JW and Clayton GM, “Lissajous-like scan pattern for a gimballed lidar,” in Advanced Intelligent Mechatronics (AIM), 2014 IEEE/ASME International Conference on. IEEE, 2014, pp. 1171–1176. [Google Scholar]
- [59].Giese T. and Janes J, “2d mems scanning for lidar with sub-nyquist sampling, electronics, and measurement procedure,” in Three-Dimensional Imaging, Visualization, and Display 2015, vol. 9495. International Society for Optics and Photonics, 2015, p. 94950F. [Google Scholar]
- [60].Kasturi A, Milanovic V, Atwood BH, and Yang J, “Uav-borne lidar with mems mirror-based scanning capability,” in Laser Radar Technology and Applications XXI, vol. 9832. International Society for Optics and Photonics, 2016, p. 98320M. [Google Scholar]
- [61].Ra H, Piyawattanametha W, Mandella MJ, Hsiung P-L, Hardy J, Wang TD, Contag CH, Kino GS, and Solgaard O, “Three-dimensional in vivo imaging by a handheld dual-axes confocal microscope,” Optics express, vol. 16, no. 10, pp. 7224–7232, 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [62].Ra H, Piyawattanametha W, Gonzalez-Gonzalez E, Mandella MJ, Kino GS, Solgaard O, Leake D, Kaspar RL, Oro A, and Contag CH, “In vivo imaging of human and mouse skin with a handheld dual-axis confocal fluorescence microscope,” Journal of Investigative Dermatology, vol. 131, no. 5, pp. 1061–1066, 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [63].Ra H, Piyawattanametha W, Taguchi Y, Lee D, Mandella MJ, and Solgaard O, “Two-dimensional mems scanner for dual-axes confocal microscopy,” Journal of Microelectromechanical systems, vol. 16, no. 4, pp. 969–976, 2007. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.