Skip to main content
The Review of Scientific Instruments logoLink to The Review of Scientific Instruments
. 2012 Apr 6;83(4):043701. doi: 10.1063/1.3697747

Stereoscopic high-speed imaging using additive colors

Georgy N Sankin 1,a), David Piech 1, Pei Zhong 1,a)
PMCID: PMC3331865  PMID: 22559533

Abstract

An experimental system for digital stereoscopic imaging produced by using a high-speed color camera is described. Two bright-field image projections of a three-dimensional object are captured utilizing additive-color backlighting (blue and red). The two images are simultaneously combined on a two-dimensional image sensor using a set of dichromatic mirrors, and stored for off-line separation of each projection. This method has been demonstrated in analyzing cavitation bubble dynamics near boundaries. This technique may be useful for flow visualization and in machine vision applications.


Recent advances in digital imaging have significantly reduced data acquisition time, making it an indispensable technique in scientific research. Although the technique is often used for flow visualization in transparent fluids containing bubbles, drops, or biological cells,1 understanding the complex shape and motion of an object in three-dimensional (3D) space is often limited by observation from a single view. For example, the collapse of a bubble near a boundary has revealed dissimilar shape change and flow patterns when observed from different directions either parallel or perpendicular to the boundary.2 Hence, it is important to achieve simultaneous observations of the interaction in two orthogonal directions. Such an effort has been made via stereoscopic imaging using a mirror system3 or multiple synchronized video-cameras.4 However, the mirror system lacks trimming/separation of individual optical paths because of uniform illumination spectrum, while the use of multiple cameras has inherent limitation in cluttering with increased costs.

In this paper we introduce a method for simultaneous imaging of fluid dynamics from two orthogonal projection planes using a color digital camera combined with additive color illumination. This method is illustrated by examining the effect of two boundaries on the dynamics of cavitation bubbles produced between them.

Figure 1 shows schematically the experimental setup for producing two orthogonal projection images of a 3D object, which are superimposed and captured simultaneously by using a high-speed 14-bit color CMOS camera (Vision Research Inc., Phantom v.7.3, 1 μs exposure). In general, color imaging in a digital camera is implemented by depositing photoresist color filter arrays (CFA) in a mosaic pattern placed on top of the sensor array. Specifically, the Phantom camera utilizes the Bayer filter pattern,5 which alternates a row of red and green filters with a row of blue and green filters. The filters in the CFA are not evenly divided, using a color petition scheme of 50% green, 25% red, and 25% blue, known as RGBG array. This method has the advantage that only one image sensor is required with all the color information (red, green, and blue) recorded simultaneously. The true color of a single pixel can be determined by averaging the values from the closest surrounding pixels.6

Figure 1.

Figure 1

A schematic diagram of the optical setup for stereoscopic imaging of an object in 3D (i.e., an oscillating bubble in a fluid layer between two elastic boundaries, see inset) back-led by separate colors generated using a fiber coupled halogen light source (1) combined with stepped neutral density (2), red (2), and blue (3) filters. The x-y and y-z projection images of the object in red and blue are reflected by a dichroic mirror (4) and a metallic mirror (5), respectively, and combined through a dichroic beamsplitter (6), before being recorded simultaneously by a digital color camera (7). The microfluidics consists of a gold-coated glass plate (11) and PDMS (10), which are positioned within a glass water tank and form the channel between them (9). The laser beam (8) passes through the dichroic mirror (4) and focuses onto the gold layer to produce a cavitation bubble within camera's field of view (FOV).

To compose a stereoscopic image we used two pairs of matched beam splitter and different color backlightings. As each pixel in the camera has a peak sensitivity to red, green, or blue light [Fig. 2], there are six possible combinations of colors that match the color response of the CFA. We used red and blue backlights placed in two orthogonal directions to illuminate the object. In Fig. 1, color backlighting was formed by filtering light from a 150-Watt halogen light source (1) (Dolen-Jenner Industries, Fiber-Lite Model 180) with two gooseneck light-guides using additive filter sets (Edmund Optics, C46-140). The y-z projection (“X” image) was illuminated by red light (2), which was reflected by a dichroic mirror (4) (Thor Labs, DMLP900) before transmitting through a long-pass dichroic beamsplitter (6) (Thor Labs, DMLP567R) into the camera (7). In comparison, the x-y projection (“Z” image) was backlit by a blue light (3), which was reflected by a metallic mirror (5) (Edmund Optics, NT43-873) and subsequently by the dichroic beamsplitter (6). The mirrors were aligned so that the x-y and y-z projections are superimposed and recorded simultaneously by the camera through a long distance microscope (Infinity K2 with CF-4 lens) at the same magnification. Brightness adjustment between the channels is accomplished using a stepped neutral density filter (Edmund Optics, R32-599) placed in front of the red filter. The maximum framing rate of the camera is inversely related to the number of pixels used; for example, at 88 888 frames/s, the image resolution is 128 × 128 px.

Figure 2.

Figure 2

Image acquisition and post-processing are based on spectral characteristics of the camera (dashed lines) and the backlight filter sets (solid lines). The “X” image is formed using light from the red source (defined by the “red” backlight curve SX) and registered by the “red” channel in the camera (characterized by spectral response curve DX in 580–700 nm spectral range), while the “Z” image is created using light from the blue source (defined by the “blue” backlight curve SZ) and registered by the “blue” channel in the camera (characterized by spectral response curve DZ in 400–510 nm spectral range).

The y-z projection, detected by the red pixels, and the x-y projection, detected by the blue pixels, were captured by the camera sensor and converted into an equal-sized image of true colors through a specialized demosaicing algorithm (Phantom Camera Control Software v675.2, Vision Research Inc.). Subsequently, post-processing of the images in MATLAB (the MathWorks, R2010b) allowed decomposition of the two images by isolating the “red” or “blue” value for each pixel, and thus reproducing separate grayscale “X” and “Z” images, respectively. Considering the 5.33× magnification of the microscope and 22 μm pixel size of the camera sensor, the resolution and depth of field of the imaging system were estimated to be 4.1 μm/pixel and 0.5 mm, respectively. Using a grid target (Edmund Optics, R36-121), the image resolution after color decomposition (which reduces the resolution by half from the original image) was found to be about 10 μm, consistent with our estimation.

A wedge-shaped channel was constructed using polydimethylsiloxane (PDMS) from a master mold made of a 1-mm pitch Fresnel lens (Edmund Optics, P32-682) via soft lithography. The PDMS channel was mechanically clamped onto a soda-lime glass plate of 1 × 6 mm in thickness × width. Before assembly, the glass surface was coated with a semi-transparent 15-nm gold layer using an electron beam evaporator (Kurt Lesker, PVD 75). The PDMS-glass assembly was connected to a 3-axis positioning stage and suspended into a transparent glass cuvette filled with glycerol/water mixture (42:58 ratio by weight, dynamic viscosity μ = 3.5 cP at 25 °C).

Using a Q-switched Nd:YAG laser of 1064 nm wavelength, 5 ns pulse duration, and a pulse energy of 1 ∼ 200 mJ (New Wave Research, Tempest 10), a single cavitation bubble can be produced through localized vaporization of the fluid.2, 7 The laser beam was focused onto the gold-coated glass surface to enhance laser light absorption via metal ablation.8 While the gold coating absorbed 70% of the incident laser energy, it still allowed sufficient light transmission for imaging using visible light.

This method for generating well-controlled cavitation activity has been used in microfluidics to create unique flow fields that include vortices and liquid jets in the channel for single cell manipulation, such as cell displacement9 and membrane poration.10 The characteristics of this flow field can be widely varied by the size, position, and oscillation phase of the bubbles.11

The complex bubble dynamics between tilted boundaries are influenced by the narrowing gap and wall elasticity, and thus are difficult to capture from a single projection. Figure 3 shows an example of the original video [Fig. 3a], together with frames obtained after color decomposition corresponding to front-view [Fig. 3b] and side-view [Fig. 3c] of the object, respectively. From the side-view the bubble can be seen emerging on the glass plate and having a hemispherical shape during expansion, while from the front-view it shows a circular shape with uncertainty related to contact area between bubble and each boundary. Moreover, deformation of the PDMS wall is noticeable in the side-view projection after the maximum bubble expansion at t = 22 μs. Subsequently, the bubble starts to collapse asymmetrically forming a jet toward the PDMS wall and the thinner section of the gap. The flow induced inside the gap causes the bubble to splits into two fragments at t = 56 μs, which are visible from both the side- and front- projections. Interestingly, compared to the top fragment, which continues to rotate around the Z-axis (side-view) the bottom fragment translates while deforming and jetting toward the narrower gap before finally breaking down into two smaller fragments. All three bubble fragments can be seen from the front-view; however, PDMS deformation and fluid rotation can only be resolved from the side-view [see supplementary movie, Fig. 3].

Figure 3.

Download video file (316.7KB, mov)

Representative image series (image size 0.6 by 0.6 mm) of bubble dynamics between tilted boundaries before (a), and after (b and c) color separation that shows bubble-wall interaction at two perpendicular projections acquired in the same laser shot. Side view (c) shows PDMS wall deformation (second frame) and jetting toward PDMS wall (forth frame). Front view (b) shows all three bubble fragments as a result of jetting and translation of the bubble (enhanced online) .

We have described an optical method for digital stereoscopic image acquisition using a single (color) camera. To fully explore the color capability and speed of the camera we overlapped additive images from two orthogonal directions on a single 2D image sensor and recorded them simultaneously to resolve 3D bubble dynamics. This approach decreases clutter within experimental setup, increasing versatility and processing efficiency, with precise temporal match between corresponding frames of the two orthogonal projections. The image quality acquired by this technique can be further improved using image sensors with better color separation characteristics and narrow band-pass interference filters.

The cavitation experiment shows that both projections can be acquired from the same video after color separation and analyzed to reveal the cavitation bubble dynamics in detail. As such, this method may provide a cost effective method for stereoscopic imaging. Combined with particle image velocimetry, this method may help to better define cavitation-induced flow field and associated stresses in channels that are critical to the development of new methods in hydrodynamics and microfluidics.

Acknowledgments

The authors acknowledge Fang Yuan and Vision Research Inc. for technical support. This work was supported in part by National Institutes of Health (NIH) through Grant Nos. R37-DK052985 and S10-RR16802.

References

  1. Thoroddsen S. T., Etoh T. G., and Takehara K., Ann. Rev. Fluid Mech. 40, 257 (2008). 10.1146/annurev.fluid.40.111406.102215 [DOI] [Google Scholar]
  2. Lindau O. and Lauterborn W., J. Fluid Mech. 479, 327 (2003); 10.1017/S0022112002003695 [DOI] [Google Scholar]; Tomita Y. and Shima A., Acustica 71(3), 161 (1990). [Google Scholar]
  3. Ohl C. D., Tijink A., and Prosperetti A., J. Fluid Mech. 482, 271 (2003); 10.1017/S0022112003004117 [DOI] [Google Scholar]; Luther S., Rensen J., and Guet S., Exp. Fluids 36(2), 326 (2004); 10.1007/s00348-003-0725-7 [DOI] [Google Scholar]; Choi S., Kim S., and Park J., Lab Chip 10(3), 335 (2010). 10.1039/b915047a [DOI] [PubMed] [Google Scholar]
  4. Appel J., Koch P., Mettin R., Krefting D., and Lauterborn W., Ultrason. Sonochem. 11(1), 39 (2004); 10.1016/S1350-4177(03)00111-1 [DOI] [PubMed] [Google Scholar]; Kim J. and Longmire E. K., Exp. Fluids 47(2), 263 (2009); 10.1007/s00348-009-0659-9 [DOI] [Google Scholar]; Ortiz-Duenas C., Kim J., and Longmire E. K., Exp. Fluids 49(1), 111 (2010). 10.1007/s00348-009-0810-7 [DOI] [Google Scholar]
  5. Bayer B. E., U.S. patent 3, 971, 065 July 20, 1976.
  6. Sakamoto T., Nakanishi C., and Hase T., IEEE Trans. Consum. Electron. 44(4), 1342 (1998). 10.1109/30.735836 [DOI] [Google Scholar]
  7. Sankin G. N., Zhou Y. F., and Zhong P., J. Acoust. Soc. Am. 123(6), 4071 (2008). 10.1121/1.2903865 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Yang G. W., Prog. Mater. Sci. 52(4), 648 (2007). 10.1016/j.pmatsci.2006.10.016 [DOI] [Google Scholar]
  9. Lautz J., Sankin G. N., Yuan F., and Zhong P., Appl. Phys. Lett. 97(18), 183701 (2010). 10.1063/1.3511538 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Sankin G. N., Yuan F., and Zhong P., Phys. Rev. Lett. 105(7), 078101 (2010). 10.1103/PhysRevLett.105.078101 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Zwaan E., Le Gac S., Tsuji K., and Ohl C. D., Phys. Rev. Lett. 98(25), 254501 (2007); 10.1103/PhysRevLett.98.254501 [DOI] [PubMed] [Google Scholar]; Yuan F., Sankin G., and Zhong P., J. Acoust. Soc. Am. 130(5), 3339 (2011). 10.1121/1.3626134 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from The Review of Scientific Instruments are provided here courtesy of American Institute of Physics

RESOURCES