Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Oct 10.
Published in final edited form as: Opt Commun. 2017 Jul 25;404:51–54. doi: 10.1016/j.optcom.2017.07.025

Spatial and spectral imaging of point-spread functions using a spatial light modulator

Sravan Munagavalasa a,b,#, Bryce Schroeder a,c,#, Xuanwen Hua a, Shu Jia a,c,*
PMCID: PMC6179356  NIHMSID: NIHMS983614  PMID: 30319153

Abstract

We develop a point-spread function (PSF) engineering approach to imaging the spatial and spectral information of molecular emissions using a spatial light modulator (SLM). We show that a dispersive grating pattern imposed upon the emission reveals spectral information. We also propose a deconvolution model that allows the decoupling of the spectral and 3D spatial information in engineered PSFs. The work is readily applicable to single-molecule measurements and fluorescent microscopy.

1. Introduction

Super-resolution fluorescence imaging techniques have overcome the optical diffraction limit (~ λ/2) of conventional fluorescence microscopy, allowing visualization of biological structures and processes with near-molecular-scale resolution [1,2]. These include methods based on patterned illumination, such as stimulated emission depletion (STED and related RESOLFT) microscopy [1,3,4] and saturated structured illumination microscopy ((s)SIM) [5,6] as well as methods based on single-molecule switching and localization, such as stochastic optical reconstruction microscopy (STORM) [7] and (fluorescence) photoactivated localization microscopy ((F)PALM) [8,9]. To date, the rapid development of these techniques has significantly advanced biological research in many unexplored regimes.

Among these techniques, single-molecule based methods rely on precise localization of the point spread functions (PSFs) of individual fluorophores [79]. For three-dimensional (3D) localization, various point-spread function (PSF) engineering strategies have been developed using axial information encoded in the PSF, including astigmatic [10,11], biplane [12], double-helix [13,14], interferometric [15,16], virtual-volume [17], cork-screw [18], self-bending [19], and tetrapod [20,21] PSFs. These methods have advanced single-molecule-based super-resolution microscopy, achieving significantly improved 3D resolving power and imaging depth. However, the spectral dimension of molecular emissions has long been overlooked in super-resolution microscopy. Addressing this demand, several emerging techniques have successfully extracted the spectral information from the PSFs, implementing new PSF engineering concepts in the spectral domain using various dispersive optical elements, such as a dual-objective system consisting a prism in one channel [22], spectral photon localization using a diffraction grating [23,24], mechanical slits [25], and spectrally-dependent multi-color PSF engineering [26]. These methods enable single-molecule spectroscopy without compromising the 3D localization, allowing the simultaneous identification of both spatial and spectral dimensions of molecular emissions. In this paper, we demonstrate the principle of simultaneous spatial and spectral measurement of molecular emissions using a spatial light modulator (SLM)-based diffraction grating, i.e. a pattern imposed on the SLM, which can disperse the incoming spectrum into a range of varying deflection angles.

2. Experimental setup and results

In our experimental realization (Fig. 1(a)), we placed an SLM (PLUTO-VIS, Holoeye) at the Fourier plane to implement phase modulation in the detection path of the microscope. Sub-diffraction-limit 100-nm fluorescent beads (TetraSpeck, Thermo Fisher) were used as point emitters, and their images were recorded using a 100×, 1.45NA oil objective lens (CFI Plan Apo Lambda, Nikon). The fluorescent images represent the PSF of the imaging system, the wavefront of which can be engineered using a phase pattern applied with the SLM. Here, we adopted a sawtooth diffraction grating on the SLM. The zeroth-order diffraction is the original, unmodulated PSF, whereas in the first-order diffraction, the spectral components of the fluorescent emission are dispersed into different diffraction angles. This angular discrimination thus converts spectral information into varying spatial displacement in the image plane on the camera (Fig. 1(b)). Theoretically, the setting of the diffraction grating and its Fourier transform results in a displacement Δxfmλ/d where m, is the diffraction order (m = 1), d is the period of the grating, λ is the wavelength, and f is the focal length of the Fourier transform lens.

Fig. 1.

Fig. 1.

(a) Experimental setup. The objective lens (OBJ) and tube lens (TL) form an image of the sample at the intermediate plane (black dashed line), which is relayed to the EMCCD camera by two relay lenses, L1 and L2. The spatial light modulator (SLM) situated at the focal plane of the relay lenses imparts phase modulation that disperses fluorescent emissions. Laser, two lines at 561 nm and 647 nm; DM, dichroic mirror. (b) Principle of diffraction (or dispersive) grating. Each spectral component of the 1st-order beam propagates away at an angle that depends on its wavelength. The 0th-order beam remains unmodulated.

We experimentally demonstrated this principle with two fluorescent emission spectra, peaked at 580 nm and 680 nm. As shown in Fig. 2(a), the PSFs of two different spectra were initially aligned to their unmodulated zeroth-order diffraction. It is observed that their first-order diffraction images exhibit different displacements due to the wavelength-dependent dispersion through the grating. Numerically, we performed simulations of the dispersive profiles using vendor-provided spectral data of the fluorophores, which agree well with the experimentally measured PSFs and cross-sectional profiles (Fig. 2(a), (b)). The position of the zeroth-order diffraction can be determined by 2D Gaussian fitting, and the first-order position by measuring the centroid of intensity. Here, the statistical distribution of the centroids of the first-order images for each spectrum exhibits an easily-discernible difference in separation distance between the orders, providing a means of resolving spectral components (Fig. 2(c)). Inserting the experimental settings ( d = 48.0 μm, λ = 580 or 680 nm, f = 20 cm) into the above equation, we derived the displacement Δx of the first-order diffraction as 2.42 mm or 2.83 mm (or equivalently 151.3 pixels or 176.9 pixels on the camera), consistent with the experimental observation of 163.3 pixels and 184.3 pixels, respectively (Fig. 2(c)).

Fig. 2.

Fig. 2.

(a) Experimental data and numerical simulations of the dispersive pattern in two spectra (peaked at 580 nm and 680 nm). Left, images of the 0th-order (unmodulated) beam; right, images of the 1st-order (dispersed) beam. Simulation uses vendor-supplied data for the fluorescence emission spectrum. Scale bar, 1 μm. (b) Normalized cross-sectional intensity profiles of the 0th (left) and 1st (right) -order beams for the fluorescent emissions peaked at 580 nm (green solid line) and 680 nm (red solid line). Corresponding red and green dashed lines show the result expected from the vendor-provided fluorescence emission spectrum. Good agreement is noted. (c) Localizations of fluorescent emitters show that the distance between the 0th and 1st-order images reliably differentiates the two spectra (mean = 163.3 and 184.3, s.d. = 5.0 and 1.2, for 580-nm and 680-nm spectra, respectively (unit: pixels). n = 21). (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

3. Decouple spatial and spectral information

Furthermore, PSF engineering in the spectral domain provides a path for simultaneous spectral and 3D spatial imaging of molecular emissions. However, such an engineered PSF contains multiplexed spectral and spatial information. Methods for decoupling and recovering the information in each category have been reported, such as utilizing dual channels [22] and integrating multi-spectral phase modulation [26]. The SLM-based grating methods can perform the PSF engineering with merged grating and axial-information-encoding patterns, which leads to a coupled first-order image that contains both spatial and spectral information. Here we develop a deconvolution model to address this problem. As a proof of principle, we utilized the astigmatic PSF [10] for 3D localization and thus combined an astigmatic phase mask with the diffraction grating on the SLM.

The propagation of the astigmatic PSF was simulated using a beam-propagating method [27] and its axial-stack images are shown in Fig. 3(a). With astigmatism, the ellipticity of the PSF varies during propagation, which can be used to determine the axial position of the emitter. Adding the diffraction grating, the original astigmatic PSF profiles are horizontally dispersed with respect to the orientation of the grating, becoming unsuitable for axial localization due to the smeared ellipticity (Fig. 3(b)). Our proposed model for recovering spatial and spectral information is as follows: first, we can determine the spectral information of the PSFs by measuring the positions of the centroids of the images in Fig. 3(b), i.e. to obtain the displacement of the first-order from the zeroth-order images, which obeys a pre-calibrated relationship as exemplified in Fig. 2. Next, by deconvolving the PSF with the corresponding spectral profile using the scikit-image unsupervised Wiener–Hunt deconvolution algorithm in Python [28], the spatial-only astigmatic PSF and thus the 3D spatial information can be revealed. Following this two-step process, the spectral and spatial information are decoupled and can be individually identified. The dispersed astigmatic PSF can thus be largely recovered using the above process, as shown in Fig. 3(c). It should also be noted that the deconvolution concept may introduce a solution to an issue in PSF-engineered imaging using SLMs in the detection path, where the non-monochromatic nature of a fluorescence emission inevitably creates a dispersive effect, tending to smear the first-order image on the camera. Using the deconvolution concept with the known emission spectrum of the fluorophore, this imperfection could be effectively reversed, improving the image quality correspondingly. The source code and mathematical derivations can be found in the Supplementary Material and online (https://sites.google.com/site/thejialab).

Fig. 3.

Fig. 3.

Numerical simulation of simultaneous spatial and spectral information recovery. (a) Axial image stacks of an astigmatic PSF at different axial positions, z = −250, −125, 0, 125, 250 nm. The ellipticity of the images can be used to determine the axial position. (b) The corresponding axial image stacks of the PSF when the grating pattern is combined. A horizontally orientated dispersion is noted, resulting in smeared PSF profiles. (c) The corresponding axial image stacks of the PSF after spectral deconvolution, restoring the astigmatic PSF for 3D localization. Nominal axial positions shown are calculated for 580 nm. Scale bar, 1 μm.

4. Discussion and conclusions

In summary, in this Report, we demonstrate an SLM-based PSF engineering approach to simultaneous imaging of the spatial and spectral information of molecular emissions. We show that the dispersive grating imposed upon the emission readily allows fluorescent spectra to be distinguished. The presented SLM-based system is flexible and adjustable for various experimental conditions and compatible with many existing PSF-engineered optical imaging methods [29]. In particular, the method provides an accessible and alternative system design for the reported spectrally-resolved techniques for single-molecule measurement and super-resolution fluorescence microscopy [22,23]. Furthermore, the use of the SLM allows the incorporation of phase patterns that encode axial information for 3D localization. Here we proposed a model that allows the decoupling of the spatial and spectral information, realizing simultaneous 3D imaging of molecules using an astigmatic PSF. This model can be readily extended to be utilized with other types of PSF-engineering methods to further improve 3D imaging capability [13,19,26].

However, several considerations for practical use should be noted. The field of view (FOV) is moderately reduced due to the simultaneous use of the zeroth and first diffraction orders. The FOV can be enlarged by using a denser diffraction grating, however, at the expense of the number of photons detected. In fact, such SLM-imposed photon loss has been a crucial factor to be considered for single-molecule imaging [13,19], where the image resolution depends on the number of photons detected from individual fluorophores. The photon loss can be largely mitigated by using a continuous phase mask fabricated by gray-scale photolithography after the mask has been determined by the SLM [30]. Moreover, the approach can also be combined with the previously reported dual-objective detection scheme [11] or ultrabright photoactivatable fluorophores [31], which should further increase the number of photons detected and thus allow for better image resolutions. Taking these factors into account, the method is expected to be ripe for further demonstration in single-molecule measurement and fluorescence microscopy.

Supplementary Material

2

Acknowledgments

This work is supported by Stony Brook University, National Science Foundation (1604565) and Defense Advanced Research Projects Agency (1135385).

Footnotes

Appendix A. Supplementary data

Supplementary material related to this article can be found online at http://dx.doi.org/10.1016/j.optcom.2017.07.025.

References

  • [1].Hell SW, Far-field optical nanoscopy, Science 2 (2007) 1153–1158. 10.1126/science.1137395. (80-). [DOI] [PubMed] [Google Scholar]
  • [2].Huang B, Babcock H, Zhuang X, Breaking the diffraction barrier: Super-resolution imaging of cells, Cell 143 (2010) 1047–1058. 10.1016/j.cell.2010.12.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [3].Willig KI, Rizzoli SO, Westphal V, Jahn R, Hell SW, STED microscopy reveals that synaptotagmin remains clustered after synaptic vesicle exocytosis, Nature 440 (2006) 935–939. 10.1038/nature04592. [DOI] [PubMed] [Google Scholar]
  • [4].Klar TA, Hell SW, Subdiffraction resolution in far-field fluorescence microscopy, Opt. Lett 24 (1999) 954 10.1364/OL.24.000954. [DOI] [PubMed] [Google Scholar]
  • [5].Gustafsson MGL, Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy, J. Microsc 198 (2000) 82–87. 10.1046/j.1365-2818.2000.00710.x. [DOI] [PubMed] [Google Scholar]
  • [6].Gustafsson MGL, Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution, Proc. Natl. Acad. Sci. USA 102 (2005) 13081–13086. 10.1073/pnas.0406877102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Rust MJ, Bates M, Zhuang X, Sub-diffraction-limit imaging by stochastic optical reconstruction microscopy (STORM), Nat. Methods 3 (2006) 793–795. 10.1038/nmeth929. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Betzig E, Patterson GH, Sougrat R, Lindwasser OW, Olenych S, Bonifacino JS, Davidson MW, Lippincott-Schwartz J, Hess HF, Imaging intracellular fluorescent proteins at nanometer resolution, Science 313 (2006) 1642–1645. 10.1126/science.1127344. [DOI] [PubMed] [Google Scholar]
  • [9].Hess ST, Girirajan TPK, Mason MD, Ultra-high resolution imaging by fluorescence photoactivation localization microscopy, Biophys. J 91 (2006) 4258–4272. 10.1529/biophysj.106.091116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Huang B, Wang W, Bates M, Zhuang X, Three-dimensional super-resolution imaging by stochastic optical reconstruction microscopy, Science 319 (2008) 810–813. 10.1126/science.1153529. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Xu K, Babcock HP, Zhuang X, Dual-objective STORM reveals three-dimensional filament organization in the actin cytoskeleton, Nat. Methods 9 (2012) 185–188. 10.1038/nmeth.1841. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [12].Juette MF, Gould TJ, Lessard MD, Mlodzianoski MJ, Nagpure BS, Bennett BT, Hess ST, Bewersdorf J, Three-dimensional sub-100 nm resolution fluorescence microscopy of thick samples, Nat. Methods 5 (2008) 527–529. 10.1038/nmeth.1211. [DOI] [PubMed] [Google Scholar]
  • [13].Pavani SRP, Thompson MA, Biteen JS, Lord SJ, Liu N, Twieg RJ, Piestun R, Moerner WE, Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function, Proc. Natl. Acad. Sci. USA 106 (2009) 2995–2999. 10.1073/pnas.0900245106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Backlund MP, Lew MD, Backer AS, Sahl SJ, Grover G, Agrawal A, Piestun R, Moerner WE, Simultaneous, accurate measurement of the 3D position and orientation of single molecules, Proc. Natl. Acad. Sci. USA 109 (2012) 19087–19092. 10.1073/pnas.1216687109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].Shtengel G, Galbraith JA, Galbraith CG, Lippincott-Schwartz J, Gillette JM, Manley S, Sougrat R, Waterman CM, Kanchanawong P, Davidson MW, Fetter RD, Hess HF, Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure, Proc. Natl. Acad. Sci 106 (2009) 3125–3130. 10.1073/pnas.0813131106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Aquino D, Schönle A, Geisler C, Middendorff CV, Wurm CA, Okamura Y, Lang T, Hell SW, Egner A, Two-color nanoscopy of three-dimensional volumes by 4Pi detection of stochastically switched fluorophores, Nat. Methods 8 (2011) 353–359. 10.1038/nmeth.1583. [DOI] [PubMed] [Google Scholar]
  • [17].Tang J, Akerboom J, Vaziri A, Looger LL, Shank CV, Near-isotropic 3D optical nanoscopy with photon-limited chromophores, Proc. Natl. Acad. Sci. USA 107 (2010) 10068–10073. 10.1073/pnas.1004899107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [18].Lew MD, Lee SF, Badieirostami M, Moerner WE, Corkscrew point spread function for far-field three-dimensional nanoscale localization of pointlike objects, Opt. Lett 36 (2011) 202–204. 10.1364/OL.36.000202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Jia S, Vaughan JC, Zhuang X, Isotropic three-dimensional super-resolution imaging with a self-bending point spread function, Nat. Photonics 8 (2014) 302–306. 10.1038/nphoton.2014.13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Shechtman Y, Sahl SJ, Backer AS, Moerner WE, Optimal point spread function design for 3D imaging, Phys. Rev. Lett 113 (2014) 133902 10.1103/PhysRevLett.113.133902. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [21].Shechtman Y, Weiss LE, Backer AS, Sahl SJ, Moerner WE, Precise 3D scan-free multiple-particle tracking over large axial ranges with Tetrapod point spread functions, Nano Lett 15 (2015) 4194 10.1021/acs.nanolett.5b01396. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Zhang Z, Kenny SJ, Hauser M, Li W, Xu K, Ultrahigh-throughput single-molecule spectroscopy and spectrally resolved super-resolution microscopy, Suppl. Nat. Methods 12 (2015) 935–938. 10.1038/nmeth.3528. [DOI] [PubMed] [Google Scholar]
  • [23].Dong B, Almassalha L, Urban BE, Nguyen T-Q, Khuon S, Chew T-L, Backman V, Sun C, Zhang HF, Super-resolution spectroscopic microscopy via photon localization, Nature Commun. 7 (2016) 12290 10.1038/ncomms12290. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [24].Urban BE, Dong B, Nguyen T-Q, Backman V, Sun C, Zhang HF, Subsurface super-resolution imaging of unstained polymer nanostructures, Sci. Rep 6 (2016) 28156 10.1038/srep28156. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Kakizuka T, Ikezaki K, Kaneshiro J, Fujita H, Watanabe TM, Ichimura T, Simultaneous nano-tracking of multiple motor proteins via spectral discrimination of quantum dots, Biomed. Opt. Express 7 (2016) 2475–2493. 10.1364/BOE.7.002475. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Shechtman Y, Weiss LE, Backer AS, Lee MY, Moerner WE, Multicolour localization microscopy by point-spread-function engineering, Nat. Photonics 10 (2016) 590–594. 10.1038/nphoton.2016.137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [27].Okamoto K, Fundamentals of Optical Waveguides, Academic Press, 2005. 10.1016/B978-012525096-2/50001-5. [DOI] [Google Scholar]
  • [28].Orieux F, Giovannelli J-F, Rodet T, Bayesian estimation of regularization and point spread function parameters for Wiener–Hunt deconvolution, J. Opt. Soc. Amer. A 27 (2010) 1593 10.1364/JOSAA.27.001593. [DOI] [PubMed] [Google Scholar]
  • [29].Backer AS, Moerner WE, Extending single-molecule microscopy using optical fourier processing, J. Phys. Chem. B 118 (2014) 8313–8329. 10.1021/jp501778z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [30].Grover G, Quirin S, Fiedler C, Piestun R, Photon efficient double-helix PSF microscopy with application to 3D photo-activation localization imaging, Biomed. Opt. Express 2 (2011) 3010–3020. 10.1364/BOE.2.003010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [31].Vaughan JC, Jia S, Zhuang X, Ultrabright photoactivatable fluorophores created by reductive caging, Nat. Methods 9 (2012) 1181–1184. 10.1038/nmeth.2214. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

2

RESOURCES