Abstract
Single-molecule super-resolution fluorescence microscopy and single-particle tracking are two imaging modalities that illuminate the properties of cells and materials on spatial scales down to tens of nanometers, or with dynamical information about nanoscale particle motion in the millisecond range, respectively. These methods generally use wide-field microscopes and two-dimensional camera detectors to localize molecules to much higher precision than the diffraction limit. Given the limited total photons available from each single-molecule label, both modalities require careful mathematical analysis and image processing. Much more information can be obtained about the system under study by extending to three-dimensional (3D) single-molecule localization: without this capability, visualization of structures or motions extending in the axial direction can easily be missed or confused, compromising scientific understanding. A variety of methods for obtaining both 3D super-resolution images and 3D tracking information have been devised, each with their own strengths and weaknesses. These include imaging of multiple focal planes, point-spread-function engineering, and interferometric detection. These methods may be compared based on their ability to provide accurate and precise position information of single-molecule emitters with limited photons. To successfully apply and further develop these methods, it is essential to consider many practical concerns, including the effects of optical aberrations, field-dependence in the imaging system, fluorophore labeling density, and registration between different color channels. Selected examples of 3D super-resolution imaging and tracking are described for illustration from a variety of biological contexts and with a variety of methods, demonstrating the power of 3D localization for understanding complex systems.
1. Introduction
Fluorescence microscopy has been a mainstay of biological and biomedical laboratories since its inception.(1) Labeling specific proteins or oligonucleotides with a small fluorophore (2) or an autofluorescent protein (3, 4) lights up those molecules of interest against a dark background in a relatively non-invasive way, painting a picture of subcellular structures and intermolecular interactions. Imaging of fluorescent labels has also been applied to the study of non-biological materials such as polymers and glasses,(5) mesoporous materials,(6) and directly to natively-fluorescent antenna complexes.(7) The resulting fluorescence images have provided useful information about structures, dynamics, and interactions for decades, first with wide-field epifluorescence microscopes.
A major advance occurred more than 27 years ago, specifically the advent of the ability to optically detect and spectrally characterize single molecules in a condensed phase by direct detection of the absorption,(8) followed a year later by detection of single-molecule absorption by sensing emitted fluorescence in solids (9) and in solution.(10) Single-molecule spectroscopy (SMS) and imaging allows exactly one molecule hidden deep within a crystal, polymer, or cell to be observed via optical excitation of the molecule of interest. This represents the ultimate sensitivity level of ~1.66 × 10−24 moles of the molecule of interest (1.66 yoctomole), but the detection must be achieved in the presence of billions to trillions of solvent or host molecules. Successful experiments must meet the requirements of (a) guaranteeing that only one molecule is in resonance in the volume probed by the laser, and (b) providing a signal-to-noise ratio (SNR) for the single-molecule signal that is greater than unity for a reasonable averaging time. The first of these two requirements means that at room temperature, the single molecules need be farther apart than about 500 nm, so that the diffraction-limited spots from each do not overlap, and this is generally achieved by dilution. The power of the method primarily rests upon the removal of ensemble averaging: in contrast to traditional spectroscopy, it is no longer necessary to average over billions to trillions of molecules to measure optical quantities such as brightness, lifetime, emission spectrum, polarization, and so on. Therefore, it becomes possible to directly measure distributions of behavior to explore hidden heterogeneity, a property that would be expected in complex environments such as in cells, polymers, or other materials. In the time domain, the ability to optically sense internal states of one molecule and the transitions among them allows measurement of hidden kinetic pathways and the detection of rare intermediates. Because typical single-molecule labels behave like tiny light sources roughly 1–3 nm in size and can report on their immediate local environment, single-molecule studies provide a new window into the nanoscale with intrinsic access to time-dependent changes. The basic principles of single-molecule optical spectroscopy and imaging have been the subject of many reviews(11–21), and books (22–25)
This article considers two applications of single-molecule spectroscopy as extended to three spatial dimensions, “single-particle tracking” (SPT) of individual molecules in motion, and “super-resolution” (SR) microscopy of extended structures to provide details of the shape of an object on a very fine scale. Both of these methods rest on imaging of single molecules, where the x–y position is extracted from the image to localize the position of the molecule. (For concision, we will generally refer to point emitters as single fluorescent molecules, since quantum dot nanocrystals, metallic nanoparticles, etc. can be localized similarly.) Many types of microscopes can be used,(18) including wide-field, confocal, total-internal-reflection, two-photon, etc. to measure the positions of isolated single molecules, but a key point is: How precisely can the position or location of a single molecule be determined? Unfortunately, fundamental diffraction effects (26) blur the molecule’s detected emission profile, limiting its smallest width on the camera in sample units to roughly the optical wavelength λ divided by two times the numerical aperture (NA) of the imaging system (Figure 1A). Since the largest values of NA for state-of-the-art, highly corrected microscope objectives are in the range of about 1.3–1.5, the minimum spatial size of single-molecule spots is limited to about ~200 nm for visible light of 500 nm wavelength. Nevertheless, the measurement of the shape of the spot is useful information, in that a fitting function can be fit to the measured pixelated image to obtain the molecular position. The precision of this localization can be far smaller than the diffraction-limited width of the spot, and is limited primarily by the number of photons detected from the molecule. Both the precision and accuracy of this localization process are described in detail in Sections 2 and 3 of this paper, not only for 2D, but also for 3D determinations of the molecular position.
Single-particle-tracking (SPT) is one method that applies single-emitter localization to answer key questions in both biology and materials science. In a SPT experiment, the spatial trajectory of a given molecule is determined by repeatedly detecting and localizing it at many sequential time points (Figure 1B).(28) Analysis of the resulting single-molecule tracks provides information on the mode of motion of the set of labelled molecules, which may be diffusive, motor-directed, confined or a mixture of these modes.(29–34) For materials science applications and in vitro reconstitutions of biological systems, dilution of the fluorescent probe used is typically sufficient to achieve single-molecule concentration. For fluorescently tagged biomolecules in living cells, it is sometimes necessary to use other means to lower the emitter concentration. These include lowering the expression levels of genetically encoded labels,(35) quenching or photobleaching an initially large number of emitters, chemical generation of emitters, or sparse activation of only a few photoactivatable molecules at a time.(36, 37) While in vitro measurements are simpler, because intracellular crowding and binding interactions with cellular structures influence measured trajectories, it is more representative to observe the motion of the molecule of interest as it performs its function in the native biological environment, either in the cytoplasm or the membrane.(38)
Turning now to the second method, let us pose a problem: how can one extract images of structures, if the power of optical microscopy to resolve two closely spaced-emitters is limited by the fundamental effects of diffraction? This problem has been overcome by the invention of “super-resolution” optical microscopy, recognized by the Nobel Prize in Chemistry in 2014, awarded to one of us (W.E.M.)(39), Eric Betzig(40) and Stefan Hell(41), an achievement which has continued to stimulate a revolution in biological and materials microscopy. Hell pursued Stimulated Emission Depletion Microscopy (STED) and its variants, while the focus of this paper is on approaches to overcome the diffraction limit using single molecules which rely on a clever modification of standard wide-field single-molecule fluorescence microscopy,(42–44) described in more detail in other papers in this special issue.
These single-molecule super-resolution (SR) imaging approaches are summarized in Figure 1C. As opposed to tracking of the same molecule, SR determines the positions of different molecule labels at different times to resolve a static structure. The essential requirements for this process are (a) sufficient sensitivity to enable imaging of single-molecule labels, (b) determination of the position of a single molecule with a precision better than the diffraction limit, and (c) the addition of some form of on/off control of the molecular emission to maintain concentrations at very low levels in each imaging frame. This last point is crucial: the experimenter must actively pick some photophysical, photochemical, or other mechanism which forces most of the emitters to be off while only a very small, non-overlapping subset is on. If these labelled copies are incorporated into a larger structure, such as a polymeric protein filament, then their positions recorded from different imaging frames randomly sample this structure. A point-by-point reconstruction can then be assembled by combining the localized positions of all detected molecules in a computational post-processing step. Importantly, because all molecules are localized with a precision of tens of nanometers, this approach circumvents the diffraction limit that otherwise limits image resolution to 200–300 nm in conventional fluorescence microscopy.
The required sparsity in the concentrations of emitting molecules can be achieved by a variety of methods. The PAINT method(45) (Points Accumulation for Imaging in Nanoscale Topography) relies upon the photophysical behavior of certain molecules that light up when bound or constrained, and this idea was initially demonstrated with the twisted intermolecular charge transfer turn-on behavior of Nile Red.(46) PAINT has advantages that the object to be imaged need not be labeled and that many individual fluorophores are used for the imaging, thus relaxing the requirement on the total number of photons detected from each single molecule. In the STORM approach (44) (Stochastic Optical Reconstruction Microscopy), cyanine dye molecules (e.g. Cy5) are forced into a dark state by photoinduced reaction with nearby thiols. A small subset of the emitters return from the dark form either thermally or by pumping of a nearby auxiliary molecule like Cy3. In the (f)PALM approach (42, 43) (fluorescence PhotoActivated Localization Microscopy), a photoactivatable fluorescent protein label is used, and a weak activation beam creates a small concentration of emitters which are localized until they are bleached, and the process is repeated many times.
Other active control mechanisms have been demonstrated giving rise to a menagerie of acronyms and names, including dSTORM (direct STORM) (47), GSDIM (Ground-State Depletion with Individual Molecule return)(48), “Blink Microscopy” (49), SPDM (Spectral Precision Determination Microscopy) (50), and many others. Photoactivation methods have been extended to organic dyes,(51, 52) photorecovery and/or photoinduced blinking can be used for SR with fluorescent proteins such as eYFP,(53, 54) and even enzymatic methods produce turn-on which may be controlled by the concentration of substrate and the enzymatic rate.(55)
Between the large improvement in resolution (a factor of 5 or more, typically down to the 20–40 nm range) of SR microscopy, and the dynamical information and understanding of heterogeneity available from SPT, single-molecule localization has revealed structures and processes that were previously unresolvable. As treated in previous reviews,(56–65) and discussed throughout this special issue, these improvements have allowed scientists to address many long-standing questions spanning the fields of biology and materials science. The methodological challenge that we focus on in this review, of how to best obtain 3D information in SPT and SR microscopy, touches all of these subject areas. The reason 3D information is needed is simple: the world is three-dimensional, so that if we can only make 2D measurements, we will likely miss critical information. For example, as shown in Figure 2, motion on a 3D object such as the surface of a spherical cell appears contracted and warped when only 2D information is available. In the 3D trajectory, it is easy to observe that the molecule diffusely freely over the surface, while in the 2D projection, it is not even clear whether the molecule is surface-associated.
Precise and accurate single-molecule localization is the foundation for all quantitative SR and SPT studies, and we begin in Section 2 by introducing the key principles of single-molecule detection and position estimation for the general case of 2D localization. In Section 3, we catalog and compare many of the creative and elegant means by which these measurements have been extended into the third dimension. In Section 4, we treat the physical and theoretical limits of these techniques: the possible resolution that can be attained, the corrupting influence of aberrations and other systematic errors, and how these limits have been pushed. Finally, in Section 5, we show the power of 3D localization microscopy by examining several case studies where 3D SR and SPT have offered new insights.
2. Basic Principles of Single-Molecule Localization
Many of the experimental and analytical considerations in 3D single-molecule microscopy can be seen as an extension of those in “traditional” 2D single-molecule localization experiments. To introduce vocabulary and context that is shared between many 3D localization modalities, we first give a brief overview of instrumentation and statistical theory used to localize single molecules in two dimensions. We encourage readers new to localization microscopy to peruse reviews that specifically treat these topics in detail for further background (18, 66, 67).
2.1 Detecting Single Molecules
The most common imaging geometry used in 2D single-molecule wide-field imaging experiments is sketched in Figure 3A, and provides the basis for 3D single-molecule microscopes. In most details, this is equivalent to the epifluorescence microscopes used for diffraction-limited microscopy. Here, we briefly review the foundational instrumental principles true of both 2D and 3D before discussing the microscope modifications allowing 3D localization in later sections.
In contrast to confocal microscopes, which scan a diffraction-limited focused laser spot across the sample, epifluorescence microscopes often produce a larger Gaussian illumination spot ranging from 1 to 100 μm in diameter in the object plane. In this way, all actively emitting fluorescent molecules in the sample can be detected simultaneously without scanning. To produce a broad illumination profile, the light source (typically a laser beam) is focused at the back focal plane of the microscope objective using a Köhler lens positioned externally to the inverted microscope body. The fluorescence from excited molecules is then collected back through the objective and imaged onto an array detector (i.e. a camera) with a separate tube lens. The objective lens is the most important detection optic in the system, and its most important parameter is numerical aperture, NA = n × sin(θ) with n the refractive index of the medium between the sample and objective and θ the objective’s maximum collection half-angle. The highest θ available from conventional objective lenses is roughly 72°, and by using high-index media such as immersion oil (n = 1.51), typical objectives used for single-molecule imaging achieve NA = 1.3–1.5.
The objective NA is a major determinant of image quality. As discussed in the introduction, optical resolution is limited by blurring of image features finer than the diffraction limit. The lateral resolution of wide-field fluorescence microscopy is thus limited to 0.61 λem, or at best ~200–250 nm for emission wavelengths λem in the visible range. Axial resolution is even more dependent on NA, and is at best roughly equal to 2λem/NA2 (~550–700 nm). Much of the behavior of the microscope is defined by the “point spread function” (PSF), which describes how light collected from a point emitter is transformed into an image. Mathematically, diffraction blur is equivalent to a convolution of the true labeled structure with the microscope’s three-dimensional PSF, shown in Figure 3B for a NA 1.4 objective and an emitter near the coverslip. This blur is directly responsible for the inability of conventional diffraction-limited microscopy to record fine details. The image formed on the camera by an ideal point emitter is a cross-section of the 3D PSF, dependent on the position of the emitter relative to the focal plane. The ~250 nm transverse width of the in-focus PSF is clearly apparent (Figure 3B, top x–y slice). By contrast, the PSF at 300 nm defocus is more diffuse and appears dimmer (bottom x–y slice). As can been seen in the x–z profile, within the scalar approximation, defocused emission profiles from above and below the focal plane are indistinguishable. As suggested by the resolution criteria above, higher NA objectives provide more compact PSFs; they also improve total signal, as light collection efficiency scales with the square of the NA.
However, even the best collection optics and brightest fluorophores available will not be enough if the signal from single molecules is drowned out by background light, and filtering this background while keeping as many signal photons as possible is a well-known requirement for every single-molecule experiment (18, 70). Carefully selected spectral filters are absolutely essential to reduce background from autofluorescence and scattering, and it is common to image at relatively red wavelengths (500–700 nm) to avoid spectral regions of particularly high cellular autofluorescence. Yet when imaging samples such as thick cells, even optimal spectral filtering may not be sufficient, as autofluorescence, scattering, and fluorescence from out-of-focus fluorophores all increase with the illuminated sample volume. In these cases, it is necessary to consider alternate illumination geometries from epiillumination. One way to reduce background is excitation by total internal reflection (thus TIR fluorescence or TIRF) which makes use of the 100–200 nm thick evanescent field generated from the excitation beam at incidence angles greater than the critical angle of the interface between the coverslip and the sample.(71–73) However, while TIRF greatly reduces out-of-focus background fluorescence above the 100–200 nm illumination region, yielding excellent results for thin imaging volumes,(74–77) having a limited axial reach away from the coverslip makes TIRF less useful for 3D imaging inside cells.
Detection is the last part of the imaging process, and the development of new, more sensitive cameras has been crucial to the progress of single-molecule microscopy. To maximize collected signal from the molecule, a single-molecule detector must have high quantum efficiency in the conversion of photons to photoelectrons. Further, all detectors introduce noise due to thermally generated current (“dark noise,” dependent on temperature and integration time) and noise from the electronics that convert the photocurrent of each pixel to a signal voltage (“read noise,” roughly constant for a given detector). For conventional detectors used in diffraction limited fluorescence microscopy such as Charge Coupled Device (CCD) cameras, this added noise can overwhelm the small signal from single molecules, and must be reduced. State-of-the-art cameras that address these needs include Electron-Multiplying CCD (EMCCD) and scientific Complementary Metal Oxide Semiconductor (sCMOS) cameras. EMCCDs have long achieved quantum efficiencies on the order of 95%, making them a common choice for single-molecule imaging, while sCMOS cameras have become increasingly popular in recent years thanks to rapid development in detector technology, and have recently reached quantum efficiencies of 95% as well (Photometrics Prime 95B; Tucson, AZ, USA). These cameras each overcome read noise in their own way, either by charge multiplication before readout (EMCCD) or by engineering extremely low read noise in the first place (sCMOS).
The contributions to the image formed on the camera from single-molecule fluorescence and background are summarized in Figure 3C,D. To design effective single-molecule experiments, whether in 2D or 3D, it is important to consider what determines the quality of such an image. Signal-to-background ratio (SBR) or contrast is a common metric in many imaging modalities, but is less informative in single-molecule imaging. As the difference between Figure 3C and Figure 3D shows, even when maintaining the same SBR, increasing the total detected photons (as is done by increasing the integration time or illumination intensity) improves the signal-to-noise ratio given shot noise statistics. That is, background is fundamentally a problem because it adds noise, rather than decreases contrast.
While increasing the pumping irradiance can greatly improve signal-to-noise (SNR), there is an upper limit to the signal available from single molecules. On a fundamental photophysical level, the emission rate from the molecule might saturate while that from the background might not. In most cases, a more serious limit is photochemical stability, i.e. the maximum possible emitted photons from single-molecule fluorophores before photobleaching. This quantity is often more important than brightness per se. For modern detectors and under most imaging conditions, these photon statistics are overwhelmingly the largest noise source, though the detector noise floor (i.e. read noise, dark noise, and other factors such as clock induced charge in EMCCDs) becomes highly significant under certain conditions with low signal in each pixel.(78) With such noise statistics in mind, we now turn to the question of how exactly we can infer accurate and precise molecular positions from the images collected by our microscope.
2.2 Estimating Single-Molecule Position
After acquiring single-molecule data, the task remains to implement algorithms that faithfully extract molecule positions from noisy, complicated images. While this task poses challenges, the reward – nanoscale position estimates of potentially millions of individually resolved molecules within a sample – is a worthy one. As with instrumentation, many of the principles of image analysis in 2D and 3D single-molecule localization are analogous. Here, we describe how molecular positions are extracted from image data in 2D, and introduce a general framework for evaluating and comparing localization methods in 3D.
Most fitting algorithms used for SR and SPT assume spatiotemporal sparsity, such that regions of interest (ROIs) can be defined in each image frame that each contain the intensity pattern of only one single-molecule emitter. As discussed in Section 2.1, under ideal conditions, the average intensity produced by a point emitter is proportional to a cross-section of the microscope’s 3D PSF, scaled by the total signal photons Nsig plus a constant background per unit area Nbg, giving us the imaging model
(1) |
defined for position u,v in the image plane when the emitter is at position x, y, z in the sample space, where θ represents these position and photon parameters, plus any other parameterizations of the PSF. The actual image from the detector is subject to noise and binned into pixels, producing a signal nk at each of the pixels wk within the ROI. Phrased this way, single-molecule localization is revealed to be an inverse problem: given the pixelated and noisy image vector n and our knowledge of H, how can we accurately and precisely estimate the molecule’s position x = (x, y, z)?
Accuracy means that our chosen position estimator must not be systematically biased away from the true value, while precision means that the estimator makes effective use of all the available data to minimize random error. Estimators can be very simple, but not necessarily very accurate or precise. For example, since the 2D microscope PSF is brightest close to the molecule’s position, we could take the brightest of the pixel vector w of the PSF as a position estimate, , a “max-value” estimator. Or, applying the same logic slightly differently, we could use a weighted average of all the N pixels in the ROI,
(2) |
i.e. a centroid estimator. Such estimators are computationally convenient, but limited: the max-value estimate is sensitive to noise and limited by pixel size, and so has low precision, while the centroid estimate is biased towards the center of the ROI by the background in outlying pixels, making it inaccurate (Figure 4).
How can one do better? While it is possible to create effective, unbiased estimators that use only basic assumptions about the PSF, such as radial symmetry,(79) the most powerful estimators implement a more detailed PSF model to solve the inverse problem of eq. (1). Least-squares (LS) fitting and maximum likelihood estimation (MLE) are two such methods that are adaptable to many models.
LS and MLE fit the image data to a model function. As implied by eq. (1), the imaging model H(θ) produces an expected value of the image, I(u,v; θ), and the best guess of θ is found by optimizing I to match the observed PSF. While tracking of the positions of large biological objects has a long history,(29, 30) deep considerations of the fitting procedure are relatively recent. LS was one of the first relatively unbiased localization algorithms used,(80, 81) and is familiar from its ubiquitous application to scientific regression problems and curve fitting. The scoring method of LS is to minimize the square error between the PSF model μ(θ) and the observed data n,
(3) |
where x̂ is included in the optimized parameter estimate θ̂. This optimization strategy follows from the statistical assumption that each pixel value nk is an observation with normally, independently and identically distributed noise, which is generally only approximately true.(82, 83)
We must still define the imaging model H that produces the parametrized image I(u, v; θ). After more than a century of study, there is a sophisticated understanding of the standard microscope PSF, and this quantitatively describes the way light from a molecule is collected by the microscope.(26, 68, 84–86) When molecules emit light isotropically, without net polarization, the scalar approximation allows for a comprehensive treatment of the imaging system.(68) As shown in Figure 3, the image of an ideal, in-focus emitter at the coverslip is an Airy disk,
(4) |
with J1 the first order Bessel function of the first kind, k the wavenumber 2π/λ, ρ the distance from the point source in the image plane, i.e. , and C a constant for a given number of total detected photons. While the Airy disk has several relatively dim rings, most of the PSF’s intensity is concentrated in the center. For this reason, a Gaussian function is a tractable and reasonably accurate approximation, and it is extremely common to fit single-molecule data with a symmetric Gaussian plus a constant background with the width σ the size of the spot arising from the diffraction limit(82, 87):
(5) |
As shown in Figure 4 for a simulation with Nsig = 250 photons and Nbg = 5 photons/pixel, performing nonlinear LS fitting with a Gaussian model retrieves a position estimate ( ) close to that of the true position, and avoids the bias of the simpler estimators. We must next ask how precise an individual localization is. To empirically test the random error in our measurement, we can either simulate or measure the spread of position estimates obtained from the same molecule emitting over many frames. As shown in Figure 4B, repeated least-squares localizations on simulated data give a range of estimates ( ) normally distributed around the true values, with standard deviation, or “localization precision,” σ ≈ 10 nm. The localization precision is dependent on the estimator: for example, max-value estimation is much less precise, resulting in a localization precision of 34 nm for the exact same data. A well-known analytical expression was developed for the localization precision of LS Gaussian fitting by Thompson et al., with later corrections by Mortensen et al.:(85, 88)
(6) |
This equation assumes the PSF is a Gaussian with standard deviation σPSF detected with pixels of diameter a, and gives the correct result within a few percent even if the true PSF is an Airy disk (such as simulated in Figure 4). For low background and ignoring pixelation, eq. (6) reduces to the approximate relationship
(7) |
highlighting the importance of high photon counts. Eq. (6) has been widely applied to Gaussian LS estimation, but it only holds for the special case of aberration-free, in-focus, 2D PSFs, and does not reveal whether it might be possible to achieve a better precision with a different estimator. The question of how to define – and achieve – the best possible precision for an arbitrary 3D PSF is found in maximum-likelihood-estimation methods. MLE is rooted in a statistical framework that attempts to estimate a set of underlying parameters yielding a given measurement, accounting for the signal generating model (image formation) and noise statistics. Given image formation and noise models, one attempts to find the set of underlying parameters θ that maximizes the likelihood function, or, equivalently, the log-likelihood.(89, 90) Assuming shot (Poisson) noise dominates, the log-likelihood function is(82, 91)
(8) |
where we drop terms not dependent on θ for clarity. Equivalently, MLE means seeking the set of parameters for which the measurement is most probable.
The preceding has discussed several estimators for determining molecule position. This leaves the question as to how to define the best possible precision one can achieve. This is of interest, both for practical reasons, such as estimating the resolution of an image, for designing a better imaging system, and for fundamental theoretical reasons. A useful and elegant framework for the quantification of localization precision stems from recognizing that emitter localization is a parameter estimation problem. As such, tools from estimation theory can be used to analyze it. Specifically, Fisher information is a mathematical measure of the sensitivity of an observable quantity (image) to changes in its underlying parameters (e.g. emitter position).(91–93) Intuitively, an image that is more sensitive to the emitter’s position contains more information about the position. Mathematically, Fisher information is given by:
(9) |
where f(s; θ) is the probability density, i.e. the probability of measuring a signal s given the underlying parameter vector θ, T denotes the transpose operation, and E denotes expectation over all possible values of s. In the special case where Poisson noise is dominant and there is spatially uniform background, eq. (9) becomes:
(10) |
where μ(k) is the model of the PSF in pixel k and β is the number of background photons per pixel. More sophisticated models can be used that account for the noise characteristics of EMCCD(94) and sCMOS detectors,(95) and calibration of the pixel-to-pixel variation of sCMOS detectors is necessary for accurate localization.
The use of Fisher information in localization analysis is motivated by its relation to precision. The inverse of the Fisher information is the Cramér-Rao lower bound (CRLB), which is the lower bound on the variance with which the set of parameters θ can be estimated using any unbiased estimator.(96) For the position coordinates (x,y,z), this translates to the optimal precision with which the source can be localized for a given imaging modality, signal, background, and noise model. The CRLB is not a quantity of merely theoretical importance – it has been shown that the CRLB can be approached in practice.(82, 85, 97) As long as the imaging model matches the experiment, the precision of MLE will always be superior to LS methods, and will generally achieve the CRLB. However, LS estimators also have merits: they do not require a comprehensive noise model and are simpler to implement, and can achieve precisions close to the CRLB for high background levels, where noise statistics are similar between pixels.(82, 83) As we discuss further in Section 3, CRLB comparisons allow us to systematically compare 3D localization methods, and are important in the design of new PSFs for 3D imaging.
3. Methods for 3D Localization
The 3D PSF shape of a standard fluorescence microscope changes relatively slowly with changing axial position relative to the focal plane (Figure 3). This is equivalent to having low Fisher information, or poor precision for determining z (see Figures 5 and 7 for comparisons of CRLB analysis of several PSFs including the standard PSF). Further, in the absence of aberrations, the PSF is symmetric above and below the focal plane, making it difficult to assign a unique z position to the emitter. While some tracking experiments have achieved nanometer axial precision with the standard PSF, these required extremely bright nanoparticle emitters and working away from focus to remove ambiguity.(98, 99) In most cases, the low photon budget of probes used in single-molecule microscopy severely limits axial localization precision with a conventional fluorescence microscope, especially over extended axial ranges.
Several microscope modifications have emerged that obtain high Fisher information for both axial and transverse localization, allowing precise 3D imaging for a wide range of conditions. These modifications can be classified based on the optical principle they employ: these include multifocus methods, where multiple focal planes are imaged simultaneously; PSF engineering methods, where the PSF of the microscope is altered to encode axial position in its shape; and interference-, interface-, or intensity-sensing methods, which measure the position of the molecule relative to features in the imaging geometry instead of fitting the shape of the PSF. All of these methods are compatible with a wide range of emitting labels, and are sometimes creatively combined to take advantage of the strengths of multiple approaches.
3.1 Multifocus Methods to Extend the Standard PSF
As summarized above, the standard PSF has clear limitations for 3D imaging. These limitations follow from the need to estimate molecule position from a single 2D slice of the PSF; if we were to measure the entire 3D PSF, it would be possible to fit the axial position just as we fit the lateral position. Such volumetric imaging is common in confocal microscopy, where scanning the focal spot to acquire a stack of z-slices gives axial information with gives axial sectioning to diffraction-limited resolution, even though the larger extent of the PSF in the axial direction reduces attainable precision. Since single-molecule imaging is inherently dynamic due to e.g. the movement of molecules in a tracking experiment or the blinking process in a super-resolution experiment, scanning through a focal stack is an unattractive approach.
Widefield multifocal imaging employs a microscope modification that avoids scanning and makes multiple simultaneous measurements of the 3D PSF at different focal planes. The simplest implementation of multifocal imaging, also known as biplane or bifocal imaging, splits fluorescence equally into two channels that are imaged either onto two cameras(100) or onto different regions of the same camera,(99, 101) where shifting the camera placement (or adding another lens) in one channel causes different focal planes to be detected in each channel (Figure 5A). The PSFs with focal planes at 0 nm (channel 1) and 500 nm (channel 2) are shown in Figure 5B. Near focus, each PSF is bright and changes slowly as a function of z position (good lateral precision, poor axial precision), and away from focus, each PSF is spread out but changes shape more quickly (poor lateral precision, better axial precision). When imaged together, the two channels complement each other, allowing good lateral and axial localization precision over a large axial range, as well as removing ambiguity about which side of the focal plane the emitter is on. This is described quantitatively by CRLB analysis: biplane imaging gives more significantly more precise axial localization than single-plane imaging, and has an axial range of ~1.5 μm (Figure 5C). (The best choice of spacing distance between focal planes for a given application has been investigated using CRLB analysis.(102)) Biplane data is usually fit following the approach of eq. (1), where a model of how the PSF of each channel changes as a function of the 3D emitter position (x0, y0, z0) is used to simultaneously fit both channels. The fitting task can be defined by LS or MLE with either theoretically derived or empirically measured 3D PSFs.(103, 104)
Multifocal/biplane imaging has been applied to several systems, including single-particle tracking using quantum dots(103, 105) and super-resolution imaging with synthetic dyes and fluorescent proteins(101, 106, 107) in cells. By adding additional planes,(103, 105, 108) it is possible to capture a larger depth of field at the cost of either additional cameras or lateral field of view on a single camera. For example, multifocal imaging with four planes has yielded 3D trajectories of quantum dots in cells with an axial range up to 10 μm.(105)
For high-NA optics, refocusing by displacing the camera is not exactly equivalent to displacing the emitter position within the sample, and can introduce spherical aberration for larger displacements.(109) Aberration-free refocusing can be realized in multifocal microscopy by the application of diffractive optics analogous to those used in PSF engineering (Section 3.2).(108, 110–112) By placing a custom-fabricated optical element into the detection path, the image is diffracted into many orders, each of which has a different phase term that closely mimics the effect of axial displacement within the sample, avoiding aberrations. Such an optical element effectively replaces the functions of the beamsplitter and camera displacement shown in Figure 5. While diffractive optics may incur some loss of photons, this approach has achieved up to ~84% diffraction efficiency for the case of nine planes ranging over 4 μm (113) and is compatible with single-molecule tracking(111) and super-resolution imaging with small molecules.(112) A drawback of multifocal methods that span a large axial range is inefficient use of signal, especially at the extremes of the focal stack, where much of the signal is split into planes that are too out of focus to provide useful information. To improve the 3D precision attainable from each plane, multifocal imaging has been combined with astigmatism (see section 3.2),(114, 115) and we anticipate the possibility of further interesting modifications to multifocus techniques.
Another set of microscope designs related to multifocal imaging achieve high axial resolution by viewing the emitter from multiple angles. One such approach, termed “Parallax”,(116) splits the angular distribution of detected light into two channels by placing a mirror into a plane conjugate to the back focal plane. The axial position is measured from the relative displacement between the PSF in each channel, while the lateral position is read out from the average position. This method, and an earlier version using a prism to split the light, have been applied to the 3D single-molecule tracking of motor proteins.(117, 118) Alternatively, a virtual side view of the sample can be created by mounting the sample onto a fabricated micromirror sitting at a roughly 45° angle to the optical axis. The PSFs in this virtual image can be fit to directly extract axial position. While it complicates sample preparation, this approach has been successfully demonstrated for particle tracking and for super-resolution imaging.(119, 120)
3.2 Accessing the Third Dimension via PSF Engineering
The limitations of the standard PSF make it difficult to extract emitter z position from a single image, but this need not be true for all microscope PSFs. An alternate set of methods attain precise axial localization of a point source by encoding the emitter’s depth in the shape of the PSF. This can be achieved with simple phase aberrations, such as by inserting a weak cylindrical lens before the tube lens to induce astigmatism, where the eccentricity of the PSF changes as a function of axial position (Figure 6A).(121–123) A more general-purpose way to induce phase aberrations is to place a programmable phase-modulator in a plane conjugate to the objective back focal plane, i.e. the Fourier plane of the microscope (Figure 6). (While one could modulate amplitude in addition to phase using absorptive optics, this would cause the loss of precious photons.) Performing phase modulation in this specific geometry provides a useful mathematical framework for synthesizing new PSFs, and is necessary in order to impart the same modulation throughout the field of view.(124) The relation between the electromagnetic fields in the back focal plane and the image plane is given by
(11) |
where E(x′,y′) is the electric field in the back focal plane caused by a point source at (x,y,z), the 3D position relative to the focal plane and the optical axis. The function P(x′,y′) is a phase pattern imposed by optics in the Fourier plane. We denote by ℱ the 2D Fourier transform. To broadly summarize the implications of eq. (11) for PSF engineering, an axial (z) displacement of the emitter from the focal plane is equivalent to curvature in the phase of E(x′,y′), and the phase pattern P(x′,y′) changes how this manifests in the image I(u,v).
The z-dependent curvature term of the electromagnetic field at the back focal plane can be approximated by
(12) |
derived from the scalar approximation in the Gibson-Lanni model.(68) This model assumes a two-layer experimental system consisting of a sample of refractive index n2 (e.g. water), and glass/immersion oil which have matched refractive index of n1. The distance between the emitter and the interface separating layer 1 and layer 2 (namely, the distance from the surface of the glass coverslip) is given by z1, and the distance between the microscope focal plane and the interface is denoted by z2, with k the wavenumber 2π/λ, and denoting normalized polar pupil plane coordinates. Phase curvature is added to the phase pattern P(x′,y′), and the measured PSF I(u,v) reflects both P(x′,y′) and z.(86) In the absence of a phase mask (i.e., if P(x′,y′) is a constant), z displacements appear as defocus blur. The goal of PSF engineering is to choose a pattern P(x′,y′) that makes the amount of phase curvature, and thus z position, easier to detect from the image.
There are multiple ways in which a phase mask can encode the axial position of an emitter in the shape of the PSF, and various PSF designs have been used in recent years. These include an elliptical (astigmatic) PSF,(121–123) the rotating double-helix PSF (DH-PSF),(125, 126) the corkscrew PSF,(127) the phase ramp PSF,(128) the self-bending PSF(129) and the saddle-point/Tetrapods(130, 131) (Figure 6). These engineered PSFs span different axial ranges and exhibit a wide variety of shapes, but all have distinct features that change rapidly as a function of z position, such as changing eccentricity (Figure 6A), relative motion of two lobes (Figure 6B–D), lateral translation (Figure 6E), or relative motion, splitting, and recombining of multiple lobes (Figure 6F–H).
As mentioned above, these PSFs can be generated using several approaches. Simpler phase patterns can be induced using optics such as a cylindrical lens(123) or a glass wedge,(128) while other patterns require control over the phase pattern, as can be attained with a deformable mirror (DM),(132) a lithographically etched dielectric (e.g. quartz),(62) or a liquid crystal spatial light modulator (SLM).(126, 129–131) SLMs and DMs can be flexibly programmed to generate different phase patterns, and have advantages and disadvantages. SLMs have many pixels, but are designed to work for a specific linear polarization of light, halving the photons collected. This limitation can be avoided if necessary by using a “pyramid geometry” that collects both polarizations.(133) DMs have negligible loss, but generally have a limited number of actuators and cannot generate patterns with phase singularities or other fine features. Glass or quartz optics can generate features limited in fineness only by the fabrication method and have minimal loss, but are fabricated into a set pattern and cannot be changed.
There are multiple ways to determine z positions from image features in engineered PSFs. An intuitive and widely-applied approach is to choose an easily identifiable parameter in the image and generate a calibration curve relating that parameter and z. Such calibration curves can be derived from a theoretical model of the PSF or from empirical measurement of fluorescent beads or other bright fiducials, where the focal plane is translated by changing the distance between the objective and sample. For the case of the DH-PSF (Figure 6C), the angle of the two revolving PSF lobes are such a calibration parameter (Figure 7A). For the astigmatic PSF (Figure 6A), the relevant parameters are the PSF width along the x and y axes (Figure 7B), such that z is estimated from the value that best matches the observed pair (wx,wy). These parameters are measured by fitting image data to an approximation of the PSF, such as a pair of 2D Gaussian functions fit to the DH-PSF lobes, or an asymmetric 2D Gaussian fit to the astigmatic PSF.
While intuitive, such parametrizations do not use all the data in the image. Most 3D PSFs have features that are not accurately fit by a Gaussian function; for example, it is difficult to imagine a simple parameter that uses all the information present in complicated-looking PSFs such as the Tetrapod family (Figure 6F–H). To make use of this information, one can instead fit the observed PSF to a 3D model function of the PSF. Such a model function can be numerically generated or interpolated from an experimental calibration. In this way, fine features of the PSF contribute to the estimate, improving the precision. (MLE, rather than LS fitting, is particularly desirable when fitting a complete PSF model, as MLE uses information from dim pixels more correctly.) Because the PSF need not have a single, easily-interpretable feature, this broadens the range of possible 3D PSFs.
Engineered PSFs exhibit a wide range of behavior in terms of size, applicable z-range, and attainable precision. For example, the astigmatic PSF has a small footprint on the detector, allowing for potentially more emitters per unit volume to emit simultaneously while avoiding PSF overlap, but a relatively small z-range, close to that of the standard PSF. On the other hand, the DH-PSF has a larger z-range, over which it exhibits superior and uniform precision (smaller CRLB) than astigmatism,(135) and the saddle-point/Tetrapod PSFs(130, 131) have an even larger applicable z-range at the cost of potentially larger footprint. Figure 6 shows various different PSFs along with their applicable z-ranges, and Figure 8A–C shows the phase patterns for the astigmatic, DH-PSF, and Tetrapod PSFs.
As mentioned earlier, Fisher information is a useful measure to assess the attainable precision of an imaging system, and can be used to compare different PSF designs.(135) Such a comparison is shown in Figure 8D, displaying the calculated CRLB of four PSFs – the standard PSF, astigmatism, Double-Helix, and a 3 μm range Tetrapod. The CRLB bounds the attainable variance, and hence lower CRLB is indicative of superior (smaller) precision. Key observations from Figure 8D include: 1) The standard PSF contains very little information about the axial (z) position of an emitter near the focal plane (z = 0), and only maintains good lateral (x,y) localization performance over a < 1 μm z range. Clearly, a standard PSF is a poor choice for 3D localization. 2) For all PSFs, axial precision is worse than lateral precision. 3) The Tetrapod PSF exhibits the best mean CRLB in 3 dimensions over the 3 μm range. This is because it was algorithmically designed to maximize Fisher information, and therefore to be optimally precise under the conditions simulated.(130) It is worth noting that the design algorithm assumed uniform Poisson background; in certain situations (such as in cells) this may be hard to achieve.
When selecting a PSF for 3D localization, a major consideration is its applicable z-range, which is the axial region where the CRLB is low. Figure 9 shows a comparison between a Tetrapod PSF designed for a 3 μm z-range and a Tetrapod PSF designed for a 6 μm z-range.(131) It is evident that the 6 μm Tetrapod exhibits a larger applicable z-range, albeit at the cost of poorer precision in the shorter (3 μm) z-range. This trade-off in performance means that the experimenter should pick a PSF with a range appropriate to the sample under consideration. If imaging time and drifts are not issues, a short-range PSF can be successively translated again and again in z to build up an image.(101)
While the focus of this review is on 3D localization, the type of information that one can encode in a PSF is not limited to depth. PSF engineering has been used to encode the 3D orientation of an emitter(136, 137) and even its emission wavelength,(138, 139) adding new ways to extract information about a molecule’s identity and local environment while simultaneously obtaining a nanoscale position estimate.
3.3 Intensity, Interfaces, and Interference
The methods described so far all extract axial position by measuring the shape of the PSF in the far field. However, this is not the only information available from an emitter. Several methods have emerged that perform axial localization using properties such as the intensity gradient of the excitation field, the near-field coupling of the emission to the coverslip, or the phase information of the emitted light. While these are not directly related, the three examples we discuss below highlight the power of carefully considering the physical process in which light couples to and from the emitter, without relying on information from the shape of the standard PSF.
The first class of methods uses the profile of excitation light. If the fluorescence quantum yield and absorption cross-section of an emitter are constant, the fluorescence intensity from a single molecule will depend only on the excitation irradiance. Assuming this, relative axial position can be inferred from fluorescence intensity in the presence of a well-defined axial intensity gradient. One early example made use of the exponentially-decaying evanescent wave produced at the coverslip/sample interface by TIRF microscopy.(74, 140) The lateral position of fluorophores in a polyacrylamide dye was measured as in a standard 2D microscope, while changes in detected fluorescence intensity specified relative axial position. Given the decrease in intensity and thus localization precision away from the coverslip, 3D measurement using TIRF is limited by a short effective detection range. This has been employed indirectly to allow axial “sectioning” of 2D SR data using sequential imaging at different penetration depths.(141) The steep intensity gradient of a confocal illumination volume can be employed in an analogous fashion to that of the TIR illumination profile. While not a wide-field detection method, this has been used to track the motion of single nanoparticles, one at a time, by using a xyz translation stage in closed-loop operation, permitting the observation of complex cellular transport processes with a large range and high resolution both spatially and temporally.(142–145)
A second related set of methods uses the effects of the sample-coverslip interface on emission, rather than excitation, to extract relative axial position. This can be done with the refractive index mismatch between sample and coverslip, measuring the coupling of emitted light into the coverslip rather than the evanescent excitation profile of TIR. For emitters more than ~λ distant from the coverslip, any light emitted at more than the critical angle (θc~61° relative to vertical for a water/glass interface) will be totally internally reflected back into the sample. However, as the emitter gets closer to the surface, “supercritical” light above this angle will be coupled into the coverslip and can be collected with a high-NA objective; the axial dependence of the coupling efficiency is comparable to that of TIR excitation with highly inclined illumination (Figure 10A).(146) (This supercritical, or “forbidden,” light was used to sense the position of a near-field pumping source near a single molecule in the early days of single-molecule spectroscopy.(147)) The supercritical and “undercritical” components of the fluorescence can be discriminated by placing an annular filter in the back focal plane, and comparison of the total fluorescence relative to the undercritical-only fluorescence allows determination of axial position without a large loss in signal as in the TIRF case (Figure 10B). This method has been demonstrated for in vitro experiments with 3D origami nanostructures and for single-molecule imaging very near the interface in cells.(76, 77)
A distinct interface-based axial localization method has been demonstrated that uses a coverslip coated in a thin gold film.(148) Fluorophores close to the film (< 100 nm separation) exhibit lowered fluorescence lifetimes due to coupling to surface plasmons, and the lifetime changes steeply and monotonically with distance from the film.(149) While the quenching lowers the total number of photons detected before photobleaching, lowering lateral precision, it is possible to achieve axial precisions of a few nanometers with standard fluorescent dyes by measuring single-molecule fluorescence lifetimes using confocal detection and a single-photon-counting detector.(148)
A third approach uses the phase of the single-molecule fluorescence, rather than intensity, to interferometrically estimate axial position. While this class of methods can be more experimentally demanding than others discussed in this review, it has high potential precision. Since the light emitted from a single molecule is self-coherent, it can be interfered with itself to sense the relative phase delay between different imaging paths. For a microscope, this can be accomplished using opposed parfocal objectives to collect light from above and below the sample (Figure 11A). A displacement of the emitter δ along the optical axis in a sample of refractive index n will advance the phase of light in one path and delay the other by approximately 2πnδ/λ = δk, causing a total phase shift of 2δk. Interfering the two paths adds axial interference fringes to the PSF: while a given x–y slice of the interfered PSF has roughly the same shape as it would in a conventional microscope (Figure 3), the total intensities of the PSF slices vary sinusoidally with z position, with period ~2π/2k = λ/2n. Before being applied to single-molecule studies, this effect was used in bulk fluorescence microscopies using confocal scanning (4Pi)(150) and wide-field (I5M/I5S)(151, 152) illumination. By axially scanning the sample and deconvolving the interferometric PSF, an image is produced with ≥5x improved axial resolution relative to standard confocal or wide-field microscopy.(153)
As discussed in Section 3.1, scanning is not generally feasible for dynamic single-molecule measurements, and as with multifocal microscopy, it is preferable to simultaneously sample several channels that mimic axial displacement of the sample. In the case of interference-based microscopy, this amounts to a different relative phase from the top and bottom objective. Rather than scanning the sample to create an interferogram, different effective displacements are sampled in each channel i by introducing a relative phase delay φi between the light from the two objectives. In the limiting case of low-NA collection, the total intensities detected from the molecule in each channel, Ii, have the relationship
(13) |
for z the axial position relative to focus. (High NA lowers the effective value of k.(75, 154, 155)) A two channel interferometric setup, i ∈ (1, 2), with relative phase delays φ = (0, π) can be generated using a single 50:50 beamsplitter to recombine signal from both objectives due to the π/2 phase retardation upon reflection (Figure 11). From eq. (13),
(14) |
letting us extract the molecule’s position z from its measured intensity in the channels I1, I2. As the range of cos−1 is [0, π], eq. (14) demonstrates that phase aliasing limits the z values that can be detected in a two-channel interferometric microscope to a range of π/2k = λ/4n ≈ 120 nm. The Fisher information vanishes when the two beams perfectly interfere (at phase delays of 0 and π/2), as the cosine response of (13) reaches a point of zero derivative simultaneously for both channels, further limiting the practical range of such a microscope.
Sampling more than two phase delays resolves ambiguity to achieve an axial range of λ/2n and provides more uniformly high Fisher information. Multiple names including “iPALM”(75) and “4Pi-SMS” (156) have been used for setups that use three or four detection paths with relative phase delays multiples of 2π/3 or π/2, respectively. These paths are created using a specially-designed beamsplitter or combination of beamsplitters and phase-shifting elements, respectively, with an approach generating four channels sketched in Figure 11A. The intensities in each detection path are analyzed following eq. (13) to extract z information: as shown in Figure 11B, the four detected intensities change in a well-defined fashion with z, repeating with period ~λ/2n. It was shown early on both by theoretical studies of the CRLB(156) and by experimental measurement(75) that such approaches provide high precision laterally (due to the increased number of photons from two objectives) and even better precision axially, on the order of σxy < 10 nm, σz < 5 nm for photon counts typical of fluorescent proteins when imaged using TIRF.
While effective for 3D SR imaging near the cell membrane, the first studies using single-molecule interference microscopy could not go beyond the λ/2n limit imposed by phase wrapping. Later implementations used additional features of the image to break this symmetry. For example, for high-NA optics and sufficiently high SNR, the phase oscillation is measurably different at the edges of the PSF, allowing fringes to be disambiguated by spatial filtering.(155) Alternatively, a feature such as astigmatism can be added to the PSF to coarsely localize the molecule to one of the fringes before extracting the precise “intra-fringe” position.(157) By avoiding phase aliasing, these methods achieve an axial range of interference localization microscopy limited primarily by the focal range of the PSF, ~700–1000 nm for the standard or astigmatic PSFs.
The high precision afforded by interference methods comes with practical drawbacks. One is complexity of alignment and maintenance. Since any difference in the path length between the two imaging arms is directly reflected in the measurement, these must be stabilized to nanoscale tolerances over a path length on the order of a meter. Such stabilization can performed with additional interferometric feedback systems.(151, 158) Alternatively, by placing a mirror directly above the sample to collect both paths through the same objective and recombining them using an SLM, it is possible to reduce the amount of stabilization needed, with a trade-off of somewhat poorer localization precision.(159) A separate difficulty arises when imaging thick samples, due to the phase aberrations produced by a heterogeneous refractive index. In a comprehensive study, it was recently demonstrated that by using adaptive optics, it is possible to compensate for both the aberrations of thick cells and the impact of sample drift to achieve accurate imaging at multiple positions within thick cells.(158) While this did not extend the range of the PSF beyond the limit of its focal range, the stability of the setup allowed whole-cell interferometric imaging over a total range of up to 10 μm by combining multiple sequentially acquired axial sections, where aberrations within each section were separately corrected.
Taken together, the multiple new methods for 3D localization described in this section give the experimenter many choices. Since the goal in all such studies is optimal utilization of the available emitted photons from single molecules, it continues to be essential to balance complexity and performance as needed for the application.
4. Challenges and New Developments in 3D Localization
As we have seen, there exist a wide variety of optical enhancements of the standard wide-field fluorescence microscope that significantly improve the precision of 3D localization, enabling otherwise impossible measurements. Along with new possibilities, this advance brings with it new challenges to overcome. Systematic errors can be introduced in 3D imaging by optical aberrations and by dipole emission effects. Similarly, the need to accurately preserve 3D information between imaging channels (such as in multicolor imaging or polarization imaging) requires a careful approach to registration. And in general, adding a new dimension can affect the resolution, sampling rate, and field of view of localization microscopy.
In this section, we highlight some technical implications of moving to 3D imaging and the approaches that have been developed to improve its accuracy and spatiotemporal resolution. This description of possible errors is not exhaustive, and other errors such as those induced by sample drift over the course of acquisition(106, 160–162) are also necessary to correct in quantitative single-molecule microscopy. (In the case of drift, correction can be achieved using fiducial markers, interferometry, or cross-correlation with post-processing or active feedback.) We close by offering a general prospectus of methodological improvements that have either just begun to be made, or that would be a desirable future addition to the field.
4.1 Spatially Dependent Optical Aberrations and Multichannel Registration
Nonidealities of the imaging system that would be minor for a diffraction-limited microscope can cause errors that overwhelm the much finer resolution of a single-molecule microscope if not corrected. Such aberrations can be considered as an additional undesirable phase pattern added to the system, i.e. to the function P(x′,y′) of eq. (11). Since this changes the 3D PSF and alters the expected phase of the emitter, these potential errors are relevant for most 3D microscopy methods. Careful optical alignment is the first step in minimizing aberrations, but even a perfectly aligned setup with index-matched media will have imperfect optics and deviate from the paraxial ideal.(163, 164) As long as the microscope PSF is spatially invariant across the transverse field of the image and at any depth within the sample, minor aberrations can be accounted for with a simple calibration, such as an axial scan of a fluorescent bead adhered to the coverslip. Such calibrations are used in all categories of 3D localization methods,(75, 101, 123, 126) and can be used to extract the exact wavefront aberration using phase retrieval.(165) However, several classes of aberrations undermine the assumption that there is a single PSF over the entire 3D volume, and require more comprehensive consideration and correction. These can be categorized as either depth-dependent or field-dependent aberrations.
Depth-dependent aberrations are due to a refractive index mismatch between the sample and immersion medium, as is common for biological samples (with index close to water, n = 1.33) imaged with high NA oil immersion (n = 1.51) objectives. In addition to axial distortions of the PSF due to spherical aberration, index mismatch induces a well-known rescaling of the effective focal plane depth (~30% for an oil-water mismatch).(68, 166, 167) Index mismatch can be greatly reduced by switching to water immersion objectives (and carefully aligning the sample and correction collar),(168) but due to the need for high NA in 3D single-molecule microscopy, the reduced NA when using low-index media such as water may be undesirable. Two alternative solutions include calibration, and correction via adaptive optics.
Calibrations of depth-dependent errors require a different approach than a simple axial translation of the sample, as this does not actually change the depth of the fiducial (which is typically adhered to the coverslip and does not suffer significant index mismatch effects). One powerful approach, demonstrated for the astigmatic PSF, is to move fiducials through a range of well-defined axial positions within the sample using an optical trap while keeping the focal plane fixed.(169) While it does not require any specific PSF, this approach has not been applied more generally, perhaps owing to its relative complexity. An alternative method computationally synthesizes the 3D PSF at different depths by adding appropriate amounts of aberration to the empirical phase retrieved from a standard calibration. This has been shown to be effective for multifocal(170) and astigmatic(171) localization microscopy.
Adaptive optics (AO) is a powerful means to correct aberrations, and works by measuring or otherwise inferring the aberration phase, then using a programmable modulator to cancel it out.(172, 173) The aberration phase is typically defined and corrected in a plane conjugate to the back focal plane (also referred to as the pupil plane for AO applications), and is modulated by a programmable SLM or DM in this conjugate plane. This is accomplished similar to the 4f system described in Section 3.2, and most AO modulation schemes are straightforward to combine with other PSF engineering methods. Such a geometry is chosen in order to add the same correction to all points within the field of view. Multiple approaches have been reported to determine the appropriate correction phase for a given focal plane depth within the sample. Applications of AO to confocal and 2-photon microscopy often use a bright fluorescent point source or “guide star” to measure the 3D PSF in situ. The aberration phase can be explicitly detected, such as with a Shack-Hartman wavefront sensor conjugate to the back focal plane,(174) or inferred (as is useful for scattering samples and/or thick tissue) indirectly from image-based readouts such as pupil segmentation.(175) In the context of single-molecule imaging, aberration correction has been informed by direct wavefront measurement as well as iterative image optimization with guide stars.(132) Another approach iteratively optimizing the PSFs detected from single molecules observed in the first frames of the SR acquisition.(176) These methods result in improved 3D SR reconstructions in cells, and have permitted 2D SR imaging in tissue at depths of 50 μm.(177) AO was also recently introduced for super-resolution interference microscopy by using one DM in each imaging arm, where system aberrations were corrected by iteratively optimizing the PSF for a guide star at the coverslip.(158)
Aberrations can also depend on position within the sample, such as for cells or tissue with heterogeneous index, as well as for several classes of system aberrations. For many experiments near the coverslip, sample-induced aberrations may not cause a problem, but for precise imaging deep into a biological sample, they should be addressed. For confocal and other scanning microscopies, specimen-induced field-dependent aberrations are relatively tractable, and can be accounted for by performing separate interferometry measurements and AO correction at each pixel of the image. For diffraction-limited confocal imaging, field-dependent system aberrations are often not necessary (and are more complicated) to correct.(178) Field-dependent AO becomes more difficult for wide-field single-molecule techniques both in the measurement (pupil phase is contributed equally from across the field of view) and in correction (phase induced at the pupil alters the whole PSF). It is possible to perform AO corrections conjugate to planes other than the back focal plane, such as at the sample plane.(179) While such corrections could potentially address field-dependent aberrations in single-molecule imaging, this has not been demonstrated.
The high precision of single-molecule methods makes them more susceptible to any system imperfections. Since it is difficult to sense or correct the wavefront of wide-field microscopes in a field-dependent manner, it is expedient to instead directly measure and correct the effect of these aberrations by field-dependent PSF calibration. Such calibrations can be performed by using regularly-spaced point emitters such as subdiffraction apertures (nanoholes) in a metal film filled with fluorescent dye (Figure 12A).(180) Both double-helix and astigmatic PSFs have been tested in this manner by calibrating the PSF over a 30 μm field of view (Figure 12B), and both the rate of change of the PSF, and apparent focal position were observed to vary over the field of view relative to any arbitrary reference calibration. This work showed that using a single calibration could potentially lead to errors in 3D measurements of 20% or more over the field of view (Figure 12C). The observed errors depended on the choice of engineered PSF phase pattern, with the DH-PSF showing less error than astigmatism. (In this case, both PSFs were induced at the back focal plane; it can be expected that inducing astigmatism by placing a cylindrical lens in an intermediate plane would couple field and Fourier space, resulting in worse field-dependent errors.) Using a spatially varying calibration function derived from all the nanoholes, axial localization errors of ~50–100 nm were reduced to less than 25 nm (including random localization error) over a ~25 μm × 25 μm × 1.4 μm volume. Given that errors increased away from the reference calibration, larger fields of view are likely to be more susceptible to such errors if not calibrated by local sampling of the 3D PSF. The high NA objective is one likely contributor to these aberrations, as supported by recent studies of (global) aberrations in orientational imaging(181) and comparative modeling of Fourier microscopy configurations.(182)
Channel registration can be seen as a special case of compensating for field-dependent aberrations. When images of the same molecule in multiple channels are used together to generate a localization (e.g., for multifocal, interferometic, and some PSF engineering methods, as well as polarization imaging) the different field-dependent aberrations, translations, and magnifications of each channel contribute to the final reconstruction in a complicated fashion. In multicolor imaging, where different molecules appear in each channel, mismatched aberrations or poor channel alignment are even more problematic, and can introduce a false displacement between two labels of different colors. In this case, channels must be registered together to an accuracy at least as good as the image resolution to avoid artifacts in measurements of colocalization or correlation between different labels. (In this case, “channels” may refer to imaging paths that impinge on different cameras or regions on the same camera, or simply to the differences due to chromatic aberrations and/or filters that vary between labels.)
A powerful way to register localizations within the same coordinate system is to derive a transformation function from control points (CP), i.e. localizations of fiducials visible in both channels. Registration accuracy depends on the number of CP sampled and the type of transformation used.(183) For example, an affine transformation (rotation, translation, scaling, and/or reflection) in 2D requires only three CP pairs, while finer distortions require commensurate increases in control point sampling density. When discussing registration accuracy, it is important to clarify the multiple error metrics that have been used, including the Fiducial Registration Error (FRE) and Target Registration Error (TRE): the FRE is defined for CP pairs that were used when generating the transformation function, while the TRE is defined for a set of CP pairs that were set aside and did not contribute to the transformation. (183) The FRE is a useful measure of CP localization errors, but can give extremely overoptimistic estimates of registration accuracy over the field of view. For example, a 2D affine transformation estimated from three CP pairs will overlay the CPs perfectly and give a FRE of zero, yet this clearly does not mean that all aberrations within the field of view have been corrected! The TRE, on the other hand, reports the registration accuracy at intermediate positions not used in constructing the transformation, and is therefore a more representative measure of the error added during registration.
It is common to use ~10 CPs scattered throughout the field of view to generate an affine or low-order global polynomial transformation function for multicolor 3D SR microscopy.(111, 155, 184, 185) This is typically sufficient for registration in applications that do not require fine correlative measurements. However, it was recognized early in 2D colocalization studies that much finer sampling and more flexible local transformation functions were necessary for accuracy on the order of 10 nm.(186, 187) This approach has been extended to 3D SR using a locally weighted transformation function capable of achieving < 10 nm registration accuracy and accurately overlaying super-resolved 3D structures in bacteria using many fluorescent beads sequentially scanned through the 3D imaging volume as control points.(62) As shown in Figure 13, the same protocol can be applied using nanohole arrays. By localizing thousands of control points in both color channels throughout the 3D volume, a high sampling density is achieved, giving a mean TRE < 8 nm. While CP localization precision can contribute slightly to registration errors, this accuracy is still primarily limited by spatial sampling, and it was shown that mean 3D TREs below 6 nm can be achieved by imaging > 25 CPs/μm3.(62) More fundamental limits such as the pixel-to-pixel-nonuniformity of the detector camera have been described for 2D imaging,(188) which should be possible to calibrate in the same manner for 3D imaging.
4.2 Dipole Effects Which Can Lead to Localization Bias
One specific feature of single-molecule localization microscopy arises from the fact that individual molecules are not isotropic emitters: the dipole nature of the emission from a single fluorophore can be non-negligible, and even informative.(189–194) A practical implication of the dipole nature of fluorophore emission is that the emitter’s orientation may need to be taken into account when localizing single molecules in microscope images. Ignoring this effect can lead to localization errors: out-of-focus molecules can appear to be shifted in the lateral (x–y) plane, simply due to their non-isotropic emission profile.(133, 195, 196) This localization error can be appreciable (~100 nm or more) for molecules far from the nominal focal plane of the microscope. Since this is the regime of interest for 3D imaging, the potential x–y localization bias must be considered.
The x–y localization bias would not occur if the emitters were highly rotationally mobile, because then any anisotropy from the dipole emission pattern would be removed in a time average. To quantify this effect, Lew and Backlund et al. considered the situation for fluorophore labels whose rotational flexibility follows the well-known “wobble in a cone” model.(197) This model assumes that the tip of the emission dipole moves randomly in a 3D cone with half-angle α, and the results of this simulation are shown in Figure 14A. The tilted ellipsoids show the effective intensity distribution near the detector in an x–z projection: the recorded image will be the intersection of this ellipsoid with the x–y oriented camera. As shown in the narrow panels, molecules at different z positions in the sample will produce laterally shifted images. The two cases (A) and (B) refer to α = 15° (a small cone angle indicative of relatively fixed dipoles) and α = 60° (a highly mobile dipole orientation). The conclusion of this modeling study was that the lateral error can be bounded to less than ~20 nm for the larger cone angle, on the order of the localization precision. Notably, for fluorophores attached to an antibody with a floppy saturated carbon linker, this situation is likely to be realized for most emitters. Polarized measurements with a bisected-pupil PSF confirmed that the rotational flexibility is high in this situation(137). Nevertheless, not all attachment strategies or fluorescent labels allow free emitter rotation, requiring a different solution.
A brute-force approach to handling dipole-induced localization bias is to measure the 3D orientation of a fluorophore, and later correct for the localization error by post processing. The orientation can be determined by a variety of methods, ranging from the use of polarization optics(190, 191, 198) to analysis of the defocused emitter’s image.(192, 199) One method to determine orientation combines polarization microscopy with PSF engineering, which has the advantage of being simple, precise, and robust to minor defocus errors.(86) This can be achieved, for example, by using the DH-PSF,(133) or by PSFs with a quadrated pupil,(136) or by a bisected pupil.(137) To directly show that orientation measurements can be performed simultaneously with 3D localization microscopy, Backlund and Lew et al. took advantage of the fact that polarized DH-PSF images show varying degrees of lobe asymmetry depending upon z position (Figure 14B, left three columns) (200). By measuring lobe asymmetry, linear dichroism, and the z position of each emitter, the shift error can be estimated from a model and removed from the final determination of the x–y position of the emitter. This approach was illustrated for the case of relatively photostable single molecules immobilized in a polymer, and the right column of Figure 14B shows the x–y localizations found as one single molecule is translated in z, both without (u) and with (c) the correction.
In some experiments, it is quite useful to measure orientations of single molecules along with positions,(202) but this extra step requires more detected photons. Alternatively, there is a direct, purely optical solution to the mislocalization problem, proposed recently by Lew et al. (201) and illustrated in Figure 14C. This solution recognizes that the asymmetry giving rise to the shifted spot on the detector arises only from the radially polarized light in the back focal plane, and so utilizes an azimuthal polarizer to give unbiased lateral localization determination. To demonstrate this experimentally, Backlund et al. used a “y-phi” nanopost mask which converts the azimuthally (i.e. φ) polarized light into y-polarized light,(203) so that the addition of a polarizer to reject x-polarized light restores a final centrosymmetric image on the detector. Analyzing the final image with a centrosymmetric estimator extracts the molecular position without any dipole shift bias. Thus with the analyses and experimental demonstrations in this section, the problem of localization biases in 3D localization microscopy has been addressed in several ways.
4.3 Precision, Sampling Density, and Resolution
As in 2D microscopy, the concept of resolution of an image is very important in the 3D case. As a starting place, many investigators work to quantify the precision of the localizations, and use this as an estimate of resolution. The process of assessing precision can be done in several ways, with increasing degrees of accuracy and sophistication. As described in Section 2.2, eq. (6) provides an analytical expression for the localization precision, σ, in 2D for least-squares fitting of the standard PSF.(85, 88) Similarly, the attainable precision of 3D estimation methods can be calculated with the CRLB,(82, 135, 156) but this may be overoptimistic. Not every estimator will reach the CRLB, and CRLB analysis can be blind to experimental imperfections including aberrations and uneven background. An empirical solution is to localize the same stationary emitter many times,(204) using the standard deviation of the position estimates in all three dimensions as an estimate of the localization precision. To go further, in a study by Gahlmann et al. using the DH-PSF,(62) a fluorescent bead was localized in 3D for many frames, while the detected signal and background were varied by changing the pumping power and illuminating the sample with small amounts of white light. The resulting calibration surface for precision was used to define the precision of each localization based on the observed number of detected photons and background.
However, the true resolution of an image depends upon the ability to distinguish two objects close together, which can be different from simply knowing the localization precision of one isolated object. An estimation-theoretic approach to the case of discriminating the distance between two single emitters was presented by Ram et al.(205, 206), carefully treating the difference between simultaneous and sequential localization. In all cases, the best-case precision for the estimate of distance between two emitters is , or the sum in quadrature of the emitters’ localization precisions σ. In order to sample the true resolution of extended structures, and other effects such as label size, it is common to experimentally demonstrate resolution in a SR reconstruction by showing a cross-section of two close-by, well-defined structures such as crossing microtubules.(207)
Regardless of the PSF, the localization precision and fundamental resolution limits for SR scale approximately as the inverse square root of the number of photons detected from each emitter, eq. (7). For this reason, it is important to acquire as many photons as possible. Strategies that improve photon counts include the use of high-yield dyes, high objective NA, and optimizing the experiment to avoid losses. For example, using more focal planes than necessary in multifocal microscopy can result in photons wasted in out-of-focus planes, while using SLMs for PSF engineering can result in a loss of half the signal due to polarization requirements. One of the advantages of interferometric microscopy is the doubled collection efficiency due to using two objectives, and dual-objective setups can benefit other methods as well.(208, 209) Since background worsens localization precision, it must be limited using appropriate filters. Concentrating the PSF using aberration correction can also limit contamination by background by reducing the number of pixels over which the signal is spread.
A critical concern for resolving extended structures is sampling density: if the density of labels on a structure is too low, or if too few labels are detected (e.g. due to emitters prematurely bleaching or not activating), fine structural details can be lost. Early on, the need for Nyquist sampling was clearly noted(210), requiring a density of labels such that at least two molecules are sampled over an interval of the quoted resolution. A more sophisticated approach pursued by two groups defined a Fourier Ring Correlation (FRC) resolution metric for single-molecule imaging, based on similar approaches in cryo-electron microscopy.(211, 212) From a set of localizations, two reconstructed images are generated from disjoint subsets, and the correlation of the two images’ Fourier transforms along a radial coordinate (i.e. rings, or in 3D, shells) gives a FRC (in 3D, FSC) curve. The effective resolution is defined by the support for which the correlation is above a given threshold. This metric effectively senses the degree to which high spatial frequencies in the underlying structure are sampled, taking into account both localization precision and label density. This provides a simple way to gauge approximate resolution and analyze drift correction effectiveness. However, the measured resolution from FRC is sample-dependent: samples (or regions within a given sample) with more high-frequency content will seem to have a higher resolution even if the sampling density and localization precision are the same.(211, 213)
It is worthwhile to realize that the number of fluorophore labels required to sample a space scales with dimension. On the one hand, this can be helpful, as the increased spatial sparsity can help differentiate separate regions, such as in a recent study of target search kinetics of regulatory noncoding RNA.(214) However, if one wishes to image a full 3D object in 3D, the requirements for label density can be extreme. Fortunately, in many real-world cases, structures imaged in 3D have a lower dimensionality (1 for linear structures like microtubules, 2 for sheet-like structures like membranes), and the sampling density requirements are no worse than for 2D SR techniques.
Extracting the position of an emitter from its image becomes more challenging with the existence of neighboring emitters in the ROI. This is because the PSFs of adjacent emitters overlap, which likely degrades the performance of any fitting method. The overlap problem already arises in the case of 2D imaging, limiting the attainable temporal resolution of single-molecule super-resolution imaging methods by forcing a low emitter density per frame. The problem of fitting multiple emitters within the ROI has been tackled by various methods, such as using maximum likelihood estimation to find the most probable number (and position) of emitters,(215, 216) using a prior assumption of sparsity in the ROI (i.e. that only a relatively small number of emitters overlap in each frame),(217) or employing a spatiotemporal statistical model for blinking and overlap.(218, 219)
In the case of 3D imaging, the overlap problem becomes more severe. This is because PSFs designed to encode depth are typically larger than the standard PSF (with some exceptions(127)). Figure 15 illustrates this problem, along with a naïve attempt at a solution.(139). A region of interest consisting of 5 emitters imaged using a 6 μm range Tetrapod PSF is simulated. Each emitter is then fit using MLE, ignoring the other emitters and assuming (wrongly) a spatially constant background with Poisson noise distribution. Once an emitter is fit, its model is subtracted from the ROI, and the next emitter is fit. This basic multiemitter fitting method, while possibly providing reasonable results, is clearly sub-optimal. Indeed, more sophisticated methods have been suggested to overcome the overlapping PSF problem in 3D, such as simultaneous multi-emitter fitting with an astigmatic PSF (220), employing sparsity,(221, 222) combining astigmatism with biplane imaging,(223) or 3D deconvolution based on the alternating direction method of multipliers algorithm.(224)
4.4 Future Directions for 3D Localization Microscopy Methods
So far, we have touched on many relatively mature technical developments of 3D localization microscopy, but there is much work left to do. Of course, there are many areas of active development that can be expected to be fruitful for SR and SPT in general, including the design of fluorophores with higher photon yields, less-perturbative labeling techniques, lower-noise detectors, and improved sample buffers. Here, we discuss several research trends specifically pertinent to 3D SR and SPT, highlighting those now beginning to emerge and that we believe will be of increasing importance as these fields develop.
An emerging area for single-molecule and other super-resolution methods has been to address the imaging of large volumes throughout thick, preferably living, cells and tissue. This presents several challenges for 3D localization techniques: thicker cells increase the total amount of background from autofluorescence and out-of-focus emitters, as well as the total number of emitters that must be localized, and the requirement that cells be alive raises concerns about phototoxicity and imaging speed. TIRF is a common method to reduce background by reducing the illumination volume, but is not useful away from the coverslip. Lightsheet microscopy, by contrast, sweeps a ~1–3 μm thick beam through any slice of the sample, sequentially scanning thin volumes within the cell. The development of lightsheet imaging for single-molecule imaging is an area of active research, with only a few current examples of applications to 3D localization.(213, 225, 226) A major source of phototoxicity is DNA damage due to UV or near-UV illumination used in photoactivation or photoswitching. Using lightsheet or two-photon(227)illumination for photoactivation can reduce the total amount of irradiation (and reduce activation of out-of-focus emitters), and engineering dyes that are activatable with relatively red light(228, 229) and/or with higher photoactivation quantum yield(51) can further reduce the need for damaging UV light. Given that imaging a larger volume increases the number of labels that must be imaged for a full reconstruction, acquisition rates will become increasingly important, and it will be necessary to ensure high fluorophore switching rates and image acquisition rates. In part, this may depend on multiemitter fitting algorithms as described above. Since even nominally “dark” or nonemissive labels can still emit some light, it will also be important to ensure the contrast between emissive and nonemissive labels is as high as possible. It will also be necessary to further develop methods for AO, potentially correcting at a plane within the sample to accommodate specimen-induced aberrations.(179)
Another area of interest is quantitative microscopy, including methods that measure molecule stoichiometries and clustering,(230) and that generate particle averages between many identical particles in the sample.(231) 3D localization will improve both of these developing techniques. In clustering studies, adding another dimension helps improve sparsity, allowing better resolution of oligomers or assemblies within the sample.(214) Particle averaging techniques, adapted from those used in cryo-electron microscopy, aim to reconstruct an averaged particle from many presumed-identical particles within the sample. (These techniques can be viewed as an extension of using SR to measure average spatial distributions and distances within a sample, examples of which are described in Section 5.) Particle averaging with single-molecule localization data is capable of reaching Å-scale precision, but has only been applied to a few systems, including the nuclear pore complex.(232, 233) Applying 3D localization will help ensure correct alignment of randomly oriented particles for averaging, and we expect forthcoming developments in registration algorithms for particle averaging in 3D microscopy to enable this technique to be used more broadly.
Finally, as is evident from the many topics within this review, the field of 3D localization faces the organizational challenge of having a large number of details for the experimenter to understand. This makes it difficult for nonspecialists to enter the field and appropriately select between a wide field of options. We anticipate the need for simple and universal selection criteria that apply to all the techniques we have described, ensuring that the correct one can be applied to a given problem. Comparisons of CRLBs have been a useful step in this direction.(130, 135) We believe that the formalism of estimation theory would allow a more comprehensive approach to defining overall imaging resolution available with a given technique, including the conserved product of spatiotemporal bandwidth, field of view and depth of field, and SNR. We are encouraged to see two recent reviews introduce ways to comprehensively describe these quantities in single-molecule microscopy, including a treatment for SR imaging(234) and one for SPT.(34) Along with theoretical comparisons, it will be necessary to characterize how robust various imaging methodologies are to common aberrations, including the effect aberrations have on precision and resolution.(235, 236) Another practical concern is the performance of different localization algorithms, and it will be necessary to continue current efforts to compare existing software packages for SR(237) and SPT(238) as our theoretical understanding and frameworks develop.
5. Selected Applications
The development of single-molecule imaging techniques and their subsequent extension to three dimensions have benefited many domains of cell biology.(27, 63, 239, 240) There is also an interest in applications of single-molecule methods to bottom-up studies in polymer and materials science.(241–243) Here, we present selected studies that demonstrate the utility of accurate and precise 3D single-molecule localization for quantitative structural and dynamical measurements.
5.1 Super-Resolution Imaging
Focal adhesions (FAs) are one of the best-characterized examples of how cells mechanically interact with their external environment, and ~150 known components and hundreds of inter-component interactions of the integrin adhesome have been genetically characterized.(244) FAs of isolated cells are axially aligned, reaching from the cell onto the coverslip. This makes them a particularly attractive system for 3D SR, and difficult to study with other modalities. While measurements with other methods such as electron microscopy and atomic force microscopy showed that FAs form <200 nm thick plaques, researchers had difficulty determining 3D protein organization due to the limitations of specific labeling in those modalities.(245, 246) Early 2D multicolor super-resolution studies revealed that different molecular species were not homogeneously distributed within FAs, but were unable to define any overall organization due to the lack of 3D resolution(210). The need for both molecular specificity and fine axial resolution within the thin FA domain was thus a good fit for early iPALM given its < 10 nm 3D localization precision and < 225 nm range.(75) In a systematic study, Kanchanawong et al. used iPALM to measure the axial distribution of many FA proteins labeled with photoswitchable fluorescent proteins in separate single-color experiments (Figure 16A,B). Because FAs have roughly the same dimensions and are inherently aligned, it was possible to use the plane of the coverslip as a reference position between many separate experiments; this allowed the same label and optical channel to be used to image each protein, avoiding registration errors. Measurements of representative proteins belonging to different functional modules revealed a consistent axial organization of these proteins, such that each could be assigned to distinct multilaminar subdomains within the FA (Figure 16C). Extensions of this work include the addition of FRET sensors to add readout of molecular conformation to 3D SR of FAs.(247) The utility of 2D SPT measurements in FAs(248, 249) suggests a possible place for 3D SPT in future studies, and 3D SR of FAs still has room to expand: for example, it may be possible to image the ≤ 25 nm scale particles at the membrane-cytoskeleton interface that have so far only been observed in EM.(250)
The protein organization of chemical synapses is another example of how nanoscale spatial position is coupled to function. The presynaptic active zone (AZ) and postsynaptic density (PSD) of excitatory synapses comprise specialized assortments of proteins that control vesicle release, and receptor organization and signaling, respectively. Electron microscopy is routinely employed to study synapse architecture; in EM, the PSD appears as a clearly-defined electron-dense region with thickness ~30–60 nm, opposite the AZ (itself visible as an electron-dense section of membrane) and separated by the ~20–24 nm synaptic cleft.(252–254) Molecular specificity can be obtained in EM with antibodies conjugated to gold nanoparticles, and such “immunogold” micrographs have shown that key AZ and PSD proteins are concentrated within distinct layers away from the membrane.(255–257) However, immunogold labeling has low labeling density except in certain special situations, limiting throughput.(258)
The specificity and relatively simple sample preparation of super-resolution microscopy complement EM’s limitations, and a study by Dani, Huang et al. demonstrates how 3D multicolor SR can be used for (relatively) high-throughput measurements of synaptic protein organization within mouse brain tissue.(259) The random 3D orientations of synapses within tissue were defined by aligning the protein distributions along the trans-synaptic axis, and without 3D information, it would have been impossible to accurately define this axis. This was achieved with three-color measurements, where the protein of interest was localized relative to the AZ and PSD proteins Bassoon and Homer1 (Figure 17A–D) and placed in absolute coordinates based on EM measurements of Bassoon’s position and the width of the synaptic cleft(257, 260) (Figure 17E). Multicolor labeling was performed using secondary antibodies linked to photoswitchable dye pairs with variable activation wavelengths (Alexa 405, Cy2, and Cy3) and a single emitting species (Alexa 647, a dye with relatively high photon yield). While this labeling scheme produced some activation cross-talk, complicating analysis, it obviated the need for channel registration and ensured high total photon counts in each channel, providing localization precisions of 14 nm laterally and 35 nm axially with an astigmatic PSF. This good precision was obtained in spite of the difficulties of imaging in 10 μm thick cryosections of mouse brain tissue, including autofluorescence, scattering of signal, and fluorescence from labels out of the imaging range; the extra sources of background were reduced in part by using highly inclined illumination, probing only a ~2–3 μm axial section of the sample. The 10 proteins studied could thus be localized to a precision of 5 nm or less (SEM of many synapses), demonstrating different localization patterns (Figure 17E). For example, antibodies labeling membrane receptor subunits (GABABR1, NR2B, and GluR1) were observed approximately 10–15 nm from the membrane, roughly consistent with known structural data of AMPA receptors. Bassoon and Piccolo, giant (420 kD and 550 kD) scaffolding proteins of the AZ, were systematically oriented: labels on the N-termini of these two proteins appeared ~30 nm and ~40 nm farther from the membrane than C-terminal labels, respectively, a result later confirmed and extended via EM.(261) Extending such orientational measurements to smaller proteins is limited by localization precision and the size of the label, but as described in earlier sections, both of these areas are constantly being improved.
As evidenced by these studies of focal adhesions and synapses, super-resolution imaging is a powerful tool for understanding the structural organization of domains smaller than the wavelength of light within eukaryotic cells. Yet for prokaryotic organisms such as bacteria that have sizes on the order of the wavelength of light, the entire living cell is such a subdiffraction domain, and super-resolution is critical for quantitative microscopy. SR (and SPT) have been productively applied across bacterial phyla, both to probe universal processes such as DNA transcription and repair in vivo, and to resolve the nanoscale features of prokaryotic anatomy.(27, 262, 263) Here, we show a few examples where extending SR to 3D has been especially useful for studying structural features of bacteria.
As the radius of curvature of almost all bacteria is on the order of the diffraction limit, it is impossible to optically section bacteria. By contrast, 3D SR microscopy can clearly resolve the bacterial cell surface, as demonstrated in a study by Lee et al. that imaged the surface geometry of Caulobacter crescentus, a model organism for developmental biology that exhibits a dimorphic life cycle.(228) Rhodamine spirolactams are typically photoswitchable from a dark to emissive form with UV illumination. Adding highly conjugated substituents to the lactam nitrogen generated new derivatives capable of photoswitching with low-intensity visible light, avoiding the need for potentially phototoxic UV irradiation without significantly changing the photophysics of the emissive form. Adding an N-hydroxysuccinimidyl linker to the dye enabled labeling of surface amines on the proteinaceous surface layer of Caulobacter, while the dye’s positive charge inhibited passage into the cell. The high photon yield of this dye provided localization precisions σx,y,z = (15, 13, 17) nm for 3D super-resolution imaging using the double-helix PSF in live cells, and the NHS labeling method provided high and uniform sampling (~4300 localizations per cell), together giving high resolution reconstructions of the Caulobacter surface (Figure 18A). From these data, it was possible to precisely measure features such as the ~100 nm wide (FWHM) stalks present in mature cells, with results consistent with those from earlier electron microscopy, and to clearly resolve geometric details such as the pinching at the division plane in late predivisional cells.
Bacterial cell division is also difficult to study without 3D resolution, as the bacterial divisome is membrane-associated. By imaging several E. coli divisome components with 3D SR, Buss et al. were able to partially elucidate the radial organization of this multiprotein complex.(264) FtsZ, a central component, forms a ring at the division plane and is necessary to recruit other divisome proteins.(265, 266) Its localization is patchy under wild type conditions, and becomes even more fragmented in the absence of ZapA or ZapB. To understand the organization of this complex, Buss et al. performed separate iPALM measurements for these three proteins and an inner membrane-targeting-sequence (MTS), each labeled with the photoswitchable protein mEos2 (Figure 18B), obtaining localization precisions σx,y,z = (21, 23, 17) nm. In these conditions FtsZ and ZapA appeared relatively punctate, while ZapB and the MTS were more uniformly distributed. With 3D measurements, it was possible to compare the axial distributions of proteins using the coverslip as a position reference. To remain within the 225 nm axial range of this iPALM implementation, only the bottommost part of each protein ring was analyzed. FtsZ and ZapA resided in roughly the same layer of the divisome (80±2 and 74±1 nm, away from the coverslip, respectively; ± SEM) slightly offset from the inner membrane (67±1 nm) and distinctly separated from ZapB (117±2 nm). Multicolor imaging in 2D in the same study was used to show that patches of FtsZ and ZapA do not necessarily colocalize despite their similar radial positioning. Altogether, this demonstrated that the divisome is not a simple stoichiometric complex, supporting a model where each protein interacts with other components in a complex manner. Extending the 3D SR approach demonstrated here to multiple colors could add further understanding of divisome organization in future studies.
One example of how multicolor SR measurements can be used to reveal the spatial extent of protein-protein interactions comes from a study of chromosome segregation in Caulobacter. Ptacin et al. examined the localization patterns of PopZ, the “polar organizing protein,” relative to the ParA and ParB chromosome partitioning proteins.(267) ParB attaches to the chromosome, forming an extended complex, while ParA dimers form a distribution along the cell. As the ParB-chromosome complex moves from the old cell pole towards the new cell pole, the ParA dimers it contacts are disassembled, and this interaction creates the motive force for translocation. Ptacin et al. biochemically demonstrated that ParA and ParB molecules specifically bind to PopZ, which was known to form an extended space-filling network at both poles in predivisional Caulobacter cells. To probe the nature of this interaction in vivo, they performed live-cell, two-color 3D SR imaging of PAmCherry-labeled PopZ and of eYFP labeling either ParB or an obligate monomer form of ParA, the ParAG16V point mutant, which localized only to the poles (Figure 18C). Using a finely sampled registration function to accurately combine the two data channels, it was found that the ParB-centeromere complex at the new pole was consistently displaced from the PopZ region toward the interior of the cell (53.1 ± 4 nm SEM, N = 57) while on average, the ParAG16V distribution overlaid exactly with that of PopZ (0.0 ± 3 nm, N = 83). This indicated that the centromere was tethered to the edge of the domain filled by PopZ, presumably by ParB binding, and that ParA monomers were recruited within the PopZ region. The recruitment of ParA monomers to a region of high concentration may aid their recycling into dimers, enforcing the polarity of the ParA distribution. In this case, 3D imaging clearly demonstrated that ParA was concentrated within the PopZ network rather than e.g. recruited over its external surface.
5.2 Single-Particle-Tracking
Figure 19 illustrates the results of a recent study which employs PSF engineering to perform 3D SPT with high spatiotemporal resolution to observe dynamical motions present in the nucleus of budding yeast.(268) The goal of the study was to measure the relative motions of the two copies of the GAL locus in diploid cells under activating (galactose as the carbon source) and repressive (dextrose) conditions to quantify the change in correlation between the measured loci as a function of gene activation state. The two chromosomal DNA loci were labeled with two different colors using the LacO/LacI-GFP and the TetO/TetR-mCherry labeling strategies, and imaging was performed on nondividing G1 phase cells with the DH-PSF microscope in two color channels at a frame rate of 10 Hz. The many fluorescent protein copies at each locus provided high signal over track lengths of many seconds, with approximate values of localization precision, σx,y,z = (13, 12, 23) nm in the GFP channel and (27, 27, 44) nm in the dimmer mCherry channel. It is worth noting that a previous 3D tracking study of the GAL locus using confocal scanning microscopy (269), while useful, was limited to far lower time resolution (>5 s to acquire a full volume) with significantly poorer localization precision. Figure 19A shows one example of the 3D motion extracted from the DH-PSF data, where the variable of interest chosen is the time-dependent velocity of each locus along all three Cartesian axes. The velocities fluctuate due to thermal excitations, but in addition, this particular pair of loci shows correlations in the velocities, displayed as a velocity cross-correlation in the fourth panel. The findings suggest there is a time lag between the motions of the two loci, possibly arising due to viscoelastic behavior of the chromatin. The two trajectories are shown in space with time coloring of the trajectories in Figure 19B; their apparent motion on a curved surface suggests that these two loci are located near the nuclear periphery. This result was not universal, and as a result of the heterogeneity from cell to cell, correlations were not always observed between pairs of loci.
With quantitative 3D data such as these for many single cells, it was possible to explore the correlations between the two chromosomal loci in considerable detail, providing a clear example of how to quantitatively and simultaneously track two differently colored objects in the same cell.(268) Backlund et al. found that under repressive conditions the GAL loci exhibited significantly higher velocity cross-correlations on average than they did under activating conditions. This relative increase has potentially important biological implications, as it might suggest coupling via shared silencing factors or association with decoupled machinery upon activation. The researchers also found that on the time scale studied (~0.1–30 s), the loci moved subdiffusively, with the time scaling of the mean-squared displacement (MSD) lower than expected for a Brownian diffuser. They reported a higher subdiffusive exponent than previously reported from the less-precise confocal z-slice approach, which has implications for the application of polymer theory to chromatin motion in eukaryotes.
Ensuring the accuracy of these measurements of subdiffusive motion requires careful analysis of the MSD with attention to the effect of measurement errors,(270) including the effect of localization precision (a static error) and of motion blur caused by the motion of the tracked locus during the camera acquisition time (a dynamic error). The GAL locus two-color DH-PSF imaging study included a control that illustrates these errors: in a cell line with only the LacO/LacI-GFP label, removal of a filter in the two-color imaging apparatus allowed light from the same single locus to be imaged in both the green and red imaging channels, but with different signal levels. Figure 19C shows the MSD vs. time on a log-log plot for these two cases in each channel, as well as the “cross-MSD” calculated from the products of displacements in each channel (black). In the error-free case, the MSD scales as τα, and would appear linear on a log-log plot. Tellingly, the experimental cases diverge from this ideal. The red channel receives fewer photons, giving it a larger static error than the green channel and causing it to be sharply curved upward in a log-log plot. The cross-MSD averages out static error, but remains sensitive to dynamic error, causing it to curve slightly downward. Finally, the green channel is serendipitously relatively linear, as the errors cancel out. A simple fit assuming τα scaling would give completely different results for each case, despite measuring the same underlying motion. Clearly, a more sophisticated approach is needed. For the special case of Brownian motion (α = 1) and continuous illumination, MSD data can be fit to extract a single diffusion coefficient D and correctly account for localization error σ and finite frame acquisition time tE with the expression(271)
An analogous expression was derived by Savin et al. for the case of fractional Brownian motion (271), and Backlund et al. derived appropriate expressions for the velocity cross-correlations.(270) When the correct expression for the MSD is used to extract estimates of the subdiffusive exponent α on different time scales during the trajectory, a single value of α that is independent of time and localization errors is extracted, as shown in Figure 19D. This example highlights how analysis of diffusive motion should always take into account static and dynamic errors, using a model appropriate to the type of diffusion present.
From these studies of chromatin dynamics, we can see that correlative tracking of two mobile objects is a useful approach. Correlating single-particle trajectories with super-resolved structures provides a similar advantage. Bálint and coworkers used such a two-color SPT/SR measurement to study the transport of lysosomes along microtubules (MTs) and at MT intersections.(272) Kinesin and dynein motors act to generate directed motion of cargos on MTs. Previous in vitro SPT studies showed that crossed MTs can result in cargo dissociation, switching, or simply passing at the point of intersection, while in vivo measurements demonstrated track switching, but did not super-resolve the MT network. The focus of this study was to measure the functional impact of MT intersection geometry in vivo using SPT of stained lysosomes in live cells followed by in situ fixation, immunolabeling, and SR of MTs using Alexa Fluor 405/647 dye pairs. Because of the fast (~seconds) timescale of MT network rearrangement in vivo and the relatively slow (< 1 μm/s) speed of motor transport, it was necessary to stabilize the MT network so that it did not move between the beginning of the tracking measurement and the onset of cell fixation. This could be achieved by maintaining reduced temperatures and treatment with drugs inhibiting turnover of actin polymers, with only slight effects on observed behavior.
Given the large size of lysosomes (~600 nm diameter), it is inappropriate to treat them as point emitters, and they were localized and tracked in 2D using a centroid estimator. After tracking and fixation, MTs were imaged in 3D using astigmatic imaging, achieving an axial resolution of ~100 nm. The sample drift, differential aberrations between channels, and distortion from the fixation process were accounted for using a 2D global transformation between fiducial beads imaged during the SPT and SR experiments, which was sufficiently accurate given the desired resolution. Localization in 3D was necessary to quantify the spacing between MTs in intersections, as most MTs are oriented roughly in the xy plane, and crossed MTs are primarily separated in z. This structural information was then correlated with observed lysosome pausing, switching, and reversal events. Figure 20 shows overlaid SR reconstructions with a single lysosome track, highlighting pausing events. Lysosomes were only seen to pause for MT separations less than 200 nm, demonstrating that close proximity is necessary to create a substantial obstruction. However, passing or pause-then-pass events were still observed for small separations, implying that lysosomes are able to rotate, squeeze, or otherwise move past close-by MTs at intersections. Future studies that extend this approach to achieve 3D tracking of lysosomes or other cargo in the context of 3D SR measurements remain an exciting possibility for discriminating between the possible mechanisms for cargo passage and switching at intersections.
While most of the applications discussed in this review are biologically oriented, there are many additional uses for 3D localization. One example is flow profiling in microfluidic channels. Specifically, 3D micro particle image velocimetry (μPIV(273–275)) is a technique in which emitters that are flowing in a microfluidic channel are localized as they flow, and the flow profile is then obtained by analyzing their trajectories. Figure 21 shows such an experiment where the long-range capability of PSF engineering leads to fast, precise, and scan-free 3D tracking. Fluorescent beads 200 nm in diameter are imaged using a 20 μm z-range Tetrapod as they flow in a ~20 μm deep microfluidic channel, and consequently localized in 3D(131). The laminar flow profile in the channel can then be obtained by analyzing the trajectories of many beads.
6. Conclusions
With the emergence of methods capable of acquiring imaging data with resolution far below the diffraction limit and obtaining trajectories of single particles with sub-second sampling and nanoscale precision, a new and highly quantitative view of the nanoscale structures and dynamic behavior within biological cells and other complex systems is now becoming available. The extension of super-resolution and single-particle-tracking to 3D reveals the true geometry of these systems, and in many ways, the limitations of optical fluorescence microscopy in terms of resolution, dimensionality, and imaging speed are being removed. It is tempting and understandable for the experimenter to rush out and buy a complex machine to enjoy this new regime of nanoscale science. However, as illustrated multiple times in this review, exciting new levels of quantitation must be accompanied by intense attention to both the precision and the accuracy of the imaging methods that enable that quantitation. In addition, the various methods for 3D imaging have a range of trade-offs, advantages and disadvantages. Given the intricacies of single-molecule experiments, data analysis, and interpretation, especially when extended into three dimensions, we cannot emphasize the need for scrutiny enough when validating new results. In addition, as spatial resolution improves beyond the diffraction limit, it becomes important to test for artifacts on ever-smaller scales, and to perform comprehensive controls of the labels, imaging method, and analysis algorithms used.
One can expect to see newer imaging modalities appear, to add to the variables already being extracted from precise and accurate measurements on single molecules. The molecular orientation of labels can be used to infer information about local environments,(202, 276, 277) and such measurements can be combined with 3D measurements to powerful effect. Additional well-known local single-molecule variables such as lifetime, energy transfer, spectra, etc. can certainly be imaged in 3D. Fluorescent probes are improving in both photon yield and small size, and the emergence of other imaging approaches based on scattering may alleviate the need for fluorescent labeling altogether.(278) Without question, the future of 3D imaging on the nanoscale is bright.
Acknowledgments
We gratefully acknowledge the collaborations with many members of the Moerner Laboratory over the years, including Michael Thompson, Matthew Lew, Mikael Backlund, Steffen Sahl, Andreas Gahlmann, Marissa Lee, Adam Backer, Lucien Weiss, Maurice Lee, Petar Petrov, and Anna-Karin Gustavsson. This work was supported in part by the National Institute of General Medical Sciences Grant No. R35-GM118067.
Biographies
Alex von Diezmann received his B.A. in Chemistry from Reed College in 2011. His Ph.D. research in the Moerner lab has focused on developing 3D single-molecule imaging techniques to measure the spatial organization and dynamics of proteins within biological structures in vivo. He is fascinated by how quantitative single-molecule experiments can bridge molecular and cellular size scales in our understanding of cellular anatomy and signaling. His current work examines how subcellularly localized protein microdomains generate polarity in the model organism Caulobacter crescentus.
Yoav Shechtman received his B. Sc. degree in electrical engineering and physics in 2007, and his Ph.D. degree in physics in 2013, all from the Technion–Israel Institute of Technology, Haifa, Israel. In 2013 Yoav became a Postdoctoral Fellow in the Department of Chemistry at Stanford University, and in 2016 he joined the Department of Biomedical Engineering in the Technion, as an assistant professor. Yoav is interested in applying signal processing methods to practical biological/physical optical imaging challenges.
W. E. (William Esco) Moerner, the Harry S. Mosher Professor of Chemistry and Professor, by courtesy, of Applied Physics, at Stanford University, has conducted research in the areas of physical chemistry and biophysics of single molecules, development of 2D and 3D super-resolution imaging and tracking for cell biology, nanophotonics, photorefractive polymers, and trapping of single biomolecules in solution.
References
- 1.Lakowicz JR. Principles of Fluorescence Spectroscopy; Springer Science; New York: 2006. [Google Scholar]
- 2.O’Hare HM, Johnsson K, Gautier A. Chemical probes shed light on protein function. Curr Opin Struct Biol. 2007;17:488–494. doi: 10.1016/j.sbi.2007.07.005. [DOI] [PubMed] [Google Scholar]
- 3.Shaner NC, Steinbach PA, Tsien RY. A guide to choosing fluorescent proteins. Nat Methods. 2005;2:905–909. doi: 10.1038/nmeth819. [DOI] [PubMed] [Google Scholar]
- 4.Giepmans BNG, Adams SR, Ellisman MH, Tsien RY. The fluorescent toolbox for assessing protein location and function. Science. 2006;312:217–224. doi: 10.1126/science.1124618. [DOI] [PubMed] [Google Scholar]
- 5.Friedrich J, Haarer D. Structural Relaxation Processes in Polymers and Glasses as Studied by High Resolution Optical Spectroscopy. In: Zschokke I, editor. Optical Spectroscopy of Glasses. Reidel; Dordrecht: 1986. pp. 149–198. [Google Scholar]
- 6.Zurner A, Kirstein J, Doblinger M, Brauchle C, Bein T. Visualizing single-molecule diffusion in mesoporous materials. Nature. 2007;450:705–708. doi: 10.1038/nature06398. [DOI] [PubMed] [Google Scholar]
- 7.Scholes GD, Fleming GR, Olaya-Castro A, van Grondelle R. Lessons from nature about solar light harvesting. Nat Chem. 2011;3:763–774. doi: 10.1038/nchem.1145. [DOI] [PubMed] [Google Scholar]
- 8.Moerner WE, Kador L. Optical detection and spectroscopy of single molecules in a solid. Phys Rev Lett. 1989;62:2535–2538. doi: 10.1103/PhysRevLett.62.2535. [DOI] [PubMed] [Google Scholar]
- 9.Orrit M, Bernard J. Single pentacene molecules detected by fluorescence excitation in a p-terphenyl crystal. Phys Rev Lett. 1990;65:2716–2719. doi: 10.1103/PhysRevLett.65.2716. [DOI] [PubMed] [Google Scholar]
- 10.Shera EB, Seitzinger NK, Davis LM, Keller RA, Soper SA. Detection of Single Fluorescent Molecules. Chem Phys Lett. 1990;174:553–557. [Google Scholar]
- 11.Moerner WE, Basché T. Optical Spectroscopy of Single Impurity Molecules in Solids. Angew Chem Int Ed. 1993;32:457–476. [Google Scholar]
- 12.Moerner WE. Examining Nanoenvironments in Solids on the Scale of a Single, Isolated Molecule. Science. 1994;265:46–53. doi: 10.1126/science.265.5168.46. [DOI] [PubMed] [Google Scholar]
- 13.Orrit M, Bernard J, Brown R, Lounis B. Optical Spectroscopy of Single Molecules in Solids. Prog Optics. 1996;35:61–144. [Google Scholar]
- 14.Plakhotnik T, Donley EA, Wild UP. Single-Molecule Spectroscopy. Annu Rev Phys Chem. 1997;48:181–212. doi: 10.1146/annurev.physchem.48.1.181. [DOI] [PubMed] [Google Scholar]
- 15.Nie S, Zare RN. Optical detection of single molecules. Annu Rev Biophys Biomol Struct. 1997;26:567–596. doi: 10.1146/annurev.biophys.26.1.567. [DOI] [PubMed] [Google Scholar]
- 16.Moerner WE, Orrit M. Illuminating Single Molecules in Condensed Matter. Science. 1999;283:1670–1676. doi: 10.1126/science.283.5408.1670. [DOI] [PubMed] [Google Scholar]
- 17.Moerner WE. A Dozen Years of Single-Molecule Spectroscopy in Physics, Chemistry, and Biophysics. J Phys Chem B. 2002;106:910–927. [Google Scholar]
- 18.Moerner WE, Fromm DP. Methods of Single-Molecule Fluorescence Spectroscopy and Microscopy. Rev Sci Instrum. 2003;74:3597–3619. [Google Scholar]
- 19.Tinnefeld P, Sauer M. Branching Out of Single-Molecule Fluorescence Spectroscopy: Challenges for Chemistry and Influence on Biology. Angew Chem Int Ed. 2005;44:2642–2671. doi: 10.1002/anie.200300647. [DOI] [PubMed] [Google Scholar]
- 20.Cornish PV, Ha T. A Survey of Single-Molecule Techniques in Chemical Biology. ACS Chem Biol. 2007;2:53–61. doi: 10.1021/cb600342a. [DOI] [PubMed] [Google Scholar]
- 21.Moerner WE, Schuck PJ, Fromm DP, Kinkhabwala A, Lord SJ, Nishimura SY, Willets KA, Sundaramurthy A, Kino GS, He M, Lu Z, Twieg RJ. Nanophotonics and Single Molecules. In: Rigler R, Vogel H, editors. Single Molecules and Nanotechnology. Vol. 12. Springer-Verlag; Berlin: 2008. pp. 1–23. [Google Scholar]
- 22.Basché T, Moerner WE, Orrit M, Wild UP, editors. Single Molecule Optical Detection, Imaging, and Spectroscopy. Verlag-Chemie; Munich: 1997. [Google Scholar]
- 23.Zander C, Enderlein J, Keller RA, editors. Single-Molecule Detection in Solution: Methods and Applications; Wiley-VCH; Berlin: 2002. [Google Scholar]
- 24.Gell C, Brockwell DJ, Smith A. Handbook of Single Molecule Fluorescence Spectroscopy. Oxford Univ. Press; Oxford: 2006. [Google Scholar]
- 25.Selvin PR, Ha T, editors. Single-Molecule Techniques: A Laboratory Manual. Cold Spring Harbor Laboratory Press; Cold Spring Harbor, NY: 2008. [Google Scholar]
- 26.Abbe E. Contributions to the theory of the microscope and microscopic detection (Translated from German) Arch Mikroskop Anat. 1873;9:413–468. [Google Scholar]
- 27.Gahlmann A, Moerner WE. Exploring bacterial cell biology with single-molecule tracking and super-resolution imaging. Nat Rev Micro. 2014;12:9–22. doi: 10.1038/nrmicro3154. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Braeuchle C, Lamb DC, Michaelis J, editors. Single particle tracking and single molecule energy transfer; Wiley-VCH; Weinheim: 2010. [Google Scholar]
- 29.Barak LS, Webb WW. Diffusion of Low Density Lipoprotein-Receptor Complex on Human Fibroblasts. J Cell Biol. 1982;95:846–852. doi: 10.1083/jcb.95.3.846. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Gelles J, Schnapp BJ, Sheetz MP. Tracking kinesin-driven movements with nanometre-scale precision. Nature. 1988;331:450–453. doi: 10.1038/331450a0. [DOI] [PubMed] [Google Scholar]
- 31.Schmidt T, Schutz GJ, Baumgartner W, Gruber HJ, Schindler H. Imaging of Single Molecule Diffusion. Proc Natl Acad Sci U S A. 1996;93:2926–2929. doi: 10.1073/pnas.93.7.2926. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Saxton MJ, Jacobson K. Single-particle tracking: applications to membrane dynamics. Annu Rev Biophys Biomol Struct. 1997;26:373–399. doi: 10.1146/annurev.biophys.26.1.373. [DOI] [PubMed] [Google Scholar]
- 33.Kusumi A, Tsunoyama TA, Hirosawa KM, Kasai RS, Fujiwara TK. Tracking single molecules at work in living cells. Nat Chem Biol. 2014;10:524–532. doi: 10.1038/nchembio.1558. [DOI] [PubMed] [Google Scholar]
- 34.Mathai PP, Liddle JA, Stavis SM. Optical tracking of nanoscale particles in microscale environments. Applied Physics Reviews. 2016;3:011105. doi: 10.1063/1.4941675. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Deich J, Judd EM, McAdams HH, Moerner WE. Visualization of the movement of single histidine kinase molecules in live Caulobacter cells. Proc Natl Acad Sci U S A. 2004;101:15921–15926. doi: 10.1073/pnas.0404200101. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Niu L, Yu P. Investigating Intracellular Dynamics of FtsZ Cytoskeleton with Photoactivation Single-Molecule Tracking. Biophys J. 2008;95:2009–2016. doi: 10.1529/biophysj.108.128751. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Manley S, Gillette JM, Patterson GH, Shroff H, Hess HF, Betzig E, Lippincott-Schwartz J. High-density mapping of single-molecule trajectories with photoactivated localization microscopy. Nat Methods. 2008;5:155–157. doi: 10.1038/nmeth.1176. [DOI] [PubMed] [Google Scholar]
- 38.Vrljic M, Nishimura SY, Moerner WE. Single-molecule tracking. In: McIntosh TJ, editor. Lipid Rafts. Humana Press; New Jersey: 2009. pp. 193–219. [Google Scholar]
- 39.Moerner WE. Single-Molecule Spectroscopy, Imaging, and Photocontrol: Foundations for Super-Resolution Microscopy (Nobel Lecture) Angewandte Chemie International Edition. 2015;54:8067–8093. doi: 10.1002/anie.201501949. [DOI] [PubMed] [Google Scholar]
- 40.Betzig E. Single Molecules, Cells, and Super-Resolution Optics (Nobel Lecture) Angewandte Chemie International Edition. 2015;54:8034–8053. doi: 10.1002/anie.201501003. [DOI] [PubMed] [Google Scholar]
- 41.Hell SW. Nanoscopy with Focused Light (Nobel Lecture) Angewandte Chemie International Edition. 2015;54:8054–8066. doi: 10.1002/anie.201504181. [DOI] [PubMed] [Google Scholar]
- 42.Betzig E, Patterson GH, Sougrat R, Lindwasser OW, Olenych S, Bonifacino JS, Davidson MW, Lippincott-Schwartz J, Hess HF. Imaging intracellular fluorescent proteins at nanometer resolution. Science. 2006;313:1642–1645. doi: 10.1126/science.1127344. [DOI] [PubMed] [Google Scholar]
- 43.Hess ST, Girirajan TPK, Mason MD. Ultra-high resolution imaging by fluorescence photoactivation localization microscopy. Biophys J. 2006;91:4258–4272. doi: 10.1529/biophysj.106.091116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Rust MJ, Bates M, Zhuang X. Sub-diffraction-limit imaging by stochastic optical reconstruction microscopy (STORM) Nat Methods. 2006;3:793–796. doi: 10.1038/nmeth929. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Sharonov A, Hochstrasser RM. Wide-field subdiffraction imaging by accumulated binding of diffusing probes. Proc Natl Acad Sci U S A. 2006;103:18911–18916. doi: 10.1073/pnas.0609643104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Mei E, Gao F, Hochstrasser RM. Controlled biomolecular collisions allow sub-diffraction limited microscopy of lipid vesicles. Phys Chem Chem Phys. 2006;8:2077–2082. doi: 10.1039/b601670g. [DOI] [PubMed] [Google Scholar]
- 47.Heilemann M, van de Linde S, Schüttpelz M, Kasper R, Seefeldt B, Mukherjee A, Tinnefeld P, Sauer M. Subdiffraction-Resolution Fluorescence Imaging with Conventional Fluorescent Probes. Angew Chem Int Ed. 2008;47:6172–6176. doi: 10.1002/anie.200802376. [DOI] [PubMed] [Google Scholar]
- 48.Testa I, Wurm CA, Medda R, Rothermel E, von Middendorf C, Foelling J, Jakobs S, Schoenle A, Hell SW, Eggeling C. Multicolor Fluorescence Nanoscopy in Fixed and Living Cells by Exciting Conventional Fluorophores with a Single Wavelength. Biophys J. 2010;99:2686–2694. doi: 10.1016/j.bpj.2010.08.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Cordes T, Strackharn M, Stahl SW, Summerer W, Steinhauer C, Forthmann C, Puchner EM, Vogelsang J, Gaub HE, Tinnefeld P. Resolving Single-Molecule Assembled Patterns with Superresolution Blink-Microscopy. Nano Lett. 2010;10:645–651. doi: 10.1021/nl903730r. [DOI] [PubMed] [Google Scholar]
- 50.Lemmer P, Gunkel M, Baddeley D, Kaufmann R, Urich A, Weiland Y, Reymann J, Mueller P, Hausmann M, Cremer C. SPDM: light microscopy with single-molecule resolution at the nanoscale. Appl Phys B. 2008;93:1–12. [Google Scholar]
- 51.Lee HD, Lord SJ, Iwanaga S, Zhan K, Xie H, Williams JC, Wang H, Bowman GR, Goley ED, Shapiro L, et al. Superresolution Imaging of Targeted Proteins in Fixed and Living Cells Using Photoactivatable Organic Fluorophores. J Am Chem Soc. 2010;132:15099–15101. doi: 10.1021/ja1044192. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Vaughan JC, Jia S, Zhuang X. Ultrabright photoactivatable fluorophores created by reductive caging. Nat Methods. 2012;9:1181–1184. doi: 10.1038/nmeth.2214. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Dickson RM, Cubitt AB, Tsien RY, Moerner WE. On/Off Blinking and Switching Behavior of Single Molecules of Green Fluorescent Protein. Nature. 1997;388:355–358. doi: 10.1038/41048. [DOI] [PubMed] [Google Scholar]
- 54.Biteen JS, Thompson MA, Tselentis NK, Bowman GR, Shapiro L, Moerner WE. Super-resolution imaging in live Caulobacter crescentus cells using photoswitchable EYFP. Nat Methods. 2008;5:947–949. doi: 10.1038/NMETH.1258. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Lee MK, Williams J, Twieg RJ, Rao J, Moerner WE. Enzymatic activation of nitro-aryl fluorogens in live bacterial cells for enzymatic turnover-activated localization microscopy. Chem Sci. 2013;4:220–225. doi: 10.1039/C2SC21074F. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Huang B, Babcock H, Zhuang X. Breaking the diffraction barrier: super-resolution imaging of cells. Cell. 2010;143:1047–1058. doi: 10.1016/j.cell.2010.12.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Thompson MA, Lew MD, Moerner WE. Extending Microscopic Resolution with Single-Molecule Imaging and Active Control. Annu Rev Biophys. 2012;41:321–342. doi: 10.1146/annurev-biophys-050511-102250. [DOI] [PubMed] [Google Scholar]
- 58.Thompson MA, Biteen JS, Lord SJ, Conley NR, Moerner WE. Molecules and Methods for Super-Resolution Imaging. Meth Enzymol. 2010;475:27–59. doi: 10.1016/S0076-6879(10)75002-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Biteen JS, Moerner WE. Single-Molecule and Superresolution Imaging in Live Bacteria Cells. Cold Spring Harb Perspect Biol. 2010;2:a000448. doi: 10.1101/cshperspect.a000448. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Lew MD, Lee SF, Thompson MA, Lee HD, Moerner WE. Single-Molecule Photocontrol and Nanoscopy. In: Tinnefeld P, Eggeling C, Hell SW, editors. Far-Field Optical Nanoscopy. Springer; Berlin Heidelberg: 2012. pp. 1–24. [Google Scholar]
- 61.Moerner WE. Microscopy beyond the diffraction limit using actively controlled single molecules. J Microsc. 2012;246:213–220. doi: 10.1111/j.1365-2818.2012.03600.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Gahlmann A, Ptacin JL, Grover G, Quirin S, von Diezmann ARS, Lee MK, Backlund MP, Shapiro L, Piestun R, Moerner WE. Quantitative multicolor subdiffraction imaging of bacterial protein ultrastructures in 3D. Nano Lett. 2013;13:987–993. doi: 10.1021/nl304071h. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Sahl SJ, Moerner WE. Super-resolution fluorescence imaging with single molecules. Curr Opin Struct Biol. 2013;23:778–787. doi: 10.1016/j.sbi.2013.07.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Godin A, Lounis B, Cognet L. Super-resolution Microscopy Approaches for Live Cell Imaging. Biophys J. 2014;107:1777–1784. doi: 10.1016/j.bpj.2014.08.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Fischer RS, Wu Y, Kanchanawong P, Shroff H, Waterman CM. Microscopy in 3D: a biologist’s toolbox. Trends in Cell Biology. 2011;21:682–691. doi: 10.1016/j.tcb.2011.09.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Deschout H, Zanacchi FC, Mlodzianoski M, Diaspro A, Bewersdorf J, Hess ST, Braeckmans K. Precisely and accurately localizing single emitters in fluorescence microscopy. Nat Meth. 2014;11:253–266. doi: 10.1038/nmeth.2843. [DOI] [PubMed] [Google Scholar]
- 67.Small A, Stahlheber S. Fluorophore localization algorithms for super-resolution microscopy. Nat Methods. 2014;11:267–279. doi: 10.1038/nmeth.2844. [DOI] [PubMed] [Google Scholar]
- 68.Gibson SF, Lanni F. Experimental test of an analytical model of aberration in an oil-immersion objective lens used in three-dimensional light microscopy. J Opt Soc Am A. 1991;8:1601–1613. doi: 10.1364/josaa.9.000154. [DOI] [PubMed] [Google Scholar]
- 69.Kirshner H, Aguet F, Sage D, Unser M. 3-D PSF fitting for fluorescence microscopy: implementation and localization application. J Microsc. 2013;249:13–25. doi: 10.1111/j.1365-2818.2012.03675.x. [DOI] [PubMed] [Google Scholar]
- 70.Ambrose WP, Basché T, Moerner WE. Detection and Spectroscopy of Single Pentacene Molecules in a p-Terphenyl Crystal by Means of Fluorescence Excitation. J Chem Phys. 1991;95:7150–7163. [Google Scholar]
- 71.Axelrod D. Total Internal Reflection Fluorescence Microscopy. In: Taylor D, Wang Y, editors. Methods in cell biology: Fluorescence microscopy of living cells in culture. Vol. 30. Academic Press, Inc; San Diego: 1989. pp. 245–270. [DOI] [PubMed] [Google Scholar]
- 72.Funatsu T, Harada Y, Tokunaga M, Saito K, Yanagida T. Imaging of single fluorescent molecules and individual ATP turnovers by single myosin molecules in aqueous solution. Nature. 1995;374:555–559. doi: 10.1038/374555a0. [DOI] [PubMed] [Google Scholar]
- 73.Paige MF, Bjerneld EJ, Moerner WE. A comparison of through-the-objective total internal reflection microscopy and epi-fluorescence microscopy for single-molecule fluorescence imaging. Single Molecules. 2001;2:191–201. [Google Scholar]
- 74.Dickson RM, Norris DJ, Tzeng YL, Moerner WE. Three-dimensional imaging of single molecules solvated in pores of poly(acrylamide) gels. Science. 1996;274:966–969. doi: 10.1126/science.274.5289.966. [DOI] [PubMed] [Google Scholar]
- 75.Shtengel G, Galbraith JA, Galbraith CG, Lippincott-Schwartz J, Gillette JM, Manley S, Sougrat R, Waterman CM, Kanchanawong P, Davidson MW, et al. Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure. Proc Natl Acad Sci U S A. 2009;106:3125–3130. doi: 10.1073/pnas.0813131106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Deschamps J, Mund M, Ries J. 3D superresolution microscopy by supercritical angle detection. Opt Express. 2014;22:29081–29091. doi: 10.1364/OE.22.029081. [DOI] [PubMed] [Google Scholar]
- 77.Bourg N, Mayet C, Dupuis G, Barroca T, Bon P, Lécart S, Fort E, Lévêque-Fort S. Direct optical nanoscopy with axially localized detection. Nat Photon. 2015;9:587–593. [Google Scholar]
- 78.Chao J, Sripad R, Ward ES, Ober RJ. Ultrahigh accuracy imaging modality for super-localization microscopy. Nat Methods. 2013;10:335–338. doi: 10.1038/nmeth.2396. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Parthasarathy R. Rapid, accurate particle tracking by calculation of radial symmetry centers. Nat Meth. 2012;9:724–726. doi: 10.1038/nmeth.2071. [DOI] [PubMed] [Google Scholar]
- 80.Anderson CM, Georgiou GN, Morrison IE, Stevenson GV, Cherry RJ. Tracking of cell surface receptors by fluorescence digital imaging microscopy using a charge-coupled device camera. Low-density lipoprotein and influenza virus receptor mobility at 4 degrees C. J Cell Sci. 1992;101:415–425. doi: 10.1242/jcs.101.2.415. [DOI] [PubMed] [Google Scholar]
- 81.Cheezum MK, Walker WF, Guilford WH. Quantitative Comparison of Algorithms for Tracking Single Fluorescent Particles. Biophys J. 2001;81:2378–2388. doi: 10.1016/S0006-3495(01)75884-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Abraham AV, Ram S, Chao J, Ward ES, Ober RJ. Quantitative study of single molecule location estimation techniques. Opt Express. 2009;17:23352–23373. doi: 10.1364/OE.17.023352. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Rieger B, Stallinga S. The Lateral and Axial Localization Uncertainty in Super-Resolution Light Microscopy. ChemPhysChem. 2014;15:664–670. doi: 10.1002/cphc.201300711. [DOI] [PubMed] [Google Scholar]
- 84.Richards B, Wolf E. Electromagnetic Diffraction in optical systems. II. Structure of the image field in an aplanatic system. Proc R Soc London Ser A. 1959;253:358–379. [Google Scholar]
- 85.Mortensen KI, Churchman LS, Spudich JA, Flyvbjerg H. Optimized localization analysis for single-molecule tracking and super-resolution microscopy. Nat Methods. 2010;7:377–381. doi: 10.1038/nmeth.1447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Backer AS, Moerner WE. Extending Single-Molecule Microscopy Using Optical Fourier Processing. J Phys Chem B. 2014;118:8313–8329. doi: 10.1021/jp501778z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Zhang B, Zerubia J, Olivo-Marin J. Gaussian approximations of fluorescence microscope point-spread function models. Appl Opt. 2007;46:1819–1829. doi: 10.1364/ao.46.001819. [DOI] [PubMed] [Google Scholar]
- 88.Thompson RE, Larson DR, Webb WW. Precise Nanometer Localization Analysis for Individual Fluorescent Probes. Biophys J. 2002;82:2775–2783. doi: 10.1016/S0006-3495(02)75618-X. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Enderlein J. Maximum-Likelihood Criterion and Single-Molecule Detection. Appl Opt. 1995;34:514–526. doi: 10.1364/AO.34.000514. [DOI] [PubMed] [Google Scholar]
- 90.Pawitan Y. In All Likelihood: Statistical Modeling and Inference Using Likelihood. Clarendon Press; Oxford: 2001. [Google Scholar]
- 91.Ober RJ, Ram S, Ward ES. Localization accuracy in single-molecule microscopy. Biophys J. 2004;86:1185–1200. doi: 10.1016/S0006-3495(04)74193-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Ram S, Sally Ward E, Ober R. A Stochastic Analysis of Performance Limits for Optical Microscopes. Multidim Syst Sig Process. 2006;17:27–57. [Google Scholar]
- 93.Chao J, Ward ES, Ober RJ. Fisher information theory for parameter estimation in single molecule microscopy: tutorial. JOSA A. 2016;33:B36–B57. doi: 10.1364/JOSAA.33.000B36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Chao J, Ward ES, Ober RJ. Fisher information matrix for branching processes with application to electron-multiplying charge-coupled devices. Multidim Syst Sig Process. 2012;23:349–379. doi: 10.1007/s11045-011-0150-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Huang F, Hartwich TMP, Rivera-Molina FE, Lin Y, Duim WC, Long JJ, Uchil PD, Myers JR, Baird MA, Mothes W, et al. Video-rate nanoscopy using sCMOS camera-specific single-molecule localization algorithms. Nat Methods. 2013;10:653–658. doi: 10.1038/nmeth.2488. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Kay SM. Fundamentals of Statistical Signal Processing: Estimation Theory. Prentice-Hall PTR; Englewood Cliffs, NJ: 1993. [Google Scholar]
- 97.Smith CS, Joseph N, Rieger B, Lidke KA. Fast, single-molecule localization that achieves theoretically minimum uncertainty. Nat Methods. 2010;7:373–375. doi: 10.1038/nmeth.1449. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Speidel M, Jonas A, Florin E. Three-dimensional tracking of fluorescent nanoparticels with subnanometer precision by use of off-focus imaging. Opt Lett. 2003;28:69–71. doi: 10.1364/ol.28.000069. [DOI] [PubMed] [Google Scholar]
- 99.Toprak E, Balci H, Blehm BH, Selvin PR. Three-dimensional particle tracking via bifocal imaging. Nano Lett. 2007;7:2043–2045. doi: 10.1021/nl0709120. [DOI] [PubMed] [Google Scholar]
- 100.Prabhat P, Ram S, Ward ES, Ober RJ. Simultaneous imaging of different focal planes in fluorescence microscopy for the study of cellular dynamics in three dimensions. IEEE Trans Nanobioscience. 2004;3:237–242. doi: 10.1109/tnb.2004.837899. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Juette MF, Gould TJ, Lessard MD, Mlodzianoski MJ, Nagpure BS, Bennett BT, Hess ST, Bewersdorf J. Three-dimensional sub-100 nm resolution fluorescence microscopy of thick samples. Nat Methods. 2008;5:527–529. doi: 10.1038/nmeth.1211. [DOI] [PubMed] [Google Scholar]
- 102.Tahmasbi A, Ram S, Chao J, Abraham AV, Tang FW, Sally Ward E, Ober RJ. Designing the focal plane spacing for multifocal plane microscopy. Opt Express. 2014;22:16706–16721. doi: 10.1364/OE.22.016706. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Ram S, Prabhat P, Chao J, Ward ES, Ober RJ. High accuracy 3D quantum dot tracking with multifocal plane microscopy for the study of fast intracellular dynamics in live cells. Biophys J. 2008;95:6025–6043. doi: 10.1529/biophysj.108.140392. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Mlodzianoski MJ, Juette MF, Beane GL, Bewersdorf J. Experimental characterization of 3D localization techniques for particle-tracking and super-resolution microscopy. Opt Express. 2009;17:8264–8277. doi: 10.1364/oe.17.008264. [DOI] [PubMed] [Google Scholar]
- 105.Ram S, Kim D, Ober R, Ward E. 3D Single Molecule Tracking with Multifocal Plane Microscopy Reveals Rapid Intercellular Transferrin Transport at Epithelial Cell Barriers. Biophys J. 2012;103:1594–1603. doi: 10.1016/j.bpj.2012.08.054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Mlodzianoski MJ, Schreiner JM, Callahan SP, Smolková K, Dlasková A, Šantorová J, Ježek P, Bewersdorf J. Sample drift correction in 3D fluorescence photoactivation localization microscopy. Opt Express. 2011;19:15009–15019. doi: 10.1364/OE.19.015009. [DOI] [PubMed] [Google Scholar]
- 107.Ries J, Kaplan C, Platonova E, Eghlidi H, Ewers H. A simple, versatile method for GFP-based super-resolution microscopy via nanobodies. Nat Methods. 2012;9:582–584. doi: 10.1038/nmeth.1991. [DOI] [PubMed] [Google Scholar]
- 108.Dalgarno PA, Dalgarno HIC, Putoud A, Lambert R, Paterson L, Logan DC, Towers DP, Warburton RJ, Greenaway AH. Multiplane imaging and three dimensional nanoscale particle tracking in biological microscopy. Opt Express. 2010;18:877–884. doi: 10.1364/OE.18.000877. [DOI] [PubMed] [Google Scholar]
- 109.Botcherby EJ, Juskaitis R, Booth MJ, Wilson T. Aberration-free optical refocusing in high numerical aperture microscopy. Opt Lett. 2007;32:2007–2009. doi: 10.1364/ol.32.002007. [DOI] [PubMed] [Google Scholar]
- 110.Blanchard PM, Greenaway AH. Simultaneous multiplane imaging with a distorted diffraction grating. Appl Opt. 1999;38:6692–6699. doi: 10.1364/ao.38.006692. [DOI] [PubMed] [Google Scholar]
- 111.Abrahamsson S, Chen J, Hajj B, Stallinga S, Katsov AY, Wisniewski J, Mizuguchi G, Soule P, Mueller F, Darzacq CD, et al. Fast multicolor 3D imaging using aberration-corrected multifocus microscopy. Nat Methods. 2013;10:60–63. doi: 10.1038/nmeth.2277. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 112.Hajj B, Wisniewski J, Beheiry ME, Chen J, Ravyakin A, Wu C, Dahan M. Whole-cell, multicolor super resolution imaging using volumetric multi focus microscopy. Proc Natl Acad Sci U S A. 2014;111:17480–17485. doi: 10.1073/pnas.1412396111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Abrahamsson S, Ilic R, Wisniewski J, Mehl B, Yu L, Chen L, Davanco M, Oudjedi L, Fiche J, Hajj B, et al. Multifocus microscopy with precise color multi-phase diffractive optics applied in functional neuronal imaging. Biomed Opt Express. 2016;7:855–869. doi: 10.1364/BOE.7.000855. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Hajj B, El Beheiry M, Dahan M. PSF engineering in multifocus microscopy for increased depth volumetric imaging. Biomed Opt Express. 2016;7:726–731. doi: 10.1364/BOE.7.000726. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Oudjedi L, Fiche J, Abrahamsson S, Mazenq L, Lecestre A, Calmon P, Cerf A, Nöllmann M. Astigmatic multifocus microscopy enables deep 3D super-resolved imaging. Biomed Opt Express. 2016;7:2163–2173. doi: 10.1364/BOE.7.002163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Sun Y, McKenna JD, Murray JM, Ostap EM, Goldman YE. Parallax: High Accuracy Three-Dimensional Single Molecule Tracking Using Split Images. Nano Lett. 2009;9:2676–2682. doi: 10.1021/nl901129j. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Sun Y, Sato O, Ruhnow F, Arsenault ME, Ikebe M, Goldman YE. Single-molecule stepping and structural dynamics of myosin X. Nat Struct Mol Biol. 2010;17:485–491. doi: 10.1038/nsmb.1785. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 118.Yajima J, Mizutani K, Nishizaka T. A torque component present in mitotic kinesin Eg5 revealed by three-dimensional tracking. Nat Struct Mol Biol. 2008;15:1119–1121. doi: 10.1038/nsmb.1491. [DOI] [PubMed] [Google Scholar]
- 119.McMahon MD, Berglund AJ, Carmichael P, McClelland JJ, Liddle JA. 3D Particle Trajectories Observed by Orthogonal Tracking Microscopy. ACS Nano. 2009;3:609–614. doi: 10.1021/nn8008036. [DOI] [PubMed] [Google Scholar]
- 120.Tang J, Akerboom J, Vaziri A, Looger LL, Shank CV. Near-isotropic 3D optical nanoscopy with photon-limited chromophores. Proc Natl Acad Sci U S A. 2010;107:10068–10073. doi: 10.1073/pnas.1004899107. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 121.Kao HP, Verkman AS. Tracking of single fluorescent particles in three dimensions: use of cylindrical optics to encode particle position. Biophys J. 1994;67:1291–1300. doi: 10.1016/S0006-3495(94)80601-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Holtzer L, Meckel T, Schmidt T. Nanometric three-dimensional tracking of individual quantum dots in cells. Appl Phys Lett. 2007;90:053902. [Google Scholar]
- 123.Huang B, Wang W, Bates M, Zhuang X. Three-dimensional super-resolution imaging by stochastic optical reconstruction microscopy. Science. 2008;319:810–813. doi: 10.1126/science.1153529. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 124.Goodman JW. Introduction to Fourier Optics. Roberts & Company Publishers; Greenwood Village, CO: 2005. [Google Scholar]
- 125.Piestun R, Schechner YY, Shamir J. Propagation-invariant wave fields with finite energy. J Opt Soc Am A. 2000;17:294–303. doi: 10.1364/josaa.17.000294. [DOI] [PubMed] [Google Scholar]
- 126.Pavani SRP, Thompson MA, Biteen JS, Lord SJ, Liu N, Twieg RJ, Piestun R, Moerner WE. Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function. Proc Natl Acad Sci U S A. 2009;106:2995–2999. doi: 10.1073/pnas.0900245106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 127.Lew MD, Lee SF, Badieirostami M, Moerner WE. Corkscrew point spread function for far-field three-dimensional nanoscale localization of pointlike objects. Opt Lett. 2011;36:202–204. doi: 10.1364/OL.36.000202. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 128.Baddeley D, Cannell MB, Soeller C. Three-dimensional sub-100 nm super-resolution imaging of biological samples using a phase ramp in the objective pupil. Nano Research. 2011;4:589–598. [Google Scholar]
- 129.Jia S, Vaughan JC, Zhuang X. Isotropic three-dimensional super-resolution imaging with a self-bending point spread function. Nat Photonics. 2014;8:302–306. doi: 10.1038/nphoton.2014.13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 130.Shechtman Y, Sahl SJ, Backer AS, Moerner WE. Optimal Point Spread Function Design for 3D Imaging. Phys Rev Lett. 2014;113:133902. doi: 10.1103/PhysRevLett.113.133902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 131.Shechtman Y, Weiss LE, Backer AS, Sahl SJ, Moerner WE. Precise 3D scan-free multiple-particle tracking over large axial ranges with Tetrapod point spread functions. Nano Lett. 2015;15:4194–4199. doi: 10.1021/acs.nanolett.5b01396. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.Izeddin I, El Beheiry M, Andilla J, Ciepielewski D, Darzacq X, Dahan M. PSF shaping using adaptive optics for three-dimensional single-molecule super-resolution imaging and tracking. Opt Express. 2012;20:4957–4967. doi: 10.1364/OE.20.004957. [DOI] [PubMed] [Google Scholar]
- 133.Backlund MP, Lew MD, Backer AS, Sahl SJ, Grover G, Agrawal A, Piestun R, Moerner WE. Simultaneous, accurate measurement of the 3D position and orientation of single molecules. Proc Natl Acad Sci U S A. 2012;109:19087–19092. doi: 10.1073/pnas.1216687109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 134.Moerner W, Shechtman Y, Wang Q. Single-molecule spectroscopy and imaging over the decades. Faraday Discuss. 2015;184:9–36. doi: 10.1039/c5fd00149h. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 135.Badieirostami M, Lew MD, Thompson MA, Moerner WE. Three-dimensional localization precision of the double-helix point spread function versus astigmatism and biplane. Appl Phys Lett. 2010;97:161103. doi: 10.1063/1.3499652. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 136.Backer AS, Backlund MP, Lew MD, Moerner WE. Single-molecule orientation measurements with a quadrated pupil. Opt Lett. 2013;38:1521–1523. doi: 10.1364/OL.38.001521. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 137.Backer AS, Backlund MP, Diezmann AR, Sahl SJ, Moerner WE. A bisected pupil for studying single-molecule orientational dynamics and its application to 3D super-resolution microscopy. Appl Phys Lett. 2014;104:193701–193705. doi: 10.1063/1.4876440. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 138.Smith C, Huisman M, Siemons M, Grünwald D, Stallinga S. Simultaneous measurement of emission color and 3D position of single molecules. Opt Express. 2016;24:4996–5013. doi: 10.1364/OE.24.004996. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 139.Shechtman Y, Weiss LE, Backer AS, Lee ML, Moerner WE. Multicolor localization microscopy by point-spread-function engineering. Nat Photonics. 2016;10:590–594. doi: 10.1038/nphoton.2016.137. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 140.Kummer S, Dickson RM, Moerner WE. Probing Single Molecules in Poly(acrylamide) Gels. Proc Soc Photo-Opt Instrum Engr. 1998;3273:165–173. [Google Scholar]
- 141.Fu Y, Winter PW, Rojas R, Wang V, McAuliffe M, Patterson GH. Axial superresolution via multiangle TIRF microscopy with sequential imaging and photobleaching. Proceedings of the National Academy of Sciences. 2016;113:4368–4373. doi: 10.1073/pnas.1516715113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 142.Cang H, Wong CM, Xu CS, Rizvi AH, Yang H. Confocal three dimensional tracking of a single nanoparticle with concurrent spectroscopic readouts. Appl Phys Lett. 2006;88:223901. [Google Scholar]
- 143.Wells NP, Lessard GA, Goodwin PM, Phipps ME, Cutler PJ, Lidke DS, Wilson BS, Werner JH. Time-Resolved Three-Dimensional Molecular Tracking in Live Cells. Nano Lett. 2010;10:4732–4737. doi: 10.1021/nl103247v. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 144.Welsher K, Yang H. Multi-resolution 3D visualization of the early stages of cellular uptake of peptide-coated nanoparticles. Nature nanotechnology. 2014;9:198–203. doi: 10.1038/nnano.2014.12. [DOI] [PubMed] [Google Scholar]
- 145.Jung Y, Riven I, Feigelson SW, Kartvelishvily E, Tohya K, Miyasaka M, Alon R, Haran G. Three-dimensional localization of T-cell receptors in relation to microvilli using a combination of superresolution microscopies. PNAS. 2016;113:E5916–E5924. doi: 10.1073/pnas.1605399113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 146.Ruckstuhl T, Verdes D. Supercritical angle fluorescence (SAF) microscopy. Opt Express. 2004;12:4246–4254. doi: 10.1364/opex.12.004246. [DOI] [PubMed] [Google Scholar]
- 147.Moerner WE, Plakhotnik T, Irngartinger T, Wild UP, Pohl D, Hecht B. Near-Field Optical Spectroscopy of Individual Molecules in Solids. Phys Rev Lett. 1994;73:2764. doi: 10.1103/PhysRevLett.73.2764. [DOI] [PubMed] [Google Scholar]
- 148.Karedla N, Chizhik AI, Gregor I, Chizhik AM, Schulz O, Enderlein J. Single-Molecule Metal-Induced Energy Transfer (smMIET): Resolving Nanometer Distances at the Single-Molecule Level. ChemPhysChem. 2014;15:705–711. doi: 10.1002/cphc.201300760. [DOI] [PubMed] [Google Scholar]
- 149.Chance RR, Prock A, Silbey RJ. Molecular Fluorescence and Energy Transfer Near Interfaces. Adv Chem Phys. 1978;37:1–65. [Google Scholar]
- 150.Hell SW, Stelzer EHK. Properties of a 4Pi-confocal fluorescence microscope. J Opt Soc Am A. 1992;9:2159–2166. [Google Scholar]
- 151.Gustafsson MGL, Agard DA, Sedat JW. I5M: 3D widefield light microscopy with better than100nm axial resolution. J Microsc. 1999;195:10–16. doi: 10.1046/j.1365-2818.1999.00576.x. [DOI] [PubMed] [Google Scholar]
- 152.Shao L, Isaac B, Uzawa S, Agard DA, Sedat JW, Gustafsson MGL. I5S: Wide-Field Light Microscopy with 100-nm-Scale Resolution in Three Dimensions. Biophys J. 2008;94:4971–4983. doi: 10.1529/biophysj.107.120352. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 153.Bewersdorf J, Schmidt R, Hell SW. Comparison of I5M and 4Pi-microscopy. J Microsc. 2006;222:105–117. doi: 10.1111/j.1365-2818.2006.01578.x. [DOI] [PubMed] [Google Scholar]
- 154.Linfoot EH, Wolf E. Phase Distribution near Focus in an Aberration-free Diffraction Image. Proceedings of the Physical Society Section B. 1956;69:823–832. [Google Scholar]
- 155.Aquino D, Schönle A, Geisler C, Middendorff CV, Wurm CA, Okamura Y, Lang T, Hell SW, Egner A. Two-color nanoscopy of three-dimensional volumes by 4Pi detection of stochastically switched fluorophores. Nat Methods. 2011;8:353–359. doi: 10.1038/nmeth.1583. [DOI] [PubMed] [Google Scholar]
- 156.Middendorff vC, Egner A, Geisler C, Hell SW, Schönle A. Isotropic 3D Nanoscopy based on single emitter switching. Opt Express. 2008;16:20774–20788. doi: 10.1364/oe.16.020774. [DOI] [PubMed] [Google Scholar]
- 157.Brown TA, Tkachuk AN, Shtengel G, Kopek BG, Bogenhagen DF, Hess HF, Clayton DA. Superresolution Fluorescence Imaging of Mitochondrial Nucleoids Reveals Their Spatial Range, Limits, and Membrane Interaction. Molecular and Cellular Biology. 2011;31:4994–5010. doi: 10.1128/MCB.05694-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 158.Huang F, Sirinakis G, Allgeyer E, Schroeder L, Duim W, Kromann E, Phan T, Rivera-Molina F, Myers J, Irnov I, et al. Ultra-High Resolution 3D Imaging of Whole Cells. Cell. 2016;166:1028–1040. doi: 10.1016/j.cell.2016.06.016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 159.Schnitzbauer J, McGorty R, Huang B. 4Pi fluorescence detection and 3D particle localization with a single objective. Optics express. 2013;21:19701–19708. doi: 10.1364/OE.21.019701. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 160.Lew MD, Lee SF, Ptacin JL, Lee MK, Twieg RJ, Shapiro L, Moerner WE. Three-dimensional superresolution colocalization of intracellular protein superstructures and the cell surface in live Caulobacter crescentus. Proc Natl Acad Sci U S A. 2011;108:E1102–E1110. doi: 10.1073/pnas.1114444108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 161.Bon P, Bourg N, Lécart S, Monneret S, Fort E, Wenger J, Lévêque-Fort S. Three-dimensional nanometre localization of nanoparticles to enhance super-resolution microscopy. Nat Commun. 2015;6:7764. doi: 10.1038/ncomms8764. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 162.Dai M, Jungmann R, Yin P. Optical imaging of individual biomolecules in densely packed clusters. Nat Nano. 2016;11:798–807. doi: 10.1038/nnano.2016.95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 163.Born M, Wolf E. Principles of Optics. Cambridge University Press; Cambridge: 1999. [Google Scholar]
- 164.Welford WT. Aberrations of Optical Systems. Adam Hilger; Bristol, England: 1986. [Google Scholar]
- 165.Hanser B, Gustafsson M, Agard D, Sedat JW. Phase-retrieved pupil functions in wide-field fluorescence microscopy. J Microsc. 2004;216:32–48. doi: 10.1111/j.0022-2720.2004.01393.x. [DOI] [PubMed] [Google Scholar]
- 166.Hell SW, Reiner G, Cremer C, Stelzer EHK. Aberrations in confocal fluorescence microscopy induced by mismatches in refractive index. J Microsc. 1993;169:391–405. [Google Scholar]
- 167.Sheppard CJR, Torok P. Effects of specimen refractive index on confocal imaging. J Microsc. 1997;185:366–374. [Google Scholar]
- 168.Arimoto R, Murray JM. A common aberration with water-immersion objective lenses. J Microsc. 2004;216:49–51. doi: 10.1111/j.0022-2720.2004.01383.x. [DOI] [PubMed] [Google Scholar]
- 169.Deng Y, Shaevitz JW. Effect of aberration on height calibration in three-dimensional localization-based microscopy and particle tracking. Appl Opt. 2009;48:1886–1890. doi: 10.1364/ao.48.001886. [DOI] [PubMed] [Google Scholar]
- 170.Liu S, Kromann EB, Krueger WD, Bewersdorf J, Lidke KA. Three dimensional single molecule localization using a phase retrieved pupil function. Opt Express. 2013;21:29462–29487. doi: 10.1364/OE.21.029462. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 171.McGorty R, Schnitzbauer J, Zhang W, Huang B. Correction of depth-dependent aberrations in 3D single-molecule localization and super-resolution microscopy. Opt Lett. 2014;39:275–278. doi: 10.1364/OL.39.000275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 172.Booth MJ. Adaptive optics in microscopy. Phil Trans R Soc Lond A. 2007;365:2829–2843. doi: 10.1098/rsta.2007.0013. [DOI] [PubMed] [Google Scholar]
- 173.Kubby JA, editor. Adaptive Optics for Biological Imaging. CRC Press; Boca Raton, FL: 2013. [Google Scholar]
- 174.Azucena O, Tao X. Adaptive Optical Microscopy Using Direct Wavefront Measurements. CRC Press; 2013. pp. 315–343. [Google Scholar]
- 175.Ji N, Betzig E. Pupil-Segmentation-Based Adaptive Optics for Microscopy. CRC Press; 2013. pp. 231–250. [Google Scholar]
- 176.Burke D, Patton B, Huang F, Bewersdorf J, Booth M. Adaptive optics correction of specimen-induced aberrations in single-molecule switching microscopy. Optica. 2015;2:177–185. [Google Scholar]
- 177.Tehrani KF, Xu J, Zhang Y, Shen P, Kner P. Adaptive optics stochastic optical reconstruction microscopy (AO-STORM) using a genetic algorithm. Opt Express. 2015;23:13677–13692. doi: 10.1364/OE.23.013677. [DOI] [PubMed] [Google Scholar]
- 178.Schwertner M. Aberrations and the Benefit of Their Correction in Confocal Microscopy. CRC Press; 2013. pp. 51–74. [Google Scholar]
- 179.Mertz J, Paudel H, Bifano TG. Field of view advantage of conjugate adaptive optics in microscopy applications. Appl Opt. 2015;54:3498–3506. doi: 10.1364/AO.54.003498. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 180.von Diezmann A, Lee MY, Lew MD, Moerner W. Correcting field-dependent aberrations with nanoscale accuracy in three-dimensional single-molecule localization microscopy. Optica. 2015;2:985–993. doi: 10.1364/OPTICA.2.000985. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 181.Blythe KL, Titus EJ, Willets KA. Objective-induced Point Spread Function Aberrations and their Impact on Super-resolution Microscopy. Anal Chem. 2015;87:6419–6424. doi: 10.1021/acs.analchem.5b01848. [DOI] [PubMed] [Google Scholar]
- 182.Kurvits JA, Jiang M, Zia R. Comparative analysis of imaging configurations and objectives for Fourier microscopy. JOSA A. 2015;32:2082–2092. doi: 10.1364/JOSAA.32.002082. [DOI] [PubMed] [Google Scholar]
- 183.Goshtasby AA. Image registration. Springer-Verlag; London: 2012. [Google Scholar]
- 184.Bates M, Dempsey GT, Chen KH, Zhuang X. Multicolor Super-Resolution Fluorescence Imaging via Multi-Parameter Fluorophore Detection. ChemPhysChem. 2012;13:99–107. doi: 10.1002/cphc.201100735. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 185.Baddeley D, Crossman D, Rossberger S, Cheyne JE, Montgomery JM, Jayasinghe ID, Cremer C, Cannell MB, Soeller C. 4D Super-Resolution Microscopy with Conventional Fluorophores and Single Wavelength Excitation in Optically Thick Cells and Tissues. PLoS ONE. 2011;6:e20645. doi: 10.1371/journal.pone.0020645. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 186.Churchman LS, Oekten Z, Rock RS, Dawson JF, Spudich JA. Single molecule high-resolution colocalization of Cy3 and Cy5 attached to macromolecules measures intramolecular distances through time. Proc Natl Acad Sci U S A. 2005;102:1419–1423. doi: 10.1073/pnas.0409487102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 187.Churchman LS, Spudich JA. Colocalization of Fluorescent Probes: Accurate and Precise Registration with Nanometer Resolution. Cold Spring Harbor Protocols. 2012;2:141–149. doi: 10.1101/pdb.top067918. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 188.Pertsinidis A, Zhang Y, Chu S. Subnanometre single-molecule localization, registration and distance measurements. Nature. 2010;466:647–651. doi: 10.1038/nature09163. [DOI] [PubMed] [Google Scholar]
- 189.Ha T, Enderle T, Chemla DS, Selvin PR, Weiss S. Single molecule dynamics studied by polarization modulation. Phys Rev Lett. 1996;77:3979–3982. doi: 10.1103/PhysRevLett.77.3979. [DOI] [PubMed] [Google Scholar]
- 190.Forkey JN, Quinlan ME, Goldman YE. Measurement of Single Macromolecule Orientation by Total Internal Reflection Fluorescence Polarization Microscopy. Biophys J. 2005;89:1261–1271. doi: 10.1529/biophysj.104.053470. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 191.Rosenberg SA, Quinlan ME, Forkey JN, Goldman YE. Rotational motions of macro-molecules by single-molecule fluorescence microscopy. Acc Chem Res. 2005;38:583–593. doi: 10.1021/ar040137k. [DOI] [PubMed] [Google Scholar]
- 192.Toprak E, Enderlein J, Syed S, McKinney SA, Petschek RG, Ha T, Goldman YE, Selvin PR. Defocused orientation and position imaging (DOPI) of myosin V. Proc Natl Acad Sci U S A. 2006;103:6495–6499. doi: 10.1073/pnas.0507134103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 193.Axelrod D. Fluorescence polarization microscopy. Methods Cell Biol. 1989;30:333–352. [PubMed] [Google Scholar]
- 194.Peterman EJ, Sosa H, Moerner WE. Single-Molecule Fluorescence Spectroscopy and Microscopy of Biomolecular Motors. Annu Rev Phys Chem. 2004;55:79–96. doi: 10.1146/annurev.physchem.55.091602.094340. [DOI] [PubMed] [Google Scholar]
- 195.Enderlein J, Toprak E, Selvin PR. Polarization effect on position accuracy of fluorophore localization. Opt Express. 2006;14:8111–8120. doi: 10.1364/oe.14.008111. [DOI] [PubMed] [Google Scholar]
- 196.Engelhardt J, Keller J, Hoyer P, Reuss M, Staudt T, Hell SW. Molecular orientation affects localization accuracy in superresolution far-field fluorescence microscopy. Nano Lett. 2011;11:209–213. doi: 10.1021/nl103472b. [DOI] [PubMed] [Google Scholar]
- 197.Lew MD, Backlund MP, Moerner WE. Rotational mobility of single molecules affects localization accuracy in super-resolution fluorescence microscopy. Nano Lett. 2013;13:3967–3972. doi: 10.1021/nl304359p. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 198.Fourkas JT. Rapid determination of the three-dimensional orientation of single molecules. Opt Lett. 2001;26:211–213. doi: 10.1364/ol.26.000211. [DOI] [PubMed] [Google Scholar]
- 199.Böhmer M, Enderlein J. Orientation imaging of single molecules by wide-field epifluorescence microscopy. J Opt Soc Am B. 2003;20:554–559. [Google Scholar]
- 200.Backlund MP, Lew MD, Backer AS, Sahl SJ, Grover G, Agrawal A, Piestun R, Moerner WE. Simultaneous, accurate measurement of the 3D position and orientation of single molecules. Proc Nat Acad Sci USA. 2013;109:19087–19092. doi: 10.1073/pnas.1216687109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 201.Lew MD, Moerner WE. Azimuthal polarization filtering for accurate, precise, and robust single-molecule localization microscopy. Nano Lett. 2014;14:6407–6413. doi: 10.1021/nl502914k. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 202.Backer AS, Lee MY, Moerner WE. Enhanced DNA imaging using super-resolution microscopy and simultaneous single-molecule orientation measurements. Optica. 2016;3:659–666. doi: 10.1364/OPTICA.3.000659. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 203.Backlund MP, Arabi A, Petrov PN, Arabi E, Saurabh S, Faraon A, Moerner WE. Removing orientation-induced localization biases in single-molecule microscopy using a broadband metasurface mask. Nat Photonics. 2016;10:459–462. doi: 10.1038/nphoton.2016.93. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 204.Lee HD, Sahl SJ, Lew MD, Moerner WE. The double-helix microscope super-resolves extended biological structures by localizing single blinking molecules in three dimensions with nanoscale precision. Appl Phys Lett. 2012;100:153701. doi: 10.1063/1.3700446. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 205.Ram S, Ward ES, Ober RJ. Beyond Rayleigh’s criterion: A resolution measure with application to single-molecule microscopy. Proc Natl Acad Sci U S A. 2006;103:4457–4462. doi: 10.1073/pnas.0508047103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 206.Ram S, Ward ES, Ober RJ. A stochastic analysis of distance estimation approaches in single molecule microscopy: quantifying the resolution limits of photon-limited imaging systems. Multidim Syst Sign Process. 2013;24:503–542. doi: 10.1007/s11045-012-0175-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 207.Bates M, Huang B, Dempsey GT, Zhuang X. Multicolor super-resolution imaging with photo-switchable fluorescent probes. Science. 2007;317:1749–1753. doi: 10.1126/science.1146598. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 208.Ram S, Prabhat P, Ward ES, Ober RJ. Improved single particle localization accuracy with dual objective multifocal plane microscopy. Opt Express. 2009;17:6881–6898. doi: 10.1364/oe.17.006881. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 209.Xu K, Babcock HP, Zhuang X. Dual-objective STORM reveals three-dimensional filament organization in the actin cytoskeleton. Nat Methods. 2012;9:185–188. doi: 10.1038/nmeth.1841. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 210.Shroff H, Galbraith CG, Galbraith JA, White H, Gillette J, Olenych S, Davidson MW, Betzig E. Dual-color superresolution imaging of genetically expressed probes within individual adhesion complexes. Proc Natl Acad Sci U S A. 2007;104:20308–20313. doi: 10.1073/pnas.0710517105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 211.Nieuwenhuizen RPJ, Lidke KA, Bates M, Puig DL, Grunwald D, Stallinga S, Rieger B. Measuring image resolution in optical nanoscopy. Nat Methods. 2013;10:557–562. doi: 10.1038/nmeth.2448. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 212.Banterle N, Bui KH, Lemke EA, Beck M. Fourier ring correlation as a resolution criterion for super-resolution microscopy. J Struct Biol. 2013;183:363–367. doi: 10.1016/j.jsb.2013.05.004. [DOI] [PubMed] [Google Scholar]
- 213.Legant WR, Shao L, Grimm JB, Brown TA, Milkie DE, Avants BB, Lavis LD, Betzig E. High-density three-dimensional localization microscopy across large volumes. Nat Methods. 2016;13:359–365. doi: 10.1038/nmeth.3797. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 214.Fei J, Singh D, Zhang Q, Park S, Balasubramanian D, Golding I, Vanderpool CK, Ha T. Determination of in vivo target search kinetics of regulatory noncoding RNA. Science. 2015;347:1371–1374. doi: 10.1126/science.1258849. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 215.Huang F, Schwartz SL, Byars JM, Lidke KA. Simultaneous multiple-emitter fitting for single molecule super-resolution imaging. Biomed Opt Express. 2011;2:1377–1393. doi: 10.1364/BOE.2.001377. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 216.Holden SJ, Uphoff S, Kapanidis AN. DAOSTORM: an algorithm for high- density super-resolution microscopy. Nat Methods. 2011;8:279–280. doi: 10.1038/nmeth0411-279. [DOI] [PubMed] [Google Scholar]
- 217.Zhu L, Zhang W, Elnatan D, Huang B. Faster STORM using compressed sensing. Nat Methods. 2012;9:721–723. doi: 10.1038/nmeth.1978. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 218.Wang Y, Quan T, Zeng S, Huang Z. PALMER: a method capable of parallel localization of multiple emitters for high-density localization microscopy. Opt Express. 2012;20:16039–16049. doi: 10.1364/OE.20.016039. [DOI] [PubMed] [Google Scholar]
- 219.Cox S, Rosten E, Monypenny J, Jovanovic-Talisman T, Burnette DT, Lippincott-Schwartz J, Jones GE, Heintzmann R. Bayesian localization microscopy reveals nanoscale podosome dynamics. Nat Methods. 2012;9:195–200. doi: 10.1038/nmeth.1812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 220.Babcock H, Sigal Y, Zhuang X. A high-density 3D localization algorithm for stochastic optical reconstruction microscopy. Optical Nanoscopy. 2012;1:6. doi: 10.1186/2192-2853-1-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 221.Babcock HP, Moffitt JR, Cao Y, Zhuang X. Fast compressed sensing analysis for super-resolution imaging using L1-homotopy. Opt Express. 2013;21:28583–28596. doi: 10.1364/OE.21.028583. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 222.Barsic A, Grover G, Piestun R. Three-dimensional super-resolution and localization of dense clusters of single molecules. Sci Rep. 2014;4:5388. doi: 10.1038/srep05388. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 223.Min J, Holden SJ, Carlini L, Unser M, Manley S, Ye JC. 3D high-density localization microscopy using hybrid astigmatic/biplane imaging and sparse image reconstruction. Biomed Opt Express. 2014;5:3935–3948. doi: 10.1364/BOE.5.003935. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 224.Shuang B, Wang W, Shen H, Tauzin LJ, Flatebo C, Chen J, Moringo NA, Bishop LDC, Kelly KF, Landes CF. Generalized recovery algorithm for 3D super-resolution microscopy using rotating point spread functions. Scientific Reports. 2016;6:30826. doi: 10.1038/srep30826. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 225.Hu YS, Zimmerley M, Li Y, Watters R, Cang H. Single-molecule super-resolution light-sheet microscopy. ChemPhysChem. 2014;15:577–586. doi: 10.1002/cphc.201300732. [DOI] [PubMed] [Google Scholar]
- 226.Galland R, Grenci G, Aravind A, Viasnoff V, Studer V, Sibarita J. 3D high- and super-resolution imaging using single-objective SPIM. Nat Meth. 2015;12:641–644. doi: 10.1038/nmeth.3402. [DOI] [PubMed] [Google Scholar]
- 227.Vaziri A, Tang J, Shroff H, Shank CV. Multilayer three-dimensional super resolution imaging of thick biological samples. Proc Natl Acad Sci U S A. 2008;105:20221–20226. doi: 10.1073/pnas.0810636105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 228.Lee MK, Rai P, Williams J, Twieg RJ, Moerner WE. Small-molecule labeling of live cell surfaces for three-dimensional super-resolution microscopy. J Am Chem Soc. 2014;136:14003–14006. doi: 10.1021/ja508028h. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 229.Grimm JB, Klein T, Kopek BG, Shtengel G, Hess HF, Sauer M, Lavis LD. Synthesis of a far-red photoactivatable silicon-containing rhodamine for super-resolution microscopy. Angewandte Chemie International Edition. 2016;55:1723–1727. doi: 10.1002/anie.201509649. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 230.Durisic N, Cuervo LL, Lakadamyali M. Quantitative super-resolution microscopy: pitfalls and strategies for image analysis. Curr Opin Chem Biol. 2014;20:22–28. doi: 10.1016/j.cbpa.2014.04.005. [DOI] [PubMed] [Google Scholar]
- 231.Sydor AM, Czymmek KJ, Puchner EM, Mennella V. Super-resolution microscopy: from single molecules to supramolecular assemblies. Trends Cell Biol. 2015;25:730–748. doi: 10.1016/j.tcb.2015.10.004. [DOI] [PubMed] [Google Scholar]
- 232.Szymborska A, de Marco A, Daigle N, Cordes VC, Briggs JAG, Ellenberg J. Nuclear pore scaffold structure analyzed by super-resolution microscopy and particle averaging. Science. 2013;341:655–658. doi: 10.1126/science.1240672. [DOI] [PubMed] [Google Scholar]
- 233.Broeken J, Johnson H, Lidke DS, Liu S, Nieuwenhuizen RPJ, Stallinga S, Lidke KA, Rieger B. Resolution improvement by 3D particle averaging in localization microscopy. Methods Appl Fluoresc. 2015;3:014003. doi: 10.1088/2050-6120/3/1/014003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 234.Passon O, Grebe-Ellis J. Note on the classification of super-resolution in far-field microscopy and information theory. J Opt Soc Am A. 2016;33:B31–B35. doi: 10.1364/JOSAA.33.000B31. [DOI] [PubMed] [Google Scholar]
- 235.Tahmasbi A, Ward ES, Ober RJ. Determination of localization accuracy based on experimentally acquired image sets: applications to single molecule microscopy. Opt Express. 2015;23:7630–7652. doi: 10.1364/OE.23.007630. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 236.Coles BC, Webb SED, Schwartz N, Rolfe DJ, Martin-Fernandez M, Lo Schiavo V. Characterisation of the effects of optical aberrations in single molecule techniques. Biomed Opt Express. 2016;7:1755–1767. doi: 10.1364/BOE.7.001755. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 237.Sage D, Kirshner H, Pengo T, Stuurman N, Min J, Manley S, Unser M. Quantitative evaluation of software packages for single-molecule localization microscopy. Nat Methods. 2015;12:717–724. doi: 10.1038/nmeth.3442. [DOI] [PubMed] [Google Scholar]
- 238.Chenouard N, Smal I, de Chaumont F, Maška M, Sbalzarini IF, Gong Y, Cardinale J, Carthel C, Coraluppi S, Winter M, et al. Objective comparison of particle tracking methods. Nat Methods. 2014;11:281–289. doi: 10.1038/nmeth.2808. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 239.Maglione M, Sigrist SJ. Seeing the forest tree by tree: super-resolution light microscopy meets the neurosciences. Nat Neurosci. 2013;16:790–797. doi: 10.1038/nn.3403. [DOI] [PubMed] [Google Scholar]
- 240.Jakobs S, Wurm CA. Super-resolution microscopy of mitochondria. Curr Opin Chem Biol. 2014;20:9–15. doi: 10.1016/j.cbpa.2014.03.019. [DOI] [PubMed] [Google Scholar]
- 241.Kisley L, Landes CF. Molecular approaches to chromatography using single molecule spectroscopy. Anal Chem. 2015;87:83–98. doi: 10.1021/ac5039225. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 242.Kastantin M, Langdon BB, Schwartz DK. A bottom-up approach to understanding protein layer formation at solid–liquid interfaces. Adv Colloid Interface Sci. 2014;207:240–252. doi: 10.1016/j.cis.2013.12.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 243.Rühle B, Melari D, Thomas B, Bräuchle C. Fluorescence Microscopy Studies of Porous Silica Materials. Z Naturforsch B. 2014;68:423–444. [Google Scholar]
- 244.Winograd-Katz S, Fässler R, Geiger B, Legate KR. The integrin adhesome: from genes and proteins to human disease. Nat Rev Mol Cell Biol. 2014;15:273–288. doi: 10.1038/nrm3769. [DOI] [PubMed] [Google Scholar]
- 245.Chen WT, Singer SJ. Immunoelectron microscopic studies of the sites of cell-substratum and cell-cell contacts in cultured fibroblasts. J Cell Biol. 1982;95:205–222. doi: 10.1083/jcb.95.1.205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 246.Franz CM, Müller DJ. Analyzing focal adhesion structure by atomic force microscopy. J Cell Sci. 2005;118:5315–5323. doi: 10.1242/jcs.02653. [DOI] [PubMed] [Google Scholar]
- 247.Case LB, Baird MA, Shtengel G, Campbell SL, Hess HF, Davidson MW, Waterman CM. Molecular mechanism of vinculin activation and nanoscale spatial organization in focal adhesions. Nat Cell Biol. 2015;17:880–892. doi: 10.1038/ncb3180. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 248.Rossier O, Octeau V, Sibarita J, Leduc C, Tessier B, Nair D, Gatterdam V, Destaing O, Albiges-Rizo C, Tampe R, et al. Integrins β1 and β3 exhibit distinct dynamic nanoscale organizations inside focal adhesions. Nat Cell Biol. 2012;14:1231–1231. doi: 10.1038/ncb2588. [DOI] [PubMed] [Google Scholar]
- 249.Jaqaman K, Galbraith JA, Davidson MW, Galbraith CG. Changes in single-molecule integrin dynamics linked to local cellular behavior. Mol Biol Cell. 2016;27:1561–1569. doi: 10.1091/mbc.E16-01-0018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 250.Patla I, Volberg T, Elad N, Hirschfeld-Warneken V, Grashoff C, Fassler R, Spatz JP, Geiger B, Medalia O. Dissecting the molecular architecture of integrin adhesion sites by cryo-electron tomography. Nat Cell Biol. 2010;12:909–915. doi: 10.1038/ncb2095. [DOI] [PubMed] [Google Scholar]
- 251.Kanchanawong P, Shtengel G, Pasapera AM, Ramko EB, Davidson MW, Hess HF, Waterman CM. Nanoscale architecture of integrin-based cell adhesions. Nature. 2010;468:580–584. doi: 10.1038/nature09621. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 252.Sheng M, Kim E. The postsynaptic organization of synapses. Cold Spring Harb Perspect Biol. 2011:3. doi: 10.1101/cshperspect.a005678. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 253.Südhof T. The Presynaptic Active Zone. Neuron. 2012;75:11–25. doi: 10.1016/j.neuron.2012.06.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 254.Harris KM, Weinberg RJ. Ultrastructure of synapses in the mammalian brain. Cold Spring Harb Perspect Biol. 2012;4:a005587. doi: 10.1101/cshperspect.a005587. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 255.Valtschanoff JG, Weinberg RJ. Laminar organization of the NMDA receptor complex within the postsynaptic density. J Neurosci. 2001;21:1211–1217. doi: 10.1523/JNEUROSCI.21-04-01211.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 256.Petersen JD, Chen X, Vinade L, Dosemeci A, Lisman JE, Reese TS. Distribution of postsynaptic density (PSD)-95 and Ca2+/calmodulin-dependent protein kinase II at the PSD. J Neurosci. 2003;23:11270–11278. doi: 10.1523/JNEUROSCI.23-35-11270.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 257.Siksou L, Rostaing P, Lechaire J, Boudier T, Ohtsuka T, Fejtova A, Kao H, Greengard P, Gundelfinger ED, Triller A, Marty S. Three-dimensional architecture of presynaptic terminal cytomatrix. J Neurosci. 2007;27:6868–6877. doi: 10.1523/JNEUROSCI.1773-07.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 258.Masugi-Tokita M, Tarusawa E, Watanabe M, Molnar E, Fujimoto K, Shigemoto R. Number and density of AMPA receptors in individual synapses in the rat cerebellum as revealed by SDS-digested freeze-fracture replica labeling. J Neurosci. 2007;27:2135–2144. doi: 10.1523/JNEUROSCI.2861-06.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 259.Dani A, Huang B, Bergan J, Dulac C, Zhuang X. Superresolution imaging of chemical synapses in the brain. Neuron. 2010;68:843–856. doi: 10.1016/j.neuron.2010.11.021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 260.Lučić V, Yang T, Schweikert G, Förster F, Baumeister W. Morphological characterization of molecular complexes present in the synaptic cleft. Structure. 2005;13:423–434. doi: 10.1016/j.str.2005.02.005. [DOI] [PubMed] [Google Scholar]
- 261.Limbach C, Laue MM, Wang X, Hu B, Thiede N, Hultqvist G, Kilimann MW. Molecular in situ topology of Aczonin/Piccolo and associated proteins at the mammalian neurotransmitter release site. Proceedings of the National Academy of Sciences. 2011;108:E392–E401. doi: 10.1073/pnas.1101707108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 262.Coltharp C, Xiao J. Superresolution microscopy for microbiology. Cell Microbiol. 2012;14:1808–1818. doi: 10.1111/cmi.12024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 263.Tuson HH, Biteen JS. Unveiling the inner workings of live bacteria using super-resolution microscopy. Anal Chem. 2015;87:42–63. doi: 10.1021/ac5041346. [DOI] [PubMed] [Google Scholar]
- 264.Buss J, Coltharp C, Shtengel G, Yang X, Hess H, Xiao J. A multi-layered protein network stabilizes the Escherichia coli FtsZ-ring and modulates constriction dynamics. PLoS Genet. 2015;11:e1005128. doi: 10.1371/journal.pgen.1005128. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 265.Biteen JS, Goley ED, Shapiro L, Moerner WE. Three-Dimensional Super-Resolution Imaging of the Midplane Protein FtsZ in Live Caulobacter crescentus Cells Using Astigmatism. ChemPhysChem. 2012;13:1007–1012. doi: 10.1002/cphc.201100686. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 266.Holden SJ, Pengo T, Meibom KL, Fernandez Fernandez C, Collier J, Manley S. High throughput 3D super-resolution microscopy reveals Caulobacter crescentus in vivo Z-ring organization. Proc Natl Acad Sci U S A. 2014;111:4566–4571. doi: 10.1073/pnas.1313368111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 267.Ptacin JL, Gahlmann A, Bowman GR, Perez AM, von Diezmann ARS, Eckart MR, Moerner WE, Shapiro L. Bacterial scaffold directs pole-specific centromere segregation. Proc Natl Acad Sci U S A. 2014;111:E2046–E2055. doi: 10.1073/pnas.1405188111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 268.Backlund MP, Joyner R, Weis K, Moerner WE. Correlations of three-dimensional motion of chromosomal loci in yeast revealed by the Double-Helix Point Spread Function microscope. Mol Biol Cell. 2014;25:3619–3629. doi: 10.1091/mbc.E14-06-1127. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 269.Cabal GG, Genovesio A, Rodriguez-Navarro S, Zimmer C, Gadal O, Lesne A, Buc H, Feuerbach-Fournier F, Olivo-Marin J, Hurt ED. SAGA interacting factors confine sub-diffusion of transcribed genes to the nuclear envelope. Nature. 2006;441:770–773. doi: 10.1038/nature04752. [DOI] [PubMed] [Google Scholar]
- 270.Backlund MP, Joyner R, Moerner W. Chromosomal locus tracking with proper accounting of static and dynamic errors. Physical Review E. 2015;91:062716. doi: 10.1103/PhysRevE.91.062716. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 271.Savin T, Doyle PS. Static and dynamic errors in particle tracking microrheology. Biophys J. 2005;88:623–638. doi: 10.1529/biophysj.104.042457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 272.Bálint Š, Verdeny Vilanova I, Sandoval Álvarez Á, Lakadamyali M. Correlative live-cell and superresolution microscopy reveals cargo transport dynamics at microtubule intersections. Proc Nat Acad Sci U S A. 2013;110:3375–3380. doi: 10.1073/pnas.1219206110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 273.Adrian RJ, Westerweel J. Particle image velocimetry. Cambridge University Press; New York: 2011. [Google Scholar]
- 274.Cierpka C, Rossi M, Segura R, Kähler C. On the calibration of astigmatism particle tracking velocimetry for microflows. Meas Sci Technol. 2010;22:015401. [Google Scholar]
- 275.Cierpka C, Kähler C. Particle imaging techniques for volumetric three-component (3D3C) velocity measurements in microfluidics. J Vis. 2012;15:1–31. [Google Scholar]
- 276.Backlund MP, Lew MD, Backer AS, Sahl SJ, Moerner WE. The role of molecular dipole orientation in single-molecule fluorescence microscopy and implications for super-resolution imaging. ChemPhysChem. 2014;15:587–599. doi: 10.1002/cphc.201300880. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 277.Backer AS, Moerner WE. Determining the rotational mobility of a single molecule from a single image: a numerical study. Opt Express. 2015;23:4255–4276. doi: 10.1364/OE.23.004255. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 278.Ortega-Arroyo J, Kukura P. Interferometric scattering microscopy (iSCAT): new frontiers in ultrafast and ultrasensitive optical microscopy. Phys Chem Chem Phys. 2012;14:15625–15636. doi: 10.1039/c2cp41013c. [DOI] [PubMed] [Google Scholar]