Skip to main content
eLife logoLink to eLife
. 2025 Oct 28;14:e102144. doi: 10.7554/eLife.102144

Deep3DSIM: Super-resolution imaging of thick tissue using 3D structured illumination with adaptive optics

Jingyu Wang 1,2,, Danail Stoychev 1,2,†,, Mick A Phillips 1, David Miguel Susano Pinto 1,2, Richard M Parton 1, Nicholas James Hall 1, Joshua S Titlow 1, Ana Rita Faria 1, Matthew Wincott 2, Dalia Gala 1, Andreas Gerondopoulos 1, Niloufer Irani 1, Ian Dobbie 1,§,, Lothar Schermelleh 1,, Martin J Booth 2,, Ilan Davis 1,‡,
Editors: K VijayRaghavan3, K VijayRaghavan4
PMCID: PMC12563541  PMID: 41150055

Abstract

Three-dimensional structured illumination microscopy (3D-SIM) doubles the resolution of fluorescence imaging in all directions and enables optical sectioning with increased image contrast. However, 3D-SIM has not been widely applied to imaging deep in thick tissues due to its sensitivity to sample-induced aberrations, making the method difficult to apply beyond 10 µm in depth. Furthermore, 3D-SIM has not been available in an upright configuration, limiting its use for live imaging while manipulating the specimen, for example, with electrophysiology. Here, we have overcome these barriers by developing a novel upright 3D-SIM system (termed Deep3DSIM) that incorporates adaptive optics for aberration correction and remote focusing, reducing artefacts, improving contrast, restoring resolution, and eliminating the need to move the specimen or the objective lens in volume imaging. These advantages are equally applicable to inverted 3D-SIM systems. We demonstrate high-quality 3D-SIM imaging in various samples, including imaging more than 130 µm into the Drosophila brain.

Research organism: D. melanogaster

Introduction

Fluorescence microscopy has greatly advanced with the development of super-resolution microscopy techniques that enable biological imaging with spatial resolutions well below the optical diffraction limit (Huang et al., 2009; Sahl et al., 2017; Schermelleh et al., 2019; Schermelleh et al., 2010). One of these approaches is 3D-SIM, which features twofold increased resolution in each spatial direction and enables fast imaging with optical sectioning for enhanced contrast. 3D-SIM recovers missing high spatial frequency information by illuminating the specimen with a fine sinusoidal pattern and computationally reconstructing an image with twice the resolution (Gustafsson et al., 2008; Schermelleh et al., 2008). 3D-SIM has gained increasing popularity, not least due to its compatibility with standard fluorophores and labelling protocols commonly used in widefield or confocal microscopy.

 Over the past decade, several off-the-shelf 3D-SIM instruments have become commercially available and successfully used for various biological applications. For example, the OMX, based on the original design by John Sedat (Dobbie et al., 2011), as well as a bespoke implementation for correlative 3D soft X-ray cryo-imaging on a synchrotron beamline (Kounatidis et al., 2020). However, they all share similar limitations that restrict their usage to thin specimens on inverted setups. Optical aberrations are one of the main contributors to reconstruction artefacts in 3D-SIM imaging (Demmerle et al., 2017). These include spherical aberrations induced by refractive index (RI) mismatches between the immersion media, mounting media, and sample, as well as sample-induced aberrations caused by RI inhomogeneities within the sample itself. The effect of these optical aberrations increases with imaging depth, making it difficult to obtain good 3D-SIM images at a depth greater than around 10 µm. It is also difficult to capture good images over larger z-ranges in most specimens when using oil-immersion objective lenses with high numerical aperture (NA) and limited working distance. Silicone oil and water immersion, or water-dipping, objective lenses, with lower NA, have longer working distances and reduced spherical aberration because of the better match to the RI of biological tissues. Currently, there are no upright off-the-shelf 3D-SIM systems that use water-dipping objective lenses, because such lenses are intended for imaging at depths far greater than 10 µm and because building 3D-SIM optics in an upright configuration presents complex engineering challenges. However, upright microscopy is essential for certain biological experiments, so 3D-SIM has not been applied to a variety of biological questions. For example, live imaging without a cover slip and with access to the sample for manipulations, such as microinjection and electrophysiology, has not been done with 3D-SIM.

 One way to overcome the limitations described above is to incorporate adaptive optics (AO), using a deformable mirror (DM) in the optical path of a 3D-SIM system. AO works by changing the shape of the light’s wavefronts, which are surfaces formed by points of equal phase, as the light wave propagates through space. AO can enable the correction of spherical and sample-induced aberrations and allow the rapid movement of the focus axially for volume imaging in a purely optical way (remote focusing), without the need to move the specimen or the objective lens. These functionalities have been described as prototype methods for widefield (Kam et al., 2007), 3D-SIM (Lin et al., 2021; Thomas et al., 2015; Žurauskas et al., 2019), and multiphoton imaging (Žurauskas et al., 2017). However, no off-the-shelf 3D-SIM instruments are designed to allow the inclusion of AO. Moreover, AO hardware and software controls are currently not available as simple modules that can be added to the optical path of commercial systems. While some exciting bespoke-built AO-SIM systems have previously been described (Li et al., 2017; Lin et al., 2021; Thomas et al., 2015; Turcotte et al., 2019), these proof-of-principle prototypes use AO for aberration correction only, and they are not suitable for applications requiring an upright optical configuration with multiple channels. To our knowledge, although DM-driven remote focusing has been used in simulation (Kner et al., 2011) and confocal (Poland et al., 2008) or multiphoton imaging (Žurauskas et al., 2017), it has not been implemented in any 3D-SIM systems. Image scanning methods have been developed to enhance the spatial resolution in imaging of thick samples using multi-point scanning (York et al., 2013; York et al., 2012), multiphoton excitation (Ingaramo et al., 2014), or AO (Zheng et al., 2017). These methods use physical or digital pinholes together with photon reassignment to improve the resolution by a factor of 2 or greater when applying deconvolution. 3D-SIM achieves higher resolution, but manages out-of-focus background less efficiently than image scanning methods. The main reasons for this are the retention of the shot noise of the background light after reconstruction, and the reduced modulation contrast of the structured illumination (SI), potentially leading to reconstruction artefacts. These problems are compounded by depth and fluorescence density.

 Here, we describe our bespoke upright Deep3DSIM prototype system with integrated AO, enabling deep imaging in live wholemount biological specimens with direct access for their manipulation (Figure 1). The system is based around a 60×/1.1 NA water-immersion objective lens, with a correction collar that allows its use in a water-dipping configuration without a cover slip. We demonstrate high-quality 3D-SIM with nearly twofold spatial resolution extension in three dimensions at wide depths from a few micrometres to 130 µm. We use AO not only for sample-induced aberration correction in deep tissue imaging but also for remote focusing. The latter enables fast transitions of the imaging plane and prevents pressure waves caused by the motion, along the optical axis, of the specimen or the objective lens. Our novel approach and instrument design enable simultaneous multichannel imaging in conventional and super-resolution modes. The control of Deep3DSIM is based on our previously published user-friendly open-source Python software named Cockpit (see Methods and materials), controlling the high-performance hardware devices required for 3D-SIM, while achieving highly accurate and precise timing at fast rates. We first test the system’s performance using beads and cells in culture. We then apply the full range of novel imaging modalities available on Deep3DSIM to a wide range of specimen types, from mammalian tissue culture cells to Drosophila larval brains and embryos. Finally, although our use of AO for aberration correction and remote focusing is demonstrated on an upright microscope, the principles we have established could be applied equally to an inverted 3D-SIM system. Use of AO on an inverted system, like on an upright one, would provide considerably extended depth of imaging and rapid acquisition of 3D volumes without moving the specimen or the objective lens.

Figure 1. Simplified overview of the Deep3DSIM setup.

Figure 1.

(A) Optical arrangement of Deep3DSIM and the conceptual use of micromanipulators for applications such as microinjection and electrophysiology. The excitation light hits the spatial light modulator (SLM), which is conjugated to the object/image plane, and then reflects off a deformable mirror (DM) before being focused on the sample with an objective lens. In the imaging path, the reflected fluorescence is coupled off via a dichroic beam splitter and then collected by separate cameras for each channel. (B) Different imaging modes are enabled in parallel on Deep3DSIM. Deep imaging without adaptive optics (AO) usually leads to an aberrated point spread function (PSF) due to refractive index (RI) mismatch and sample inhomogeneity. Sensorless AO correction compensates for sample-induced aberrations. Remote focusing enables fast focusing at various depths without moving the specimen/objective. Combination of AO and remote focusing produces aberration-corrected imaging at different depths, without mechanical movement. (C) Deep3DSIM is controlled by Cockpit, which consists of three packages: python-microscope is responsible for the control of all hardware devices; microscope-cockpit provides a user-friendly GUI; microscope-aotools provides the AO functionality. The system uses a real-time controller, in this case, the Red Pitaya STEMlab 125–14, to coordinate different instruments for image acquisition with structured or widefield illumination.

Results

To initially assess the baseline performance of our Deep3DSIM system, we imaged green fluorescent latex microspheres (‘beads’) with an average diameter of 100 nm, attached to a glass cover slip. We used the objective lens in a water-immersion configuration, imaging through the cover slip, to make the assessment comparable to common experimental routines. We found the resolution of Deep3DSIM to be a little lower than double that of widefield. 3D images of the beads were used to estimate the point spread function (PSF) of the system in widefield (WF) and 3D-SIM modalities. The PSF provides a complete characterisation of the optical system. We fitted the intensities of these images to Gaussian curves, both laterally and axially, and then used the full width at half-maximum (FWHM) of the Gaussian curves to estimate the lateral and the axial resolution. We repeated these calculations for multiple beads for each modality (Figure 2A). The mean lateral resolution was 185 nm (standard deviation (SD) of 13 nm) for 3D-SIM and 333 nm (SD = 9 nm) for widefield, whereas the mean axial resolution was 547 nm (SD = 34 nm) and 893 nm (SD = 41 nm), respectively. We found that Deep3DSIM produced a resolution improvement that approaches but does not quite reach the theoretically possible doubling of the widefield resolution. This limitation is partly due to our choice of conservative line widths for the SI patterns (see Appendix 1 for details).

Figure 2. Resolution estimations of the Deep3DSIM system.

Figure 2.

(A) Full width at half-maximum (FWHM) measurements of 100 nm green fluorescent beads, deposited on a glass cover slip and imaged in water-immersion configuration. The images show lateral (XY) and axial (XZ) views of one bead from each sample, indicated with a cross on the distribution plots. SD: standard deviation. Scale bar: 1 µm. (B) Frequency space analysis of images of fixed COS-7 cells, with labelled microtubules in the green channel and endoplasmic reticulum in the red channel. The images were acquired in water-immersion configuration, imaging through a cover slip. Scale bars: 10 µm. The XY images are average projections along Z, and the XZ images are orthogonal projections along the middle of the Y axis. The power spectra are in logarithmic scale and centred on the zero frequency; they show lateral kxyview (top panel) and an axial kxzview (bottom panel) of the 3D DFT, both taken along the middle of the corresponding third axis. Scale bar: 3 µm–1. The power spectra were thresholded to remove noise. The power plots show the attenuation of the frequency response with increasing resolution. The lateral power plot was created by radial averaging of the frequency amplitudes in the kxy view, while the axial plot shows the frequency amplitudes of the kxz view after averaging the two directions 0 → ± kz and then averaging along the kx axis. The resolution thresholds (dashed lines) were chosen as the points at which the normalised logarithmic power reached 0.01 (i.e. 1%). The resolution thresholds were converted to the spatial domain by taking the inverse of the spatial frequency.

 3D-SIM, like other super-resolution methods, is sensitive to even low levels of system and sample-induced aberrations, which are typically encountered in thin samples, such as single cells in culture. Such minor aberrations are often overlooked in conventional microscopy modalities such as widefield and confocal (Wang and Zhang, 2021). Therefore, in addition to the FWHM measurements of beads, we also estimated the resolution enhancement by 3D-SIM in mammalian tissue culture COS-7 cells in two channels. We used immunofluorescence to label microtubules with an antibody conjugated to Alexa Fluor (AF) 488 (Figure 2B, left) and, in another example, endoplasmic reticulum labelled with antibodies against Rtn4 and conjugated to AF 555 (Figure 2B, right). Both structures consist of tubules that are below the diffraction limit and the resolving power of the objective lens, which makes them highly suitable for testing the performance of super-resolution microscopy. We compared the 3D-SIM images to their pseudo-widefield (PWF) equivalents, which were obtained using SIMcheck (Ball et al., 2015) by averaging the raw SI images. Compared to the corresponding PWF images, the optical sectioning capability of 3D-SIM was easily noticeable by the large difference in out-of-focus signal, especially in the cross-section views. We estimated the resolution with analysis in frequency space by finding the support of the optical transfer function (OTF). The OTF is the Fourier transform of the PSF, and, just like it, it provides a model of the optical system. We used the contours of the OTF support (dashed lines in Figure 2B) as estimations of the resolution. In the green channel, the resolutions were 356 nm and 190 nm laterally, and 1009 nm and 568 nm axially, respectively for PWF and 3D-SIM. These resolution values were close to, and agreed with, the FWHM values measured from beads (Figure 2A). The performance of the system in the red channel was similar, but the spatial frequency response was attenuated because of the longer wavelength. Here, the resolution values estimated from the power plots in PWF and SIM modes were 386 nm and 213 nm laterally, and 1276 nm and 644 nm axially, respectively. We performed a similar analysis for most of the subsequent experiments (supplements to Figure 3, Figure 4 and Figure 5).

Figure 3. Adaptive optics (AO) correction removes artefacts and improves the contrast and resolution of 3D structured illumination microscopy (3D-SIM) in tissue imaging.

(A, B) 3D-SIM images of neuromuscular junction in a fixed L3 Drosophila larva expressing Nrx-IV::GFP (green) and membrane labelled with anti-HRP antibodies (magenta) acquired without (A) and with AO (B), respectively. The data was acquired in water-immersion mode, imaging through a cover slip. The XZ views on the left are maximum projections along the Y axis. The boxed regions are shown scaled up on the right, with their own orthogonal XZ views along the middle of the Y axis. The arrow in (A) shows an example of ghosting artefacts in the red channel. The scale bars are 10 μm. (C) Intensity profiles along the dashed lines in the XZ views on the right of (A) and (B).

Figure 3.

Figure 3—figure supplement 1. Frequency-space analysis and aberration correction.

Figure 3—figure supplement 1.

(A) Power spectra of the green (Nrx-IV) and red (HRP) channels, and their associated plots. Scale bar: 3 µm–1. (B) Example of aberration correction mode amplitudes for this type of sample.

Figure 4. 3D structured illumination microscopy (3D-SIM) in deep tissue samples enabled by adaptive optics (AO) aberration correction.

(A) 3D-SIM image stack of fixed Drosophila L3 larval brain expressing Cno::YFP and imaged from top downwards at a depth of ~3 μm from the cover slip surface without AO (left) and with AO (right). Both the lateral views (XY) and orthogonal views (XZ) show maximum projections. Scale bar: 10 μm. (B) 3D-SIM images of the same fixed Drosophila L3 larval brain, acquired through the entire volume of a single brain lobe at ~130 μm from the top surface, with 100 nm diameter red fluorescent beads attached to the surface of the glass slide. Images are displayed as in (A) but magnified. The insets show further 3× magnification of the regions indicated with dashed lines. Scale bar: 10 μm. (C) Intensity profiles of the red channel along the lines in the insets. (D) Schematic illustration of the specimen mounting and imaging. The larval brain was mounted in PBS between a glass cover slip (top) and a glass slide (bottom); the slide was coated with red fluorescent beads at medium density. Image stacks in (A) and (B) were recorded at the proximal (P) and distal (D) sites indicated with arrowheads on the right, respectively.

Figure 4.

Figure 4—figure supplement 1. Frequency-space analysis and aberration correction.

Figure 4—figure supplement 1.

(A) Power spectra and plots for the proximal site images. Scale bar: 3 μm–1. (B) Power spectra and plots for the distal site images. Left column is the green channel, right column is the red channel. The lateral resolution is not improved with adaptive optics (AO) in the green channel, because the prominent reconstruction artefacts in bypass (BP) mode misrepresent the actual optical performance of the system. Scale bar: 3 μm–1. (C) Example of aberration correction mode amplitudes obtained at the distal site.

Figure 5. Remote focusing for fast live 3D structured illumination microscopy (3D-SIM) time-lapse imaging and for large volume imaging using multi-position aberration correction (MPAC).

(A) Mitotic embryonic divisions in Drosophila syncytial blastoderm embryos, expressing transgenic Jupiter::GFP labelling microtubules (green) and transgenic Histone H2A::RFP labelling chromosomes (magenta). The imaging was carried out in water-dipping configuration, without the use of a cover slip. Images show maximum projection of volumes with about 1 µm thickness. Brightness and contrast of each image were adjusted independently. Scale bar: 10 µm. (B) COS-7 cell in metaphase with microtubules immunostained with AF 488, visualising the mitotic spindle. The volumes were acquired with remote focusing, imaging through a cover slip in water-immersion configuration. The lateral (XY) views are maximum projections. Aberration correction was performed at two imaging planes, indicated by arrows in the adaptive optics (AO) YZ view; dynamic correction was applied for all other planes. Insets show magnifications of the regions indicated with dashed boxes. Scale bars: 10 μm in the lateral bypass (BP) view and 0.5 μm in the insets. (C) Intensity plots in the lateral and axial directions, along the dashed lines in the insets, showing increased resolution in the AO (MPAC) case.

Figure 5.

Figure 5—figure supplement 1. Multi-position aberration correction (MPAC) mode amplitudes.

Figure 5—figure supplement 1.

The two correction planes, used for the MPAC, are indicated with arrows in the inset image. The scale bar is 5 μm.

 Deep3DSIM is an upright system with a long working distance objective lens, in contrast to commercial 3D-SIM systems, which are built around inverted microscopes, and which are usually restricted to imaging thin samples, such as cultured cells. Deep3DSIM is particularly suitable for tissue-based imaging while manipulating the specimen with methods such as electrophysiology. We evaluated these unique features of the instrument by imaging the Drosophila neuromuscular junction (NMJ), a sample well suited for electrophysiological experiments. For convenience, we prepared fixed NMJ preparations by dissecting open the larvae, removing the guts, and pinning out the muscle layer with its associated motor neuron system to produce a so-called NMJ fillet preparation. These kinds of samples typically exhibit a substantial amount of RI inhomogeneity and mismatch, and they often require imaging deep into the specimen to examine specific sites of muscle innervation. These conditions led to a significant amount of optical aberrations, further emphasised by our use of a glycerol-based mounting medium (Vectashield). We used 3D-SIM with AO to image the dissected larva fillet preparation from a transgenic line expressing a bright GFP protein trap of the cell-cell ‘septate’ junction protein Neurexin IV (Nrx-IV), at a depth of 5–15 μm (Figure 3). Our implementation of the AO did not use a dedicated wavefront sensor, instead relying on an indirect sensorless approach. We used a bypass (BP) mode as control, to compare between imaging with or without AO, where we switched the DM for a flat mirror. The images of the AO-corrected volume showed significant improvements in both intensity and contrast, compared to the images of the uncorrected volume (Figure 3C). The improvement was especially noticeable between the XZ views, as the major effect of aberrations is to distort the PSF along the optical axis. This was most notable for spherical aberrations, which elongate the focus along the optical axis.

 To further test the microscope’s performance at greater depth, we imaged, dissected, and subsequently formaldehyde-fixed whole Drosophila larval brains. We imaged both the top and bottom layers of the thickest part of the brain lobes, mounted between a standard microscope cover slip (#1.5) and a slide, providing a total imaging depth of ~130 μm (Figure 4C). To help visualise the effects of aberrations and to evaluate the ability of AO to correct them at depth, we placed 100 nm diameter red fluorescent beads (580/605 nm) on the glass slide underneath the tissue (Figure 4D). We acquired two image stacks centred on two sites – one proximal site ~3 μm away from the cover slip and another distal site ~130 μm away from it, which required focusing through the entire volume of the mounted brain (Figure 4D). The larvae used to prepare the samples came from a fly line with a YFP protein trap for canoe (cno), which encodes a protein that is expressed in the adherens junctions of the monolayer of neuroepithelial stem cells in the outer proliferation centres in the optic lobes. The protein expression marked the outlines of the cells and showed the characteristic geometrical mosaic pattern that emerges from the packing of epithelial cells, covering large parts of each brain lobe and thus providing a well-defined structure that could be imaged at various depths. We found that AO aberration correction made a small but distinct improvement to the contrast in the images of the proximal side of the brain lobe (Figure 4A). On the distal side, control images bypassing the AO suffered from severe aberration-induced reconstruction artefacts, with the cell outlines not observed as continuous structures (Figure 4B, left). In contrast, imaging at the same depth with AO correction enabled 3D-SIM reconstructions to produce clearer 3D images with reduced artefacts and enhanced contrast. Large areas of the cell outlines could be visualised (Figure 4B, right). The effect of aberrations and their correction is most evident in the red channel, where the images of the individual beads closely mimic the shape of the system’s PSF, while the equivalent images without AO result in large unrecognizable fuzzy patches, instead of sharp spots. While the AO corrected images from the distal site showed minor residual aberrations (see Discussion for details) and reconstruction artefacts (see Methods and materials for further information), the overall PSF shapes were still dramatically improved with AO compared to the equivalent without AO. Moreover, the peak intensity and signal-to-noise ratio (SNR) with AO are also significantly higher than the results without AO, and the lateral FWHM resolution is preserved even at this depth (Figure 4C).

A key feature of Deep3DSIM is the ability to perform remote focusing. On the other hand, the reconstruction process of 3D-SIM is very stringent about the image quality and the image acquisition parameters, such as the accuracy and precision of Z steps in the acquisition of Z-stacks. Therefore, we wanted to show that remote focusing can meet the strict requirements of 3D-SIM reconstruction, and that the remote focusing can be used as a direct replacement of a mechanical Z stage over a moderate range. We demonstrate this with two examples, starting first with live imaging. Living specimens already present several challenges for 3D-SIM imaging on their own (Gao et al., 2012). Particularly important are the acquisition speed, which can lead to motion artefacts if not sufficiently high, and photobleaching and phototoxicity, which occur when the rate of exposure and the intensity of the excitation light are too high. The Deep3DSIM approach has the potential to resolve both these challenges by using remote focusing for fast image acquisition while preserving the high image quality of the system. To show this, we imaged Drosophila embryos undergoing rapid mitosis, where each division cycle lasts just a few minutes (Figure 5A). We collected eggs from transgenic animals expressing the microtubule-binding protein Jupiter fused to GFP and the histone H2A proteins fused to RFP. The Jupiter::GFP H2A::RFP eggs were dechorionated and mounted in aqueous solution in order to image chromosomes and microtubules simultaneously, during mitotic division in the early syncytial blastoderm stages. Using the objective lens in a water-dipping configuration, we applied our remote focusing to repeatedly acquire two-colour image volumes at a depth of approximately 5 μm under the surface of the egg. We performed this imaging at a rate of 20 fps, equivalent to about 10 s per 3D volume, at room temperature for several minutes. The image stack of each volume required 210 raw images (5 phases × 3 angles × 2 channels × 7 sections) for the 3D-SIM reconstruction. We selected four time points within the first two minutes of the time series (Figure 5A), showing the synchronous transition from metaphase to telophase. Despite minor reconstruction artefacts (see Methods and materials for details), these results demonstrate that the remote focusing functionality of the system can be successfully applied for live 3D-SIM experiments, allowing four-dimensional acquisition while keeping the specimen stationary, thus avoiding the usual agitation and perturbations associated with mechanical actuation.

In the second example, where we applied remote focusing, we combined it with an advanced form of aberration correction. Thick biological samples, even adherent cells in culture, can induce optical aberrations of variable type and magnitude throughout the volume of the sample, owing to RI inhomogeneities. Aberration correction methods can be used to correct the aberrations in every imaging plane in the volume. However, this is a time-consuming process which can lead to unnecessary photobleaching and phototoxicity. A faster method is to perform the correction routine at fewer planes and then to estimate the correction modes for all other planes, an approach that we called ‘multi-position aberration correction (MPAC).’ We demonstrate MPAC in its simplest form, a linear estimation of two planes, which we found to work well, partially because some of the dominant aberrations, like spherical, are proportional to the imaging depth. By combining MPAC with remote focusing, we were able to demonstrate the full capability of Deep3DSIM’s AO, as well as a large Z range of more than 16 μm (129 sections, 125 nm apart). We imaged fixed COS-7 cells in culture, stained for microtubules, to visualise the spindles during mitosis (Figure 5B, C). We measured the aberrations at two distinct positions, marked with arrows in Figure 5B, and then we estimated the corrections at different Z positions within the volume. Our results show conclusively that remote focusing with MPAC produced images with improved resolution compared to those without AO. Our method improved both the lateral and axial resolution (Figure 5B), and intensity profiles (Figure 5C). We conclude that the use of MPAC-based estimation, to correct aberrations for an entire imaging volume, and the use of remote focusing offer significant advantages to 3D-SIM imaging with an upright configuration.

Discussion

We have demonstrated that the design of Deep3DSIM, a prototype upright super-resolution microscope with integrated AO, enables effective 3D super-resolution imaging within thick complex tissues to a depth of at least 130 µm. Its modular design and Python-based open-source software for both system control and GUI (see details in Methods and materials), make it comparatively user-friendly and versatile. These design concepts allow the adaptation of the instrument to suit specific research applications and enable protocols for experiments at scale. The upright optical design combined with aberration correction and remote focusing makes the instrument particularly suited for applications that require super-resolution volumetric imaging of live (or fixed) thick tissue, organs, or whole organisms. Deep3DSIM is also uniquely positioned for the imaging of the whole sample in expansion microscopy, without the need for mechanical sectioning, which is a challenge because of the softness and fragility of commonly used hydrogel techniques (Cahoon et al., 2017). Importantly, the design is compatible with specimen manipulation, such as microinjection, carrying out electrophysiology measurements, or liquid handling and exchange, while imaging experiments are in progress. In this study, we have established a new paradigm for 3D-SIM to be applied at depth while acquiring volumes without moving the specimen or the objective lens, and while varying the aberration corrections across the acquired volume, in a depth-specific manner. This general principle can now be applied to any SIM configuration, including an inverted one, which can make use of the highest possible NA objective lenses to obtain volumetric SIM datasets at depth with minimal artefacts and maximum possible spatial resolution.

 Current commercially available 3D-SIM instruments are mostly based around inverted setups and high NA oil immersion objectives that are susceptible to aberrations when focusing into any specimens with RI mismatch. Such systems frequently only produce good images up to depths of a few µm. Silicone immersion objectives can improve the situation by allowing high-quality SIM up to a depth of 20 µm, despite having a slightly lower NA (Ogushi et al., 2021; Richter et al., 2019). In all cases, objective correction collars can only effectively minimise spherical aberrations for SIM imaging at a single depth in a single channel. For multi-colour imaging, the RI of the mounting medium and the immersion medium, and the objective lens’ correction collar setting, must be carefully tuned together to match the optimal wavelength-specific OTFs. Frequently, a compromise is made across different channels and imaging depths to accommodate multiple colours (Demmerle et al., 2017; Wang et al., 2018). Therefore, for most practical purposes, these microscopes can only effectively be applied to thin single layers of tissue culture cells on microscope slides or only to the superficial surface of complex tissues. Furthermore, they are not readily adapted to imaging regimes which require the specimen and the objective to remain motionless while imaging. Hence, Deep3DSIM fills an important application gap for deep imaging in live multi-cellular tissue, as well as fixed material. The method creates new possibilities for fast live super-resolution imaging while carrying out specimen manipulations. Such manipulations include micro-injection into Drosophila embryos and electrophysiology measurements on living mouse brain sections or Drosophila NMJ preparations.

 An important consideration in 3D-SIM reconstruction is the precise settings of the algorithms, such as filtering. We chose to use a well-established reconstruction method based on Wiener filtering, as implemented in the softWoRx software suite, without advanced pre-processing of the image data for reduction of artefacts. This design decision enabled us to demonstrate the operating principles of Deep3DSIM and to adopt existing tools, such as SIMcheck for quality control of the raw and reconstructed data, and the overall characterisation of the imaging system. However, recently, several novel methods for 3D-SIM reconstruction have been developed (Cai et al., 2022; Cao et al., 2023). Such open-source software implementations should be compatible with Deep3DSIM, potentially achieving higher fidelity reconstructed data with reduced artefacts.

The resolution achieved with Deep3DSIM was 186 nm FWHM in the green channel using a water-based objective lens with an NA of 1.1. In comparison, inverted commercial 3D-SIM instruments routinely achieve ~120 nm FWHM resolution at the same wavelength using a high-NA oil immersion objective. While the Deep3DSIM resolution is, therefore, lower in absolute terms, this was a design trade-off chosen to make the microscope compatible with live imaging in an upright configuration and with a large working distance. Hence, it is more pertinent to compare the super-resolution of Deep3DSIM to that achievable for live cell imaging in widefield mode, at the same wavelength and with the same 1.1 NA water-based objective lens, which was 333 nm FWHM. Nevertheless, the fundamental design of Deep3DSIM could be readily adapted, if necessary, to work in upright or inverted mode with other high NA objective lenses. For example, oil or silicone immersion objective lenses, which would still fully benefit from the use of AO for aberration correction and motionless deep imaging. In these cases, the working distance would be compromised, making it much more difficult to manipulate the sample, for example, by microinjection or with electrophysiology tools. Although the design of Deep3DSIM was mostly optimised for upright microscopy with tissue specimens, it can also benefit other types of microscopy and applications. Notably, CryoSIM (Kounatidis et al., 2020) would hugely benefit from the inclusion of AO to correct for spherical aberrations induced by increased RI mismatch and instabilities at cryogenic temperatures, together with sample-induced aberration, particularly with non-flat overlying ice at the specimen-air interface.

Our AO methods provide only one average correction for the entire field of view (FoV). However, aberrations can differ across the FoV, and areas of higher image metric may bias the correction to better correct the aberrations contained within those areas. In practice, we found that field-dependent aberrations are usually not a problem for modest FoVs, such as the one in Deep3DSIM with a maximum diameter of about 96 μm. Nevertheless, we did experience residual aberrations when we imaged through the complex tissue of an entire Drosophila larval brain, approximately 130 μm deep, as measured with beads (Figure 4B). Zonal aberration correction can help with the compensation of field-dependent aberrations by splitting the FoV into multiple zones and then correcting for aberrations zone by zone, but it requires another level of instrumentation and control methods (Rajaeipour et al., 2020). Although the current Deep3DSIM design is not intended for this type of correction, it could be added to the instrument if required. Furthermore, the spatial light modulator (SLM), used for the generation of the SI, could also be used to compensate for field-dependent aberrations of the SI in the imaging plane (Gong and Scherer, 2023), in addition to the correction already provided by the DM.

The range of remote focusing shown in this work was limited to about ±8 µm, which is smaller than some previous demonstrations. There are two reasons why we did not use wider ranges in our results. First, the DM can only reliably produce PSFs with minimal aberration repeatedly up to a range of about ±10 µm (see Appendix 2). Second, for 3D-SIM acquisition, ±5 µm is already a wide and commonly used Z range. Especially in multi-colour imaging, a rather large number of images, about 1200 frames (3 angles × 5 phases × 80 Z steps) per channel, would be required. While larger volume acquisition is possible, it would lead to considerable photobleaching and phototoxicity, as well as longer acquisition and reconstruction times. Nevertheless, a similar DM device has been used to achieve an impressive range of remote focusing (Cui et al., 2021). Although their approach is different from the high-NA super-resolution imaging presented here, a stable DM should increase the practical range of our remote focusing approach twofold or even greater. In terms of speed, refocusing with the DM instead of using the piezo stage reduced the duration of the image acquisition significantly, because the settling time of the surface of the DM was around an order of magnitude faster than the settling time of the piezo stage, which highlights the potential of AO-based refocusing for fast imaging (see Appendix 3 for further details).

 There are various routes through which the system performance could be enhanced in future versions of Deep3DSIM. First, the speed of acquisition in the reported demonstration was limited by a few key system components, which could be improved in future developments. A key limiting factor in widefield mode was the EM-CCD camera. The best overall performance in terms of SI modulation was achieved when using conventional CCD modes. However, due to the data transfer time (3 MHz pixel readout rate), the frame rate was limited to about 100 fps for a reasonably sized FoV. This acquisition speed could be increased by using modern sCMOS cameras. In addition, in SIM experiments, the imaging speed was also limited by the update rate of the SLM, determined by the settling time of the nematic liquid crystal technology (>20 ms), giving a maximum frame rate of 50 fps raw data. Fast SLMs using ferroelectric liquid crystal materials have been demonstrated in SIM for high-speed SIM imaging (Lin et al., 2021), though it is more challenging to use such SLMs for multi-colour imaging, as the phase modulation is binary and hence can only be optimised for discrete wavelengths. Third, future improvements could also include the incorporation of denoising algorithms into the reconstruction process to allow lower light dosage, which would enable faster imaging, reduced photobleaching, and reduced specimen photodamage (Huang et al., 2018; Smith et al., 2021). Finally, the Deep3DSIM prototype was custom-built on a large optical table, instead of basing the design around a commercial upright microscope stand, because the latter would have imposed unnecessary restrictions at this prototyping stage. In future, the optical path of Deep3DSIM could be simplified and reduced in size. Such a future design would make it easier to adopt or commercialise and to modify Deep3DSIM for bespoke applications.

Materials and methods

Microscope

We based the Deep3DSIM system on our previous system, Cryo-SIM (Phillips et al., 2020), which was developed from earlier work in John Sedat’s group at UCSF (Dobbie et al., 2011; Gustafsson et al., 2008). The microscope body was built in an upright configuration (Figure 1) around a bespoke-engineered scaffolding arch, which we designed to ensure sufficient mechanical stability for super-resolution microscopy while accommodating a pair of micromanipulators for electrophysiological experiments. We used a water-immersion objective lens (Olympus LUMFLN60XW), capable of working in a water-dipping configuration by adjusting a correction collar, because its combination of high NA and long working distance made it most suitable for deep imaging of tissues, particularly Drosophila brain and NMJ, and for electrophysiological experiments. More details about the optical setup can be found in Appendix 4.

Structured illumination

We used a nematic liquid-crystal SLM (Boulder Nonlinear Systems HSP512-488-800) to create SI with optimal contrast on the sample (modulation contrast). The SLM was positioned in the sample-conjugated plane and linear sinusoidal patterns were generated for three orientation angles. The ±1st order and zero-order beams were used to interfere at the sample plane to generate the 3D SI. To achieve optimal interference contrast, the linear polarisation of the excitation beam from the SLM was rotated using a liquid crystal device (Meadowlarks Optics LPR-100-λ) to maintain S polarisation at the incident plane at the back focal plane (BFP) of the objective lens for each SI orientation. For widefield acquisitions, the SLM was switched off and used as a single reflector so that only the zero-order beam provided epi-illumination, while for SIM acquisition, the SLM was switched on and it was preloaded with the patterns for the specific wavelength and line width. During image acquisition, the SI phase and angle were rotated in a fixed sequence, synchronised with the polarisation rotators, the light sources, and the cameras.

Deformable mirror

We used a DM (Alpao DM69-15) which was positioned between the main dichroic filter and the microscope body, conjugated to the BFP of the objective lens. The DM was used for the correction of aberrations in both the excitation and the emission paths. The DM aperture was matched to the BFP by an optical relay system. We included a pair of flip mirrors to create an optical path that bypassed the DM (dotted path in Appendix 4—figure 1), allowing easy comparison between imaging without and with AO.

Software

The system control software package was composed of three parts (Figure 1C): python-microscope for low-level hardware control of the microscope system (Susano Pinto et al., 2021), microscope-cockpit for a user interface and additional functionality such as experiments (Phillips et al., 2021), and microscope-aotools for performing DM calibration and AO aberration correction (Hall et al., 2020). All three were further developed and improved to match the requirements of Deep3DSIM. Most notably, remote focusing was added to the microscope-aotools package. All changes were tracked and they are described in more detail in the respective branch repositories:

The complex orchestration and timing of experiments, carried out by Cockpit, was implemented with a specialised embedded computer (Red Pitaya STEMlab 125–14 v1.0).

For SIM data reconstruction, we used the commercial software softWoRx (Cytiva), which uses Wiener deconvolution, based on the approach by Gustafsson et al., 2008. OTFs required for reconstruction for each imaging channel were generated from experimental data acquired from a single fluorescent bead.

Multi-colour image alignments were performed in the open-source software Chromagnon (Matsuda et al., 2020). SIM characterisation was done with SIMcheck (Ball et al., 2015). General image processing was done in Fiji (Schindelin et al., 2012).

Sensorless AO aberration correction

We used standard Zernike-based modal sensorless AO, as described previously (Antonello et al., 2020; Hall, 2020). We calibrated the DM with an integrated interferometer (Appendix 4—figure 1, dashed path) to obtain an accurate Zernike mode control matrix, which we used for both aberration correction and remote focusing. The duration of the calibration routine was around 30–60 min, mostly spent on phase unwrapping and other intensive computations. The resulting control matrix could be reused for weeks at a time, before recalibration was necessary, and potentially much longer on devices which are not subject to creep effect.

Our base set of correction modes included 8 Zernike modes: #5 (primary oblique astigmatism) to #11 (primary spherical) and #22 (secondary spherical), following Noll indexing. Example correction modes and their amplitudes for each sample are given in the figure supplements of Figure 3 to Figure 5. Our aberration correction method used a standard optimisation algorithm where each mode was scanned individually by applying several different mode biases to the DM, e.g., {1, 0.5, 0, 0.5, 1} rad, and acquiring widefield images each time. For our base set of correction modes, this scanning arrangement resulted in 40 images in total. Metric values were calculated for each image and a Gaussian curve was fitted to the amplitude-metric points. The mean of the Gaussian was used as the optimal correction for the Zernike mode. We used a Fourier-based metric which is described in Appendix 5. We observed that ISOsense (Žurauskas et al., 2019) achieved the most robust correction (data not shown) when the images did not have prominent high spatial frequency content, i.e., few or no sub-diffraction features.

In our approach to aberration correction, we always corrected the system-induced aberrations first, by using a sparse bead sample and by tuning the objective lens collar. Any subsequent correction of sample-induced aberrations was done on top of this ‘system flat’ correction. When imaging, we always carried out the aberration correction routine first and then held the DM in the corrective shape during the actual image acquisition. For relatively thin volumes, e.g., 1–2 µm thickness, we mostly corrected just a single plane (usually the middle of the stack), but for thicker volumes, we developed our MPAC approach. The duration of the correction routine for a single plane was proportional to the number of modes involved, the number of scanning points for each mode, and the camera exposure configuration, but usually in the order of seconds. The earlier example of 40 images for our base set of corrections would normally take around 10 s (less than 4 s of actual light exposure). The correction was independent of the subsequent image acquisition. This independence between the two processes let us configure separately the exposure settings for the correction routine, allowing us to tune parameters, such as exposure time and laser power, in a way that minimises issues such as low contrast and noise. We further accounted for noise in our Fourier-based image metric, as described in Appendix 5.

Multi-position aberration correction

We used MPAC to compensate for the optical distortions in thick volumes. We applied our standard sensorless AO correction routine at fixed positions, e.g., at the top and the bottom of the volume, and then we did a linear fitting of the resulting correction Zernike modes. This simple approach allowed us to calculate in advance the corrections for every section of the volume. All corrections required for an imaging session were stored in the DM driver beforehand and then rapidly iterated via electrical (TTL) signals in real-time during the session, same as the remote focusing function.

Remote focusing

We controlled the remote focusing and the aberration correction independently, each one with its own set of corrections, which were then summed together if both functionalities were to be used simultaneously. To calculate the required remote focusing DM patterns, we first performed a calibration procedure, which was based on our aberration correction method and hence used the same type of optimisation algorithm. Our approach achieved a linear response with high precision, within a range of ±5 µm. Because of instability problem with the DM, we then had to perform a second calibration step to achieve high accuracy as well. We describe all this tuning in Appendix 2, and we elaborate on the instability problem in Appendix 6. Similarly to the MPAC, all remote focusing patterns were computed in advance before the image acquisition and then they were iterated by TTL triggers. When remote focusing was combined with 3D-SIM imaging, the focal plane and the SI pattern were synchronously moved together during axial scanning, keeping constant the phase relationship between the two. The change in effective NA due to focus shift was negligible with our choice of focusing range (up to ~16 µm) and objective lens.

SIM reconstruction artefacts

Super-resolved images in 3D-SIM are obtained with a reconstruction process which can sometimes lead to artefacts (Demmerle et al., 2017). In this study, we intentionally chose difficult-to-image samples, to demonstrate the full capability of the Deep3DSIM prototype and to show realistic examples.

 Optical aberrations are a major source of reconstruction artefacts, and this is very well illustrated in Figure 3A. Without AO, the weak signal in the green channel resulted in high spatial frequency noise (hammerstroke) and stripe (hatching) artefacts. Furthermore, the enhanced RI mismatch led to increased spherical aberrations, and subsequently to ghosting artefacts, manifesting as faint ripples in some places in the red channel (arrow in Figure 3A). As expected, all these artefacts were removed in the AO corrected images (Figure 3B).

 However, in some situations, AO correction cannot completely remove all artefacts. We demonstrate this in Figure 4B, where the corrected volumes still have visible artefacts. There are two main reasons for this. First, high levels of optical scattering are experienced at such extreme imaging depth without optical clearing of the tissue. The scattering is a problem not only on the excitation side, leading to poor modulation contrast of the SI, but also on the emission side where it leads to weak fluorescence signal with poor contrast. These problems resulted in hatching artefacts, most noticeable in the green channel (Figure 4B, right). On the other hand, an extreme imaging depth also leads to large and complex aberrations which may not be entirely corrected. For example, some of the aberrations may have a magnitude that cannot be adequately compensated by the wavefront corrector, such as a DM, or likewise, they may contain high-order components which cannot be faithfully reproduced. In our case, such residual spherical aberrations resulted in ghosting artefacts in both channels (Figure 4B, right).

 Finally, we also observed pronounced hatching artefacts in the green channel of our live imaging of a Drosophila embryo (Figure 5A), which was caused by poor modulation contrast at one of the SI angles.

Cell culture

COS-7 (ATCC, CRL-1651) cells were grown on #1.5 glass cover slips in DMEM containing 10% foetal bovine serum (Sigma-Aldrich), washed twice with 2 ml of 100 mM sodium phosphate and fixed for 2 h in 2 ml PLP (2% [wt/vol] paraformaldehyde in 87.5 mM lysine, 87.5 mM sodium phosphate at pH 7.4, and 10 mM sodium periodate). Cover slips were washed three times in 2 ml (100 mM) sodium phosphate, pH 7.4, before permeabilisation in 1 mg/ml BSA, 0.05% saponin, and 100 mM sodium phosphate, pH 7.4, for 30 min. In all cases, primary antibody (α-Tubulin (mouse; DM1A Santa Cruz) AF 488 and Rtn4 (rabbit; AbD Serotec) AF 555) staining was performed in 1 mg/ml BSA, 0.05% saponin, and 100 mM sodium phosphate, pH 7.4 for 60 min at room temperature. Affinity-purified antibodies were used at 1 µg/ml; commercial antibodies were used as directed by the manufacturers. DAPI was added to the secondary antibody staining solution at 0.3 µg/ml. Cover slips were mounted in Mowiol 4–88 mounting medium (EMD Millipore).

 The microtubules stained in this way occasionally appeared fragmented, more noticeable in the SIM images in Figure 2B. This fragmentation is normal for a sample preparation protocol, such as ours, based on 2% paraformaldehyde fixation. More advanced protocols, specifically designed for the preservation of structures such as microtubules, can result in more continuous filaments.

Fly strains

The following fly lines were used in this study. Nrx-IV::GFP (CA06597, kindly gifted by the Rita Teodoro lab) for the NMJ samples, cno::YFP (CPTI000590, DGRC 115111) for the brain samples, Jupiter::GFP, his2A::RFP (Hailstone et al., 2020) for embryos. All stocks were raised on standard cornmeal-based medium at 25°C.

Fixed Drosophila larval fillet for imaging of neuromuscular junctions

The NMJ samples were prepared by following the protocol in Brent et al., 2009, apart from the fixation, which was done in 4% paraformaldehyde (PFA) in PBSTX (phosphate-buffered saline with 0.1% Triton X-100) for 30 min and then followed by two washes for 20 min each in PBSTX. The samples were then incubated with Cy3-conjugated α-HRP antibody (1:500, Jackson ImmunoResearch 123-165-021) for 1 hr. Finally, the samples were washed in PBSTX for 45 min and then mounted in Vectashield. All steps were done at room temperature.

Fixed whole Drosophila brains

Third instar Drosophila larvae were dissected in PBS. Brains were removed and fixed in 4% PFA in PBSTX (0.3% Triton X-100) for 25 min. Afterwards, they were rinsed three times in PBSTX, further permeabilised with two 20 min washes, and then mounted in PBS.

Drosophila embryos for live imaging

The live embryo samples were prepared as described in Parton et al., 2010, except that they were mounted in PBS.

Acknowledgements

John Sedat for the original optical design; Antonia Göhler and Mantas Žurauskas for the initial optical characterisation, work which was published previously; Martin Hailstone, Francesca Robertson, and Jeff Lee for preparing specimens during various phases of using the Deep3DSIM instruments with biology, imaging that was not shown in the manuscript. JW thanks Jacopo Antonello, Chao He, and Jiahe Cui for insightful discussions and advice for AO devices. We are grateful to Micron Oxford and its numerous partners and staff for discussions and providing the environment required for the success of this complex interdisciplinary technology development project.

Appendix 1

Structured illumination pattern generation

The stripe width of the sinusoidal SI was 317 nm for the green channel and 367 for the red channel. We used a SLM to diffract the incoming excitation beam into three paths, consisting of the zero-order beam and the ±1 order diffraction beams. The three beams interfered at the sample plane to create an axially varying stripe pattern. An example of the pattern is shown in Appendix 1—figure 1.

 The width of the stripe pattern was chosen to be larger than the theoretical possible, e.g., 271 nm in the green channel by using the Rayleigh criterion for resolution. There were three reasons for this choice. First, in the frequency domain, we observed that the highest spatial frequency amplitudes decreased as the resolution approached this theoretical limit. Using a slightly larger stripe width drastically improved the intensity of these amplitudes, resulting in improved SNR of the high-resolution features in the reconstructed images. Second, the radial distance of the first-order beams in the BFP, and hence on the DM aperture, increases as the resolution decreases. However, the actuation of the reflective membrane of the DM was less effective at the periphery because of boundary conditions. Third, higher spatial frequency stripes meant a higher incidence angle of the outer beams on the sample and the effect of spatially varying aberrations was further enhanced at these higher angles. The change in angle could be corrected in principle, because the beams were focused on the DM, but the field-dependent aberrations that affected each individual beam differently could not be fully corrected in this way.

Appendix 1—figure 1. Structured illumination microscopy (SIM) pattern generation.

Appendix 1—figure 1.

(A) Example of the structured illumination (SI) pattern at one of the angle orientations and phase shifts, visualised by imaging a monolayer of 100 nm diameter fluorescent beads in the green (left) and in the red (middle) channels. The power spectrum (right) shows combined frequency response in all three angles of the green channel, with the centre masked with a black circle to create a better contrast. Scale bars: 5 μm. (B) Fluorescence signal modulated with structured illumination, taken from the centre of a single red fluorescent bead. The plot shows three 50 μm scans through the bead, one for each of the SI angles.

Appendix 2

Remote focusing calibration and range

We implemented the remote focusing by reshaping the wavefront phase with the DM, with same type of Zernike-based modal control as the aberration correction. Effectively, it was a mapping between Zernike mode coefficients and Z positions. We created this mapping with a sample of a single layer of sparse 100 nm fluorescent beads; we first introduced an axial displacement with the piezo stage and then we used the DM to bring the beads back into focus with our sensorless aberration correction by also including the defocus mode.

We validated the remote focusing by displacing the beads sample by four offsets, {5,2.5,2.5,5} µm, and acquiring a Z-stack for each offset. We then checked if the displacements of the beads within the Z-stack matched the displacement of the stage. We repeated the same procedure with just the piezo stage, using the data as a ground truth. Each volume was acquired 10 times for statistical analysis, and likewise, we analysed the same 10 beads from each volume. We used Gaussian fitting to find the centre Z position of each bead. In terms of precision, we calculated the exact pooled variance of each of the offsets to be {6.45,4.15,20.58,11.83} nm, respectively. This demonstrated the good repeatability of the DM refocusing. However, the images showed mismatches in the position measurement (Appendix 2—figure 1A) and the image of a single bead was elongated (Appendix 2—figure 1B). We interpreted this to be caused by the creep effect of the DM. The findings from Appendix 6 demonstrated two important observations: (1) most of the initial change in wavefront phase due to the creep effect was within the first minute; (2) the exponential rate of change meant that the creep effect was effectively linear during this initial period. The rate and the way in which the shape of the DM was changed during the calibration of the remote focusing was different from those used during subsequent image acquisitions. Therefore, the calibration compensated for more creep than was typically encountered during actual imaging experiments. However, the rates of both the calibration and the imaging were well below the 1 min threshold, typically in the order of milliseconds, and therefore the contribution of the creep effect was only different up to a scale, because of the linearity in this region. We calculated this scaling factor as the ratio of the slopes of the lines in Appendix 2—figure 1A, in this case yielding a value of around 0.7. This scaling approach allowed us to get accurate remote Z displacements that matched the ground truth displacements obtained with the piezo stage, as shown in Appendix 2—figure 1C.

We usually performed remote focusing with a range of ±5 µm or less, because this is already a large volume, requiring thousands of images to be acquired for multi-colour 3D-SIM, and because the image quality was optimal in this range. We sometimes extended this range, e.g., to the ±8 µm range shown in Figure 5B, but we noticed that the quality of the PSF decreased beyond the ±10 µm range. We attribute this to the larger actuator control voltages, applied over longer periods of time, which meant that the impact of the creep effect on the DM was even stronger.

Appendix 2—figure 1. Effect of rescaling on remote focusing.

Appendix 2—figure 1.

(A) Comparison of remote focusing to conventional refocusing with piezo stage. (B) Demonstration of how pixel rescaling restores the axial profile of bead images acquired with remote focusing. (C) Displacement comparison between piezo (left) and remote focusing (right). Scale bars: 1 μm.

Appendix 3

Axial scanning speed

The remote focusing was desired not only for its ability to scan the sample axially while keeping it stationary, but also for the faster scanning speed compared to piezo stages. In our case, this was already indicated from the manufacturers’ datasheets, which listed the settling times of the DM and the Piezo as 0.4 ms and 7 ms (10% step width), respectively. The settling time is the time it takes for the device to reach a new position and maintain it without oscillation. In the case of the piezo stage, this happens every time the device moves up or down, whereas for the DM, it is a change in the shape of the reflective surface. Clever execution of imaging experiments can negate a lot of this ‘dead time’ by, for example, reading out image data from the detectors in the meantime (Cockpit already does this). However, fast image acquisition with low readout times can lead to a situation where the settling time needs to be waited out. We measured the step responses of the DM and the piezo stage, which allowed us to estimate the settling times. The values we obtained were substantially higher than the reported values from the respective datasheets. The response of the DM was measured by checking the fringe patterns of the integrated interferometer. Limited by the interferometer camera readout speed, we estimated the settling time to be between 1 and 2 ms. The response of the piezo stage was measured both with digital (using the manufacturer’s software PIMikroMove) and with analogue (using an oscilloscope) methods, with different step sizes. Using default PID parameters, we found the 10% step width setting time to be around 25 ms. However, for experiments, we used a step of size 125 nm, for which the settling time was around 10ms. In conclusion, the settling time of DM was close to an order of magnitude faster than the piezo stage.

Appendix 4

Optical design

An optical diagram of Deep3DSIM is shown in Appendix 4—figure 1. The laser module was composed of four commonly used laser lines: 405, 488, 561, and 642 nm. The 405 nm laser was incompatible with the SLM and thus was only used in widefield mode. We ended up doing all experiments with the 488 nm and 561 nm channels, so there are only two detectors in the diagram, but two more can be added for 4-channel simultaneous imaging. The system could easily be scaled to use more than four channels. By using a half-wave plate in each path, all laser beams had their polarisation rotated to be parallel to the slow axis of the SLM (Boulder Nonlinear Systems HSP512-488-800). The four beams were then combined by mirrors and dichroic filters into the same path before being guided to the SLM. The flip mirrors FM5 and FM6, together with the FBS beamsplitter, were used to bypass the SLM and create an integrated interferometer (dashed path) for the calibration of the DM. The excitation beam was deflected by the SLM towards the DM. By applying a sinusoidal stripe pattern to the SLM at a sample conjugated plane, the excitation path was diffracted into three beams consisting of ±1st order and zero-order beams. The three beams interfered at the sample plane below the objective lens to form the 3D SI. The foci of the three beams were filtered by a 7-point spatial filter (A2). A polarisation rotator (Meadowlarks Optics LPR-100) was used to change the linear polarisation angle of the structured illumination pattern during experiments. The excitation light and the emission light were separated by a dichroic filter (D4, Chroma Technology ZT405/488/561/640rpc) that was aligned at 22.5° from the optical axis to achieve the optimal sharpness of the transitions between the transmission (excitation) and reflection (emission) for all wavelengths. The combined excitation beam was deflected by a DM (Alpao DM69-15) which was conjugated to the BFP of the 60× objective lens (Olympus LUMFLN60XW). A 10× air objective lens (Olympus LMPLFLN10X) was used for widefield imaging with a large FoV (dash-dotted path). Flip mirror FM2 was used to switch between the two objective lenses, whereas flip mirror FM1 was used to direct the white LED light LL to one of the objective lenses. Note that only the 60× objective lens was optimised and intended for SIM and AO. The pair of flip mirrors FM3 and FM4 was used to bypass the DM (dotted path) for comparison between AO and no AO. The sample was moved in 3D with an assembly of two motorised piezo stages, PI M-687 (XY) and PI P-736 (Z). This entire assembly was further mounted on a heavy-duty Z-stage (Aerotech PRO115), which was used for coarse large-range Z control. The emission light was collected simultaneously from multiple channels by EMCCD cameras (Andor iXon Ultra 897). The separation of green and red fluorescence was done with a dichroic filter (D5, Chroma ZT561rdc-xr), and each channel was further filtered by individual band-pass filters: ET525/50 m (Chroma) for the green channel and ET600/50 m (Chroma) for the red channel.

Appendix 4—figure 1. Optical arrangement.

Appendix 4—figure 1.

405 nm, 488 nm, 561 nm, and 640 nm: laser sources. S: mechanical shutter. BX: beam expander. F1 to F2: optical filters. M: mirrors. λ/2: half-wave plates. L1 to L13: lenses, whose focal lengths are given in the table at the top of the figure. P: polariser. SLM: spatial light modulator. A1 to A3: apertures. FM1 to FM6: motorised flip mirrors. PR: polarisation rotator. BS: beamsplitter. FBS: 50/50 beamsplitter on flip mount. D1 to D5: dichroic beamsplitters. DM: deformable mirror. C1 to C2 are EMCCD cameras, and C3 is a CCD camera for the wavefront sensing interferometer. 10× and 60× are objective lenses. ST: assembly of stages (X, Y, and Z movement). LL: LED light used for brightfield imaging.

Appendix 5

Image metric

We used an image metric based on Fourier-domain analysis to quantify the image quality during the aberration correction routine. The algorithm, as well as other metrics, was previously developed as part of the microscope-aotools software package. An image I(x,y) was transformed to frequency space and a logarithmic power spectrum S(u,v) was obtained:

S(u,v)=ln(|F{I(x,y)}|2)

The zero-frequency component was shifted to the centre of the spectrum. The Rayleigh criterion was used to derive the cutoff frequency ωc, which was used to define two filters, one high-pass and one band-pass:

FH={1,if (u,v)2>1.1ωc0,otherwise
FB={1,if ωc>(u,v)2>0.1ωc0,otherwise

The high-pass filter was used to separate the high-frequency noise and then to define a noise threshold S-n as the mean of all noise pixels:

S¯n=1UVuUvVS(u,v)FH

The band-pass filter was used to further remove the contributions of the 0-frequency and low-frequency regions, often marked by strong artefacts (e.g. border effect) and containing signal of little interest, such as background light and blur. This created the final binary set of pixels associated with the metric:

Sm={1,if (S(u,v)FB)>S¯n0,otherwise

Finally, the numeric value of the metric was derived by simply counting the number of non-zero pixels in the subset:

|{ssSm(u,v) and s=1}|

In summary, the metric quantified the spread of the spatial frequency content in the image, and the metric was at its maximum in the absence of aberrations.

Appendix 6

DM creep effect

We observed that our DM (Alpao DM69-15) exhibited certain instability, where the surface of the mirror deviated from its intended shape over time. We note that this particular DM was a model manufactured in 2014; we note also that, to our knowledge, later revisions of the same model have been improved to remove such effects. Unlike the well-known hysteresis, this temporal behaviour known as ‘creep’ has been reported in the literature, e.g., (Bitenc, 2017; Bitenc et al., 2014), and it has been attributed to a combination of drift and temperature effects. To better understand the dynamic behaviour of the device, we performed a series of experiments.

The first experiment aimed to test the long-term stability of the device. The total duration of the experiment was 16 hr, and it was divided into 1 min cycles. For the first 50 s of each cycle, random offsets were added to the shape, with a frequency of several hundred hertz. The idea was to simulate normal operation of the device. The offsets were uniformly distributed in the range ±5% of the total range, so on average, the shape remained the same. In the final 10 s of each cycle, the original shape was re-applied and kept for the remainder of the cycle. Interferograms were captured at the end of each cycle, and root mean square (RMS) wavefront phase error from the original shape was calculated (Appendix 6—figure 1A). This experiment confirmed our observation about the change in shape and it showed significant changes within the first hour, as well as a steady increase in error even after 16 hr.

To further distinguish between the effects of drift and temperature, we performed a second set of experiments. The device was warmed up and initialised by holding defocus for 4 hr. Six different shapes were then applied and monitored for 20 min each. Then a final shape was held for 60 min. In all these experiments, the DM was undisturbed until the next change in shape. All the shapes were individual Zernike modes with a coefficient of 1 radian; the shapes applied are listed in Appendix 6—table 1. The RMS wavefront phase errors were calculated in the same way as before, and the results are shown in Appendix 6—figure 1B. They show that the device had already reached its steady state by the first data point at the 1 min mark. This indicated that the creep effect had a long-term component, as observed in the previous experiments, but there was also a significant short-term component which worked on the scale of seconds or even shorter. The nature of the creep effect depended on the current and the previously applied shapes, and as such, it acted as a memory effect where the final shape was a time-variant combination of the current and the previous shapes.

Appendix 6—table 1. Experimental conditions.

Shape Zernike mode (Noll index) Holding time [min]
S0 Z4 (defocus) 240
S1 Z5 (primary oblique astigmatism) 20
S2 Z6 (primary vertical astigmatism) 20
S3 Z7 (primary vertical coma) 20
S4 Z8 (primary horizontal coma) 20
S5 Z11 (primary spherical) 20
S6 Z22 (secondary spherical) 60

To mitigate the creep effects, we adopted a set of exercise routines which helped with warming up of the device and with bringing its shape closer to its neutral stable position. We used a combination of three main exercises to diversify the way the reflective membrane was agitated. The first exercise consisted of alternating checkerboard patterns (Appendix 6—figure 2A), each held for 3 s, for a total of 50 repeats. The two inverted patterns typically covered 80% of the total actuator range, such that the two values were set at 10% and 90% of the full range. This exercise routine was especially helpful for removing any large errors in Zernike modes from the DM, which had arisen because large control voltages were applied to actuators for extended periods. The second exercise was cycling through varying levels of defocus (Appendix 6—figure 2B), with coefficients linearly distributed in the range -5;5 rad, typically in 11 steps, each held for 100ms, for a total duration of about 10 min. This exercise helped to maintain the optimal PSF when using remote focusing. It could also restore the optimal PSF if an extremely large amplitude of remote focusing was applied and affected the PSF quality. The third exercise (Appendix 6—figure 2C) consisted of the same type of actuator poking as in the calibration procedure. Each actuator was set individually to several values linearly distributed in the range [10;90]% (80% of the total range). We typically used 11–21 steps, with one or two repeats. Because this exercise routine is identical to that used for Zernike mode calibration, it was best used for restoring optimal Zernike modes that are required for AO correction.

Appendix 6—figure 1. Deformable mirror (DM) drift and temperature characterisation.

Appendix 6—figure 1.

Wavefront phase root mean square (RMS) error (RMS error, RMSE) measurements for (A) long-term drift without warm-up and for (B) short-term drift with warm-up (4 hr of 1 rad defocus), following the sequences listed in Appendix 6—table 1. The two plots in (B) show shapes held for 20 min (top) and the final shape held for 60 min (bottom).

Appendix 6—figure 2. Actuator control signal patterns used in the deformable mirror (DM) exercise procedures.

Appendix 6—figure 2.

(A) Checkerboard. (B) Refocusing. (C) Individual poking.

Funding Statement

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication. For the purpose of Open Access, the authors have applied a CC BY public copyright license to any Author Accepted Manuscript version arising from this submission.

Contributor Information

Ian Dobbie, Email: ian.dobbie@jhu.edu.

Lothar Schermelleh, Email: lothar.schermelleh@bioch.ox.ac.uk.

Martin J Booth, Email: martin.booth@eng.ox.ac.uk.

Ilan Davis, Email: ilan.davis@glasgow.ac.uk.

K VijayRaghavan, National Centre for Biological Sciences, Tata Institute of Fundamental Research, Mumbai, India.

K VijayRaghavan, National Centre for Biological Sciences, Tata Institute of Fundamental Research, Mumbai, India.

Funding Information

This paper was supported by the following grants:

  • Wellcome Trust 10.35802/091911 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • Medical Research Council [MR/K01577X/1] to Ilan Davis.

  • Wellcome Trust 10.35802/096144 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • Wellcome Trust 10.35802/105605 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • Wellcome Trust 10.35802/107457 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • Wellcome Trust 10.35802/203141 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • Wellcome Trust 10.35802/209412 to Mick A Phillips, David Miguel Susano Pinto, Richard M Parton, Nicholas James Hall, Ian Dobbie, Ilan Davis.

  • European Research Council 10.3030/695140 to Jingyu Wang, Matthew Wincott, Martin J Booth.

  • HORIZON EUROPE Marie Sklodowska-Curie Actions 10.3030/766181 to Lothar Schermelleh.

  • Biotechnology and Biological Sciences Research Council [BB/M011224/1] to Danail Stoychev.

Additional information

Competing interests

No competing interests declared.

Author contributions

Development of control software; developed adaptive optics methods, algorithms and calibration; Implemented remote focusing method; Refined and reimplemented optical system and adaptive optics control; Refined experimental processes; Experimental strategy and specimen preparation; Imaging experiments, calibration, and data processing; Article conceptualisation; Initial draft of article and figure preparation; Revision of article.

Implemented electronics; Development of control software; Developed adaptive optics methods, algorithms and calibration; Implemented remote focusing method; Refined and reimplemented optical system and adaptive optics control; Refined experimental processes; Experimental strategy and specimen preparation; Imaging experiments, calibration, and data processing; Article conceptualisation; Initial draft of article and figure preparation; Revision of article.

Designed the microscope optics; Designed and implemented the optomechanics; Implemented electronics; Implemented initial optical system; Development of control software.

Development of control software.

Original vision and concept development; Acquisition of initial data; Refined experimental processes; Experimental strategy and specimen preparation; Article conceptualisation.

Implemented initial optical system; Development of control software; Developed adaptive optics methods, algorithms and calibration; Acquisition of initial data.

Acquisition of initial data.

Refined experimental processes; Experimental strategy and specimen preparation.

Development of control software.

Experimental strategy and specimen preparation.

Experimental strategy and specimen preparation.

Experimental strategy and specimen preparation.

Original vision and concept development; Designed the microscope optics; Implemented electronics; Implemented initial optical system; Development of control software; Acquisition of initial data; Revision of article; Supervision.

Refined experimental processes; Experimental strategy and specimen preparation; Imaging experiments, calibration, and data processing; Article conceptualisation; Revision of article; Obtaining funding; Supervision.

Developed adaptive optics methods, algorithms and calibration; Article conceptualisation; Revision of article; Obtaining funding; Supervision.

Original vision and concept development; Designed the microscope optics; Article conceptualisation; Initial draft of article and figure preparation; Revision of article; Obtaining funding; Supervision.

Additional files

MDAR checklist

Data availability

The Python software used to control the microscope is available on GitHub: python-microscope (copy archived at Stoychev, 2022a); microscope-cockpit (copy archived at Stoychev, 2022b); microscope-aotools (copy archived at Stoychev, 2023).

References

  1. Antonello J, Wang J, He C, Phillips M, Booth M. Interferometric calibration of a deformable mirror. Zenodo. 2020 doi: 10.5281/zenodo.3714951. [DOI]
  2. Ball G, Demmerle J, Kaufmann R, Davis I, Dobbie IM, Schermelleh L. SIMcheck: a toolbox for successful super-resolution structured illumination microscopy. Scientific Reports. 2015;5:15915. doi: 10.1038/srep15915. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bitenc U, Bharmal NA, Morris TJ, Myers RM. Assessing the stability of an ALPAO deformable mirror for feed-forward operation. Optics Express. 2014;22:12438–12451. doi: 10.1364/OE.22.012438. [DOI] [PubMed] [Google Scholar]
  4. Bitenc U. Software compensation method for achieving high stability of Alpao deformable mirrors. Optics Express. 2017;25:4368–4381. doi: 10.1364/OE.25.004368. [DOI] [PubMed] [Google Scholar]
  5. Brent JR, Werner KM, McCabe BD. Drosophila larval NMJ dissection. Journal of Visualized Experiments. 2009;24:1107. doi: 10.3791/1107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Cahoon CK, Yu Z, Wang Y, Guo F, Unruh JR, Slaughter BD, Hawley RS. Superresolution expansion microscopy reveals the three-dimensional organization of the Drosophila synaptonemal complex. PNAS. 2017;114:E6857–E6866. doi: 10.1073/pnas.1705623114. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Cai M, Zhu H, Sun Y, Yin L, Xu F, Wu H, Hao X, Zhou R, Kuang C, Liu X. Total variation and spatial iteration-based 3D structured illumination microscopy. Optics Express. 2022;30:7938–7953. doi: 10.1364/OE.451190. [DOI] [PubMed] [Google Scholar]
  8. Cao R, Li Y, Chen X, Ge X, Li M, Guan M, Hou Y, Fu Y, Xu X, Leterrier C, Jiang S, Gao B, Xi P. Open-3DSIM: an open-source three-dimensional structured illumination microscopy reconstruction platform. Nature Methods. 2023;20:1183–1186. doi: 10.1038/s41592-023-01958-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Cui J, Turcotte R, Emptage NJ, Booth MJ. Extended range and aberration-free autofocusing via remote focusing and sequence-dependent learning. Optics Express. 2021;29:36660. doi: 10.1364/OE.442025. [DOI] [PubMed] [Google Scholar]
  10. Demmerle J, Innocent C, North AJ, Ball G, Müller M, Miron E, Matsuda A, Dobbie IM, Markaki Y, Schermelleh L. Strategic and practical guidelines for successful structured illumination microscopy. Nature Protocols. 2017;12:988–1010. doi: 10.1038/nprot.2017.019. [DOI] [PubMed] [Google Scholar]
  11. Dobbie IM, King E, Parton RM, Carlton PM, Sedat JW, Swedlow JR, Davis I. OMX: a new platform for multimodal, multichannel wide-field imaging. Cold Spring Harbor Protocols. 2011;2011:899–909. doi: 10.1101/pdb.top121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Gao L, Shao L, Higgins CD, Poulton JS, Peifer M, Davidson MW, Wu X, Goldstein B, Betzig E. Noninvasive imaging beyond the diffraction limit of 3D dynamics in thickly fluorescent specimens. Cell. 2012;151:1370–1385. doi: 10.1016/j.cell.2012.10.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Gong D, Scherer NF. Tandem aberration correction optics (TACO) in wide-field structured illumination microscopy. Biomedical Optics Express. 2023;14:6381–6396. doi: 10.1364/BOE.503801. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Gustafsson MGL, Shao L, Carlton PM, Wang CJR, Golubovskaya IN, Cande WZ, Agard DA, Sedat JW. Three-dimensional resolution doubling in wide-field fluorescence microscopy by structured illumination. Biophysical Journal. 2008;94:4957–4970. doi: 10.1529/biophysj.107.120345. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Hailstone M, Waithe D, Samuels TJ, Yang L, Costello I, Arava Y, Robertson E, Parton RM, Davis I. CytoCensus, mapping cell identity and division in tissues and organs using machine learning. eLife. 2020;9:e51085. doi: 10.7554/eLife.51085. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Hall N. University of Oxford; 2020. Sample Correction Methods. In Accessible adaptive optics and super-resolution microscopy to enable improved imaging. [Google Scholar]
  17. Hall N, Titlow J, Booth MJ, Dobbie IM. Microscope-AOtools: a generalised adaptive optics implementation. Optics Express. 2020;28:28987–29003. doi: 10.1364/OE.401117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Huang B, Bates M, Zhuang X. Super-resolution fluorescence microscopy. Annual Review of Biochemistry. 2009;78:993–1016. doi: 10.1146/annurev.biochem.77.061906.092014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Huang X, Fan J, Li L, Liu H, Wu R, Wu Y, Wei L, Mao H, Lal A, Xi P, Tang L, Zhang Y, Liu Y, Tan S, Chen L. Fast, long-term, super-resolution imaging with Hessian structured illumination microscopy. Nature Biotechnology. 2018;36:451–459. doi: 10.1038/nbt.4115. [DOI] [PubMed] [Google Scholar]
  20. Ingaramo M, York AG, Wawrzusin P, Milberg O, Hong A, Weigert R, Shroff H, Patterson GH. Two-photon excitation improves multifocal structured illumination microscopy in thick scattering tissue. PNAS. 2014;111:5254–5259. doi: 10.1073/pnas.1314447111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Kam Z, Kner P, Agard D, Sedat JW. Modelling the application of adaptive optics to wide-field microscope live imaging. Journal of Microscopy. 2007;226:33–42. doi: 10.1111/j.1365-2818.2007.01751.x. [DOI] [PubMed] [Google Scholar]
  22. Kner P, Kam Z, Agard D, Sedat J. Adaptive optics in wide-field microscopy. SPIE MOEMS-MEMS; San Francisco, California. 2011. SPIE. [DOI] [Google Scholar]
  23. Kounatidis I, Stanifer ML, Phillips MA, Paul-Gilloteaux P, Heiligenstein X, Wang H, Okolo CA, Fish TM, Spink MC, Stuart DI, Davis I, Boulant S, Grimes JM, Dobbie IM, Harkiolaki M. 3D correlative cryo-structured illumination fluorescence and soft X-ray microscopy elucidates reovirus intracellular release pathway. Cell. 2020;182:515–530. doi: 10.1016/j.cell.2020.05.051. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Li QGL, Reinig M, Kamiyama D, Huang B, Tao XD, Bardales A, Kubby J. Woofer–tweeter adaptive optical structured illumination microscopy. Photonics Research. 2017;5:329–334. doi: 10.1364/PRJ.5.000329. [DOI] [Google Scholar]
  25. Lin R, Kipreos ET, Zhu J, Khang CH, Kner P. Subcellular three-dimensional imaging deep through multicellular thick samples by structured illumination microscopy and adaptive optics. Nature Communications. 2021;12:3148. doi: 10.1038/s41467-021-23449-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Matsuda A, Koujin T, Schermelleh L, Haraguchi T, Hiraoka Y. High-accuracy correction of 3D chromatic shifts in the age of super-resolution biological imaging using chromagnon. Journal of Visualized Experiments. 2020;160:60800. doi: 10.3791/60800. [DOI] [PubMed] [Google Scholar]
  27. Ogushi S, Rattani A, Godwin J, Metson J, Schermelleh L, Nasmyth K. Loss of sister kinetochore co-orientation and peri-centromeric cohesin protection after meiosis I depends on cleavage of centromeric REC8. Developmental Cell. 2021;56:3100–3114. doi: 10.1016/j.devcel.2021.10.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Parton RM, Vallés AM, Dobbie IM, Davis I. Live cell imaging in Drosophila melanogaster. Cold Spring Harbor Protocols. 2010;2010:db. doi: 10.1101/pdb.top75. [DOI] [PubMed] [Google Scholar]
  29. Phillips MA, Harkiolaki M, Susano Pinto DM, Parton RM, Palanca A, Garcia-Moreno M, Kounatidis I, Sedat JW, Stuart DI, Castello A, Booth MJ, Davis I, Dobbie IM. CryoSIM: super-resolution 3D structured illumination cryogenic fluorescence microscopy for correlated ultrastructural imaging. Optica. 2020;7:802–812. doi: 10.1364/OPTICA.393203. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Phillips MA, Susano Pinto DM, Hall N, Mateos-Langerak J, Parton RM, Titlow J, Stoychev DV, Parks T, Susano Pinto T, Sedat JW, Booth MJ, Davis I, Dobbie IM. Microscope-Cockpit: Python-based bespoke microscopy for bio-medical science. Wellcome Open Research. 2021;6:76. doi: 10.12688/wellcomeopenres.16610.2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Poland SP, Wright AJ, Girkin JM. Active focus locking in an optically sectioning microscope utilizing a deformable membrane mirror. Optics Letters. 2008;33:419–421. doi: 10.1364/ol.33.000419. [DOI] [PubMed] [Google Scholar]
  32. Rajaeipour P, Dorn A, Banerjee K, Zappe H, Ataman Ç. Extended field-of-view adaptive optics in microscopy via numerical field segmentation. Applied Optics. 2020;59:3784–3791. doi: 10.1364/AO.388000. [DOI] [PubMed] [Google Scholar]
  33. Richter V, Piper M, Wagner M, Schneckenburger H. Increasing resolution in live cell microscopy by Structured Illumination (SIM) Applied Sciences. 2019;9:1188. doi: 10.3390/app9061188. [DOI] [Google Scholar]
  34. Sahl SJ, Hell SW, Jakobs S. Fluorescence nanoscopy in cell biology. Nature Reviews. Molecular Cell Biology. 2017;18:685–701. doi: 10.1038/nrm.2017.71. [DOI] [PubMed] [Google Scholar]
  35. Schermelleh L, Carlton PM, Haase S, Shao L, Winoto L, Kner P, Burke B, Cardoso MC, Agard DA, Gustafsson MGL, Leonhardt H, Sedat JW. Subdiffraction multicolor imaging of the nuclear periphery with 3D structured illumination microscopy. Science. 2008;320:1332–1336. doi: 10.1126/science.1156947. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Schermelleh L, Heintzmann R, Leonhardt H. A guide to super-resolution fluorescence microscopy. The Journal of Cell Biology. 2010;190:165–175. doi: 10.1083/jcb.201002018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Schermelleh L, Ferrand A, Huser T, Eggeling C, Sauer M, Biehlmaier O, Drummen GPC. Super-resolution microscopy demystified. Nature Cell Biology. 2019;21:72–84. doi: 10.1038/s41556-018-0251-8. [DOI] [PubMed] [Google Scholar]
  38. Schindelin J, Arganda-Carreras I, Frise E, Kaynig V, Longair M, Pietzsch T, Preibisch S, Rueden C, Saalfeld S, Schmid B, Tinevez JY, White DJ, Hartenstein V, Eliceiri K, Tomancak P, Cardona A. Fiji: an open-source platform for biological-image analysis. Nature Methods. 2012;9:676–682. doi: 10.1038/nmeth.2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Smith CS, Slotman JA, Schermelleh L, Chakrova N, Hari S, Vos Y, Hagen CW, Müller M, van Cappellen W, Houtsmuller AB, Hoogenboom JP, Stallinga S. Structured illumination microscopy with noise-controlled image reconstructions. Nature Methods. 2021;18:821–828. doi: 10.1038/s41592-021-01167-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Stoychev D. Python-microscope. swh:1:rev:5a297fffe2835db2903ed9cc8fcc4d4661551dd3Software Heritage. 2022a https://archive.softwareheritage.org/swh:1:dir:e4e2a58143677eb54f0018cd72da9e0eb4fa405a;origin=https://github.com/dstoychev/microscope;visit=swh:1:snp:3be65a4712af16a37522171ab8bbf00e0e442af9;anchor=swh:1:rev:5a297fffe2835db2903ed9cc8fcc4d4661551dd3
  41. Stoychev D. Microscope-cockpit. swh:1:rev:c08862f752ff6e738cefd8efff0c0dc295e7c522Software Heritage. 2022b https://archive.softwareheritage.org/swh:1:dir:4f551a81474aa83f6ce44b662cff4d0c612aba2e;origin=https://github.com/dstoychev/cockpit;visit=swh:1:snp:f39e11be355c32ca842a8d44c24f9f0859a8c114;anchor=swh:1:rev:c08862f752ff6e738cefd8efff0c0dc295e7c522
  42. Stoychev D. AOTools: a microscope add-on for adaptive optics. swh:1:rev:523c1d26f9a53805759b4e91acd741ec2bdc667fSoftware Heritage. 2023 https://archive.softwareheritage.org/swh:1:dir:aba8017079a91a552c514a7166e0400d9cc67adc;origin=https://github.com/dstoychev/microscope-aotools;visit=swh:1:snp:70ab2436814fdb72e9798b48d08c6d3a4d0101d7;anchor=swh:1:rev:523c1d26f9a53805759b4e91acd741ec2bdc667f
  43. Susano Pinto DM, Phillips MA, Hall N, Mateos-Langerak J, Stoychev D, Susano Pinto T, Booth MJ, Davis I, Dobbie IM. Python-Microscope - a new open-source Python library for the control of microscopes. Journal of Cell Science. 2021;134:jcs258955. doi: 10.1242/jcs.258955. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Thomas B, Wolstenholme A, Chaudhari SN, Kipreos ET, Kner P. Enhanced resolution through thick tissue with structured illumination and adaptive optics. Journal of Biomedical Optics. 2015;20:26006. doi: 10.1117/1.JBO.20.2.026006. [DOI] [PubMed] [Google Scholar]
  45. Turcotte R, Liang Y, Tanimoto M, Zhang Q, Li Z, Koyama M, Betzig E, Ji N. Dynamic super-resolution structured illumination imaging in the living brain. PNAS. 2019;116:9586–9591. doi: 10.1073/pnas.1819965116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Wang Y, Yu Z, Cahoon CK, Parmely T, Thomas N, Unruh JR, Slaughter BD, Hawley RS. Combined expansion microscopy with structured illumination microscopy for analyzing protein complexes. Nature Protocols. 2018;13:1869–1895. doi: 10.1038/s41596-018-0023-8. [DOI] [PubMed] [Google Scholar]
  47. Wang J, Zhang Y. Adaptive optics in super-resolution microscopy. Biophysics Reports. 2021;7:267–279. doi: 10.52601/bpr.2021.210015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. York AG, Parekh SH, Dalle Nogare D, Fischer RS, Temprine K, Mione M, Chitnis AB, Combs CA, Shroff H. Resolution doubling in live, multicellular organisms via multifocal structured illumination microscopy. Nature Methods. 2012;9:749–754. doi: 10.1038/nmeth.2025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. York AG, Chandris P, Nogare DD, Head J, Wawrzusin P, Fischer RS, Chitnis A, Shroff H. Instant super-resolution imaging in live cells and embryos via analog image processing. Nature Methods. 2013;10:1122–1126. doi: 10.1038/nmeth.2687. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Zheng W, Wu Y, Winter P, Fischer R, Nogare DD, Hong A, McCormick C, Christensen R, Dempsey WP, Arnold DB, Zimmerberg J, Chitnis A, Sellers J, Waterman C, Shroff H. Adaptive optics improves multiphoton super-resolution imaging. Nature Methods. 2017;14:869–872. doi: 10.1038/nmeth.4337. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Žurauskas M, Barnstedt O, Frade-Rodriguez M, Waddell S, Booth MJ. Rapid adaptive remote focusing microscope for sensing of volumetric neural activity. Biomedical Optics Express. 2017;8:4369–4379. doi: 10.1364/BOE.8.004369. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Žurauskas M, Dobbie IM, Parton RM, Phillips MA, Göhler A, Davis I, Booth MJ. IsoSense: frequency enhanced sensorless adaptive optics through structured illumination. Optica. 2019;6:370–379. doi: 10.1364/OPTICA.6.000370. [DOI] [PMC free article] [PubMed] [Google Scholar]

Editor's evaluation

K VijayRaghavan 1

Three-dimensional structured illumination microscopy (3D-SIM) is a technique that doubles imaging resolution. Still, its use has been limited by its sensitivity to aberrations in thick tissues and its lack of availability in an upright configuration. This convincing 'Deep3DSIM' method addresses these issues by using adaptive optics to correct aberrations and remote focusing for artefact-free volume imaging. This enables high-quality super-resolution imaging up to 130 µm deep into specimens, such as a Drosophila brain, while allowing for live-specimen manipulation, providing valuable advances on current efforts.

Decision letter

Editor: K VijayRaghavan1

In the interests of transparency, eLife publishes the most substantive revision requests and the accompanying author responses.

[Editors' note: this paper was reviewed by Review Commons.]

eLife. 2025 Oct 28;14:e102144. doi: 10.7554/eLife.102144.sa2

Author response


General statements

We thank the reviewers for spending so much time and care commenting on our manuscript. Their detailed, thoughtful and constructive comments have certainly improved the manuscript considerably. We have gone through each and every comment in detail and addressed it individually, often by new data and its analysis, leading to a revision of all the figures and supplementary figures of the paper. Below, we explain our responses and revisions in detail, outlining our response to each and every comment, closely following our revision plan, previously accepted by eLife. The revision of the figures now also follows more closely the guidelines of the journal. For example, single channel images are now presented in greyscale, while two-channel images use green and magenta. We also adhered to the principles of Colour Universal Design (by Okabe and Ito), whenever possible. In a similar vein, the “Supplementary Materials” have been correctly reorganised into appendices, now placed in the main manuscript document, and Figures 3-5 now include one supplementary figure each. The author contributions have been removed from the main manuscript, as well as the figure images, which are submitted as individual files. The figure legends remain in the main manuscript document.

Description of revisions

Major reviewer comments

Reviewer #1:

Major comment #1

Given the emphasis on super-resolution imaging deep inside a sample, we were surprised to see no mention of other forms of structured illumination that allow super-resolution imaging in samples thicker than a single cell. These include the 'spot-scanning' implementations of SIM that offer better imaging at depth by virtue of pinholes, and include MSIM, iSIM, and rescan confocal technologies. The two-photon / AO implementation of iSIM seems particularly germane, e.g. https://pubmed.ncbi.nlm.nih.gov/28628128/ Please consider citing these works, as they help place the existing work into context.

We want to thank Reviewer #1 for making this good point. To address this comment, we added a few sentences in the introduction to describe and cite image scanning methods, referring to the work mentioned by the reviewer as well as additional studies:

“Image scanning methods have been used to enhance the spatial resolution in imaging of thick samples, in widefield (York et al., 2013; York et al., 2012), multiphoton (Ingaramo et al., 2014) and with AO (Zheng et al., 2017). These methods use physical or digital pinholes together with photon reassignment and deconvolution to improve the resolution by a factor of √2 or greater.”

Major comment #2

As we're sure the authors appreciate, besides aberrations, a major additional obstacle to 3D SIM in thick tissues is the presence of out-of-focus background. Indeed, this point was mentioned by Gustafsson in his classic 2008 paper on 3D SIM (https://pubmed.ncbi.nlm.nih.gov/18326650/):

'The application area of three-dimensional structured illumination microscopy overlaps with that of confocal microscopy, but the two techniques have different and complementary strengths. Structured illumination microscopy offers higher effective lateral resolution, because it concentrates much of the excitation light at the very highest illumination angles, which are most effective for encoding high-resolution information into the observed data, whereas confocal microscopy spreads out its illumination light more or-less uniformly over all available angles to form a focused beam. For very thick and compactly fluorescent samples, however, confocal microscopy has an advantage in that its pinhole removes out-of focus light physically. Structured illumination microscopy is quite effective at removing out-of-focus light computationally, because it is not subject to the missing-cone problem, but computational removal leaves behind the associated shot noise. Therefore confocal microscopy may be preferable on very thick and dense samples, for which the in-focus information in a conventional microscope image would be overwhelmed by out-of-focus light, whereas structured illumination microscopy may be superior in a regime of thinner or sparser samples.'

This point is not mentioned at all in the manuscript, yet we are certain it is at least partially responsible for the residual image artifacts the authors mention. Please discuss the problem of out of focus light on 3D samples, particularly with an eye to the 'spot-scanning' papers mentioned above.

We thank Reviewer #1 for emphasising the importance of out-of-focus light in thick specimens and the difficulties it poses to good 3D-SIM imaging. To address the comment, we now describe this issue in the introduction, in the context of both 3D-SIM and image scanning methods:

“3D-SIM methods achieve higher resolution, but they manage background light less efficiently than image scanning methods. Two major reasons for this are the retention, after reconstruction, of the shot noise of the background light and the reduced contrast of the structured illumination (SI), potentially leading to reconstruction artefacts. These problems are compounded by depth and fluorescence density.”

Major comment #3

The authors use a water dipping lens, yet they image into samples that are mounted on coverslips, i.e. they use a dipping lens to image through a coverslip:

This almost certainly introduces spherical aberration, which the authors seem to observe: see attached pdf for reference.

We find this troubling, as it seems that in the process of building their setup, the authors have made a choice of objective lens that introduces aberrations – that they later correct. At the very least, this point needs to be acknowledged in the manuscript (or please correct us if we're wrong) – as it renders the data in Figures 3-4 somewhat less compelling than if the authors used an objective lens that allowed correction through a coverglass, e.g. a water dipping lens with a correction collar. In other words, in the process of building their AO setup, the authors have introduced system aberrations that render the comparison with 3D SIM somewhat unfair. Ideally the authors would show a comparison with an objective lens that can image through a glass coverslip.

We thank Reviewer #1 for pointing out the confusion caused by our lack of clarity regarding the use of a water-dipping/immersion objective lens in our system. To address this comment, we have added descriptions in the introduction and the methods and materials sections that explain how the objective lens can be used in both water-immersion and water-dipping configurations:

“The system is based around a 60×/1.1 NA water-immersion objective lens, with a correction collar that allows its use in a water-dipping configuration without a cover slip.

We used a water-immersion objective lens (Olympus LUMFLN60XW), capable of working in a water-dipping configuration by adjusting a correction collar […]”

Major comment #4

The authors tend to include numbers for resolution without statistics. This renders the comparisons meaningless in my opinion; ideally every number would have a mean and error bar associated with it. We have included specific examples in the minor comments below.

This is a good point, which we address below, in minor comments #8, #9, and #11. In summary, we revised Figure 2 to show the distributions of the FWHM samples, including common statistical measures such as median and mean, and other features like density, dispersion, and skewness.

Major comment #5

In Figure 5, after the 'multipoint AO SIM', the SNR in some regions seems to decrease after AO: see attached pdf for reference

Please comment on this issue.

We want to thank Reviewer #1 for the insightful comment. We redesigned Figure 5, making sure that the two sets of images have the same intensity ranges. By doing this, the signal is now consistently higher in the AO images, so the issue of decreasing SNR is no longer present.

Major comment #6

Please provide timing costs for the indirect AO methods used in the paper, so the reader understands how this time compares to the time required for taking a 3D SIM stack. In a similar vein, the authors in Lines 213-215, mention a 'disproportionate measurement time' when referring to the time required for AO correction at each plane – providing numbers here would be very useful to a reader, so they can judge for themselves what this means. What is the measurement time, why is it so long, and how does it compare to the time for 3D SIM? It would also be useful to provide a comparison between the time needed for AO correction at each (or two) planes without remote focusing (RF) vs. with RF, so the reader understands the relative temporal contributions of each part of the method. We would suggest, for the data shown in Figure 5, to report a) the time to acquire the whole stack without AO (3D SIM only); b) the time to acquire the data as shown; c) the time to acquire the AO stack without RF. This would help bolster the case for remote focusing in general; as is we are not sure we buy that this is a capability worth having, at least for the data shown in this paper.

We agree that the timing (and other) costs can be an important consideration, and we want to thank Reviewer #1 for bringing up this good point. To address this point, we have expanded the description of the AO in the methods and materials section to include more information about the time it takes to perform the different parts of the AO:

“The duration of the calibration routine was around 30 to 60 minutes, mostly spent on phase unwrapping and other intensive computations.

The duration of the correction routine for a single plane was proportional to the number of modes involved, the number of scanning points for each mode, and the camera exposure configuration, but usually in the order of seconds. The earlier example of 40 images for our base set of corrections would normally take around 10 seconds (<3.5 second of light exposure).

We also added a discussion of how the remote focusing could speed up the image acquisition, further elaborated in Appendix 3:

“In terms of speed, refocusing with the DM instead of using the piezo stage reduced the duration of the image acquisition significantly, because the settling time of the surface of the DM was around an order of magnitude faster than the settling time of the piezo stage, which highlights the potential of AO-based refocusing for fast imaging. A more detailed discussion of this point, together with a comparison between the DM and the piezo stage, is presented in Appendix 3.”

Major comment #7

Some further discussion on possibly extending the remote focusing range would be helpful. We gather that limitations arose from an older model of the DM being used, due to creep effects. We also gather from the SI that edge effects at the periphery of the DM was also problematic. Are these limitations likely non-issues with modern DMs, and how much range could one reasonably expect to achieve as a result? We are wondering if the 10 μm range is a fundamental practical limitation or if in principle it could be extended with commercial DMs.

We are grateful to Reviewer #1 for the suggestions. During the revision of the manuscript, we noticed that there was an error in the reported range of the remote focusing in Figure 5. The actual range is ~16 µm, instead of 10 µm. In addition to this correction, we added discussion of the remote focusing range, the limitations of our approach, and whether other commercial DMs could improve the range (with more technical detail given in Appendix 2):

“The range of remote focusing shown in this work was limited to about ±8 µm, which is smaller than some previous demonstrations. There are two reasons why we did not use wider ranges in our results. First, the DM can only reliably produce PSFs with minimal aberration repeatedly up to a range of about ±10 µm (see Appendix 2). Second, for 3D-SIM acquisition, ±5 µm is already a wide and commonly used Z range. Especially in multi-colour imaging, a rather large number of images, about 1200 frames (3 angles × 5 phases × 80 Z steps) per channel, would be required. While larger volume acquisition is possible, it would lead to considerable photobleaching and phototoxicity, as well as longer acquisition and reconstruction times. Nevertheless, a similar DM device has been used to achieve an impressive range of remote focusing (Cui et al., 2021). Although their approach is different from the high-NA super-resolution imaging presented here, a stable DM should increase the practical range of our remote focusing approach twofold or even greater.”

Reviewer #2:

Major comment #1

The authors have provided an incomplete description of the structured illumination microscopy (SIM) reconstruction process. It is unclear whether the approach is based on 2D interference SIM configurations or 3D interference patterns. Furthermore, the specific algorithm utilized for image reconstruction has not been elucidated. Elaborating on these aspects is crucial as they significantly influence the interpretation of the resulting data.

We thank Reviewer #2 for highlighting the need to describe the reconstruction process in more detail. To address this point, we added further information about the reconstruction process in the Discussion section:

“We chose to use a well-established reconstruction method based on Wiener filtering, as implemented in the softWoRx software suite, without advanced pre-processing of the image data for reduction of artefacts.”

And in the methods and materials section:

“For SIM data reconstruction, we used the commercial software softWoRx (GE Healthcare), which uses Wiener deconvolution, based on the approach by (Gustafsson et al., 2008).”

Major comment #2

The authors have stated that sample-induced aberrations caused by RI inhomogeneities within the specimen is another major reason for causing artifacts generation. Literature has demonstrated that RI inhomogeneities can lead to non-local distortions in the grid pattern, which suggests that applying uniform reconstruction parameters across the entire image may not be viable. Traditional artifact remediation using the classical Wiener method is likely insufficient under these conditions (PMID: 33896197). The existing adaptive optics (AO) approach, which employs a deformable mirror (DM) alongside an sCMOS camera, is inadequate for tackling the issue at hand. Actually the assertion made in the paper that "aberrations change approximately linearly with depth" is seemingly contradicted by simulations referenced in the cited literature (PMID: 33896197). Consequently, it appears that the current methodology might only achieve a partial mitigation of the problems associated with spherical aberration resulting from RI mismatches. It is advisable, therefore, that the authors explicitly acknowledge this limitation in their manuscript to prevent any potential misinterpretation by readers.

We are thankful for this thoughtful comment by Reviewer #2. The focus of our work was not the use of advanced 3D-SIM reconstruction and aberration correction methods; instead, we used standard ones that do not deal perfectly with field-dependent aberrations. As such, our approach provides average reconstruction and average correction across the entire field of view. To address the comment, we have added a new paragraph to the Discussion section:

“Our AO methods provide only one average correction […]”

We also highlighted specific examples of residual aberrations in Figure 4B.

Major comment #3

In Figure 2, the use of COS-7 cells, which are known for their relatively thin axial dimension, for the experiments raises an eyebrow. Notably, there are ample instances in existing research where both 2D-SIM and 3D-SIM, without the integration of adaptive optics, have yielded high-quality super-resolution images of structures such as tubulin and the endoplasmic reticulum. In addition, the authors did not present a direct comparison between BP-SIM and AO-SIM here. Without this comparative analysis, it remains ambiguous whether the enhancements in resolution and contrast and the reduction in artifacts can genuinely be attributed to the mitigation of spherical aberration. To clarify this, it would be beneficial for the authors to include side-by-side comparisons of these modalities to demonstrate the specific improvements attributed to AO-SIM.

We are grateful to Reviewer #2 for this helpful comment. We revised Figure 2 entirely, which includes the bottom part of it (Figure 2B), to emphasise better the comparison that we wanted to make: widefield against 3D-SIM, demonstrating the improvement in resolution. Naturally, we also revised the second paragraph of the Results section accordingly, to be in line with the updated figure and to make our arguments clearer.

Major comment #4

In Figures 3 and 4, the authors have illustrated the enhancements achieved through the application of AO. However, there is a discernible presence of hammer-stroke and honeycomb artifacts in pre-AO imaged data, which seem to originate from the amplification of the incorrectly moved out-of-focal background in the frequency domain. Various strategies have been previously suggested to address these specific artifacts, encompassing methods like subtracting background noise in the raw images or employing selective frequency spectrum attenuation techniques, such as Notch filtering and High-Fidelity SIM. To facilitate a more comprehensive understanding, I would recommend that the authors incorporate into their study a comparison that includes BP-SIM data that has undergone either background subtraction or frequency spectrum attenuation. This added data would enable a more complete evaluation and comparison regarding the merits and impact of their AO approach.

We thank the reviewer for this excellent suggestion. We agree that a pre-processing step, such as background subtraction or frequency spectrum attenuation, can help with the reduction of artefacts. However, when we tried frequency spectrum attenuation, as we said in the revision plan, the results did not improve the image quality. We note that the methods suggested by the reviewer, e.g. HiFi-SIM, were designed for 2D-SIM and they are not suitable for our 3D-SIM methodology. We feel that the effective removal of 3D-SIM reconstruction artefacts is a complex and difficult task, which deserves its own treatment and therefore it is well beyond the scope of this manuscript. Nevertheless, to address this comment, we have added a new paragraph to the Discussion section, which clarifies our reconstruction strategy and choice of algorithms:

“An important consideration in 3D-SIM reconstruction is the precise settings […]”

The paragraph also includes references to novel 3D-SIM reconstruction methods which could produce higher fidelity reconstructed data, with less artefacts, and which can be applied to Deep3DSIM.

Reviewer #3:

Major comment #1

There is an overall reference in the manuscript of the novelty possible range of applications of using an upright microscope configuration. Examples mentioned are tissue-based imaging, access to whole-mount specimens for manipulation and electrophysiology. However, authors fail to present any such applications. There is not a single example presented which could not have been obtained with an inverted microscope. Could the authors provide an example where a water-dipping is used. Expanded samples could be one case, since the thickness of the gel makes it difficult to image with an inverted microscope. Another possible example would be to label the extracellular space and do shadow imaging of the tissue (SUSHI PMID: 29474910). ExM might be simpler to do as part of revising the manuscript than SUSHI.

We are thankful to Reviewer #3 for the constructive comment. The data in Figure 5A is an example where water-dipping is used, which we now explicitly state in the revised figure legend. The reviewer’s notes on expansion microscopy and SUSHI are interesting, but the primary purpose of our microscope system is to facilitate live super-resolution cell imaging experiments. Nevertheless, to address this comment, we have added a discussion of the application of Deep3DSIM to expansion microscopy:

“Deep3DSIM is also uniquely positioned for the imaging of the whole sample in expansion microscopy, without the need for mechanical sectioning, which is a challenge because of the softness and fragility of commonly used hydrogel techniques (Cahoon et al., 2017).”

Major comment #2

On the main text it is described a 5-fold volumetric resolution, which is confusing since authors only mention lateral and axial resolutions. Their measurements correspond to a ~1.6-fold lateral improvement and ~1.7-fold axial improvement. These are however not the 95% of the achievable resolution theoretical maximum, as stated in p7 SI (2 fold increase of 282nm), but only the 80-85%. This point should be rephrased in the manuscript.

We thank Reviewer #3 for bringing up this important point. In response, we have revised our description of the resolution improvement in the first two paragraphs of the Results section. We also removed the claim of 5-fold improvement, which we agree is confusing.

Major comment #3

[OPTIONAL] p4 and related to figure 2, it would be important to report also measurements of beads with SIM but without AO, just as done for WF. Is there an improvement of using AO on SIM? This is reported for the fixed cells but not for the beads.

We agree with the reviewer’s point and in response we have revised Figure 2, for greater clarity. We wanted to demonstrate the resolution improvement of 3D-SIM over widefield imaging. With the revision of the figure, we shifted the emphasis away from the AO to avoid confusion.

Major comment #4

Figure 2, it is odd the comparison between WF+/- AO and SIM +/- AO are done using different cellular structures. Since wavelengths used are not the same it is difficult to interpret if there is any improvement of using AO on SIM compared to SIM without AO. Same questions arise as above, Is there an improvement of using AO on SIM?

We agree that the data in Figure 2C-D is presented in unusual way. We addressed the comment by revising Figure 2 to make it clear from this point of view. We also changed considerably the corresponding descriptions in the first two paragraphs of the Results section.

Major comment #5

"A significant benefit and uniqueness of the Deep3DSIM design is its upright configuration, whereas commercial SIM systems are built around inverted microscopes and are usually restricted to imaging thin samples, such as cultured cells." (p5) is not correct. The commercial DeepSIM module from CREST Optics can be mounted on an inverted microscope as well as image deep into tissue (seehttps://crestoptics.com/deepsim/ and application notes therein) and be used with essentially any objective. This point should be rephrased in the text.

We thank Reviewer #3 for pointing out this mistake. Of course, we meant commercial 3D-SIM systems, such as GE Healthcare DeltaVision OMX and Nikon N-SIM. We have rephrased this sentence in the beginning of the third paragraph of the Results section:

“Deep3DSIM is an upright system with a long working distance objective lens, in contrast to commercial 3D-SIM systems which are built around inverted microscopes, and which are usually restricted to imaging thin samples, such as cultured cells.”

Regarding the commercial DeepSIM module from CREST Optics, as far as we can tell, it uses an imaging scanning microscopy method. This is very different from our method, which uses 3D-SIM.

Major comment #6

Figure 3 reports the improvements of AO on SIM for imaging over 10um in tissue. What are the Zernike modes measured? Or how does the pupil look like before and after correction? It would be also good to report the Fourier amplitudes as done in Figure 2C as a quantitative measure of improvement. It would be good to point out the artifacts observed on the BP SIM image reconstruction (labelled with 3x, fringes are noticeable).

This is a good suggestion – thanks. In response, we have revised Figure 3 to improve the presentation of the data, which includes the pointing out of reconstruction artefacts. We also added Figure 3—figure supplement 1 to show the power spectra and the Zernike mode corrections.

Major comment #7

Many key details relating to image acquisition and AO correction are missing for all figures. How is the AO optimization implemented? Is it implemented via a genetic algorithm (progressive optimization of parameters) or using more clever strategies? Not clear if the optimization is implemented using images obtained with flat illumination or after SIM imaging/processing of a given dataset. How long does the AO optimization take? How sensitive to noise is the process? What metric do they use to estimate the sensorless AO correction? On pag12, they say "Fourier domain image metric" for measurements with fine details; otherwise, ISOsense when not high frequencies are present. Could the authors report the formula used to calculate the first metric? What do they consider to be low and high frequencies in this case? Is there a reason why ISOsense is not always used, or is there an automatic way to choose between the two? How many images were acquired for AO correction? Which samples were corrected with ISOsense and which ones with Fourier domain image metric? (see for example the detailed experimental reporting in the Supp Mat from Lin et al. Nat Commun 2021).

We are grateful to Reviewer #3 for the extensive list of questions. We greatly expanded our description of the AO methods in the methods and materials section, which now answers all these questions. We further added a new appendix (Appendix 5) where we provide formulae and description of the Fourier-based image metric.

Major comment #8

Figure 4. Data presented for larval brain tissue is a very clear example of adding AO to image deep into tissue as the effect at ~130 cannot be understated. Here too, it would be also good to report the Fourier amplitudes as done in Figure 2C as a quantitative measure of improvement and possibly the SNR of reconstructed images. Having a way to quantitatively describe how much better are those images would be great. Also, what are the aberrations corrected? Can the wavefront or Zernike amplitude of the modes be reported? Same as for Figure 3, details about AO correction are missing.

This is a very helpful comment – thanks. We added Figure 4—figure supplement 1, which shows the power spectra and the Zernike correction modes.

Major comment #9

[OPTIONAL] "It is worth noting that aberrations can differ across larger fields, and therefore, after applying an average correction, residual aberrations can still be observed in some regions of the AO-corrected bead images. However, the overall PSF shapes were still dramatically improved with AO compared to the equivalent without AO." This point is very interesting although not result either in the main text or in the SI is presented.

This is a good point. In response, we revised Figure 4, to highlight two examples of residual aberrations. We also added a new paragraph to the Discussion section, where we elaborate on the subject:

“Our AO methods provide only one average correction […]”

Major comment #10

"As we found that the aberrations change approximately linearly in depth, we could measure the aberration in two planes and calculate the corrections in intermediate planes by interpolation, an approach which we termed "multi-position AO"." This is, personally, one of the major contributions of this work to the community. Unfortunately, it is not reported in detail. Not only for SIM but for imaging with WF or confocal, such linear change for aberrations with depth is not well known. Again, here the details of AO correction and image metrics are missing. To establish that for most thick biological structures 'aberrations change approximately linearly in depth' would be foundational to the widespread use of AO within standard imaging. Would it be possible for the authors to elaborate on this point and present detailed results? What is the error from measuring and correcting more than 2 planes? What is the error from just measuring and AO correcting at the deeper plane, i.e. from a single measurement? Authors could also show a case in which a linear assumption works nicely (or how well it works). For example, comparing an intermediate plane (or a plane beyond) imaged after AO optimization or after coefficient interpolation of the Zernike modes and compare it against correcting directly that plane.

We thank the reviewer for highlighting our poorly phrased sentence. We meant to say proportional instead of linear. We have rephrased our argument, which is now in the last paragraph of the Results section. In general, our description of the AO methods has been greatly expanded and includes details about the AO correction and the image metrics. Performing estimation from just a single measurement is not possible, but we show the simplest possible example – line fitted to two measurements (planes) – in Figure 5B, which also shows planes beyond the deep measurement point (bottom arrow).

Major comment #11

The image of the cos-7 cell in metaphase, for Figure 5 is, however, very disappointing. See Figure 1 of Novak et al. Nat Commun 2018 for an example of a single z-plane of a cell in metaphase. Having the possibility to correct for the entire 3D volume, I would expect amazing 3D volumes (movies and/or projections) associated with this imaging which are not presented.

This is an interesting point. In response, we revised the old Figure 5 to more effectively display the data, including clearer line profile measurements. The revised figure was merged with the old Figure 6, and it now corresponds to Figure 5B. This revision now allows us to motivate and describe the experiments better in the last two paragraphs of the Results section.

Major comment #12

In Figure 6, they use AO in remote configuration mode to allow imaging of live specimens. It needs to be clarified if this is an a priori characterization that is then kept fixed while recording in time. The last acquired volume of Figure 6A and B have a higher amount of artifacts with respect to time 00:00. Are those artifacts due to lower SNR (maybe due to sample bleaching) or due to some change in the aberrations of the specimen?

Thanks for this valuable comment. Our revised description of the AO methodology now explicitly states that the AO correction characterisation was performed before the actual data acquisition:

“When imaging, we always carried out the aberration correction routine first and then held the DM in the corrective shape during the actual image acquisition. For relatively thin volumes, e.g. 1-2 µm thickness, we mostly corrected just a single plane (usually the middle of the stack) […]”

Regarding the last volume in Figure 6 (now Figure 5A), the reorganisation of the mitotic spindle are highly dynamic, but the reconstruction artefacts remain constant throughout the series. Nevertheless, for clarity and transparency, we pointed the artefacts out in the description of the results, and we refer the readers to the methods and materials section, where we added a new subsection on this topic, called “SIM reconstruction artefacts”.

Major comment #13

"These results demonstrate that the remote focusing functionality of the system can be successfully applied for live 3D-SIM experiments, allowing four-dimensional acquisition while keeping the specimen stationary, thus avoiding the usual agitation and perturbations associated with mechanical actuation." Generally, this statement is true, but for the specific example shown of Drosophila embryogenesis is it relevant? If they use piezo-driven Z-stack imaging with AO, does that lead to incorrect reconstructions or motion-induced artifacts? Related to the results shown in Figure 6, the fair comparison would be AO SIM vs SIM (without AO), not AO SIM vs AO WF.

We have addressed this excellent comment by revising Figure 6 (now Figure 5A) and its description in the penultimate paragraph on the Results section. We hope this revision clarifies that remote focusing can meet the stringent image quality requirements of 3D-SIM reconstruction and that it is suitable for live imaging, i.e. fast and gentle with the excitation light.

Major comment #14

When performing remote focusing, is the effective NA of the imaged plane changing with respect to the NA of the objective used at its focal plane?

To address this good point, we now mention the negligible change in effective NA in the methods and materials section:

“The change in effective NA due to focus shift was negligible with our choice of focusing range (up to ~16 µm) and objective lens.”

Major comment #15

[OPTIONAL] Did the authors run calculations to explore whether a commercial upright microscope could be used instead of their design? Are there any fundamental flaws that would make impossible using a commercial base? If not, could an AO SIM module be designed such that it adds on a commercial base? It would be important to discuss this point.

We thank Reviewer #3 for bringing up this interesting point. A lot of considerations, calculations, and modelling were done in the prototyping of Deep3DSIM. Of course, the use of a commercial upright microscope stand was considered. One of the obvious limitations was the difficult access to the pupil-conjugated plane. In addition, a commercial microscope stand was not compatible with many of the core parts of the system, which were chosen for specific biological applications, such as dual camera system for fast live simultaneous imaging and the heavy-duty Z stage intended to support two heavy micromanipulators. To address this point, we added a brief discussion of the commercialisation of Deep3DSIM to the last paragraph of the Discussion section:

“Finally, Deep3DSIM was custom prototyped on a large optical table, instead of basing the design around a commercial upright microscope, because the latter would have imposed unnecessary restrictions at this prototyping stage. In the future, the optical path of Deep3DSIM could be simplified and reduced in size, thus making it easier to adopt or commercialise, and to add on to existing microscopes, as well as to build specific Deep3DSIM systems designed for bespoke applications.”

Minor reviewer comments

Reviewer #1:

Minor comment #1

The paper mentions Ephys multiple times, even putting micromanipulators into Figure 1 – although it is not actually used in this paper. If including in Figure 1, please make it clear that these additional components are aspirational and not actually used in the paper.

To address this point, we revised the legend of Figure 1 to make it clear that the use of the micromanipulators is conceptual and not actually shown in the manuscript.

Minor comment #2

The abstract mentions '3D SIM microscopes', 'microscopes' redundant as the 'm' in 'SIM' stands for 'microscope'.

To correct this error, we have revised the wording in the abstract and other parts of the manuscript that mentioned “SIM microscope(s)”, which we replaced with “SIM system(s)”.

Minor comment #3

'fast optical sectioning', line 42, how can optical sectioning be 'fast'? Do they mean rapid imaging with optical sectinong?

Yes. We have now revised this part of the introduction to “fast imaging with optical sectioning”.

Minor comment #4

line 59, 'effective imaging depth may be increased to some extent using silicone immersion objectives', what about water immersion objectives? We would guess these could also be used.

We rephrased this part of the introduction to include water immersion objective lenses:

“Silicone oil and water immersion, or water-dipping, objective lenses, with lower NA, have longer working distances […]”

Minor comment #5

line 65 – evidence for 'water-dipping objectives are more sensitive to aberrations' ? Please provide citation or remove. They are certainly more prone to aberrations if used with a coverslip as done here.

We reviewed this phrase and we decided that it is poorly phrased and confusing. Therefore, we removed it.

Minor comment #6

'fast z stacks' is mentioned in line 103. How fast is fast?

We changed the wording of this line to make it clearer that we are referring to acquisition speed. We discuss the acquisition speed in the last paragraph of the Discussion section, and we provide more technical details about the axial scanning speed in Appendix 3.

Minor comment #7

line 116 'we imaged 100 nm diameter green fluorescent beads'. Deposited on glass? Given that this paper is about imaging deep this detail seems worth specifying in the main text.

We added this detail, namely that the beads were deposited on a glass cover slip, to the first paragraph of the Results section.

Minor comment #8

lines 127-130, when describing changes in the bead shape with numbers for the FWHM, please provide statistics – quoting single numbers for comparison is almost useless and we cannot conclude that there is a meaningful improvement without statistics.

We agree with this comment, so we have provided statistics for the numbers from the first paragraph of the Results section. In addition, we present thorough statistical treatment of the FWHM measurements in Figure 2A.

Minor comment #9

In the same vein, how can we understand that remote focus actually improves the axial FWHM of the widefield bead? Is this result repeatable, or it just noise?

Yes, the remote focusing does improve the axial FWHM resolution, compared to the AO bypass case, since our implementation of the remote focusing corrects for some aberrations, such as spherical aberration. However, we removed the remote focusing FWHM resolution estimations from Figure 2A, because they detracted from the main comparison that we wanted to make in this figure – widefield vs SIM.

Minor comment #10

line 155, 'Because of the high spatial information…' -> 'Because of the high resolution spatial information…'

We removed this sentence from revised manuscript.

Minor comment #11

When quoting estimated resolution #s from microtubules (lines 158-163) similarly please provide statistics as for beads.

We further considered this suggestion and, on reflection, we realised that our estimate of resolution uses the frequency response of the imaging system, which does not lend itself to the same statistical treatment as the FWHM resolution estimations. To make this point clearer, we revised the second part of Figure 2 extensively, now showing explicitly how the numbers were derived. We also rephrased the respective description in the text (second paragraph of Results section) and the figure’s legend.

Minor comment #12

It seems worth mentioning the mechanism of AO correction (i.e. indirect sensing) in the main body of the text, not just the methods.

We agree with this comment. We added a brief description of the AO approach to the third paragraph of the Results section:

“Our implementation of the AO did not use a dedicated wavefront sensor, instead relying on an indirect sensorless approach.”

Minor comment #13

How long do the AO corrections take for the datasets in the paper?

We added details regarding the duration of the calibration and the AO-based corrections to the methods and materials section.

Minor comment #14

Were the datasets in Figure 2-4 acquired with remote focusing, or in conventional z stack mode? Please clarify this point in the main text and the figure captions.

No, they were not. To address this comment, we rearranged the results, and we merged the old Figure 5 and Figure 6. The new Figure 5 is now the only example with remote focusing.

Minor comment #15

It would be helpful when showing z projections in Figures 3-5 to indicate the direction of increasing depth (we assume this is 'down' due to the upright setup, but this would be good to clarify)

We revised all figures to have axis arrows for the microscope images, indicating the direction of increasing depth for images with Z axis.

Minor comment #16

line 174, 'showed significant improvements in both intensity and contrast after reconstruction' – we see the improvements in contrast and resolution, it is harder to appreciate improvements in intensity. Perhaps if the authors showed some line profiles or otherwise quantified intensity this would be easier to appreciate.

We revised all figures to either have the same intensity ranges, making it easy to visually appreciate changes in intensity, or to have bars (usually above the image) that quantify the intensity range.

Minor comment #17

line 195 'reduced artefacts' due to AO. We would agree with this statement – the benefit from AO is obvious, and yet there are still artefacts. If the authors could clarify what these (residual) artefacts are, and their cause (out of focus light, uncorrected residual aberrations, etc) this would be helpful for a reader that is not used to looking at 3D SIM images.

We agree with this comment. To address it, we explicitly pointed out the residual aberrations in the results of Figure 4B. We also consolidated the discussion of this topic to new subsection (“SIM reconstruction artefacts”), which we added to the Methods and materials section.

Minor comment #18

Line 197, 'expected overall structure', please clarify what is expected about the structure and why.

We took out from the legend of Figure 4 our brief description of the expression pattern of the Cno protein, we greatly expanded it, and put it in the Results section:

“The larvae used to prepare the samples came from a fly line with YFP protein trap for canoe (cno), which encodes a protein that is expressed in the adherens junctions of the monolayer of neuroepithelial stem cells in the outer proliferation centres in the optic lobes. The protein expression marked the outlines of the cells and showed the characteristic geometrical mosaic pattern that emerges from the packing of epithelial cells, covering large parts of each brain lobe and thus providing a well-defined structure that could be imaged at various depths.”

We also changed the “expected overall structure” phrase to “expected overall mosaic structure”.

Minor comment #19

Line 199, what is a 'pseudo structure'?

We replaced this poor description with “fuzzy patches”.

Minor comment #20

Figure 4B, 'a resolution of ~200 nm is retained at depth', please clarify how this estimate was obtained, ideally with statistics.

We agree with this comment, and have removed the confusing statement. What we meant was that the lateral FWHM resolution at ~130 µm is mostly the same as on the surface of the sample, which we now explicitly show in Figure 4C. We now also state in the results that “the lateral FWHM resolution is preserved even at this depth”.

Minor comment #21

Figure 4D, please comment on the unphysical negative valued intensities in Figure 4D, ideally explaining their presence in the caption. It would also be helpful to highlight where in the figure these plots arise, so the reader can visually follow along.

We agree with this comment. The negative intensities were removed in the revision of Figure 4 and therefore, an explanation is no longer necessary. We also indicated in the images where the line profiles came from.

Minor comment #22

Line 245, 'rapid mitosis'. What does rapid mean, i.e. please provide the expected timescale for mitosis.

The mitotic cycles at this developmental stage are short, e.g. 5 minutes per mitosis, compared to those of somatic cells, where it takes several hours. We expanded our description of this developmental stage, specifying the duration of each division cycle:

“[…] we imaged Drosophila embryos undergoing rapid mitosis, where each division cycle lasts just a few minutes […]”

Minor comment #23

For the data in Figure 6, was remote refocusing necessary?

We improved our motivation for the data in Figure 6 (now Figure 5A) in the penultimate paragraph of the Results section:

“A key feature of Deep3DSIM is the ability to perform remote focusing. On the other hand, the reconstruction process of 3D-SIM is very stringent about the image quality and the image acquisition parameters, such as the accuracy and precision of Z steps in the acquisition of Z-stacks. Therefore, we wanted to show that remote focusing can meet the strict requirements of 3D-SIM reconstruction, and that the remote focusing can be used as a direct replacement of a mechanical Z stage over a moderate range. We demonstrate this with two examples, starting first with live imaging. Living specimen already present several challenges for 3D-SIM imaging on their own (Gao et al., 2012). Particularly important are the acquisition speed, which can lead to motion artefacts if not sufficiently high, and photobleaching and phototoxicity, which occur when the rate of exposure and the intensity of the excitation light are too high. The Deep3DSIM approach has the potential to resolve both these challenges by using remote focusing for fast image acquisition while preserving the high image quality of the system.”

Minor comment #24

What is the evidence for 'reduced residual aberrations', was a comparative stack taken without AO? In general we feel that the results shown in Figure 6 would be stronger if there were comparative results shown without AO (or remote focusing).

We agree with this comment, so we removed the confusing statement. However, we did add a new paragraph to the Discussion section, where we discuss field-dependent aberrations and the residual aberrations in Figure 4B:

“Our AO methods provide only one average correction […]”

Minor comment #25

Line 350, 'incorporation of denoising algorithms' – citations would be helpful here.

We added two examples of such denoising algorithms to the last paragraph of the Discussion section:

“[…] future improvements could also include the incorporation of denoising algorithms into the reconstruction process to allow lower light dosage, which would enable faster imaging, reduced photobleaching, and reduced specimen photodamage, as demonstrated, for example, in (Huang et al., 2018; Smith et al., 2021).”

Minor comment #26

Line 411, 'All three were further developed and improved' – vague, how so?

We agree with this comment. To address it, we added a brief description of the notable changes and we added references to software repositories where more detailed descriptions can be found:

“All three were further developed and improved, to match the requirements of Deep3DSIM. Most notably, remote focusing was added to the microscope-aotools package. All changes were tracked and they are described in more detail in the respective branch repositories:

python-microscope: https://github.com/dstoychev/microscope/tree/deepsim

microscope-cockpit: https://github.com/dstoychev/cockpit/tree/deepsim

microscope-aotools: https://github.com/dstoychev/microscope-aotools/tree/deepsim-matthew-danny.”

Minor comment #27

Sensorless AO description; how many Zernike modes were corrected?

We specified which Zernike modes were usually corrected in the methods and materials section. Instead of tables, we have now added proper plots of example correction modes to the figure supplements of Figure 3 to Figure 5.

Minor comment #28

Multi-position aberration correction. Was the assumption of linearity in the Zernike correction verified or met? Why is this a reasonable assumption?

The claim of linearity arose from poor phrasing. What we meant was proportionality with depth and linearity of the estimation. We have now rephrased that section in the last paragraph of the Results section:

“We demonstrate MPAC in its simplest form, a linear estimation of two planes, which we found to work well, partially because some of the dominant aberrations, like spherical, are proportional to the imaging depth.”

Minor comment #29

Figure S1B is not useful; if the idea is to give a visual impression of the setup, we would recommend providing more photos with approximate distances indicated so that the reader has a sense of the scale of the setup. As is – it looks like a photograph of some generic optical setup.

Figure S1 is now Appendix 4 Figure 1. Despite our intention to provide more photographs of the instrument, we discovered that its footprint is too large and complex to adequately capture in a single photograph, especially within the dimensions that the figure would allow. We decided against the addition of multiple photographs and therefore removed the second part of the figure entirely, to avoid confusing the readers with photographs that are difficult to understand.

Minor comment #30

SI pattern generation – 'the maximum achievable reconstruction resolution was only slightly reduced to about 95% of the theoretical maximum'. We don't understand this sentence, as the resolution obtained on the 100 nm beads is considerably worse than 95% of the theoretical maximum. Or do the authors mean 95% of the theoretical maximum given their pitch size of 317 nm for green and 367 nm for red?

We agree with this comment, so we have removed the confusing phrase.

Minor comment #31

SI Deformable mirror calibration 'spanning the range [0.1, 0.9]' – what are the units here?

This comment is related to part of what used to be the supplementary material, describing one of the calibration procedures of the AO. To avoid confusion, we decided to remove the details about calibration procedure, as they were already well documented in the cited literature, e.g. (Antonello et al., 2020; Hall, 2020).

Minor comment #32

What are the units in Figure S5C, S5D?

Similar to minor comment #31, this figure was removed to avoid possible confusions.

Minor comment #33

It would be useful to define 'warmup' also in the caption of SI Figure S6A.

We agree with this comment. We added a description of the warm-up routine to the figure legend: “warm-up (4 hours of 1 rad defocus)”.

Minor comment #34

SI Remote Focusing, 'four offsets, {-5 mm, -2.5 mm, 2.5 mm, 5 mm}…' are the units mm or um?

The units were supposed to be micrometres. We corrected this error: “four offsets, {-5, -2.5, 2.5, 5} µm”.

Minor comment #35

'…whereas that of the 10 beads was…' here, do the authors mean the position of the beads derived from the movement of the piezo stage, as opposed to the remote focusing?

We agree with this comment. To address this confusion, we have re-analysed the data using exact pooled variance for each of the four offset steps, providing a standard well-defined measure of dispersion:

“In terms of precision, we calculated the exact pooled variance of each of the offsets to be {6.45, 4.15, 20.58, 11.83} nm, respectively.”

Minor comment #36

The authors refer to the 'results from Chapter 3.2'. What are they talking about? Do they mean a supplementary figure, or earlier supplementary results? In general, we found the discussion in this paragraph difficult to follow.

This was a remnant from an earlier version of the supplementary material, which used numbered sectioning. We rephrased this part from “results from Chapter 3.2” to “findings from Appendix 6”.

Minor comment #37

Supplementary Figure 9 seems to be not referred to anywhere in the text.

To correct this, we changed Supplementary Figure S9 to be Figure 5—figure supplement 1, and we refer to in the methods and materials section:

“Example correction modes and their amplitudes for each sample are given in the figure supplements of Figure 3 to Figure 5.”

Minor comment #38

Since the paper emphasizes 3D SIM, OTFs along the axial direction would also be useful to show, in addition to the lateral OTFs shown in Figure 2D.

To address this point, we have added axial power spectra, showing the axial OTF supports, to Figure 2. We also provide similar data in the figure supplements for Figure 3 to Figure 5.

Minor comment #39

When the sample is moved by the piezo, the axial phase of the 3D-SIM illumination pattern is stable as the sample is scanned through the illumination pattern. When remote focusing is performed, the sample is always stable so the axial phase of the 3D-SIM illumination pattern is presumably changing with remote focusing. Can the authors clarify if the 3D SIM illumination pattern is scanned when remote focusing is applied, or is the intensity pattern stable in z?

To address this point, we have now clarified in the methods and materials section that the structured illumination pattern was scanned:

“When remote focusing was combined with 3D-SIM imaging, the focal plane and the SI pattern were synchronously moved together during axial scanning, keeping constant the phase relationship between the two.”

Minor comment #40

In Supplementary Figure 9, primary spherical is referred to twice, both at index 11 and 22. The latter is presumably secondary spherical?

Yes. We revised Supplementary Figure S9, which is now Figure 5—figure supplement 1, and we removed the removed the arrows to avoid confusion.

Minor comment #41

We do not understand the x axis label, in Figure S4D, is it really [0, 50, 50, 50] as written?

We thank the review for pointing this out, as the x axis was mislabelled. It is supposed to show three 0 to 50 μm ranges. Supplementary Figure S4 is now Appendix 1—figure 1, and we modified the x-axis labels in panel B to correctly show the full range of 150 µm. We also expanded our description of the plot in the figure legend:

“The plot shows three 50 μm scans through the bead, one for each of the SI angles.”

Reviewer #2:

None

Reviewer #3:

Minor comment #1

Figure 2 lacks a color bar for D panels, which is in log scale. Authors should also show the Fourier transform along the z direction.

To avoid confusion, the colour in the power spectra of Figure 2 has been removed, so the need for a colour bar is no longer necessary. However, we did specify in the figure legend that the images are in logarithmic scale. We also added the axial power spectra.

Minor comment #2

p4, "Such minor aberrations tend to be insignificant in conventional microscopy modalities such as widefield and confocal (Wang and Zhang, 2021). Therefore…" If optical aberrations are insignificant for single cells in widefield and confocal why do experiments here? These sentences should be rephrased to motivate better the experiments performed.

We agree with this comment, and we rephrased the sentence:

“Such minor aberrations are often overlooked in conventional microscopy modalities such as widefield and confocal (Wang & Zhang, 2021). Therefore […]”

Minor comment #3

Imaged microtubules look abnormal, 'dotty' (figure 2) in both WF and SIM. See https://sim.hms.harvard.edu/portfolio/microtubules/ or Figure 1 of Wegel, et al. Dobbie Sci Rep 2016, for better examples of continuous microtubule structures as imaged with SIM.

To address this point, we have added a new paragraph to the “Cell culture” part of the methods and materials section, where we address the fragmentation of microtubules:

“The microtubules stained in this way, occasionally appeared fragmented, more noticeable in the SIM images in Figure 2B. This fragmentation is normal for a sample preparation protocol, such as ours, based on 2% paraformaldehyde fixation. More advanced protocols, specifically designed for the preservation of structures such as microtubules, can result in more continuous filaments.”

Minor comment #4

Is also the remote focusing performed via optimization of metrics similar to the one used for compensating aberrations?

Yes, the remote focusing is based on our regular aberration correction method. To improve clarity, we rephrased our description of the remote focusing in the methods and materials section, explicitly stating that its calibration uses the same type of optimisation algorithm as the aberration correction:

“To calculate the required remote focusing DM patterns, we first performed a calibration procedure, which was based on our aberration correction method and hence used the same type of optimisation algorithm.”

We further elaborate on this procedure in Appendix 2.

Minor comment #5

Figure 2, the order of names on the top right of the panel should match the order of curves presented.

The plot legends were removed during the revision of Figure 2, thus making this comment no longer applicable to the new figure.

Minor comment #6

I value the efforts to improve open-source tools for system and AO control and GUI. And those tools seemed to have been modified for this work, although those modifications are not described. Would it be possible for the authors to describe those modifications?

We added a brief description of the major changes, as well as references to the software repositories where all those changes were tracked and documented:

“All three were further developed and improved, to match the requirements of Deep3DSIM. Most notably, remote focusing was added to the microscope-aotools package. All changes were tracked and they are described in more detail in the respective branch repositories:

python-microscope: https://github.com/dstoychev/microscope/tree/deepsim

microscope-cockpit: https://github.com/dstoychev/cockpit/tree/deepsim

microscope-aotools: https://github.com/dstoychev/microscope-aotools/tree/deepsim-matthew-danny”

Minor comment #7

Reported average values of the FWHM of imaged beads in 3D (p4) require also to report errors associated with those measurements.

We agree with this comment. To address this point, we have now removed the resolution numbers from the first paragraph of the Results section, but we revised Figure 2A extensively and now it contains comprehensive statistical information for the FWHM estimations.

Minor comment #8

Page 13, second paragraph states that "The results from chapter 3.2…" I believe that was a copy/paste from a thesis but should be corrected for a peer-reviewed publication, as there is no chapter 3.2.

This was left over in error, from an older version of the supplementary material, which used numbered sectioning. To address this comment, we rephrased the sentence, which now correctly refers to Appendix 6:

“The findings from Appendix 6 demonstrated […]”


Articles from eLife are provided here courtesy of eLife Sciences Publications, Ltd

RESOURCES