Skip to main content
Optics Express logoLink to Optics Express
. 2021 Mar 3;29(6):8417–8429. doi: 10.1364/OE.419311

Shack-Hartmann wavefront sensor optical dynamic range

Vyas Akondi 1, Alfredo Dubra 1,*
PMCID: PMC8237929  PMID: 33820289

Abstract

The widely used lenslet-bound definition of the Shack-Hartmann wavefront sensor (SHWS) dynamic range is based on the permanent association between groups of pixels and individual lenslets. Here, we formalize an alternative definition that we term optical dynamic range, based on avoiding the overlap of lenslet images. The comparison of both definitions for Zernike polynomials up to the third order plus spherical aberration shows that the optical dynamic range is larger by a factor proportional to the number of lenslets across the SHWS pupil. Finally, a pre-centroiding algorithm to facilitate lenslet image location in the presence of defocus and astigmatism is proposed. This approach, based on the SHWS image periodicity, is demonstrated using optometric lenses that translate lenslet images outside the projected lenslet boundaries.

1. Introduction

The Shack-Hartmann wavefront sensor (SHWS) [1,2] is widely used in science [310], industry [1113] and medicine [1417]. This device samples wavefronts by estimating the centroids of images formed by an array of lenslets onto a pixelated detector. When the pixelated sensor is at the geometrical back focal plane [18] of a uniformly-illuminated lenslet [19], the displacement of a lenslet image centroid ρ from a reference position is proportional to the gradient of the wavefront W(r) averaged over the lenslet [20],

ρ=flW(r)d2rd2r. (1)

Here, fl is the lenslet focal length and the double integrals are evaluated over the lenslet area.

A critical design specification for SHWSs is the dynamic range for aberrations of interest. Despite its importance, there are different, and often misunderstood SHWS dynamic range definitions. Here we first examine the commonly used lenslet-bound definition, which is based on the permanent association between non-overlapping groups of pixels and the corresponding lenslets. The resulting modest dynamic range has motivated the development of numerous SHWS variations [2129], often based on the sequential sampling of pupil regions [24,3037], and with some being substantially different and/or more complex instruments [3845]. By removing the permanent association between pixels and lenslets, which can be thought of as a software [4648] or hardware limitation, we formalize a previously proposed definition based on avoiding the overlap of adjacent lenslet images [4957]. Because this definition does not include the pixelated sensor, we refer to it as the SHWS optical dynamic range. This definition is followed by a comparison of both the lenslet-bound and optical dynamic ranges, first for an arbitrary wavefront, and then, for each Zernike polynomial up to the third order plus spherical aberration. Then, we derive the relation between the lenslet image lattice vectors for periodic SHWS lenslet arrays and the Zernike polynomial coefficients for defocus and astigmatism. Finally, we use this relation to demonstrate a pre-centroiding algorithm that facilitates the assignment of lenslet images to groups of pixels on which to centroid in order to obtain a precise wavefront estimation.

2. SHWS dynamic range based on wavefront first derivative

The pixelated sensors of the first SHWSs consisted of arrays of pixel arrays with their output wired to readout and processing electronics dedicated to each lenslet [58]. This permanent association between pixels and lenslets led to a lenslet-bound definition of dynamic range in terms of the maximum wavefront slope that would shift a lenslet image towards the edge of its pixel group, as depicted in Fig. 1 [13,23,25,38,45,5963]. In this way, for a group of pixels with diameter Dl, the maximum wavefront slope averaged over a lenslet |θmax|=|W(r)d2r| before the lenslet image of width Di reaches the edge of the pixel group is,

|θmax|=12fl(DlDi), (2)

where it is assumed that the image position for a flat wavefront is at the center of the pixel group. These groups are often chosen as consisting of all pixels within the projection of the lenslet boundary on to the pixelated sensor, in which case Dl is the lenslet pitch, which coincides with its diameter for lenslets with 100% fill factor.

Fig. 1.

Fig. 1.

Shack-Hartmann wavefront sensor geometry used to define lenslet-bound dynamic range such the lenslet image (red spot) does not reach the boundary of the lenslet projection onto the sensor, in which the pixels are depicted as gray squares.

The condition in Eq. (2) can be used to calculate the maximum measurable amplitude aj of a wavefront Zj(x,y) over the SHWS pupil Ω, by solving

12fl(DlDi) = 2ajDpmax{maxΩ|Zj(x,y)x|,maxΩ|Zj(x,y)y|}, (3)

where it is assumed that the lenslets are squares with their sides along the x- and y-axis and Dp is the SHWS pupil diameter. This condition ignores the averaging of the wavefront slope over each lenslet in the interest of simplicity, leading to a slight underestimation of the amplitude of the maximum measurable wavefronts. For reasons that will become apparent later, let us now rewrite this condition using the directional derivative of the wavefront along a line with slope tanα relative to the x-axis,

αW(x,y)=W(x,y)(cosα,sinα), (4)

where denotes the dot product between vectors on the xy plane. Therefore, we can use this definition to rewrite Eq. (3) as

12fl(DlDi) = 2ajDpmaxΩ,α=0,π|αZj(x,y)|. (5)

For hexagonal lenslets, this condition must be modified to consider the lenslet image reaching any of the six lenslet sides along the lines defined by α=π/3, 0 and π/3. Now considering that wavefront slope variation can often be considered negligible across individual lenslets, we can generalize this definition by maximizing the directional wavefront derivative over a circle inscribed in the lenslet, that is,

(DlDi)2fl = 2ajDpmaxΩ,α|αZj(x,y)|, (6)

with α[0,π]. The analytical calculation of the directional derivative maxima might not be trivial, as this is a problem of maximization with constraints, with the SHWS pupil being the constraint. Thus, numerical evaluation of this condition might be preferable.

3. SHWS dynamic range based on wavefront second derivative

Most SHWSs capture all the lenslet images with a single two-dimensional array of pixels, and thus, are not limited by the permanent association between pixels and lenslets. This allows for a dynamic range definition based on avoiding the partial overlap of lenslet images, irrespective of their position on the pixelated sensor. Here, we formalize this idea, independently suggested by multiple authors [4957], to which we refer as the optical dynamic range.

Let us start by assuming that two identical adjacent lenslets, indexed k and k+1, without loss of generality have their centers (xk,yk) along a line parallel to the x-axis (i.e., yk+1=yk) and with xk+1>xk. As depicted in Fig. 2, for such images to not overlap the following condition between angular displacements θx,k and θx,k+1 must be met

dlDi+fl(θx,k+1θx,k)0, (7)

with dl being the distance between the lenslets, which coincides with the lenslet diameter Dl if the lenslet fill factor is 100%. If the lenslets are small relative to the period of the maximum spatial frequency of a wavefront W(x,y) with x and y normalized by the SHWS pupil radius (Dp/2), we can approximate the angular difference in terms of the wavefront second derivative as follows (see Appendix A)

θx,k+1θx,k(4dlDp2)2W(x,y)x2. (8)

Fig. 2.

Fig. 2.

Shack-Hartmann wavefront sensor condition used to define optical dynamic range as to avoid overlap between images of adjacent lenslets (red spots on the top view), with the pixels depicted as gray squares (compare with Fig. 1).

Therefore, the maximum positive measurable amplitude bj+ of a wavefront described by function Zj(x,y) that avoids lenslet image overlap along the x-direction can be calculated by solving

dlDi+fl(4dlDp2)bj+minΩ[2Zj(x,y)x2]=0. (9)

When the lenslet image is smaller than the lenslet separation (i.e., dl>Di) and because bj+>0, this condition can only be met if the function second derivative is negative somewhere within the pupil. Otherwise, the positive limit of the optical dynamic range bj+ is +, as it is the case for defocus. In practice, of course, no SHWS can measure an infinitely large amount of positive defocus because the lenslet images will eventually reach the edge of the pixelated sensor. The same reasoning that led to Eq. (9) applied to a negative amplitude bj yields

dlDi+fl(4dlDp2)bjmaxΩ[2Zj(x,y)x2]=0, (10)

Again, when dl>Di, and because bj<0, this condition has a solution if the wavefront second derivative is positive somewhere within the pupil. From these two conditions, it is important to note that it is possible to have different minimum and maximum measurable amplitudes for the same wavefront aberration, that is, the optical dynamic range can be asymmetric.

The conditions in Eqs. (9) and (10), however, only consider the x-axis direction. As Fig. 3 depicts for a square lenslet array, a lenslet image (green circle) could be shifted by aberrated wavefront towards images of adjacent or even distant lenslets (red circles) along the directions shown by the (orange) line segments. Using the second directional derivative α2W defined as α(αW), we can generalize the condition for avoiding the lenslet images along the direction defined by the angle α as

dl(α)Di+fl(4dl(α)Dp2)bj+minΩ[α2Zj(x,y)]=0, (11)

for positive amplitudes and

dl(α)Di+fl(4dl(α)Dp2)bjmaxΩ[α2Zj(x,y)]=0. (12)

for negative amplitudes, where the dependence of dl on α captures the fact that lenslets along different directions can be separated by different distances. This means that these equations have to be solved for each angle separately. In the interest of simplicity, we propose to use Dl which is the minimum value of dl(α), instead of dl(α), being aware that this will lead to a slight underestimation of the optical dynamic range. Therefore, our proposed formulae for determining the optical dynamic range of a SHWS, irrespective of the lenslet geometry, are

bj+=(DlDi)Dp24flDlminΩ,α[α2Zj(x,y)], (13)

for positive amplitudes and

bj=(DlDi)Dp24flDlmaxΩ,α[α2Zj(x,y)] (14)

for negative amplitudes, with the minimization and maximization performed over the entire pupil and for α[0,π].

Fig. 3.

Fig. 3.

Depiction of lenslet image locations in a Shack-Hartmann wavefront sensor with square lenslets (black outlines) showing “adjacent” (red) images to the green image along various directions indicated by the orange line segments.

4. Dynamic range definition comparison for low order Zernike polynomials

Let us now compare the two SHWS dynamic range definitions discussed above for the Zernike polynomials up to the third order and spherical aberration [64] through their ratios, noting that

α2W(x,y)=2W(x,y)x2cos2α+2W(x,y)xysin2α+2W(x,y)y2sin2α. (15)

For positive wavefront amplitudes, we have

bj+aj=(DpDl)max{maxΩ|Zj(x,y)x|,maxΩ|Zj(x,y)y|}minΩ,α[2Zj(x,y)x2cos2α+2Zj(x,y)xysin2α+2Zj(x,y)y2sin2α], (16)

where if the denominator is positive, this ratio should be replaced with +. Similarly, for negative amplitudes we have

bjaj=(DpDl)max{maxΩ|Zj(x,y)x|,maxΩ|Zj(x,y)y|}maxΩ,α[2Zj(x,y)x2cos2α+2Zj(x,y)xysin2α+2Zj(x,y)y2sin2α], (17)

where if the denominator is negative, the ratio should be replaced with . Interestingly, these ratios do not depend on the image size Di. More importantly, the ratios are proportional to the number of lenslets across the pupil Dp/Dl, which for most SHWSs is greater than 10.

Analytical evaluation of these ratios over a circular pupil of radius are shown in Table 1 below, with the fourth column values calculated using Eq. (5), the fifth column using Eqs. (13) and (14), and the sixth column using Eqs. (16) and (17). As expected for tip and tilt, the second definition yields an infinite optical dynamic range because these aberrations shift all the lenslet images equally without changing their separation. Also, as mentioned earlier, defocus has an asymmetric optical dynamic range, infinite towards the positive amplitudes because these wavefronts separate the lenslet images, and finite towards negative amplitudes because the lenslet images are brought closer together. In practice, the infinite ends of the SHWS optical dynamic range are truncated by either the pixelated sensor finite size or the increased size of the lenslet images. Third order aberrations have symmetric finite optical dynamic ranges, while spherical aberration has an asymmetric finite optical dynamic range.

Table 1. SHWS dynamic range for Zernike polynomials based on preventing lenslet images from leaving the corresponding lenslet outline (lenslet-bound dynamic range, 4rd column) and avoiding lenslet image overlap (optical dynamic range, 5th column).

Name Index Polynomial 4flajDp(DlDi) (DlDp)4flbjDp(DlDi) (DlDp)bjaj
Tip 1  2y 12 , + , +
Tilt 2 2 x 12 , + , +
Oblique astigmatism 3  26xy 126 126, 126 1, 1
Defocus 4  3(2x2+2y21) 143 143, + 1, +
Vertical astigmatism 5  6(x2y2) 126 126, 126 1, 1
Vertical trefoil 6  8(3x2yy3) 138 168, 168 12, 12
Vertical coma 7  8(3x2y+3y32y) 178 1188, 1188 718, 718
Horizontal coma 8  8(3x3+3xy22x) 178 1188, 1188 718, 718
Oblique trefoil 9  8(x33xy2) 138 168,168 12, 12
Spherical aberration 12 5(6x4+12x2y2+6y46x26y2+1) 1125 1605, 1125 15, 1

5. SHWS lattice vectors in the presence of tip, tilt, defocus, and astigmatism

In order to take full advantage of the optical dynamic range, the SHWS image processing should include a pre-centroiding step, in which the coarse location of individual lenslet images is determined and assigned to the corresponding lenslets. This can be achieved by exploiting the fact that when the wavefronts are within the SHWS optical dynamic range, the lenslet images are monotonically sorted along the x- and y-axis, and in most cases, forming a 2-dimensional Bravais square or hexagonal lattice. When this is the case, if a single lenslet image is found, then the other lenslet images can be coarsely found by moving across the pixelated sensor in integer combinations of the Bravais lattice vectors. Tip and tilt will shift all lenslet images equally, thus preserving the lattice vectors. Defocus, vertical and oblique astigmatism, on the other hand, will change the lattice vectors as it is calculated next.

Let us start by defining the lenslet lattice vectors as v1=(v1,x,v1,y) and v2=(v2,x,v2,y) that describe the SHWS lenslet image lattice resulting from a flat wavefront. The lattice vectors v1 and v2 for an aberrated wavefront in normalized pupil coordinates W(2x/Dp,2y/Dp), can be approximated using the wavefront derivatives at the center of the ith lenslet (xi,yi), instead of the average lenslet value over each lenslet, as

v1v1+2flDp[W(xi+v1,xDp/2,yi+v1,yDp/2)(xiDp/2)W(xiDp/2,yiDp/2)(xiDp/2), W(xi+v1,xDp/2,yi+v1,yDp/2)(yiDp/2)W(xiDp/2,yiDp/2)(yiDp/2)], (18)
v2v2+2flDp[W(xi+v2,xDp/2,yi+v2,yDp/2)(xiDp/2)W(xiDp/2,yiDp/2)(xiDp/2),W(xi+v2,xDp/2,yi+v2,yDp/2)(yiDp/2)W(xiDp/2,yiDp/2)(yiDp/2)]. (19)

Let us now consider a wavefront, described by a polynomial of the form Ax2+Bxy+Cy2+Dx+Ey+F, as a linear combination of defocus, astigmatism, tip, tilt, and piston. The SHWS image lattice vectors that would result from such wavefront can be calculated using Eqs. (18) and (19) as

v1v1+(4flDp2)(2Av1,x+Bv1,y,Bv1,x+2Cv1,y), (20)
v2v2+(4flDp2)(2Av2,x+Bv2,y,Bv2,x+2Cv2,y). (21)

These new vectors do not depend on the lenslet coordinates (xi,yi), and thus, the 2D Bravais lenslet image lattice is preserved. Now, substituting the OSA definition of Zernike polynomials [64] in Eq. (20) and Eq. (21) for oblique astigmatism with amplitude b3, defocus with amplitude b4, and vertical astigmatism with amplitude b5 for an initial square lattice along the x- and y-axis, the lattice vectors of the SH image lattice are

v1Dl(1,0)+Dl(4flDp2)(43b4+26b5,26b3), (22)
v2Dl(0,1)+Dl(4flDp2)(26b3,43b426b5). (23)

If the lattice vectors are experimentally estimated (see Fig. 4), then the x- and y-components of these vector equations form a system of four linear equations with three unknown aberration amplitudes. These amplitudes can be calculated either analytically or numerically using linear algebra.

Fig. 4.

Fig. 4.

Images from a SHWS with a square lenslet array illuminated by an aberration-free wavefront (a), and a wavefront generated by a 32.7 D convex cylinder oriented at 45° (c), with lattice vectors v1 and v2 shown in green. Panels (b) and (d) show the corresponding spectra with the reciprocal lattice vectors u1 and u2, in dark blue. The axes of the SHWS images are in units of lenslet pitch Dl, and for the spectra in units of 1/Dl.

The linearity of the lattice vector change with defocus and astigmatism, together with the array theorem can be used, together with the discrete Fourier transform (DFT) [65], to estimate the amplitudes of defocus and astigmatism from a raw SHWS image as follows. First, we calculate the absolute value of the DFT of the SHWS image, and then, we locate the two maxima nearest to the zero-frequency term (DC) that are not axisymmetric. The vector positions of these maxima relative to the zero spatial frequency (u1 and u2) are the reciprocal lattice vectors, which can be used to calculate the actual lattice vectors [66], as

v11u1R90u2R90u2, (24)
v21u2R90u1R90u1, (25)

where denotes inner product and R90 is the rotation matrix

R90=[0110]. (26)

The direct and reciprocal lattice vectors of SHWS images captured with square lenslet arrays with and without a convex cylinder optometric lens are shown in Fig. 4.

The value of this algorithm is not in the estimation of defocus and astigmatism coefficients, which is coarse due to the finite SHWS image sampling, but rather on the facilitation of the location of the SHWS lenslet images to assign a group of pixels to each SHWS lenslet image. This can be achieved simply by finding a single lenslet image (e.g., the brightest), and then move along the image by integer combinations of the lattice vectors. The resulting locations would then be used as the center of a group of pixels used to estimate the lenslet image centroid, which will allow precise estimation of the wavefront.

Here it is important to note that if the wavefront has third or higher order polynomial wavefront components, the periodicity of the lenslet image lattice will degrade. When such distortion is small, that is, if the lenslet images remain within the cell associated to the previously estimated lattice vector, then regions of interest on the pixelated sensor centered on the lattice cell centers will suffice to initiate the lenslet image centroid calculations.

6. Experiments

Wavefronts with defocus and astigmatism were measured with a custom SHWS depicted at the top of Fig. 5, consisting of a EXi Aqua camera (Teledyne Qimaging, Surrey, BC, Canada) with an array of 7.6 mm geometrical focal length and 300 µm pitch (Adaptive Optics Associates, now part of AOA Xinetics, Devens, MA, USA) lenslets focused to account for their low Fresnel number [18]. Light from a 637 nm S1FC637 laser diode (Thorlabs, Newton, NJ, USA) delivered by a single-mode optical fiber was collimated with an achromatic doublet to illuminate the first of three pupil planes with an approximately plane wavefront. An iris diaphragm, optometric lenses, and the SHWS lenslet array were placed in the three pupil planes relayed by afocal telescopes formed by achromatic doublets, as shown in Fig. 5.

Fig. 5.

Fig. 5.

Schematic diagram of the optical setup used to capture Shack-Hartmann wavefront sensor (SHWS) images (top) while placing convex and concave sphere lenses in the pupil plane P2, such as the examples shown above and below the plot. In these images, the red boxes show the boundaries of the lenslets with peak intensity 40% or higher than the maximum pixel value in the 0 D image. The plot shows defocus and astigmatism estimated using the proposed pre-centroiding algorithm based on the estimation of the SHWS image lattice vectors. The separation between the adjacent gray vertical lines correspond to the lenslet-bound SHWS dynamic range definition, while the red and cyan vertical lines indicate the optical dynamic range and its limit when considering the camera’s region of interest, respectively.

SHWS images captured the wavefronts generated through sphere lenses with optical power PD in the ±12 D range with 1 D steps, which due to the 0.43 magnification between the lens plane and the lenslet array plane the power scaled to ±65.3 D (PD=PD/M2). The lattice vectors were calculated for each SHWS image and their components were compared with the prediction of Eqs. (22) and (23), with the results plotted in Fig. 5. The spacing between vertical light gray lines in this plot correspond to the lenslet-bound dynamic range, calculated using Eq. (5), modeling the lenslet image width as 2× the full-width at half-maximum of a Gaussian beam (see Eq. (19) in Ref. [67]). The positive end of the optical dynamic range, denoted by the vertical orange line on the right, is approximately 13 times larger than the separation of the vertical gray lines (lenslet-bound dynamic range). The data itself, shows that the dominant aberration, defocus (b4), is estimated with a maximum of 11% error over a range 8 times larger than the lenslet-bound definition.

We did not test larger positive defocus amplitudes because the metal housing in which the lenslet array was mounted vignetted the beam (outer lenslets). A small artifactual astigmatism (mean 2.3%), likely due to trial lens centration inaccuracies and coarse DFT interpolation, was estimated. Although the lowest end of the SHWS optical defocus dynamic range is , in practice this lower end is determined by whenever a lenslet images reaches the edge of the pixelated sensor’s region of interest DROI, that is when spacing between adjacent lenslet images in the presence of defocus sl meets the condition,

slDpDlDi=DROI. (27)

This limit is shown in the plot in Fig. 5 as cyan vertical line.

The results of a similar experiment using convex cylinder optometric lenses oriented at 45° to generate oblique astigmatism and defocus are shown in Fig. 6. Accounting for pupil magnification, the optical powers at the SHWS lenslet array plane were between 5.4 and 32.7 D, which spans almost 6 times the lenslet-bound dynamic range definition.

Fig. 6.

Fig. 6.

SHWS images captured with the optical setup depicted at the top of Fig. 5 using various optometric cylinder lenses. In these images, the red boxes show the boundaries of the lenslets with peak intensity 40% or higher than the maximum pixel value in the 0 D image. The plot shows the defocus estimated using the proposed pre-centroiding algorithm based on the estimation of the SHWS image lattice vectors for defocus and astigmatism.

7. Summary

A definition of the SHWS optical dynamic range in terms of avoiding lenslet image overlap was formalized and compared with the widely used definition based on restricting each lenslet image to a fixed group of pixels. The optical dynamic range is larger by a factor proportional to the number of lenslets across the pupil and provides a method for calculating the minimum number of lenslets across its pupil for measuring a desired wavefront aberration amplitude. The proposed formulation of the optical dynamic range definition in terms of the extreme values of the directional wavefront curvature within circles inscribed in lenslets is applicable to lenslets of any shape and array geometry, even if non-periodic.

A pre-centroiding algorithm based on the estimation of the SHWS image lattice vectors was proposed to facilitate lenslet image location in the presence of large defocus and astigmatism amplitudes. This algorithm was demonstrated using optometric trial lenses that displaced the SHWS lenslet images well beyond the projection of the lenslet boundary onto the SHWS pixelated sensor.

Acknowledgments

The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Appendix A

Let W(x,y) be a wavefront with x and y unnormalized SHWS pupil coordinates, and x and y be the coordinates normalized by the SHWS pupil radius, that is, x=2x/Dp and y=2y/Dp. Now, the gradient of the wavefront along x-axis is given by,

θx=W(x,y)x=W(x,y)x(xx)=W(x,y)x(2Dp). (28)

Let us assume two identical adjacent lenslets with their centers along the x-axis and the second lenslet having the larger abscissa. Then, the difference between the lenslet image angular coordinate can be written as follows,

θxk+1θxk=(2Dp)[W(x+dlDp/2,y)xk+1W(x,y)xk], (29)

dl being the distance between lenslets. Multiplying and dividing by 2dl/Dp on the right-hand side of the above equation, we get,

θxk+1θxk=(4dlDp2)2W(x,y)x2. (30)

Funding

National Eye Institute10.13039/100000053 (P30EY026877, R01EY025231, R01EY031360, R01EY027301); Research to Prevent Blindness10.13039/100001818 (Challenge Grant).

Disclosures

The authors declare no conflicts of interest.

References

  • 1.Hartmann J., “Bemerkungen über den bau und die justierung von spektrographen,” Z. Instrumentenkd 20, 47–58 (1900). [Google Scholar]
  • 2.Shack R. V., Platt B. C., “Production and use of a lenticular Hartmann screen,” J. Opt. Soc. Am. 61(5), 656 (1971). [Google Scholar]
  • 3.Liang J., Williams D. R., Miller D., “Supernormal vision and high-resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A 14(11), 2884–2892 (1997). 10.1364/JOSAA.14.002884 [DOI] [PubMed] [Google Scholar]
  • 4.Yoon G. Y., Williams D. R., “Visual performance after correcting the monochromatic and chromatic aberrations of the eye,” J. Opt. Soc. Am. A 19(2), 266–275 (2002). 10.1364/JOSAA.19.000266 [DOI] [PubMed] [Google Scholar]
  • 5.Roorda A., Romero-Borja F., Donnelly W., III, Queener H., Hebert T., Campbell M., “Adaptive optics scanning laser ophthalmoscopy,” Opt. Express 10(9), 405–412 (2002). 10.1364/OE.10.000405 [DOI] [PubMed] [Google Scholar]
  • 6.Artal P., Chen L., Fernández E. J., Singer B., Manzanera S., Williams D. R., “Neural compensation for the eye's optical aberrations,” J. Vis. 4(4), 4–287 (2004). 10.1167/4.4.4 [DOI] [PubMed] [Google Scholar]
  • 7.Wizinowich P. L., Le Mignant D., Bouchez A. H., Campbell R. D., Chin J. C. Y., Contos A. R., van Dam M. A., Hartman S. K., Johansson E. M., Lafon R. E., Lewis H., Stomski P. J., Summers D. M., “The W. M. Keck Observatory laser guide star adaptive optics system: Overview,” Publ. Astron. Soc. Pac. 118(840), 297–309 (2006). 10.1086/499290 [DOI] [Google Scholar]
  • 8.Azucena O., Crest J., Kotadia S., Sullivan W., Tao X., Reinig M., Gavel D., Olivier S., Kubby J., “Adaptive optics wide-field microscopy using direct wavefront sensing,” Opt. Lett. 36(6), 825–827 (2011). 10.1364/OL.36.000825 [DOI] [PubMed] [Google Scholar]
  • 9.Dubra A., Sulai Y., Norris J. L., Cooper R. F., Dubis A. M., Williams D. R., Carroll J., “Noninvasive imaging of the human rod photoreceptor mosaic using a confocal adaptive optics scanning ophthalmoscope,” Biomed. Opt. Express 2(7), 1864–1876 (2011). 10.1364/BOE.2.001864 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Booth M. J., “Adaptive optical microscopy: the ongoing quest for a perfect image,” Light: Sci. Appl. 3(4), e165 (2014). 10.1038/lsa.2014.46 [DOI] [Google Scholar]
  • 11.Levine B. M., Martinsen E. A., Wirth A., Jankevics A., Toledo-Quinones M., Landers F., Bruno T. L., “Horizontal line-of-sight turbulence over near-ground paths and implications for adaptive optics corrections in laser communications,” Appl. Opt. 37(21), 4553–4560 (1998). 10.1364/AO.37.004553 [DOI] [PubMed] [Google Scholar]
  • 12.Forest C., Canizares C., Neal D., McGuirk M., Schattenburg M., “Metrology of thin transparent optics using Shack-Hartmann wavefront sensing,” Opt. Eng. 43(3), 742–753 (2004). 10.1117/1.1645256 [DOI] [Google Scholar]
  • 13.Dörband B., Müller H., Gross H., “Handbook of Optical Systems,” Volume 5: Metrology of Optical Components and Systems, 1st ed. (John Wiley & Sons, 2012). [Google Scholar]
  • 14.Mrochen M., Kaemmerer M., Seiler T., “Wavefront-guided laser in situ keratomileusis: early results in three eyes,” J. Refract. Surg. 16(2), 116–121 (2000). [DOI] [PubMed] [Google Scholar]
  • 15.Mrochen M., Kaemmerer M., Seiler T., “Clinical results of wavefront-guided laser in situ keratomileusis 3 months after surgery,” J. Cataract Refract. Surg. 27(2), 201–207 (2001). 10.1016/S0886-3350(00)00827-0 [DOI] [PubMed] [Google Scholar]
  • 16.Schallhorn S., Brown M., Venter J., Teenan D., Hettinger K., Yamamoto H., “Early clinical outcomes of wavefront-guided myopic LASIK treatments using a new-generation Hartmann-Shack aberrometer,” J Refract Surg 30(1), 14–21 (2014). 10.3928/1081597X-20131029-02 [DOI] [PubMed] [Google Scholar]
  • 17.Vinas M., Benedi-Garcia C., Aissati S., Pascual D., Akondi V., Dorronsoro C., Marcos S., “Visual simulators replicate vision with multifocal lenses,” Sci. Rep. 9(1), 1539 (2019). 10.1038/s41598-019-38673-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Akondi V., Dubra A., “Accounting for focal shift in the Shack–Hartmann wavefront sensor,” Opt. Lett. 44(17), 4151–4154 (2019). 10.1364/OL.44.004151 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Akondi V., Steven S., Dubra A., “Centroid error due to non-uniform lenslet illumination in the Shack–Hartmann wavefront sensor,” Opt. Lett. 44(17), 4167–4170 (2019). 10.1364/OL.44.004167 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Akondi V., Dubra A., “Average gradient of Zernike polynomials over polygons,” Opt. Express 28(13), 18876–18886 (2020). 10.1364/OE.393223 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Roggemann M. C., Schulz T. J., “Algorithm to increase the largest aberration that can be reconstructed from Hartmann sensor measurements,” Appl. Opt. 37(20), 4321–4329 (1998). 10.1364/AO.37.004321 [DOI] [PubMed] [Google Scholar]
  • 22.Groening S., Sick B., Donner K., Pfund J., Lindlein N., Schwider J., “Wave-front reconstruction with a Shack–Hartmann sensor with an iterative spline fitting method,” Appl. Opt. 39(4), 561–567 (2000). 10.1364/AO.39.000561 [DOI] [PubMed] [Google Scholar]
  • 23.Molebny V., “Scanning Shack-Hartmann wavefront sensor,” Proc. SPIE 5412, 66–71 (2004). 10.1117/12.541755 [DOI] [Google Scholar]
  • 24.Seifert L., Tiziani H. J., Osten W., “Wavefront reconstruction with the adaptive Shack–Hartmann sensor,” Opt. Commun. 245(1-6), 255–269 (2005). 10.1016/j.optcom.2004.09.074 [DOI] [Google Scholar]
  • 25.Choo H., Muller R. S., “Addressable Microlens Array to Improve Dynamic Range of Shack–Hartmann Sensors,” J. Microelectromech. Syst. 15(6), 1555–1567 (2006). 10.1109/JMEMS.2006.886011 [DOI] [Google Scholar]
  • 26.Hongbin Y., Guangya Z., Siong C. F., Feiwen L., Shouhua W., “A tunable Shack–Hartmann wavefront sensor based on a liquid-filled microlens array,” J. Micromech. Microeng. 18(10), 105017 (2008). 10.1088/0960-1317/18/10/105017 [DOI] [Google Scholar]
  • 27.Xia M., Li C., Hu L., Cao Z., Mu Q., Li X., “Shack-Hartmann wavefront sensor with large dynamic range,” J. Biomed. Opt. 15(2), 1–10 (2010). 10.1117/1.3369810 [DOI] [PubMed] [Google Scholar]
  • 28.Martínez-Cuenca R., Durán V., Climent V., Tajahuerce E., Bará S., Ares J., Arines J., Martínez-Corral M., Lancis J., “Reconfigurable Shack–Hartmann sensor without moving elements,” Opt. Lett. 35(9), 1338–1340 (2010). 10.1364/OL.35.001338 [DOI] [PubMed] [Google Scholar]
  • 29.Kumar N., Khare A., Boruah B., “Enhanced dynamic range of the grating array based zonal wavefront sensor using a zone wise scanning method,” Proc. SPIE 11287, E1–E6 (2020). 10.1117/12.2542084 [DOI] [Google Scholar]
  • 30.Carmichael Martins A., Vohnsen B., “Measuring ocular aberrations sequentially using a digital micromirror device,” Micromachines 10(2), 117 (2019). 10.3390/mi10020117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Aftab M., Choi H., Liang R., Kim D. W., “Adaptive Shack-Hartmann wavefront sensor accommodating large wavefront variations,” Opt. Express 26(26), 34428–34441 (2018). 10.1364/OE.26.034428 [DOI] [PubMed] [Google Scholar]
  • 32.Yoon G.-Y., Pantanelli S., Nagy L. J., “Large-dynamic-range Shack-Hartmann wavefront sensor for highly aberrated eyes,” J. Biomed. Opt. 11(3), 1–3 (2006). 10.1117/1.2197860 [DOI] [PubMed] [Google Scholar]
  • 33.Pantanelli S., MacRae S., Jeong T. M., Yoon G., “Characterizing the Wave Aberration in Eyes with Keratoconus or Penetrating Keratoplasty Using a High–Dynamic Range Wavefront Sensor,” Ophthalmology 114(11), 2013–2021 (2007). 10.1016/j.ophtha.2007.01.008 [DOI] [PubMed] [Google Scholar]
  • 34.Yoon G., “Large dynamic range Shack-Hartmann wavefront sensor,” U.S. patent, 7,414,712 (19 Aug. 2008).
  • 35.Olivier S., Laude V., Huignard J.-P., “Liquid-crystal Hartmann wave-front scanner,” Appl. Opt. 39(22), 3838–3846 (2000). 10.1364/AO.39.003838 [DOI] [PubMed] [Google Scholar]
  • 36.Laude V., Olivier S., Dirson C., Huignard J.-P., “Hartmann wave-front scanner,” Opt. Lett. 24(24), 1796–1798 (1999). 10.1364/OL.24.001796 [DOI] [PubMed] [Google Scholar]
  • 37.Navarro R., Moreno-Barriuso E., “Laser ray-tracing method for optical testing,” Opt. Lett. 24(14), 951–953 (1999). 10.1364/OL.24.000951 [DOI] [PubMed] [Google Scholar]
  • 38.McKay G. N., Mahmood F., Durr N. J., “Large dynamic range autorefraction with a low-cost diffuser wavefront sensor,” Biomed. Opt. Express 10(4), 1718–1735 (2019). 10.1364/BOE.10.001718 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Shinto H., Saita Y., Nomura T., “Shack-Hartmann wavefront sensor with large dynamic range by adaptive spot search method,” Appl. Opt. 55(20), 5413–5418 (2016). 10.1364/AO.55.005413 [DOI] [PubMed] [Google Scholar]
  • 40.Levecq X. J.-F., Bucourt S. H., “Method and device for analysing a highly dynamic wavefront,” U.S. patent, 6,750,957 (15 June 2004).
  • 41.Altmann G., “Method and apparatus for improving the dynamic range and accuracy of a Shack-Hartmann wavefront sensor,” U.S. patent application, 10/013, 565 (2003).
  • 42.Lindlein N., Pfund J., “Experimental results for expanding the dynamic range of a Shack-Hartmann sensor by using astigmatic microlenses,” Opt. Eng. 41(2), 529–533 (2002). 10.1117/1.1430724 [DOI] [Google Scholar]
  • 43.Wei X., Van Heugten T., Thibos L., “Validation of a Hartmann-Moiré Wavefront Sensor with Large Dynamic Range,” Opt. Express 17(16), 14180–14185 (2009). 10.1364/OE.17.014180 [DOI] [PubMed] [Google Scholar]
  • 44.Podanchuk D., Dan’ko V., Kotov M., Son J.-Y., Choi Y.-J., “Extended-range Shack-Hartmann wavefront sensor with nonlinear holographic lenslet array,” Opt. Eng. 45(5), 053605 (2006). 10.1117/1.2202358 [DOI] [Google Scholar]
  • 45.Ko J., Davis C. C., “Comparison of the plenoptic sensor and the Shack-Hartmann sensor,” Appl. Opt. 56(13), 3689–3698 (2017). 10.1364/AO.56.003689 [DOI] [PubMed] [Google Scholar]
  • 46.Gao Z., Li X., Ye H., “Large dynamic range Shack–Hartmann wavefront measurement based on image segmentation and a neighbouring-region search algorithm,” Opt. Commun. 450, 190–201 (2019). 10.1016/j.optcom.2019.05.045 [DOI] [Google Scholar]
  • 47.Leroux C., Dainty C., “A simple and robust method to extend the dynamic range of an aberrometer,” Opt. Express 17(21), 19055–19061 (2009). 10.1364/OE.17.019055 [DOI] [PubMed] [Google Scholar]
  • 48.Lundström L., Unsbo P., “Unwrapping Hartmann-Shack Images from Highly Aberrated Eyes Using an Iterative B-spline Based Extrapolation Method,” Optom. Vis. Sci. 81(5), 383–388 (2004). 10.1097/01.opx.0000135086.61760.b7 [DOI] [PubMed] [Google Scholar]
  • 49.Smith D. G., Greivenkamp J. E., “Generalized method for sorting Shack-Hartmann spot patterns using local similarity,” Appl. Opt. 47(25), 4548–4554 (2008). 10.1364/AO.47.004548 [DOI] [PubMed] [Google Scholar]
  • 50.Smith D. G., “High dynamic range calibration for an infrared Shack-Hartmann wavefront sensor,” (University of Arizona, 2008). [Google Scholar]
  • 51.Mauch S., Reger J., “Real-Time Spot Detection and Ordering for a Shack–Hartmann Wavefront Sensor With a Low-Cost FPGA,” IEEE Trans. Instrum. Meas. 63(10), 2379–2386 (2014). 10.1109/TIM.2014.2310616 [DOI] [Google Scholar]
  • 52.Ares M., Royo S., Caum J., “Shack-Hartmann sensor based on a cylindrical microlens array,” Opt. Lett. 32(7), 769–771 (2007). 10.1364/OL.32.000769 [DOI] [PubMed] [Google Scholar]
  • 53.Lee J., Shack R. V., Descour M. R., “Sorting method to extend the dynamic range of the Shack–Hartmann wave-front sensor,” Appl. Opt. 44(23), 4838–4845 (2005). 10.1364/AO.44.004838 [DOI] [PubMed] [Google Scholar]
  • 54.Lee W.-W., Lee J. H., Hwangbo C. K., “Increase of dynamic range of a Shack-Hartmann sensor by shifting detector plane,” Proc. SPIE 5639, 70–77 (2004). 10.1117/12.571615 [DOI] [Google Scholar]
  • 55.Pfund J., Lindlein N., Schwider J., “Dynamic range expansion of a Shack–Hartmann sensor by use of a modified unwrapping algorithm,” Opt. Lett. 23(13), 995–997 (1998). 10.1364/OL.23.000995 [DOI] [PubMed] [Google Scholar]
  • 56.Rocktäschel M., Tiziani H. J., “Limitations of the Shack–Hartmann sensor for testing optical aspherics,” Opt. Laser Tech. 34(8), 631–637 (2002). 10.1016/S0030-3992(02)00069-5 [DOI] [Google Scholar]
  • 57.Campbell C. E., “The range of local wavefront curvatures measurable with Shack-Hartmann wavefront sensors,” Clin. Exp. Optom. 92(3), 187–193 (2009). 10.1111/j.1444-0938.2009.00371.x [DOI] [PubMed] [Google Scholar]
  • 58.Hardy J. W., Adaptive Optics for Astronomical Telescopes (Oxford University Press, 1998). [Google Scholar]
  • 59.Yoon G., “Wavefront sensing and diagnostic uses,” in Adaptive Optics for Vision Science (Wiley, 2006), pp. 63–81. [Google Scholar]
  • 60.Nikitin A., Sheldakova J., Kudryashov A., Borsoni G., Denisov D., Karasik V., Sakharov A., “A device based on the Shack-Hartmann wave front sensor for testing wide aperture optics,” Proc. SPIE 9754, 97540K (2016). 10.1117/12.2219282 [DOI] [Google Scholar]
  • 61.Saita Y., Shinto H., Nomura T., “Holographic Shack-Hartmann wavefront sensor based on the correlation peak displacement detection method for wavefront sensing with large dynamic range,” Optica 2(5), 411–415 (2015). 10.1364/OPTICA.2.000411 [DOI] [Google Scholar]
  • 62.Curatu C., Curatu G., Rolland J., “Fundamental and specific steps in Shack-Hartmann wavefront sensor design,” Proc. SPIE 6288, 1–9 (2006). 10.1117/12.680892 [DOI] [Google Scholar]
  • 63.Rammage R., Neal D., Copland R., “Application of Shack-Hartmann wavefront sensing technology to transmissive optic metrology,” Proc. SPIE 4779, 161–172 (2002). 10.1117/12.451734 [DOI] [Google Scholar]
  • 64.Thibos L. N., Applegate R. A., Schwiegerling J. T., Webb R., “Standards for reporting the optical aberrations of eyes,” J. Refract. Surg. 18(5), S652–S660 (2002). [DOI] [PubMed] [Google Scholar]
  • 65.Goodman J. W., Introduction to Fourier Optics, 4th ed. (W. H. Freeman and Company, 2017). [Google Scholar]
  • 66.Kittel C., Introduction to Solid State Physics, 5th ed. (Wiley, 1976). [Google Scholar]
  • 67.Akondi V., Dubra A., “Multi-layer Shack-Hartmann wavefront sensing in the point source regime,” Biomed. Opt. Express 12(1), 409–432 (2021). 10.1364/BOE.411189 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Optics Express are provided here courtesy of Optica Publishing Group

RESOURCES