Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Nov 2.
Published in final edited form as: Laser Photon Rev. 2008 Oct 1;2(5):350–376. doi: 10.1002/lpor.200810015

Laser applications and system considerations in ocular imaging

Ann E Elsner 1, Matthew S Muller 1
PMCID: PMC2967783  NIHMSID: NIHMS156795  PMID: 21052482

Abstract

We review laser applications for primarily in vivo ocular imaging techniques, describing their constraints based on biological tissue properties, safety, and the performance of the imaging system. We discuss the need for cost effective sources with practical wavelength tuning capabilities for spectral studies. Techniques to probe the pathological changes of layers beneath the highly scattering retina and diagnose the onset of various eye diseases are described. The recent development of several optical coherence tomography based systems for functional ocular imaging is reviewed, as well as linear and nonlinear ocular imaging techniques performed with ultrafast lasers, emphasizing recent source developments and methods to enhance imaging contrast.

Keywords: Retinal imaging, confocal imaging, Vertical Cavity Surface Emitting Laser (VCSEL), Adaptive Optics, OCT (Optical Coherence Tomography), swept source lasers, ultrafast lasers, nonlinear interactions

1. INTRODUCTON

Imaging of the eye presents several optical challenges, of which only some are found in microscopy. Millions of neural cells, together with their metabolic support structures, line the inside of the eyeball, forming the retina [1]. Figure 1 shows the large scale view of the eye in cross section, and Figure 2 shows detail of the layers of the retina.

Figure 1.

Figure 1

Schematic of the cross section of the human eye. Light enters the eye through the pupil, shown as blue rectangles, which severely limits the angle of both the illumination and detection of retinal targets. Light returns from the retina over a broad wavelength range when it is specularly reflected, directly backscattered, polarization retaining, or guided by photoreceptors, but contains mainly long wavelength information when it is multiply scattered.

Figure 2.

Figure 2

Schematic of the layers of the retina, with the most superficial at the top. The retinal pigment epithelium is shown in gold, to correspond to its most prominent fluorophore, lipofucsin. The choroid is shown in blue, to correspond to its absorption in the short wavelength range due to melanin, which is strong in darkly pigmented eyes. The photoreceptor outer segments are shown in green, to correspond to their broad band absorption of the three most common photopigments, peaking roughly at 500, 535, and 565 nm.

The healthy eye must be relatively transparent over a distance of roughly 22 mm, from cornea to retina, to permit the transmission of light to the photoreceptors, which are the neurons that lie beneath the rest of the retina. The photoreceptors guide the light, capture it, and transduce light energy to initiate a neural signal. The cornea is the main focusing element of the eye and is far more reflective than the retina, which returns at most only a few percent of the illumination.

With recent developments in lasers and electro-optics, there has been an explosion of methods aimed to image the structure and function of the retina. Not only is the presence of photoreceptors and their constituent parts a target of imaging, but also a better understanding of photoreceptors' ability to guide light and produce a response. Similarly, there is interest in imaging the structure and function of at least seven cell types organized into different layers of the retina, including both neural and support cells and their components. Novel imaging modalities are aimed at applications optimized for the detection of pathological changes to the eye, such as new blood vessel growth in the retina or the cornea, edema, accumulation of debris, or death of neurons.

In this review, we describe some of the constraints on imaging placed by the eye, particularly the retina. We discuss the need for cost effective and practical manipulation of wavelength for spectral studies. We further show recent advances that allow new noninvasive techniques, which give information that is applicable to the management of eye disease. These techniques make use of not only differences in reflectivity and absorption, but also the waveguide properties of photoreceptors, the increase of scattered light in damaged tissues, the distribution and amount of endogenous fluorophores, and changes in the optical path length, birefringence, nonlinear susceptibility, and polarization in ocular tissue.

In clinical imaging of the eye, fundamental challenges are present regarding resolution, image contrast, safety, price, data acquisition speed, and system complexity. A primary drive has been to improve the resolution, image cellular structures and probe the processes underlying vision. Merely increasing the optical magnification in these situations does not achieve the desired resolution due to a lack of contrast. In addition, higher magnification systems impact the cost of laser sources or imaging components and safety becomes a concern for higher powers per sample volume. There are trade-offs in system design between the best axial and lateral resolutions, and eye motion places constraints on detection and data transfer rates. The illumination source must be stable over time to allow uniform performance at all locations within an image and across image sets. Imaging systems must have stability, ease of operation, and compact size for clinical use or high volume applications.

2. TARGETS OF OCULAR IMAGING AND IMAGING CONTRAINTS

1.1 Retinal imaging

The normal retina, which is the innermost of the layers that line the inside of the eyeball, is itself organized in layers of tightly packed structures. Some of the layers, or the interfaces between them, are highly reflective, and lead to a strong signal in imaging, while others scatter light or are transparent. In vivo imaging of the retina encounters the problem that the superficial layer, where the gelatinous structure called the vitreous humor meets the retina, is one of the most reflective surfaces. Light that is not reflected by this interface is scattered as it travels deeper into the retina, interacting with each successive layer that might absorb, backscatter, or further scatter the light, thereby obscuring deeper structures.

The constraints of tissue optics, both concerning clarity and safety issues, are already severe, but imaging is made more difficult in situ by the geometry of the eye. The optics of the human eye are not diffraction limited; aberrations limit both vision and imaging the retina. Imaging of the human retina is typically performed in reflection mode, in which light returning back through the pupil is collected to form an image. Imaging an intact eye in transmission mode does not lead to high contrast, high resolution images of the retina due to the strong light scatter and absorption in the layers beneath the retina.

There are further constraints on retinal imaging placed by the entrance and exit pupil aperture, which is the iris of the eye. The iris ranges from 2 to 8 mm in diameter in normal room illumination, with a nodal point of about 17 mm, i.e. the distance from the iris to the retina. The resulting low numerical aperture, defined as NA = n * sin (θ), where n is the refractive index of the medium and θ is one-half the maximum acceptance angle of light with respect to the optic axis, can be less than 0.1 when the eye is exposed to bright light. The low NA places a fundamental limit on the achievable resolution, and can influence the field of view, when imaging the retina. The iris diameter is generally at the lower end of the range for older adults, who are more likely to have retinal disease, and infants. The iris is routinely widened, or dilated, by the instillation of medication to increase the NA, but this is not safe in all individuals; nor can a sufficiently large diameter always be achieved in older individuals or infants.

Beneath the retina are continuous layers that contain both cellular and acellular structures. These layers provide metabolic support and fluid balance, and contain melanin, collagen, and blood vessels. The layer directly beneath the retina is a monolayer of support cells, called the retinal pigment epithelium. These deeper layers are not only important as targets of disease pathology but also serve as reflectors for reflection-based imaging.

The retina is not only a curved surface, but also has several features that are elevated or depressed with respect to the neighboring structures [2]. There are two structures that grossly alter the uniformity of the retinal geometry and finely alter the retinal layers: the fovea and the optic nerve head. The fovea is the location of the most densely packed photoreceptors, and is the region that provides highest resolution vision. Overlying retinal layers are displaced radially, providing a unique near-surface view of the photoreceptors. The lack of overlying cells allows photoreceptors to detect light that is propagated with the minimum of scatter. The fovea forms a pit, which leads to a characteristic reflection, as shown in Figure 4. The optic nerve head is the structure where the nerve fibers exit to carry information to the brain. The retinal vessels enter and exit through the optic nerve head. The layers beneath the retina are nourished by a separate vascular supply, which lies wholly beneath the retina in the healthy eye and is difficult to image clearly. Nevertheless, there are important pathological changes that can occur in this region, such as new blood vessels that grow towards the retina and destroy it in diseases such as age-related macular degeneration. The optic nerve head, and the retinal nerve fiber layer, are crucial targets for imaging in diseases such as glaucoma. Gross changes in the thickness of this layer are measured with a variety of imaging techniques, with the goal of detecting abnormally thin locations. Finer changes indicate a loss of blood supply locally or damage to specific bundles of retinal nerve fibers, and are also of interest in retinal imaging to detect and manage glaucoma.

Figure 4.

Figure 4

Two samples from a cross sectional image of the living human retina, comparing a retinal region outside the fovea (left) with a region centered on the fovea (right), at locations indicated in Fig. 1. Each bright layer shows a strong index of refraction change between the stratified cellular and subcellular elements of the healthy retina. The more peripheral retinal has additional layers of neural elements that are displaced away from the foveal pit, indicated by the white arrow, thereby decreasing the unwanted of scattering of light before it reaches the photoreceptors. This image was produced by mapping the amplitude of the interference of coherent light against a reference of known position, known as Optical Coherence Tomography. The contrast in both the lateral and axial directions, as well as the uniformity over the retina, can be improved by using Adaptive Optics to correct the large wavefront aberrations in the human eye. These images were acquired from the instrument of Dr. Stephen A. Burns, using the Optical Coherence Tomography design of Dr. Daniel X. Hammer, as described in Section 5 and refs 58, 93. Note also that not all of the illumination light penetrates the sample at each location, consistent with absorption by the blood in the retinal blood vessels.

The retina possesses few structures to provide contrast when parameters such as absorption are used for imaging. The main absorbers in the normal retina include the macular pigment, which peaks at about 460 nm, hemoglobin and oxygenated hemoglobin, and the photopigments in the photoreceptors, as shown in Figure 5. Light that is specularly reflected at the retina is more likely to contain short wavelength information, since the blood in the retinal vessels is one of the main absorbers. The photopigments provide additional absorption, and lie beneath the retinal vessels. The cone photoreceptors act as optical waveguides that collect and return light over a limited range of angles, which can now be quantified with retinal imaging techniques [3]. Macular pigment has significant optical density only near the fovea, and the en face distribution varies among individuals and with age [4]. Light that has penetrated to the layers beneath the retina is absorbed by the melanin in the retinal pigment epithelium, the melanin in the choroid, and the choroidal blood supply, as well as extensively scattered by the randomly oriented collagenous support structures beneath the retina. Thus, light returning from the deeper layers contains relatively more scattered, randomly polarized, or diffusely guided light.

Figure 5.

Figure 5

Diagram of the principle of absorption and fluorescence, and the spectra of the major absorbers in the human retina: Le- lens, Me- melanin, HbO and HbO2- hemoglobin and oxygenated hemoglobin, H2O- water, and MP- macular pigment. Fluorescence occurs as a function of the excitation spectrum of the molecule, as shown by the blue curve on the top left, and also by the amount of light that penetrates the absorbers. Short wavelength light penetrates poorly to the deepest layers of the human eye, and so the wavelength at which the most excitation occurs may not return as much fluorescence signal as compared to longer wavelength excitation that avoids the local absorption maxima associated with blood. The emission spectrum, shown in the top right curve, is shifted towards longer wavelengths, which often encounter less absorption in the human eye.

There are several endogenous fluorophores of interest, such as the fluorescent debris known as lipofuscin, which builds up with disease and aging in the retinal pigment epithelium. In vivo studies of the fluorescence spectra indicate broad maxima and potentially more than one fluorophore component [5]. Given the broad absorption and emission maxima, small changes in wavelength of the source illumination are often not helpful in modeling these absorbers or fluorophores. The most desirable wavelengths are not always available in single line laser sources. The cost, and sometimes the complexities of operation, of tunable wavelength lasers leads researchers to select sources such as Xenon lamps and other broadband sources with grating monochromators or filters to control wavelength [5,6].

A common method to improve contrast in retinal imaging is the use of contrast agents, such as sodium fluorescein, as shown in Figures 6B,C, or indocyanine green. These have strict FDA approval procedures, and institutional guidelines for human subjects, since they are injected into the circulation system. The risks of fatal adverse events with these procedures are low, but exist. Oral ingestion of sodium does produce useful retinal imaging, and can be used to detect pathology needing treatment in age-related macular degeneration [7]. However, the oral method does not provide visualization of a bolus of dye moving through the vessels, which provides less information about leakage of dye over time. Intravitreal injections carry even higher risks than dye injection into the blood stream, and are therefore typically used only for treatment.

Figure 6.

Figure 6

Comparison of imaging methods in the macula of a patient with age-related macular degeneration. A- Color photograph showing new vessel growth and leakage, seen as a gray-green membrane, inferior to the fovea. B- A fluorescein angiogram with 488 nm Argon laser illumination , showing the early phase of dye leakage out of the leaking vessels. C- Fluorescein angiogram, showing a later phase, with more dye leakage at the location of the new vessel growth. D- A 780 nm infrared diode laser image at the magnification shown by the white box in panel B, computed from light depolarized from retinal structures, showing a white halo and dark core that localize the new vessel membrane. The methods are as in ref 53. E- A matching infrared image, but containing both polarized and depolarized light, showing the membrane but less well than in D. F- An infrared image computed from the same dataset as D, but using only light that is polarized perpendicularly to the illumination, showing the white halo at the rim of the new vessel membrane. The top of the macular cross is seen above the halo, where there is healthier retina.

The clinical applications of ocular imaging have also become major research and commercial applications, driving the development of novel light sources for ophthalmic imaging. Leakage of blood or fluid from the retinal blood vessels occurs in two of the main causes of vision loss in industrialized countries: age-related macular degeneration and diabetic retinopathy. Therefore, a significant amount of imaging research has been devoted to this problem. This leakage can be thought of as a failed attempt at wound healing, an inflammatory response, or simply damage to the retinal blood vessels from disease processes. New blood vessels subsequently grow, and these often lack the structural components needed to contain all the blood within their walls. The leakage from blood vessels and damaged cells leads to edema that is visualized in retinal imaging and quantified as an increase in retinal thickness.

1.2 Corneal and anterior segment imaging

The cornea is another important target in ocular imaging. The cornea is the main focusing element of the eye, with the lens providing variable focus in younger people while it is still flexible. With the high optical power of the cornea, relatively small changes in its surface lead to aberrations that reduce vision. As with the retina, the corneal tissues must be well-ordered to maintain clarity. The cornea must survive or be rebuilt to last a lifetime without an immediately adjacent blood supply. In recent years, the health of the cornea has become a key issue when considering the safety and efficacy of various forms of refractive surgery.

The clarity of the cornea provides a challenge for imaging, in that there are few structures to provide contrast when parameters such as absorption are used. A major limitation for corneal imaging is that the cornea is so highly curved that it is difficult to have more than a small area in sharp focus in a single image. It is also difficult to image the cornea because, unlike the retina, there is not a continuous structure directly behind it to serve as a reflector for reflection-based imaging. The iris of the eye is available as a reflector for only a portion of the cornea, and is not behind the central cornea, i.e. at the location of the main optic axis. Finally, the topical use of contrast agents, such as sodium fluorescein, is commonplace and relatively safe in a sterile environment.

One measurement of interest for corneal imaging is overall corneal thickness, which is about 550 microns, but varies more than ± 100 microns across individuals. The thickness also varies in each individual with the radial distance from the optic axis. An accurate measurement of the corneal thickness is an important factor in determining whether sufficient material can be removed or remodeled, such as by laser refractive surgery, to obtain the desired optical correction. A second use of corneal thickness is to assess whether the intraocular pressure in a given eye is an accurate measure of the risk of glaucoma. If the cornea is thin or thick, then it is assumed that the intraocular pressure may be exerting more or less force, respectively, on the neural structures in the retina than would be expected.

Other aspects of corneal imaging include monitoring the thickness and structure of the individual tissues, such as the stroma, which is comprised of the collagen layers that vary in their orientation and provide the support for the cornea. The corneal endothelial cells are another important target, as sufficient and healthy endothelial cells are needed to maintain corneal clarity, and hence vision. The number of endothelial cells decreases during an individual's lifespan and also with trauma or surgery. For this reason, major research efforts have been made to reliably and reproducibly image the cornea and its tissues with optical means.

The eye has a variety of anterior structures that are not are not as readily imaged as the cornea, and contain more complex layered structures and cell types, more similar to the retina. One of the most important of these includes the ciliary body, where several functions occur such as the movement of muscles and fibers to control the shape of the lens. In addition there is a fibrous network in the angle between the cornea and the iris that serves as a canal through which fluid flows out of anterior eye to regulate intraocular pressure, which is important for preventing glaucoma. As compared to retinal imaging, the safety standards for illuminating the cornea and anterior structures of the eye are lenient. Perhaps more importantly, the severe constraints on numerical aperture, and on the resulting apparatus configuration, are reduced or removed.

The ciliary body structures have recently been imaged with some of the most novel laser and imaging applications, some of which would not be safe in their present form for retinal imaging due to the high levels of laser power or the lack of dissipation of energy from brief exposures, coupled with the light starved conditions present in retinal imaging.

3. LASERS AND SAFETY IN OCULAR IMAGING SYSTEMS

The familiarity with laser instrumentation likely aided vision researchers in the implementation of novel lasers as sources for ocular imaging devices [8,9]. Lasers have long been used as treatment devices for photocoagulation in ocular disease, described by several groups in 1963 [10,11]. Lasers have also proved advantageous as high spectral purity light sources to probe the human visual system [12,13]. In the majority of applications for imaging, the research or clinical goals included a source with excellent reliability, given that the source was just one of many components in a complex system. To ensure reliability, care must be taken in the optical design to prevent external reflections from components or targets from reentering the laser cavity. Even small amounts of feedback can disrupt laser operation, reducing power, adding spectral noise, and potentially causing permanent damage. Common methods to eliminate these problems include tilting components, using polished fiber connectors that are cleaved at an angle, or adding an optical isolator. Further, the desired cost of each system is often far less than that of other imaging modalities, such as magnetic resonance imaging. The model of a few centers with expensive devices has not been adopted in eye care, possibly because ocular imaging has historically been performed at a large number of sites, and mass screening for a particular eye disease is rare.

As there has been widespread use of lasers for treatment and in laboratories, the possibility of damage to retinal and corneal tissues has led to safety as a primary consideration. The American National Standards Institute (ANSI) has specific standards for eye safety that take into account many of the important experimental variables, such as laser power delivered at the cornea, viewing time, beam diameter or full-field viewing, the exposure area on the retina, and the wavelength. The revised standards describe the light safety issues for ophthalmic instruments, including fundamental requirements and test methods [14]. Near infrared radiation, relative to visible wavelength radiation, has long been permitted to have relatively more exposure in power or duration, which is one of the key reasons for its widespread use in imaging devices. The standards have been altered over the past several years to allow an instrument to emit more radiation to be used for longer exposures, and still stay within the Maximum Permissible Exposure guidelines. A recent review describes some of the guiding principles and remaining questions, such as the incorporation of pupil constriction or aversion of gaze with bright lights [15]. The power for imaging systems with continuous viewing is measured at the cornea for human retinal imaging applications, ranging from 14 microwatts/cm2 to 450 microwatts/cm2 on the retina for visible wavelengths, and up to 30 mW/cm2 in the near infrared. The light used for imaging the cornea and other anterior segment structures is typically higher. The use of pulsed lasers is contained within the guidelines, which were historically devised for continuous wave radiation. The safety guidelines cannot cover the needed detail for all types of pulsed lasers, particularly the emerging class of ultrafast lasers.

For continuous wave laser radiation, modern retinal imaging techniques provide light economy and sensitivity, so that the recommended limits of radiation are not exceeded. Several photographic or ophthalmoscopic techniques remain in widespread use, despite having extremely bright and uncomfortable levels of visible wavelength light including short wavelength light, which are potentially too intense to conform to safety guidelines. One example is angiography with sodium fluorescein, which requires short wavelength light to excite the dye. Illumination is particularly intense when there is low detector sensitivity, coupled with poor light return from the retinas in eyes that have increased absorption in the aging lens at short wavelengths, small pupils, or dark choroidal pigmentation. Recent noninvasive techniques have been demonstrated to reduce the need for angiography, as described below.

4. LIGHT DELIVERY: LASER SCANNING AND CONFOCAL IMAGING SYSTEMS

With the advent of lasers as illumination sources, spatial and temporal coherence have become increasingly familiar parameters. The interference of the random phases of a scattered illumination beam gives rise to a granular noise present in imaging, commonly called speckle. A common way to reduce the influence of unwanted speckle in an image is to spatially average it over each position using motion. In flood illumination, this can be accomplished by a rotating or scanning element. An alternative speckle-reducing technique is to implement a laser scanning system, in which a focused spot [16] or slit [1721] of light is scanned across the target, thereby reducing speckle. This also allows the detection of the light returning from the target to temporally correspond to the region illuminated at a given time, so that an image is built up from temporal samples. In point scanning, a single high sensitivity detector was originally used [16], and current implementations coordinate a fast scanner and an orthogonal slow scanner to produce a raster pattern over the sample. With slit scanning, solid state detector arrays are now used in the form of a linear array [18] or a two dimensional array that has the advantages of a modern digital camera [1921]. The use of slit scanning eliminates the need to coordinate between two scanning mechanisms, reducing the complexity with only a minor increase in speckle noise, leading to the development of the Laser Scanning Digital Camera [1921], a low cost retinal imaging device.

Confocal imaging, in which an aperture is placed in a plane conjugate to the sample, is used to reduce the focal volume of light that can reach the detector. Confocal imaging, implemented with or without laser scanning, allows better control of light returning from the sample, as discussed in detail below. Several forms of this technique have found widespread clinical use for providing retinal imaging that document pathology or tomographic measurements of retinas thickened by accumulation of fluid or debris or thinned by the death of neurons.

4.1 Ocular imaging with visible wavelength sources

The original sources for retinal imaging with scanning laser techniques were primarily gas lasers [16, 22]. To obtain sufficient spectral purity when sources such as argon lasers or argon krypton lasers were used, spectral filters were inserted into the illumination path. For use with multi-line argon lasers have laser lines that are quite similar in wavelength, 6 cavity interference filters were used to obtain spectrally pure sources [23]. Argon lasers provide considerable power and a choice of laser lines are available, but the beam intensity profile is often not the Gaussian shape that produces the small point spread function on the target that is desirable for most imaging applications. Historically, imaging systems that use argon laser systems and also require a precise beam position suffer from the need for frequent realignment. Considerable cooling is needed, also problematic in the context of an imaging device to be used in a clinical setting. Newer commercial devices have replaced the 488 nm line used for imaging with a solid state, diode pumped laser, but only one line is typically available [24].

The 633 nm helium neon (HeNe) laser, which readily provides a Gaussian beam shape with low divergence, is widely used, and available in a range of powers from 2 – 10 mW at moderate cost. However, for specific applications such as measuring the photopigment in photoreceptors, utilizing a range of wavelengths closer to the absorption maxima of the pigments provides better signal-to-noise ratio for mapping cone photopigment in the human fovea [4,23]. One example is the 594 nm HeNe laser, which is near the well-known sodium line and is a stable source with a good beam shape for imaging (Figure 8). This source is not widely available in the power range necessary for all applications, i.e. greater than 2.5 mW.

Figure 8.

Figure 8

Retinal image of a normal subject's eye, acquired with a confocal scanning laser system at 594 nm, using the technique of ref 23. Retinal blood vessels are in good contrast, given the strong absorption at 594 nm, and the control of unwanted scattered light.

Recent advances have led to a broader range of choices of solid state, visible wavelength lasers, which are now used in clinical devices for retinal imaging that originally were validated with HeNe lasers [25]. The use of 543 nm HeNe lasers for retinal imaging [26] is limited by their power, and continuous wave frequency-doubled yttrium aluminum garnet (YAG) lasers at 532 nm can be used instead for applications in which the slightly shorter wavelength is acceptable [27].

4.2 Ocular imaging with near infrared sources

Near infrared imaging is made possible by the advances in solid state detectors that are sensitive at the illumination wavelengths and stabilization circuitry, such as silicon-based avalanche photodiodes [23]. For retinal imaging, there is less absorption in the near infrared compared to visible wavelengths, which results in far more light return but an increased amount of undesirable scatter. Prior to the use of laser scanning systems and confocal imaging, the results of near infrared imaging were thought to be characteristic of poor contrast and diattenuation rather than birefringence in retinal tissues [28]. However, with the use of laser scanning in the delivery and collection of light, along with confocal apertures, high contrast images are obtained in the red and near infrared wavelength ranges at comfortable and safe light levels [23, 29]. Contrary to previous conclusions, the retina exhibits strong birefringence in the near infrared range. The thickness of the retinal nerve fiber layer can be determined from phase retardance measurements in both 633 nm [30] and 780 nm.

Using an edge-emitting diode laser at 830 nm, deep pathological changes, hyperpigmentation in the retinal pigment epithelium due to the clumping of dead cells, could be seen, and focal defects in the sensitivity to light compared with more normal retina were measured at these locations [31]. To further explore the imaging of deeper structures, longer wavelength sources with better beam quality were sought.

The success of near infrared imaging with a diode laser led to the application of novel sources to the field of imaging with Gaussian beam shapes, including a continuous wave titanium-doped;sapphire (Ti:Sapphire) laser and the low coherence and efficient Vertical Cavity Surface Emitting Laser (VCSEL) [89]. Using a tunable Ti:sapphire laser, high contrast images were obtained in both reflected light imaging and indocyanine green angiography that proved to be far superior to angiography performed with flood illumination [8,26,32]. The expected improvement in penetration to the deeper layers was demonstrated, visualizing debris beneath the retina [8,26]. Other novel near infrared sources include a diode pumped tunable Chr:Li:SaF4 laser, tunable over about 810–870 nm with excellent beam quality. This source provided excellent images that were used to readily visualize features in normal subjects [33] and pathological features in age-related macular degeneration [3435]. These near infrared illumination sources have also been used in combination with a variety of visible wavelength sources that can provide visual stimuli to probe retinal function while simultaneously imaging structures within or beneath the retina [23].

4.3 Retinal imaging to probe deeper layers by confocal imaging and multiply scattered light imaging with near infrared sources

The usual goal of confocal imaging is to provide detail limited to a given range of tissue depth. Strong index of refraction changes dominate the light in a sample obtained through a small aperture that is conjugate to the plane of the target, such as the retina or cornea. The removal of light from other planes of focus increases the contrast in plane of focus by also removing light that is multiply scattered in a lateral direction, i.e. within the same plane as the target. The optics of the human eye are diffraction limited in the central 2 mm of the cornea, and permit some degree of resolution of structures in depth and improved resolution of some structures in en face images. Tomography of the retina performed by acquiring series of images varying only in focal plane is used to explore the topography of the retina to detect glaucoma [36] and retinal diseases [37].

Confocal imaging of the retina is limited by both the optics of the human eye and the entrance/exit pupil size. The maximum potential lateral resolution is inversely proportional to the numerical aperture, i.e. 1/NA. The maximum potential axial resolution is proportional to 1/NA2. Given the low NA of the human eye, and the aberrations introduced by its optics, the axial resolution provided by confocal sectioning does not provide complete discrimination against light outside the plane of focus. The maximum potential lateral and axial resolutions have been computed as 10 microns and 250 microns FWHM, respectively, using 633 nm and the Gullstrand model eye [38]. Commonly used confocal apertures of 2 – 5 times the diameter of the illumination point spread function on the healthy retina lead to proportionally more multiply scattered light than is collected in most ex vivo microscopy applications. Nevertheless, confocal imaging improves the contrast of selected superficial or deeper structures, including the new vessel membranes that grow beneath the retina in exudative age-related macular degeneration [39]. Although the axial transfer function does not meet the Rayleigh criterion commonly used for imaging resolution, it differs significantly from the normal transfer function in the presence of exudation in age-related macular degeneration [40]. When the new blood vessels grow in the retina [41], they are accompanied by unusually large increases in retina thickening [42]. Although the superficial layers are bright compared with the deeper ones, very simple forms of image processing provide excellent detection and localization of new vessel membranes that agree well with angiographic methods [26, 43].

The most familiar method of imaging deeper layers with confocal imaging is to configure illumination and detection apertures to maximize the light return in a small focal volume in a deep plane of focus. In this case bright, on axis light return from outside the focal volume can still reach the detector, leading to decreased contrast of the deeper layers. An alternative approach is to use the equivalent of dark field microscopy, such that the illumination and confocal aperture are spatially mismatched to minimize the strong signal returning from the superficial retinal and maximize the light returning from the deeper layers of interest. In diseases such as age-related macular degeneration, the disease mechanisms begin early in the photoreceptors and retinal pigment epithelium. Alterations to these structures, along with the growth of new blood vessels that leak, provide a wide range of structures that scatter light in a far different manner from healthier retina. Detecting these structures noninvasively is possible by forming images from light that is multiply scattered off structures. In the eye, since there are several scattering layers and little absorption superficial to the photoreceptors and retinal pigment epithelium, this technique works well.

There are several methods to obtain multiply scattered light images with scanning laser, confocal imaging systems. The most readily configured version is to use a single axis for illumination and detection, and to employ annular apertures appropriately. This design provides rapid, noninvasive detection and localization of exudation and deep retinal structures that scatter light in inflammatory disease and age-related macular degeneration [26,35,4344], illustrated in Figures 10 and 11.

Figure 10.

Figure 10

Retinal images of a patient with early age-related macular degeneration, showing two different types of pathology with confocal imaging employing directly backscattered light on the left and multiply scattered light imaging on the right. The left image was acquired as in ref 44, using a 790 nm diode laser and a confocal aperture, and shows the highly reflective debris lies beneath the retina. The right image was acquired with a 633 nm HeNe laser and an annular aperture, which blocks the directly backscattered light and emphasizes the multiply scattered light. Clumps of dead retinal pigment epithelial cells are seen as dark spots, indicating severe and permanent damage that is not seen in the confocal image.

Figure 11.

Figure 11

Retinal images acquired with a Ti:Sapphire laser of a patient with a new blood vessel membrane indicated by black arrows,. The left image was acquired as in ref 26, at 860 nm and using a confocal aperture, showing only the superficial part of the new vessel membrane. The right image was acquired with an annular aperture, and the extent of the new blood vessel membrane is clearly larger than that indicated by the superficial information.

Another multiply scattered light imaging system design provides illumination at a range of positions with respect to the retina, and detects the return light through a single confocal aperture. This design is very stable, requires fewer moving parts, and can employ an array of VCSEL elements that are individually addressable. When the spatially separated source elements are combined with tomographic imaging, both the benefits of confocal and multiply scattered light imaging can be realized, as illustrated in Figure 12. This design provides a better understanding of exudative structures, such as is age-related macular degeneration and diabetic macular edema [4446], since exudation does not always provide signal based on a strong index of refraction change.

Figure 12.

Figure 12

Series of a normal subject's retina at increasing focal depths, using an 850 nm VCSEL source. The images in the left column were acquired with only the center laser element illuminated, on axis with the confocal aperture. The light return is strongest at the superficial retina. The images in the right column use illumination from eight surrounding laser elements for multiply scattered light imaging, providing visualization of blood vessels in the choroid, deep beneath the retina.

A final design example, which employs a novel a novel detection pathway, is the Laser Scanning Digital Camera with flexible electronic aperture [21,47]. This system uses slit scanning illumination, coupled with synchronous line read-out on a CMOS detector, which acts as the confocal aperture. By varying the CMOS line read-out with respect to the illumination slit position, it is possible to image both directly backscattered and multiply scattered light from the sample [47].

4.4 Retinal imaging with polarized light to probe deeper layers by multiply scattered light imaging and detect birefringence alterations

The use of multiply scattered light to image structures beneath the highly reflective superficial layer can also be performed by means of scanning laser polarimetry. In the near infrared region, linearly polarized light is backscattered with some retardance from the superficial layers and also penetrates the deeper retinal layers. With increasing penetration depth, there is a greater chance of interaction with the retinal layers, which increasingly randomized the polarization. In diseased eyes, pathological changes disrupt the organization of the retina, and there is relatively more scatter than in healthy eyes. By injecting linearly polarized light that varies over a sequential series of acquired images, and analyzing the polarization properties at each pixel in the image, it is possible to form a series of images that vary in polarization content, as shown in Figures 6D–F [4854]. The depolarized light image, shown in Figure 6D, is computed by assigning at each pixel the grayscale value of the minimum in any image at that location. The average image, shown in Figure 6E, is computed by averaging the grayscale values at that location over the entire image series. The birefringence image, shown in Figure 6F, is computed by assigning at each pixel the amplitude of the variation in grayscale over all the polarization conditions.

The axons of the foveal photoreceptors create a small region of birefringence that surrounds the fovea, and produces the well-known macular cross when viewed in polarized light. The presence and location of the macular cross have been used to demonstrate the existence of cone photoreceptors, even in the presence of severe exudative age-related macular degeneration [5253], as seen in Figure 6F.

4.5 Retinal imaging with adaptive optics to improve axial and lateral resolution, increase contrast, and probe deeper layers

The wavefront aberrations associated with the cornea and lens may be detected with a Hartmann-Shack Wavefront Sensor at a conjugate pupil plane and corrected by the use of a phase plate [55], deformable mirror [5658], or liquid crystal spatial light modulator [new 59] as shown schematically in Figure 13. The wavefront corrections result in improved resolution in both the axial and lateral dimensions, by decreasing the point spread function on the target layer of interest. Due to the spatial resolution and finite number of elements of the Hartmann-Shack sensor, the sampling of the wavefront is limited, and measurement accuracy can be further degraded by speckle noise. To reduce speckle, sources with high spatial coherence but low temporal coherence, such as one or more superluminscent diode lasers, are now commonly used for both the wavefront aberration and imaging pathways.

Figure 13.

Figure 13

Schematic representation of an adaptive optics imaging system.

Excellent images of the human retina result are obtained when combining adaptive optics with a high magnification scanning laser ophthalmoscope (AOSLO), as shown in Figure 14. Taking images across the retina sequentially allows building up montages. One of the main applications for this hybrid system is the imaging of the photoreceptors in an en face manner. The increased optical sectioning provided clearly shows the light returning from each individual cone photoreceptor in the healthy retina, as shown in Figure 14, or alternatively the retinal nerve fiber layer in the superficial retina, as shown in Figure 15

Figure 14.

Figure 14

AOSLO images of the human retina, comparing images with the Adaptive Optics correction turned off as opposed to on. Each white dot shows the light return from an individual cone photoreceptor.

Figure 15.

Figure 15

AOSLO image of the human retina, focused more superficially to show the nerve fiber layer and blood vessels in the retina.

Fluorescence images at high magnification in a primate model have been obtained, revealing cellular structures [57]. The sources were chosen to balance the excitation and extinction spectra of the fluorophores with the light absorption of the retina. Images were taken with simultaneous exposure of light at three wavelengths: an argon/krypton tunable source at 530 nm or 568 nm for fluorescence excitation, an 830 nm superluminescent diode for reflectance imaging, and a 904 nm superluminescent diode for wavefront measurement. The combined photochemical and thermal exposure was kept below the maximum permissible exposure ANSI limits for extended sources over a two-hour exposure time. To image the endogenous fluorescence of the lipofuscin in the retinal pigment epithelial cells, the retina was simultaneously illuminated at 568 nm, but no dye was injected, as shown in Figure 16. Simultaneous imaging of reflectance and fluorescence images provide registration and signal averaging of the low contrast fluorescence images based on the features in the high contrast reflectance images taken. To label the ganglion cells and their axons, rhodamine dextran dye was injected into the lateral geniculate nucleus, as shown in Figure 17.

Figure 16.

Figure 16

AOSLO image of a primate retina, focused deeper to show the endogenous fluorescence from the retinal pigment epithelium, as in ref 57. Each hexagonal structure is a single retinal pigment epithelial cell. Images were provided by Dr. David Williams and Jessica Wolfing.

Figure 17.

Figure 17

AOSLO image of the primate retina as in ref 57, focused more superficially to show the nerve fiber layer, stained ganglion cell bodies and their axons, and blood vessels in the retina. Images were provided by Dr. David Williams and Jessica Wolfing.

5. OPTICAL COHERENCE TOMOGRAPHY

5.1 Overview

The development of optical coherence tomography (OCT), as a three-dimensional biological imaging system, has undergone remarkable progress for a wide range of applications in diagnostic medicine, biology, data storage and material science [59]. Its real-time operation at high resolution (1 to 15 μm) permits non-invasive imaging at depths of up to several millimeters in tissue.

OCT-based systems perform imaging by analyzing the interference between broadband light backscattered from a sample and light reflected in a reference arm with a known path delay. The most common implementation uses a Michelson interferometer, and determines the backscattered intensity with respect to sample depth at one transverse point on the sample at a time. A three-dimensional image is built up by raster scanning the beam across the sample. Recent comprehensive reviews of the progress and development of OCT-based systems provide more extensive detail concerning specific techniques [6063].

The earliest OCT systems, commonly referred to as time-domain OCT (TD-OCT), used a reference arm with a path delay that is mechanically stepped across the full sample depth range, as depicted in Fig. 18. At each reference arm position, the intensity of the interference is recorded by a photodetector, yielding the scattering depth profile for the reference arm range of motion. The fact that the reference path length must be mechanically swept to obtain an axial scan limits the acquisition speed: using a rapid scanning delay line, a 8 kHz Doppler OCT implementation has been demonstrated [64], and a 3.125 kHz system has been commercialized and adapted for high resolution imaging in ophthalmology [65, 66]. Despite its initial commercialization for clinical use, the popularity of TD-OCT has diminished in recent years due to the development of spectral-domain OCT systems with higher acquisition speeds and sensitivities. However, TD-OCT is still useful for ophthalmic applications that image with combined high axial and transverse resolutions, either by generating en face cross-sectional scans [67], or by employing adaptive focusing optics to overcome higher order ocular aberrations [68].

Figure 18.

Figure 18

Schematic of a free-space time-domain OCT (TD-OCT) system

More recent Fourier-domain OCT (FD-OCT) systems differ from TD-OCT in that the reference arm remains fixed, and the spectral interferogram is measured with a spectrometer comprised of a fixed grating and solid state detector array (Fig. 19). An inverse Fourier transform is applied in post-processing to reconstruct the scattering depth profile, achieving the same axial resolution as obtained in TD-OCT systems. To prevent the aliasing of scattering interfaces over the processed depth profile, the source spectrum must be measured with a sufficiently narrow spectral resolution, requiring the use of line-scan cameras with typically 512, 1024 or 2048 pixels, depending on the source bandwidth and maximum penetration depth of the sample. The electronic line-scan camera provides a faster scan rate than the mechanical mirror scan rates achieved in TD-OCT. In addition, by spreading the imaging spectrum across many pixels, the noise is reduced, permitting a higher sensitivity with respect to TD-OCT [69, 70]. FD-OCT imaging of the retina has been accomplished with a 6 μm axial resolution and axial scan rate of 29.3 kHz [71]. By incorporating adaptive optics, the morphology of cone photoreceptors has been monitored in vivo with FD-OCT at even higher speeds (75 kHz) to reduce motion artifacts [72]. Full-range FD-OCT, in which positive and negative path delays are resolved, has been applied to anterior segment imaging at 1310 nm, where long depth ranges are required for unaliased structural visualization [73]. In this case, FD-OCT imaging at an axial resolution of 9 μm and scan rate of 6.7 kHz, provided a corneal thickness map and the determination of the angle between the cornea and the iris, both of which are important for the diagnosis of glaucoma.

Figure 19.

Figure 19

Schematic of a free-space Fourier-domain OCT (FD-OCT) system

As opposed to measuring the interferograms with a spectrometer and line-scan camera, researchers have instead developed laser sources that rapidly sweep through their spectral bandwidth with narrow instantaneous line widths. These OCT systems, called swept-source OCT (SS-OCT), use single photodetectors that are temporally synchronized to the spectral content of the interferogram [74]. By using currently available high speed 8 bit analog to digital converters, the electronic scan speed limitation can be overcome and replaced by the speed at which the source can perform its spectral sweep. As in the case of FD-OCT, SS-OCT systems use a fixed reference arm and perform an inverse Fourier transform to recover the depth scattering potential. To prevent aliasing over longer depth ranges, the interferogram must be sampled with high spectral resolution, requiring a narrow instantaneous line-width for SS-OCT systems. Currently, the fastest reported OCT axial scan rates with axial resolutions near 10 μm have been achieved with SS-OCT implementations; a buffered Fourier domain mode locked laser has achieved an axial scan rate of 370 kHz [75].

SS-OCT systems require dynamic calibration to maintain their time-frequency synchronization, a calibration that is unnecessary in FDOCT. This calibration has been accomplished by monitoring the fringes produced in a separate Mach-Zehnder interferometer at fixed path delay during each spectral sweep / axial scan [76]. In vivo images taken with SS-OCT include that of a human retina at a 13 – 13.5 μm axial resolution and a 43.2 kHz axial scan rate [77].

The trade-offs between FD- and SS-OCT arise from the additional complexities at either the detector or the source, respectively. Although both FD- and SS-OCT systems require the same frequency sampling to avoid aliasing, the use of finite-width pixel elements in FD-OCT array cameras results in a convolution with the desired interferogram [78]. As the frequency of the interferometric fringes increases with longer path delays, this convolution reduces the fringe contrast, degrading the achievable FD-OCT sensitivity. In the FD-OCT system studied in ref. [79], convolution with the pixels was found to degrade the maximum 109 dB sensitivity by approximately 4 dB over a 1.8 mm path delay.

The use of single detectors in SS-OCT systems permits the low-cost implementation of balanced detection to remove common-mode noise sources, such as random laser intensity noise, from the interferogram. Balanced detection has been shown to be superior in terms of the signal-to-noise ratio to unbalanced detection, though the improvements can be marginal in the case of slower scan rates when the receiver noise is low [80].

Despite the significantly higher scanning rates reported in SS-OCT systems, both SS- and FD-OCT systems are limited by the same electronic analog to digital conversion speeds; the fastest SS-OCT systems are currently only supported by 8 bit analog to digital converters, which severely limit the imaging dynamic range. As electronic processing speeds improve, the scan rates of both SS- and FD-OCT systems will benefit, provided the technology can be incorporated into line scan camera technology. However, should analog to digital conversion speeds advance beyond the current sweep rates of buffered Fourier domain mode-locked lasers, it may be challenging to accommodate with new swept source development. Currently, both SS- and FD-OCT systems have been commercialized and are expected to replace TD-OCT in most clinical ophthalmic applications. The cost of each of these systems does not appreciably differ and remains at the high end of the price range of ophthalmic diagnostic instruments.

Swept source laser development has yet to achieve the bandwidth necessary for ultra-high resolution OCT ocular imaging. At the present time, the highest reported axial resolution using a swept source has been <7 um in the retina with a center wavelength of 850 nm and 16 kHz axial scanning rate [81]. By contrast, at the same center wavelength and imaging speed, an FD-OCT system using a Ti:sapphire laser capable of 2.1 um axial resolution in the retina has been reported [82].

5.2 Axial and Transverse Resolution

In the far field approximation, the temporal point-spread function achieved in FD-OCT imaging is related by the Fourier transform to the power spectrum of the source [83]. Thus, the depth resolution scales inversely with the frequency bandwidth, and jagged spectral peaks will be translated to unwanted temporal sidelobes that reduce the imaging dynamic range. The axial resolution is usually characterized with a Gaussian shaped power spectrum, and is expressed in terms of its center wavelength λ0 and full-width half-maximum (FWHM) spectral bandwidth Δλ. Using the Gaussian time-bandwidth product of 2ln2π and assuming a delta function sample reflection, the axial resolution Δz, defined as the Gaussian FWHM of the point-spread function, is given by

ΔZ=2ln2πλ02Δλ (1)

The Gaussian power spectrum is typically chosen due to its smooth exponential decay even though, by using a spectrum that more closely resembles a top-hat, it is possible to achieve a higher FWHM axial resolution at the expense of larger side-lobes. However, in virtually all OCT applications, the backscatter intensity varies by several orders of magnitude along the depth of the sample and the presence of side-lobes near a strong interface can partially or completely mask weaker nearby structure. Researchers have reshaped their source spectrum for OCT imaging, both optically [84, 85] and digitally in post-processing [86, 87], to optimize the point-spread function for a given application.

As opposed to axial resolution, the transverse resolution of an OCT system is determined by how tightly the incident beam can be focused onto the sample. The diffraction-limited spot size, given in terms of the 1/e2 beam radius, w0, that can be achieved for a Gaussian beam profile depends on the focal length of the lens being used, f, the input beam spot size, d, and the beam wavelength λ. This relationship, developed in [88], is

w0=λfπd (2)

As the transverse resolution is increased, the Rayleigh range, zR, defined as the distance from the focal point at which the spot size is increased by a factor of 2, will be reduced according to

ZR=πw02λ (3)

In typical OCT applications, scanning is performed over a depth field of view of roughly four times the Rayleigh range [89], which is left intentionally long so that the scattering potential across several millimeters can be measured simultaneously. This also simplifies alignment for many in vivo applications, particularly for endoscopy, in which it may be difficult to finely position the focal point at the region of interest. As Eq. 3 shows, a 2 mm depth field of view in a 1300 nm OCT system results in a diffraction limited spot size of 14.4 μm in air, yielding a 16.9 μm FWHM transverse resolution.

As the imaging source bandwidth is increased to achieve higher axial resolutions, chromatic dispersion and aberrations degrade the resolution performance predicted from equations 13 above. The chromatic dispersion introduced by biological samples must be measured and compensated to prevent a smearing of the axial point-spread function during imaging. Compensation is accomplished by introducing the same amount of dispersion into the reference arm of the OCT interferometer, or through post-processing techniques [90]. On the other hand, the transverse resolution, determined by the focused spot size on the sample, will be limited by aberrations introduced by the focusing lens. As previously discussed, ocular aberrations affecting transverse resolution can be partially compensated using adaptive optics. Generally speaking, as greater source bandwidths become available, a smooth spectral shape and the effective use of all of the bandwidth becomes more challenging and expensive to implement, which has thus far largely limited ultra-high resolution OCT systems to laboratory benchtops.

The most common light source used at present in OCT is the superluminescent diode (SLD). These compact, reliable and inexpensive semiconductors emit radiation by amplified spontaneous emission (ASE) over a broad bandwidth, recently permitting axial resolutions up to 2.2 μm in air with a multiple-quantum well structure [91]. At present, SLD sources are commercially available at center wavelengths ranging from 650 nm to 1610 nm, providing OCT axial resolutions typically in the range of 5 – 15 μm and output powers in the range of several milliwatts (up to 30 mW) for single-mode operation. The greater flexibility in the center wavelength and increased spectral bandwidth of these compact sources achieved over the past few years has made them useful for clinical OCT applications requiring low speckle and moderate to high resolution.

Thermal radiation sources have been adapted for OCT imaging, such as halogen lamps [92] which, despite achieving low coherence lengths, suffer from poor spatial coherence, necessitating the use of spatial filtering with a pinhole or single-mode optical fiber. After improving the spatial coherence of the beam, very little power remains to illuminate the sample, resulting in poorer sensitivity for similar scanning rates. However, despite their low useable power, thermal sources have very broad bandwidths, which make them a cost-effective alternative to ultrafast lasers for ultra-high resolution imaging.

More specialized OCT sources that are later discussed include ASE fiber lasers (both fixed spectrum and wavelength-swept), femtosecond pulsed lasers and sources that use supercontinuum generation to nonlinearly broaden their spectra.

5.3 OCT-Based Ocular Imaging Systems

A wide variety of OCT systems have been developed to exploit different imaging capabilities, or to enhance contrast for a particular application. In ocular imaging, the theoretical OCT axial and transverse resolutions are typically not attained due to the wavefront and longitudinal chromatic aberrations introduced by the cornea and lens. As in other imaging systems, adaptive optics can be implemented with OCT to correct wavefront aberrations, enhancing both the lateral and axial resolutions as well as the imaging contrast [93, 94]. In vivo ultra-high resolution AO-OCT imaging requires fast acquisition to prevent blurring from motion artifacts, as well as wide aberration correction capabilities. As such, researchers have developed standard and parallel FD-OCT systems [95, 96, 97], and have implemented liquid crystal phase modulators instead of deformable mirrors to enhance the wavefront correction performance [98, 99]. The use of adaptive optic OCT has allowed retinal layers, such as small blood vessels and neural components at different depths, to be visualized as separate structures and has allowed better quantification of photoreceptors over a given retinal area.

Although standard OCT imaging has provided evidence of thinning or thickening of retinal layers, this can be due to either quiescent scar tissue or active new blood vessel growth, requiring radically different treatment. By exploiting the birefringence of the collagenous portion of the new vessel membrane, polarization-sensitive OCT (PS-OCT) can visualize the thickness of a lesion and different components as a portion of the entire layer, clarifying which require treatment [101].

The PS-OCT system described in [101] uses a superluminescent diode (SLD) with a central wavelength of 840 nm, with a bandwidth of 50 nm, allowing an axial resolution of 8.3 μm in the air and 6.0 μm in tissue. After the polarization is vertically aligned by a linear polarizer, an electro-optic modulator varies the incident polarization. Light returning from the eye is divided into horizontally and vertically polarization components by a polarizing beamsplitter that are detected by separate CCD line cameras. The images are formed based on Jones matrix calculations, and the birefringence is normalized at the superficial retina. Although the intensity image indicates a uniform lesion, the phase retardance indicates striking polarization changes within the lesion, as shown in Fig. 21. The PS-OCT technique has also been applied to the retinal nerve fiber layer [94], and recently comparisons of the two main imaging techniques for glaucoma can be compared: en face analysis of retinal birefringence and OCT [102].

Figure 21.

Figure 21

2-Dimensional FD-OCT scan of a new vessel membrane in age-related macular degeneration, without (left) and with (right) polarization sensitive analysis using the system described in ref. [101], and provided by Dr. Masahiro Miura. The right-hand scan indicates the location of birefringent collagen within the lesion, seen as pink, separating it from regions of potential new blood vessel growth.

Apart from birefringence, the measurement of blood flow velocities can assist in the early detection of retinal diseases, such as diabetic retinopathy and glaucoma. Noninvasive in vivo measurements resulted from the development of color Doppler OCT (CD-OCT) [103], permitting quantitative in vivo volumetric blood flow in the retina, in addition to standard structural imaging. For the case of Doppler FD-OCT, flow velocities are determined by measuring phase shifts in the interferogram for adjacent, slightly overlapping, axial scans. Using an SLD source with 49 nm bandwidth centered about 841 nm, an axial resolution of 7.5 μm has been obtained in a color Doppler FD-OCT system that statistically estimated the minimum and maximum velocities of retinal flow in the artery (38.35 – 51.13 mm/s) and vein (16.62 – 29.7 mm/s), respectively [104].

To image with both high transverse and axial resolutions, researchers have combined high numerical aperture optics, typically used in confocal microscopy, with ultrahigh resolution OCT. These hybrid confocal/OCT systems, called optical coherence microscopy (OCM) [105], image over narrow depth fields of view, making them suitable for high resolution en face imaging. Although initial OCM systems employed TD-OCT system designs, the difficulty in aligning the reference arm path delay with the confocal spatial filter has led to the more recent development of FD-OCT and SS-OCT based OCM systems. Transverse and axial resolutions of 0.9 × 2.2 μm have been reported for a 29 kHz FD-OCT implementation for ex vivo rat tissue [106], as well as a 1.6 × 8 μm 42 kHz SS-OCT implementation with a 20 μm confocal gate width for ex vivo human colon and rat kidney studies and in vivoXenopus laevis tadpole [107].

The use of a confocal aperture in OCM systems localize the light returning from the sample and permit a spectral absorption analysis of the backscatter. This method, referred to as spectroscopic OCT [108], has been implemented using both TD- and FD-OCT systems [109, 110] but has thus far not been applied for in vivo ocular imaging.

The localization of the light within the sample also permits en face imaging at a single depth position without scanning, called full-field OCT [111]. With the use of microscope objectives and thermal lamp sources, ultra-high axial and transverse resolutions have been obtained for ex vivo [112] and in vivo [113] imaging of a rat eye. The use of thermal sources requires spatial filtering to reduce speckle noise prior to sample illumination, which reduces the incident power to several milliwatts. The low incident power, combined with scattering crosstalk from nearby transverse positions limits the sensitivity in full-field OCT systems; a sensitivity of 71 dB was reported in ref. [113]. However, the resolutions achieved, 0.9 × 0.7 μm and 1.1 × 1.5 μm transverse by axial resolutions for ref. [112] and 113], respectively, are obtained with sources at a fraction of the cost of ultrafast lasers used for similar applications. Thus far, the primary drawback to full-field in vivo imaging has been the relatively long acquisition times, which make the system susceptible to motion artifacts.

With a view toward increasing acquisition speed, OCT systems have been developed in which the depth information across a transverse line is acquired at each scanning step [114, 115]. These systems, referred to as line-field OCT, or parallel FD-OCT, illuminate the sample with a line instead of a point. One dimension of a pixel-array detector is reserved for the spectral, depth encoded, interferogram, while other is mapped with the transverse line backscattered from the sample. In vivo imaging of the retina has been demonstrated using a line-field FD-OCT system [116] with an effective axial scan rate of 51,456 Hz for 3-dimensional imaging. However, due to the incoherent contribution of scattered light along the length of the illuminating line, the sensitivity and dynamic range degrades by approximately 30 dB at the edges of the transverse scan. This loss of coherent backscattered light severely limits the imaging contrast and line-field OCT's present potential as a clinical replacement to flying-spot scanning FD-OCT and SS-OCT systems.

6. ULTRAFAST LASERS IN IMAGING

With the recent availability of pulsed femtosecond (1 fs = 10−15 sec) laser sources, researchers have exploited nonlinear optical effects as a tool to enhance contrast for ultrafast spectroscopy and imaging applications. Unlike continuous wave lasers, the energy of pulsed sources is concentrated within a narrow temporal window, which results in high pulse intensities and permits access to a range of nonlinear effects, among which include sum and difference frequency generation in nonlinear media [117], self-phase modulation in tapered and photonic crystal fibers [118] and higher-order harmonic generation [119]. A further notable difference from continuous wave is that, due to the Fourier transform relationship between the spectral and temporal domains, sources that emit narrow pulse envelopes have, by necessity, broad power spectral bandwidths. When applied to biological imaging, a degree of uncertainty remains in the incident power thresholds prior to tissue damage since the dominant mechanism can vary depending on the temporal power distribution of the pulses. The current ANSI maximum permissible exposures for ocular safety require three calculations to account for damage due to a single bright pulse, damage built-up over many consecutive pulses and damage from the average illumination power [15].

Ultrafast sources that cover a wide range of pulse durations, from under ten to hundreds of femtoseconds, are commercially available using the titanium-doped sapphire (Ti:sapphire) oscillator. For this oscillator, short pulses are generated using passive Kerr lens mode-locking; refer to Keller [120] for a review of the various mode-locking mechanisms used in recent ultrafast sources. The Ti:sapphire gain medium permits tunable operation over several hundred nanometers in the near infrared, with a maximum gain achieved at a center wavelength near 800 nm [121]. Due to their high average output powers of several watts, good beam quality and robustness, the pulsed Ti:sapphire laser has become a popular choice for ultrafast experiments. Their relatively smooth and wide spectral bandwidth alone has made them suitable for ultra-high resolution optical coherence tomography [122, 123], which has been applied in vivo to analyze intraretinal structure of patients with macular disease [124]. To obtain similar high transverse resolutions (5 – 10 μm), adaptive optics have been added to these OCT systems, which, after compensating for ocular aberrations, have shown up to a 9 dB signal to noise ratio improvement of intraretinal layers [125].

The use of an ultrafast source in an imaging system requires careful attention to and management of dispersion. Chromatic dispersion, arising from a frequency dependent index of refraction, will cause some frequencies within a pulse to travel slower than others, resulting in an overall pulse broadening. This temporal-frequency dependence across the pulse is commonly referred to as chirp, and can be altered when the pulses pass through glass or biological media.

When a medium's second derivative of the refractive index with respect to frequency is greater than zero, the dispersion introduced is said to be normal. Most media exhibit normal dispersion in the visible region of the spectrum. Conversely, when the second derivative of the refractive index is negative, the dispersion is said to be anomalous, with the cross-over point called the zero dispersion wavelength. It is important to identify the type and amount of dispersion present in an imaging system since it can impact the accuracy of physical path length measurements, including depth resolution in OCT.

To compensate for normal dispersion, and compress a pulse back to its minimum transform-limited duration, several methods may be used: custom-designed chirped mirrors; prism pairs; grating pairs; or dispersion-shifted fibers [126]. Chirped mirrors are fabricated with overlaying dielectric coatings, such that the design wavelengths are reflected at different layers and experience different path lengths, providing a very specific but inflexible amount of dispersion compensation. Both prism and grating pairs spatially separate the spectral content of the pulse and positive or negative dispersion can be introduced by varying the separation distance between the prisms [127] or dispersion gratings [126, 128].

Despite the wide-spread turn-key use and stability of Ti:sapphire lasers, their wavelength range has been shown to not be optimal for imaging below the retinal pigment epithelium layer. A comparative ultra-high resolution OCT study of retinal layers in excised pig retinas, performed at center wavelengths of 790, 1050, and 1350 nm, demonstrated that imaging penetration depth increased with increasing wavelength where the ANSI standards permit higher illumination powers [129]. Based on the trade-off between water absorption, eye length and permissible illumination power, 1050 nm was found to be the preferred center wavelength for in vivo human retinal imaging. Sources at 1350 nm, which are compatible with common fiber optic components, are preferred for either human anterior segment imaging or animal retinal studies where the eye length is shorter. Since the quantum efficiency of silicon-based detectors drops off for wavelengths higher than 1000 nm, the detector technology is usually switched to indium gallium arsenide for imaging systems operating in the 1000 – 1600 nm range.

When imaging is performed within the infrared telecommunication wavelength bands, previously well-developed fiber-optic components can be directly implemented into the system design, creating a more compact and robust product. The O-band, covering a wavelength range of 1260 – 1360 nm, has been attractive for the telecommunication industry due to silica fiber's natural zero dispersion point around 1300 nm, which results in lower chromatic dispersion and pulse broadening. The C-band, covering 1530 – 1565 nm, has also been widely used due to low signal attenuation in silica, as well as the availability of erbium-doped fiber amplifiers, which have been critical in the development of long-haul fiber-optic communication systems.

Longer wavelength ultrafast imaging sources have been realized by shifting the spectral operating range of the Ti:sapphire laser using nonlinear optics within an optical parametric oscillator (OPO) [130] or in a second harmonic generating crystal. Within an OPO, parametric difference frequency generation downconverts the pump pulses to a tunable spectrum centered in the infrared. When this spectrum subsequently undergoes second harmonic generation, the bandwidth is shifted from the infrared back to the visible. Depending on the nonlinear crystal in use, wavelength tuning can performed by tilting the crystal, altering its temperature, or adjusting the temporal overlap of the pump and signal pulses with the cavity length of the OPO. Commercially available OPOs are now capable of a cavity length tuning range from 1100 – 1600 nm and several hundred milliwatts of output power. The advantage of using an OPO is that the emitted pulses are synchronized to the pump, which allows temporal stability between pulses for nonlinear experiments for a wide range of wavelength separation. To date, the high cost and lack of clinical nonlinear imaging applications restricts the use of OPOs to laboratory research.

Tunable ultrafast sources for direct emission in the O- and C-bands include the chromium-doped forsterite (Cr:Forsterite) and chromium-doped yttrium aluminium garnet (Cr:YAG) femtosecond lasers, respectively. Due to the extended penetration depths in tissue with increasing wavelength [131, 132], these lasers have served as broad bandwidth sources for optical coherence tomography [133], though for this application they have largely been replaced by relatively inexpensive continuous wave superluminescent diodes [134, 135]. Continuous wave Cr4+:Forsterite lasers have also been employed as an early wavelength swept source for OCT, using prisms to disperse the spectrum and allow only a narrow bandwidth to experience intra-cavity gain at any given point in time [136].

Concurrent research in alternative pulsed broad bandwidth sources for imaging has included the development of erbium-doped fiber ring cavity lasers based on amplified spontaneous emission [137]. Despite achieving broad bandwidths, fiber lasers have thus far been unable to provide the same femtosecond-scale pulse durations as free-space systems due to fiber nonlinearities, as well as the difficulty in compensating higher order cavity dispersion. However, by rapidly changing which spectral components exhibit intra-cavity gain with a tunable Fabry-Pérot interferometer, fiber ring lasers using semiconductor optical amplifiers have been developed as wavelength swept sources for SS-OCT systems [75]. Although fiber lasers have the advantage of being compact and require low operational costs, their stability remains affected by thermal and mechanical fiber stresses.

With a view toward further increasing the depth resolution, researchers have extended their source bandwidth using supercontinuum generation in tapered optical fibers and photonic crystal fibers (PCFs) [118]. Tapered fibers are produced by heating and gently stretching a fiber until its radius reduces to several microns, effectively eliminating the core-cladding layer. PCFs also have very narrow cores, but are surrounded by a custom array of air gaps acting as a low-index cladding, which can enhance the control of the broadened spectral shape and reduce fragility, but come at an increased manufacturing cost.

When pulses of light are focused into and propagate along the very narrow tapered or PCF regions, their high intensities can give rise to several nonlinear effects, depending on the incident wavelength, power and pulse duration, as well as the fiber's geometry and group velocity dispersion. When the pulse wavelengths are close to the fiber's zero-dispersion point, self-phase modulation occurs, and new frequencies are generated by a nonlinear phase shift [118]. Spectral broadening beyond two octaves has been reported when operating near the zero-dispersion point [138], though fine spectral structure noise and spectral flatness have been problematic due to the generation of Raman solitons in the anomalous dispersion region [139]. Hybrid highly nonlinear fibers have since been developed to provide flat spectral broadening by separating and controlling the nonlinear effects in turn [139], or by shifting the zero-dispersion point to operate exclusively in a normal dispersive region [140]. As OCT sources, the flatter supercontinuum spectra obtained by highly nonlinear fibers have permitted high depth resolutions (< 3 μm in air), combined with a 40 dB dynamic range and 103 dB sensitivity [140]. With a PCF broadened spectrum centered at 830 nm, and spectral shaping applied in post-processing, an FD-OCT system capable of 4 μm axial resolution in air has been used to demonstrate in vivo imaging of the human cornea and retina [141].

Recent PCFs have been designed to include two zero-dispersion points, allowing simultaneous broadening at two different center wavelengths with a single pump laser. From the point of view of tissue scattering, dual-band OCT sources can extend the versatility of the imaging system, providing simultaneous dual-band imaging for contrast enhancement and spectroscopic measurement capabilities [142]. In addition, when speckle noise between two separate illumination bands is uncorrelated, it can be reduced by frequency compounding methods in post-processing [143].

Just as nonlinear optical effects can help broaden the imaging spectrum to achieve higher resolutions, they can also distort it, giving rise to unwanted sidelobes in the point-spread function. To avoid inadvertently shaping the final spectrum in the fiber optic components used in the imaging system, the pulses are often temporally smeared through a length of dispersion-shifted fiber to reduce their peak intensity. Dispersion-shifted fiber is typically chosen in order to limit the nonlinear effects to self-phase modulation; when the imaging spectrum is near the zero dispersion wavelength, four-wave mixing and stimulated Raman scattering can also occur, producing peaks at the edges of the imaging spectrum that translate to sidelobes in the point-spread function.

Apart from the access to their broad spectral properties, the advantages of using ultrafast pulses for imaging can be realized in a variety of ways. For example, the temporal interaction window formed by the overlap of ultrafast pulses in a nonlinear medium outside the sample can produce a time gate for time-resolved imaging [144]. In this case, the ballistic light that has transmitted through, or been reflected from, a sample is temporally filtered so that light only from a specific window in time is detected, while multiply scattered snake and diffuse light is rejected. Time-resolved imaging has been implemented with incoherent time gates formed using second-harmonic generation [145], parametric amplification [146], stimulated Raman scattering [147] and the optical Kerr effect [148]. These techniques have thus far aimed to achieve the shortest time gate possible to maximize the depth resolution, to date achieving 10 um with 70 fs pulses in a time gate provided by sum-frequency generation [149]. Since the imaging depth resolution is directly related to the shape of the pulses, dispersion from the sample can affect performance unless there is a priori compensation. These imaging approaches have been primarily applied to detect the variation in tissue attenuation at tumor sites in breast tissue [150] and have not been applied to ocular imaging, where the superior resolution, contrast and speed favor OCT techniques for reflection-based imaging.

A system that combines both time gating and FD-OCT has been recently reported [151]. In this case, sum-frequency generation is used as a wide time gate to restrict the backscattered depth field of view to several hundred micrometers, while OCT provides high sensitivity imaging of the structure within the gate to sub-10 μm resolution. By selectively removing strong scattering sites from a depth scan prior to detection, the contrast of the imaging sites of interest can be enhanced over a standard FD-OCT system, while the volume of collected imaging data is minimized to the depth region of interest. As compared to en face imaging systems that use a depth field of view equal to the system's axial resolution, the wider time gate enables better structural localization. The reported proof-of-concept system uses a sample illumination power of 40 mW, which makes it currently unsuitable for ocular imaging. However, since the nonlinear interaction occurs outside the sample, further optimization of the time gating process is expected to permit lower powers in future implementations.

Ultrafast imaging has also exploited nonlinear two- and three-photon interactions that occur within a sample, such as fluorescence and second and third harmonic generation [152, 153]. In these cases, the high intensities from pulsed laser sources serve to generate a significant amount of localized frequency shifted light that can be subsequently filtered and detected to enhance imaging contrast. The wavelength-tuning capability of the Ti:Sapphire laser provides high intensity pulses that can be tuned to the molecular resonances of interest and it has been used in commercially available fluorescent confocal microscopes for rod photoreceptor studies [154]. Autofluorescence imaging with a Ti:Sapphire laser has also been investigated in donor human retinal pigment epithelial cells for the purpose of the early diagnosis of age-related macular degeneration [155].

The primary drawback to multi-photon techniques is the low nonlinear conversion efficiency, which decreases as higher order harmonics are generated and create light-starved situations. To maximize the nonlinear power generated, the illumination pulses should be optimized for the molecular interaction taking place, with as high a peak intensity as possible. As previously mentioned, chromatic dispersion, introduced by lenses or biological media, will spread the temporal duration of the pulse and lower the nonlinearly generated light.

To monitor and characterize the temporal shape of femtosecond pulses, current photodetectors cannot be used due to their insufficient acquisition speeds. An early ultrafast pulse characterization device, which uses the pulse to measure itself, is the second-harmonic autocorrelator. The autocorrelator sends an input beam into an interferometer and controls the temporal overlap of the recombined pulses. These pulses are recombined in a nonlinear crystal set for second harmonic generation so that the frequency-doubled light is detected over a range of relative pulse delays. Due to the direct dependence between the incident and second harmonic generated intensities, an autocorrelation trace of the unknown pulse is produced, from which the duration can be inferred for a given pulse shape. Frequency-resolved optical gating (FROG) techniques [156] perform the additional step of measuring the spectral content of the nonlinearly generated light at varied relative incident pulse delays. In this case, a separate well-characterized reference pulse can be used to produce a cross-correlation trace, and several FROG implementations using nonlinear interactions other than second-harmonic generation have been reported. Iterative post-processing techniques are applied to the detected time-frequency trace, resulting in a near-complete phase and amplitude characterization of the unknown pulse [157].

Although the potential of second-harmonic signals for enhancing imaging contrast in highly scattering media was anticipated and studied with Q-switched YAG lasers long before ultrafast lasers were widely available [158], recent advances in temporal control and calibration have led to striking images of the collagenous layers in the human donor tissue samples of cornea [159] and optic nerve head [160]. The optic nerve head samples are illuminated by a Ti:Sapphire oscillator pumped by a frequency-doubled neodymium-doped yttrium vanadate (Nd:YVO4) solid-state laser, in an application-specific microscope. The 170 fs, 76 MHz mode-locked laser pulses are raster scanned across the target, using 150 fs pulses with a repetition rate of 90 MHz and set to deliver a relatively high average power for ophthalmology (48 mW) at the focal plane of the microscope objective. A comparison of two-photon excited fluorescence and second harmonic generation imaging of the same corneal samples has been shown to characterize the cells in the collagen rich layers of the cornea [161]. Due to the incident power limitations and the low numerical aperture optics used to focus light onto the retina, it is challenging to obtain sufficient second harmonic and fluorescent photons for in vivo retinal imaging, which has prompted the use of contrast agents when possible. However, noninvasive two-photon imaging of the periphery of the retina has been reported by focusing light through the sclera in the study of retinyl ester storage particle dynamics in wild-type mice [162]. On the other hand, multi-photon imaging of the sclera and cornea can be performed with higher numerical aperture optics, permitting greater illumination intensities and second harmonic or fluorescent light.

The short but intense illumination achieved with pulsed sources has permitted a look into ultrafast dynamics of numerous biological and chemical processes [163]. Popular pump-probe experiments involve the use of a pump pulse and a suitably delayed probing pulse to excite a system and monitor its behavior before it returns to its steady state. Pulsed sources have been used in functional fluorescence excitation microscopy. Studies of calcium dynamics in retinal ganglion and amacrine cells have been performed using a Ti:Sapphire laser with 200 fs pulses centered at 930 nm, which permitted neuron measurements without the saturation of photoreceptors [164].

Coherent anti-Stokes Raman spectroscopy (CARS) is a well-known pump-probe technique that is sensitive to the molecular vibrations of a sample [165]. This process uses pulses at two separate frequencies to illuminate the sample. Light is nonlinearly generated within the sample at a frequency of 2ω1 – ω2 due to a third order interaction called four-wave mixing. The illumination frequencies are chosen so that the generated light at the anti-Stokes frequency is resonant with the molecular vibrations of the sample. This resonance enhances the power and directionality of the generated light, which is subsequently detected. A drawback to this technique is that non-resonant fluorescence can partially or completely mask the desired signal. To suppress this background, polarization sensitive, time-gating and phase control of the excitation pulses can be employed. To the authors' knowledge, CARS has not been applied to ocular imaging, primarily due to the high incident powers required to achieve a detectable return signal.

Recently the detection of second harmonic generated signals in nonlinear tissue samples has been done interferometrically using optical coherence tomography [166, 167]. In this case, the light that has been upconverted in the sample is interfered with frequency-doubled light in the reference arm of the interferometer. With the appropriate use of dichroic mirrors and separate Michelson interferometers, OCT imaging with both the second harmonic generated spectrum and with the unconverted bandwidth has been demonstrated. However, due to the weak second harmonic light levels returning from the sample, even with high powers (50 mW) incident on the sample, the image acquisition time for the design in ref. [167] is considerably longer (5 Hz axial scan rates) than typical OCT systems, limiting its potential for in vivo ocular imaging applications.

7. OVERVIEW

The use of lasers in clinical ocular imaging applications requires careful consideration of the safe exposure limits at a given wavelength as set forth by the ANSI standards. Additional limitations arise from the optics and geometry of the eye, which limit the resolution and field of view that can be obtained in retinal imaging; these constraints have been partially overcome with the use of adaptive optics to correct wavefront aberrations, and the dilation of the patient's pupil, respectively.

The use of fluorescence has been a traditional method of enhancing the imaging contrast of a particular region of interest in the retina. A great amount of versatility can be achieved for this type of imaging when the illumination source wavelength has the capability to be tuned to the resonance of the fluorophore. The use of fluorescein angiograms permits the monitoring of a chemical dye as it passes through the blood stream, which can be particularly important when searching for vessel leakage due to age-related macular degeneration. Contrast enhancement can also be achieved by exploiting the natural birefringence of foveal photoreceptors, ganglion cell axons, and collagen with polarization-sensitive imaging.

Confocal imaging systems permit the exclusion of unwanted light, be it light from outside the focal volume or direct backscatter, from reaching the detector by the appropriate geometry and placement of apertures. The confocal effect has been effectively combined with other imaging systems and sources, such as point-scanning laser ophthalmoscopes, to provide en face imaging in vivo and VCSEL sources to reduce scanning complexity and overall cost.

The recent emergence of OCT has had a particularly strong impact on the imaging of the eye, permitting in vivo views of the deeper layers of the retina. Numerous broad bandwidth sources have been developed for OCT imaging with system trade-offs such as: cost, center wavelength, bandwidth, spectral uniformity, power and permissible acquisition speed. The primary OCT systems developed for functional ocular imaging include the polarization-sensitive, second-harmonic, confocal hybrids and Doppler variants, each with application-specific advantages. It is expected that the imaging capabilities and commercialization of these imaging systems will continue to advance in the years to come.

The use of ultrafast lasers in imaging provides access to a variety of imaging techniques that are not available with conventional continuous wave sources. Their primary advantages are a result of their high pulse intensities, which permit access to nonlinear effects that occur either prior to imaging (spectral broadening with supercontinuum generation), inside the sample (fluorescence, second-harmonic generation), or for optical processing prior to detection (time gating). However, nonlinear interactions within a biological sample require a power density that often exceeds safety limits for in vivo studies. Despite the access to broad bandwidths provided with ultrafast lasers, they remain expensive and complex with respect to continuous wave light sources, which has limited their use to ultra-high resolution imaging and prevented widespread implementation in clinical environments.

Future development for clinical ocular applications largely depend on the imaging technique in use. The availability of semiconductor continuous wave laser sources at more wavelengths, particularly at shorter wavelengths, will add to the range of samples that can be easily imaged using fluorescence. For most in vivo work, the current emitted power levels are more than sufficient, and are not as important as achieving stable Gaussian beam shapes, reliable turn-key operation and a low cost.

The commercialization of OCT systems will continue to drive the development of lower cost broadband sources. Superluminescent diodes are currently favored in many standard imaging systems, though more specialized applications will become accessible to clinicians as the cost of required sources and their operation decreases.

Apart from cost, nonlinear imaging systems will also require improvements in detector technology prior to commercialization. Since safety limits restrict the incident power, higher detector efficiencies are required to improve the signal to noise ratio in light-starved situations, decrease the exposure duration. Particularly for three-dimensional OCT imaging, the development of faster analog to digital electronics is critical to transfer large quantities of data for real-time imaging.

Figure 3.

Figure 3

En face, flood illuminated photograph of a patient's eye with a common retinal disease, age-related macular degeneration.

Figure 7.

Figure 7

Diagram of a confocal imaging system with laser scanning. The light from out of focus planes, as well as light that is scattered laterally over long distances, is restricted from reaching the detector by the use of the aperture. Light within the desired small focal volume, shown in green, passes through the aperture and reaches the detector.

Figure 9.

Figure 9

Diagram of an imaging system with laser scanning that captures multiply scattered light in preference to directly backscattered light. The light from out of focus planes, or that is scattered laterally over long distances, passes through the aperture and reaches the detector. Directly backscattered light from within the small focal volume that is on axis with the illumination is blocked by the aperture and not detected.

Figure 20.

Figure 20

A pair of FD-OCT 2-dimensional images of the human retina taken without (left) and with (right) adaptive optics using the system described in ref. [100]. The left image used a 1.2 mm pupil and shows a relatively large lateral speckle size, while the right image used a 6.0 mm pupil size. For both, a 10 mW superluminescent diode was used, centered at 840 with a FWHM bandwidth of 50 nm. The adaptive optics system corrected ocular aberrations up to a wavefront RMS of approximately 50 nm and a Strehl ratio of 0.8 (diffraction limited). The smaller speckle size, as well as a higher resolution, allows visualization of finer features. Image provided by Drs. Barry Cense and Donald Miller, with the method described in [100].

Figure 22.

Figure 22

Nonlinear optical interaction between pump and idler pulses producing signal pulse at ω3 = ω1 + ω2 for sum frequency generation or at ω3 = ω1 − ω2 for difference frequency generation.

Figure 23.

Figure 23

Dispersion compensation configurations using a chirped mirror (A), a prism pair (B), and a grating pair (C).

Figure 24.

Figure 24

Schematic of a tapered optical fiber. Nonlinear spectral broadening occurs in the waist region.

Figure 25.

Figure 25

Cross-section of a photonic crystal fiber. Air gaps surrounding the core permit a similar spectral broadening effect as in tapered fibers.

Figure 26.

Figure 26

An imaging pulse is transmitted through a sample and is temporally composed of light from different scattering paths. A narrow time-gating pulse then interacts with the imaging pulse in an external nonlinear medium and restricts the light to a narrow temporal window prior to detection.

Figure 27.

Figure 27

Schematic of a second-harmonic FD-OCT system. Second-harmonic generated (SHG) light from the sample (blue) is interfered with SHG light from a nonlinear crystal. Standard FD-OCT is performed with the unconverted light (red).

Acknowledgments

Supported by NIH NEI EY007624 and NIBIB EB002346. We thank Drs. Stephen Burns, Barry Cense, Donald Miller, and David Williams for images from their research.

Footnotes

PACS: 42.55.-f, 42.55.Px, 42.30.- d, 87.63.L, 42.66.Ct, 87.64.Cc

References

  • [1].Dowling JE. The Retina, an Approachable Part of the Brain. Harvard University Press; Cambridge: 1987. [Google Scholar]
  • [2].Elsner AE. In: Handbook of Adaptive Optics. Porter J, Queener H, Lin J, Thorn K, Awwal A, editors. Wiley; New York: 2006. pp. 205–234. [Google Scholar]
  • [3].Burns SA, Wu S, Delori FC, Elsner AE. J. Opt. Soc. Am. A. 1995;12:2329. doi: 10.1364/josaa.12.002329. [DOI] [PubMed] [Google Scholar]
  • [4].Elsner AE, Burns SA, Beausencourt E, Weiter JJ. Invest. Ophthalmol. Vis. Sci. 1998;39:2394. [PubMed] [Google Scholar]
  • [5].Delori FC, Staurenghi G, Arend O, Dorey CK, Goger DG DG, Weiter JJ. Invest. Ophthalmol. Vis. Sci. 1995;36:2327. [PubMed] [Google Scholar]
  • [6].Denninghoff KR, Chipman RA, Hillman LW. J. Biomed. Opt. 2007;12:034020. doi: 10.1117/1.2745312. [DOI] [PubMed] [Google Scholar]
  • [7].Bartsch D-U, Elmusharaf A, El-Bradey M, Freeman WR. Ophthalmic Surg. Lasers Imaging. 2003;34:17. [PubMed] [Google Scholar]
  • [8].Elsner AE, Jalkh AH, Weiter JJ. In: Practical Atlas of Retinal Disease and Therapy. Freeman W, editor. Raven; New York: 1993. pp. 19–35. [Google Scholar]
  • [9].Elsner AE, Burns SA, Weiter JJ, Hartnett ME. LEOS '94, (IEEE Catalog number 94CH3371–2, Library of Congress number 93–61268, 1, 1994) pp. 14–15. [Google Scholar]
  • [10].Campbell CJ, Noyori KS, Rittler MC, Koester CJ. Acta Ophthalmol. Suppl. 1963;76:22. doi: 10.1111/j.1755-3768.1963.tb05161.x. [DOI] [PubMed] [Google Scholar]
  • [11].Kapany NS, Peppers NA, Zweng HC, Flocks M. Nature. 1963;199:146. doi: 10.1038/199146a0. [DOI] [PubMed] [Google Scholar]
  • [12].Krauskopf J, Williams DR, Heeley DW DW. Vision Res. 1981;21:951. doi: 10.1016/0042-6989(81)90198-x. [DOI] [PubMed] [Google Scholar]
  • [13].Burns SA, Kreitz M, Elsner AE. Applied Optics. 1991;30:2063. doi: 10.1364/AO.30.002063. [DOI] [PubMed] [Google Scholar]
  • [14].American National Standards Institute ISO 15004-2. 2007 [Google Scholar]
  • [15].Delori FC, Webb RH, Sliney DH. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2007;24:1250. doi: 10.1364/josaa.24.001250. [DOI] [PubMed] [Google Scholar]
  • [16].Webb RH, Hughes GW. IEEE Trans Biomed Eng. 1981;28:488. doi: 10.1109/TBME.1981.324734. [DOI] [PubMed] [Google Scholar]
  • [17].Auran JD, Koester CJ, Kleiman NJ, Rapaport R, Bomann JS, Wirotsko BM, Florakis GJ, Koniarek JP. Ophthalmology. 1995;102:33. doi: 10.1016/s0161-6420(95)31057-3. [DOI] [PubMed] [Google Scholar]
  • [18].Hammer DX, Ferguson D, Magill JC JC, White MA, Elsner AE, Webb RH. Applied Optics. 2003;42:4621. doi: 10.1364/ao.42.004621. [DOI] [PubMed] [Google Scholar]
  • [19].Elsner AE, Stewart JB, Schwarz R. OSA Annual Meeting; Orlando, FL. September, 2002. [Google Scholar]
  • [20].Smithwick QY, Elsner AE, Stewart JB, Schwarz RA, Cheney MC. OSA Annual Meeting; Frontiers in Optics, Tucson, AZ. October, 2003.p. 48. [Google Scholar]
  • [21].Zhao Y, Elsner AE, Haggerty BP, VanNasdale DA, Petrig BL. OSA Annual Meeting; Frontiers in Optics, Rochester, NY. 2006. [Google Scholar]
  • [22].van Norren D, van de Kraats J. Vision Res. 1989;29:1825. doi: 10.1016/0042-6989(89)90163-6. [DOI] [PubMed] [Google Scholar]
  • [23].Elsner AE, Burns SA, Hughes GW, Webb RH. Applied Optics. 1992;31:3697. doi: 10.1364/AO.31.003697. [DOI] [PubMed] [Google Scholar]
  • [24].Jorzik JJ, Bindewald A, Dithmar S, Holz FG. Retina. 2005;25:405. doi: 10.1097/00006982-200506000-00003. [DOI] [PubMed] [Google Scholar]
  • [25].Gabriele ML, Wollstein G, Bilonick RA, Burgansky-Eliash Z, Ishikawa H, Kagemann LE, Schuman JS. Ophthalmology. doi: 10.1016/j.ophtha.2007.05.045. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Elsner AE, Burns SA, Weiter JJ, Delori FC. Vision Res. 1996;36:191. doi: 10.1016/0042-6989(95)00100-e. [DOI] [PubMed] [Google Scholar]
  • [27].Marcos S, Burns SA. J. Opt. Soc. Am. A. 1999;16:995. doi: 10.1364/josaa.16.000995. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [28].Hochheimer BF, Kues HA. Applied Optics. 1982;21:3811. doi: 10.1364/AO.21.003811. [DOI] [PubMed] [Google Scholar]
  • [29].Elsner AE, Burns SA, Delori FC, Webb RH. Quantitative Reflectometry with the SLO. In: Nasemann JE, Burk ROW, editors. Laser Scanning Ophthalmoscopy and Tomography. Quintessenz-Verlag; Berlin: 1990. pp. 109–121. [Google Scholar]
  • [30].Dreher AW, Reiter K, Weinreb R. Applied Optics. 1992;31:3730. doi: 10.1364/AO.31.003730. [DOI] [PubMed] [Google Scholar]
  • [31].Chen JF, Elsner AE, Burns SA, Hansen RM, Lou PL, Kwong KK, Fulton AB. Clinical Vision Sciences. 1992;7:521. [Google Scholar]
  • [32].Wolf S, Wald K, Elsner AE, Staurenghi G. Retina. 1993;13:266. doi: 10.1097/00006982-199313030-00025. [DOI] [PubMed] [Google Scholar]
  • [33].Remky A, Beausencourt E, Elsner AE. Invest. Ophthalmol. Vis. Sci. 1996;37:2350. [PubMed] [Google Scholar]
  • [34].Remky A, Elsner AE. British J. Ophthalmol. 2005;89:464. doi: 10.1136/bjo.2004.050260. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [35].Moraes L, Elsner AE, Kunze C, Suzuki H, Nehemy MB, Soriano DS, Kara-José N. Arq. Bras. Oftalmol. 2007;70:844. doi: 10.1590/s0004-27492007000500021. [DOI] [PubMed] [Google Scholar]
  • [36].Dreher AW, Reiter K, Bille J. Invest. Ophthalmol. Visual Sci. 1988;29:355. [Google Scholar]
  • [37].Bartsch DU, Intaglietta M, Bille JF, Dreher AW, Gharib M, Freeman WR. Am. J. Ophthalmol. 1989;108:277. doi: 10.1016/0002-9394(89)90118-9. [DOI] [PubMed] [Google Scholar]
  • [38].Gaida G. Perspectives and Limits of Three-Dimensional Fundus Microscopy. In: Nasemann JE, Burk ROW, editors. Laser Scanning Ophthalmoscopy and Tomography. Quintessenz-Verlag; Berlin: 1990. pp. 253–257. [Google Scholar]
  • [39].Miura M, Elsner AE, Beausencourt E, Kunze C, Hartnett ME, Lashkari K, Trempe C. Retina. 2002;22:300. doi: 10.1097/00006982-200206000-00008. [DOI] [PubMed] [Google Scholar]
  • [40].Miura M, Elsner AE. Optics Express. 2001;9:436. doi: 10.1364/oe.9.000436. [DOI] [PubMed] [Google Scholar]
  • [41].Hartnett ME, Weiter JJ, Staurenghi G, Elsner AE. Ophthalmology. 1996;103:2042. doi: 10.1016/s0161-6420(96)30389-8. [DOI] [PubMed] [Google Scholar]
  • [42].Kunze C, Elsner AE, Beausencourt E, Moraes L, Hartnett ME. C.L. Trempe. Ophthalmology. 1999;9:1830. doi: 10.1016/S0161-6420(99)90364-0. [DOI] [PubMed] [Google Scholar]
  • [43].Hartnett EM. A.E. Elsner.Ophthalmology. 1996;103:58. doi: 10.1016/s0161-6420(96)30731-8. [DOI] [PubMed] [Google Scholar]
  • [44].Elsner AE, Miura M, Burns SA, Beausencourt E, Kunze C, Kelley L, Walker JP, Wing GL, Raskauskas PA, Fletcher DC, Zhou Q, Dreher AW. Optics Express. 2000;7:95. doi: 10.1364/oe.7.000095. [DOI] [PubMed] [Google Scholar]
  • [45].Elsner AE, Dreher AW, Zhou Q, Beausencourt E, Burns SA, Webb RH. Lasers and Light in Ophthalmol. 1998;8:193. [Google Scholar]
  • [46].Elsner AE, Zhou Q, Beck F, Tornambe PE, Burns SA, Weiter JJ, Dreher AW. Int. Ophthalmol. 2001;23:245. doi: 10.1023/a:1014405320509. [DOI] [PubMed] [Google Scholar]
  • [47].Elsner A, Zhao Y, VanNasdale D, Haggerty B, Petrig B. American Academy of Optometry. 2008 October 24–27;:68. 070027. [Google Scholar]
  • [48].Burns SA, Elsner AE, Mellem-Kairala MB, Simmons RB. Invest. Ophthalmol. Vis. Sci. 2003;44:4061. doi: 10.1167/iovs.03-0124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [49].Weber A, Cheney MC, Smithwick QYJ, Elsner AE. Optics Express. 2004;12:5178. doi: 10.1364/opex.12.005178. [DOI] [PubMed] [Google Scholar]
  • [50].Mellem-Kairala MB, Elsner AE, Weber A, Simmons RB, Burns SA. Invest. Ophthalmol. Vis. Sci. 2005;46:1099. doi: 10.1167/iovs.04-0574. [DOI] [PubMed] [Google Scholar]
  • [51].Miura M, Elsner AE, Weber A, Cheney MC, Osako M, Usui M, Iwasaki T. Am. J. Ophthalmol. 2005;140:1014. doi: 10.1016/j.ajo.2005.06.033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [52].Weber A, Elsner AE, Mirua M, Kompa S, Cheney MC. Eye. 2006 Jan 6; doi: 10.1038/sj.eye.6702203. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [53].Elsner AE, Weber A, Cheney MC, VanNasdale DA, Miura M. J. Optical Soc. Am. A Opt. Image Sci. Vis. 2007;24:1431. doi: 10.1364/josaa.24.001468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [54].Miura M, Elsner AE, Cheney MC, Usui M, Iwasaki T. J. Optical Soc. Am. A Opt. Image Sci. Vis. 2007;24:1431. doi: 10.1364/josaa.24.001431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [55].Burns SA, Marcos S, Elsner AE, Barra S. Optics Letters. 2002;27:400. doi: 10.1364/ol.27.000400. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [56].Roorda A, Romero-Borja F, Donnelly WJ, Queener H, Hebert TJ, Campbell MCW. Opt. Express. 2002;10:405. doi: 10.1364/oe.10.000405. [DOI] [PubMed] [Google Scholar]
  • [57].Burns SA, Tumbar R, Elsner AE, Ferguson RD, Hammer DX. J. Opt. Soc. Amer. Opt. Image Sci. Vis. 2007;24:1313. doi: 10.1364/josaa.24.001313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [58].Gray DC, Merigan W, Wolfing JI, Gee BP, Porter J, Dubra A, Twietmeyer TH, Ahamd K, Tumbar R, Reinholz F, Williams DR. Opt. Express. 2006;14:7144. doi: 10.1364/oe.14.007144. [DOI] [PubMed] [Google Scholar]
  • [59].Fujimoto JG, Bouma BE, Tearney GJ, editors. Handbook of Optical Coherence Tomography. chap. 1. Marcel Dekker; New York: 2002. [Google Scholar]
  • [60].Fercher AF, Drexler W, Hitzenberger CK, Lasser T. Rep. Prog. Phys. 2003;66:239. [Google Scholar]
  • [61].Tomlins PH, Wang RK. J. Phys. D.: Appl. Phys. 2005;38:2519. [Google Scholar]
  • [62].Zysk AM, Nguyen FT, Oldenburg AL, Marks DL, Boppart SA. J. Biomed. Opt. 2007;12:051403. doi: 10.1117/1.2793736. [DOI] [PubMed] [Google Scholar]
  • [63].Drexler W, Fujimoto JG. Prog. Ret. Eye Res. 2008;27:45. doi: 10.1016/j.preteyeres.2007.07.005. [DOI] [PubMed] [Google Scholar]
  • [64].Yang VXD, Gordon ML, Tang S-J, Marcon NE, Gardiner G, Qi B, Bisland S, Seng-Yue E, Lo S, Pekar J, Wilson BC, Vitkin IA. Opt. Express. 2003;11:2416. doi: 10.1364/oe.11.002416. [DOI] [PubMed] [Google Scholar]
  • [65].Herz PR, Chen Y, Aguirre AD, Fujimoto JG, Mashimo H, Schmitt J, Koski A, Goodnow J, Petersen C. Opt. Express. 2004;12:3532. doi: 10.1364/opex.12.003532. [DOI] [PubMed] [Google Scholar]
  • [66].Chen Y, Andrews PM, Aguirre AD, Schmitt JM, Fujimoto JG. J. Biomed. Opt. 2007;12:034008. doi: 10.1117/1.2736421. [DOI] [PubMed] [Google Scholar]
  • [67].Rosa CC, Rogers J, Pedro J, Rosen R, Podoleanu A. Appl. Opt. 2007;46:1795. doi: 10.1364/ao.46.001795. [DOI] [PubMed] [Google Scholar]
  • [68].Hermann B, Fernández EJ, Unterhuber A, Sattmann H, Fercher AF, Drexler W, Prieto PM, Artal P. Opt. Lett. 2004;29:2142. doi: 10.1364/ol.29.002142. [DOI] [PubMed] [Google Scholar]
  • [69].Leitgeb R, Hitzenberger CK, Fercher AF. Opt. Express. 2003;11:889. doi: 10.1364/oe.11.000889. [DOI] [PubMed] [Google Scholar]
  • [70].de Boer JF, Cense B, Park BH, Pierce MC, Tearney GJ, Bouma BE. Opt. Lett. 2003;28:2067. doi: 10.1364/ol.28.002067. [DOI] [PubMed] [Google Scholar]
  • [71].Nassif NA, Cense B, Park BH, Yun SH, Chen TC, Bouma BE, Tearney GJ, de Boer JF. Opt. Lett. 2004;29:480. doi: 10.1364/ol.29.000480. [DOI] [PubMed] [Google Scholar]
  • [72].Zhang Y, Cense B, Rha J, Jonnal RS, Gao W, Zawadzki RJ, Werner JS, Jones S, Olivier S, Miller DT. Opt. Express. 2006;14:4380. doi: 10.1364/OE.14.004380. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [73].Sarunic MV, Asrani S, Izatt JA. Arch. Ophthalmol. 2008;126:537. doi: 10.1001/archopht.126.4.537. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [74].Chinn SR, Swanson EA, Fujimoto JG. Opt. Lett. 1997;22:340. doi: 10.1364/ol.22.000340. [DOI] [PubMed] [Google Scholar]
  • [75].Huber R, Adler DC, Fujimoto JG. Opt. Lett. 2006;31:2975. doi: 10.1364/ol.31.002975. [DOI] [PubMed] [Google Scholar]
  • [76].Lim H, Mujat M, Kerbage C, Lee ECW, Chen Y, Chen Teresa C., de Boer JF. Opt. Express. 2006;14:12902. doi: 10.1364/oe.14.012902. [DOI] [PubMed] [Google Scholar]
  • [77].Huber R, Wojtkowski M, Fujimoto JG, Jiang JY, Cable AE. Opt. Express. 2005;13:10523. doi: 10.1364/opex.13.010523. [DOI] [PubMed] [Google Scholar]
  • [78].Dorrer C, Belabas N, Likforman J-P, Joffre M. J. Opt. Soc. Am. B. 2000:1795. [Google Scholar]
  • [79].Yun SH, Tearney GJ, Bouma BE, Park BH, de Boer JF. Opt. Express. 2003;11:3598. doi: 10.1364/oe.11.003598. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [80].Podoleanu AG. Appl. Opt. 2000;39:173. doi: 10.1364/ao.39.000173. [DOI] [PubMed] [Google Scholar]
  • [81].Srinivasan VJ, Huber R, Gorczynska I, Fujimoto JG, Jiang JY, Reisen P, Cable AE. Opt. Lett. 2007;32:361. doi: 10.1364/ol.32.000361. [DOI] [PubMed] [Google Scholar]
  • [82].Wojtkowski M, Srinivasan VJ, Ko TH, Fujimoto JG, Kowalczyk A, Duker JS. Opt. Express. 2004;12:2404. doi: 10.1364/opex.12.002404. [DOI] [PubMed] [Google Scholar]
  • [83].Fercher AF, Hitzenberger CK, Kamp G, El-Zaiat SY. Opt. Commun. 1995;117:43. [Google Scholar]
  • [84].Akcay AC, Rolland JP, Eichenholz JM. Opt. Lett. 2003;28:1921. doi: 10.1364/ol.28.001921. [DOI] [PubMed] [Google Scholar]
  • [85].Na J, Kim J, Choi E, Ryu SY, Baek JH, Lee BH. Opt. Quantum Electron. 2005;37:1251. [Google Scholar]
  • [86].Tripathi R, Nassif N, Nelson JS, Park BH, de Boer JF. Opt. Lett. 2002;27:406. doi: 10.1364/ol.27.000406. [DOI] [PubMed] [Google Scholar]
  • [87].Gong J, Liu B, Kim YL, Liu Y, Li X, Backman V. Opt. Express. 2006;14:5909. doi: 10.1364/oe.14.005909. [DOI] [PubMed] [Google Scholar]
  • [88].Milonni PW, Eberly JH. Lasers. John Wiley & Sons; New York: 1988. [Google Scholar]
  • [89].Feldchtein FI, Gelikonov VM, Gelikonov GV, Bouma BE, Tearney GJ, editors. Handbook of Optical Coherence Tomography. chap. 5. Marcel Dekker; New York: 2002. [Google Scholar]
  • [90].Fercher AF, Hitzenberger CK, Sticker M, Zawadzki R, Karamata B, Lasser T. Opt. Express. 2001;9:610. doi: 10.1364/oe.9.000610. [DOI] [PubMed] [Google Scholar]
  • [91].Chan MC, Su YS, Lin CF, Sun CK. Scan. 2006;28:11. doi: 10.1002/sca.4950280102. [DOI] [PubMed] [Google Scholar]
  • [92].Ohmi M, Haruna M. Opt. Rev. 2003;10:478. [Google Scholar]
  • [93].Bigelow CE, Iftimia NV, Ferguson DR, Ustun TE, Bloom B, Hammer DX. J. Opt. Soc. Am. A. 2007;24:1327. doi: 10.1364/josaa.24.001327. [DOI] [PubMed] [Google Scholar]
  • [94].Cense B, Chen TC, Park BH, Pierce MC, de Boer JF. J. Biomed. Opt. 2004;9:121. doi: 10.1117/1.1627774. [DOI] [PubMed] [Google Scholar]
  • [95].Zawadzki RJ, Jones SM, Olivier SS, Zhao MT, Bower BA, Izatt JA, Choi S, Laut S, Werner JS. Opt. Express. 2005;13:8532. doi: 10.1364/opex.13.008532. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [96].Hammer DX, Iftimia NV, Ferguson RD, Bigelow CE, Ustun TE, Barnaby AM, Fulton AB. Invest Ophthalmol Vis Sci. 2008;49:2061. doi: 10.1167/iovs.07-1228. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [97].Zhang Y, Rha JT, Jonnal RS, Miller DT. Opt. Express. 2005;13:4792. doi: 10.1364/opex.13.004792. [DOI] [PubMed] [Google Scholar]
  • [98].Prieto PM, Fernández EJ, Manzanera S, Artal P. Opt. Express. 2004;12:4059. doi: 10.1364/opex.12.004059. [DOI] [PubMed] [Google Scholar]
  • [99].Fernández EJ, Považay B, Hermann B, Unterhuber A, Sattmann H, Prieto PM, Leitgeb R, Ahnelt P, Artal P, Drexler W. Vis. Res. 2005;45:3432. doi: 10.1016/j.visres.2005.08.028. [DOI] [PubMed] [Google Scholar]
  • [100].Cense B, Brown JM, Jonnal RS, Gao W, Miller DT. Photonics West BiOS 2008: Ophthalmic Technologies XVIII. San Jose; California: 2008. [Google Scholar]
  • [101].Miura M, Yamanari M, Iwasaki T, Elsner AE, Makita S, Yatagai T, Yasuno Y. Invest. Ophthal. Vis. Sci. doi: 10.1167/iovs.07-0501. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [102].Yamanari M, Miura M, Makita S, Yatagai T, Yasuno Y. J. Biomed. Opt. 2008;13:014013. doi: 10.1117/1.2841024. [DOI] [PubMed] [Google Scholar]
  • [103].Yazdanfar S, Rollins AM, Izatt JA. Opt. Lett. 2000;25:1448. doi: 10.1364/ol.25.001448. [DOI] [PubMed] [Google Scholar]
  • [104].Wang Y, Bower BA, Izatt JA, Tan O, Huang D. J. Biomed. Opt. 2007;12:041215. doi: 10.1117/1.2772871. [DOI] [PubMed] [Google Scholar]
  • [105].Izatt JA, Hee MR, Owen GM, Swanson EA, Fujimoto JG. Opt. Lett. 1994;19:590. doi: 10.1364/ol.19.000590. [DOI] [PubMed] [Google Scholar]
  • [106].Xu C, Vinegoni C, Ralston TS, Luo W, Tan W, Boppart SA. Opt. Lett. 2006;31:1079. doi: 10.1364/ol.31.001079. [DOI] [PubMed] [Google Scholar]
  • [107].Huang S-W, Aguirre AD, Huber RA, Adler DC, Fujimoto JG. Opt. Express. 2007;15:6210. doi: 10.1364/oe.15.006210. [DOI] [PubMed] [Google Scholar]
  • [108].Kulkarni MD, Izatt JA. CLEO '96. Summaries of Papers Presented at the Conference on Lasers and Electro-Optics.1996. p. 59. [Google Scholar]
  • [109].Morgner U, Drexler W, Kärtner FX, Li XD, Pitris C, Ippen EP, Fujimoto JG. Opt. Lett. 2000;25:111. doi: 10.1364/ol.25.000111. [DOI] [PubMed] [Google Scholar]
  • [110].Leitgeb R, Wojtkowski M, Kowalczyk A, Hitzenberger CK, Sticker M, Fercher AF. Opt. Lett. 2000;25:820. doi: 10.1364/ol.25.000820. [DOI] [PubMed] [Google Scholar]
  • [111].Beaurepaire E, Boccara AC, Lebec M, Blanchot L, Saint-Jalmes H. Opt. Lett. 1998;23:244. doi: 10.1364/ol.23.000244. [DOI] [PubMed] [Google Scholar]
  • [112].Grieve K, Paques M, Dubois A, Sahel J, Boccara C, Le Gargasson J-F. Invest. Ophth. Vis. Sci. 2004;45:4126. doi: 10.1167/iovs.04-0584. [DOI] [PubMed] [Google Scholar]
  • [113].Grieve K, Dubois A, Simonutti M, Paques M, Sahel J, Le Gargasson J-F, Boccara C. Opt. Express. 2005;13:6286. doi: 10.1364/opex.13.006286. [DOI] [PubMed] [Google Scholar]
  • [114].Zeylikovich I, Gilerson A, Alfano RR. Opt. Lett. 1998;23:1797. doi: 10.1364/ol.23.001797. [DOI] [PubMed] [Google Scholar]
  • [115].Zuluaga AF, Richards-Kortum R. Opt. Lett. 1999;24:519. doi: 10.1364/ol.24.000519. [DOI] [PubMed] [Google Scholar]
  • [116].Nakamura Y, Makita S, Yamanari M, Itoh M, Yatagai T, Yasuno Y. Opt. Express. 2007;15:7103. doi: 10.1364/oe.15.007103. [DOI] [PubMed] [Google Scholar]
  • [117].Boyd RW, editor. Nonlinear Optics. 2nd ed. Academic Press; San Diego: 2003. [Google Scholar]
  • [118].Wadsworth WJ, Ortigosa-Blanch A, Knight JC, Birks TA, Man T-PM, Russell P. St. J. J. Opt. Soc. Am. B. 2002;19:2148. [Google Scholar]
  • [119].Lewenstein M, Balcou P, Ivanov MY, L'Huillier A, Corkum PB. Phys. Rev. A. 1994;49:2117. doi: 10.1103/physreva.49.2117. [DOI] [PubMed] [Google Scholar]
  • [120].Keller U. Nature. 2003;424:831. doi: 10.1038/nature01938. [DOI] [PubMed] [Google Scholar]
  • [121].Moulton PF. J. Opt. Soc. Am. B. 1986;3:125. [Google Scholar]
  • [122].Drexler W, Morgner U, Kärtner FX, Pitris C, Boppart SA, Li XD, Ippen EP, Fujimoto JG. Opt. Lett. 1999;24:1221. doi: 10.1364/ol.24.001221. [DOI] [PubMed] [Google Scholar]
  • [123].Bouma B, Tearney GJ, Boppart SA, Hee MR, Brezinski ME, Fujimoto JG. Opt. Lett. 1995;20:1486. doi: 10.1364/ol.20.001486. [DOI] [PubMed] [Google Scholar]
  • [124].Drexler W, Sattmann H, Hermann B, Ko TH, Stur M, Unterhuber A, Scholda C, Findl O, Wirtitsch M, Fujimoto JG, Fercher AF. Arch. Opthalmol. 2003;121:695. doi: 10.1001/archopht.121.5.695. [DOI] [PubMed] [Google Scholar]
  • [125].Hermann B, Fernández EJ, Unterhuber A, Sattmann H, Fercher AF, Drexler W, Prieto PM, Artal P. Opt. Lett. 2004;29:2142. doi: 10.1364/ol.29.002142. [DOI] [PubMed] [Google Scholar]
  • [126].Walmsley I, Waxer L, Dorrer C. Rev. Sci. Instr. 2001;72:1. [Google Scholar]
  • [127].Fork RL, Martinez OE, Gordon JP. Opt. Lett. 1984;9:150. doi: 10.1364/ol.9.000150. [DOI] [PubMed] [Google Scholar]
  • [128].Martinez OE. IEEE J. Quan. Elect. 1987;23:59. [Google Scholar]
  • [129].Považay B, Bizheva K, Hermann B, Unterhuber A, Sattmann H, Fercher AF, Drexler W, Schubert C, Ahnelt PK, Mei M, Holzwarth R, Wadsworth WJ, Knight JC, Russel P. St. J. Opt. Express. 2003;11:1980. doi: 10.1364/oe.11.001980. [DOI] [PubMed] [Google Scholar]
  • [130].Ebrahimzadeh M, Sorokina IT, Vodopyanov KL, editors. Solid-State Mid-Infrared Laser Sources, Top. Appl. Phys. Vol. 89. Springer-Verlag; Berlin Heidelberg: 2004. pp. 179–219. [Google Scholar]
  • [131].Elsner AE, Burns SA, Weiter JJ, Delori FC. Vis. Res. 1996;36:191. doi: 10.1016/0042-6989(95)00100-e. [DOI] [PubMed] [Google Scholar]
  • [132].Považay B, Hermann B, Unterhuber A, Hofer B, Sattmann H, Zeiler F, Morgan JE, Falkner-Radler C, Glittenberg C, Blinder S, Drexler W. J. Biomed. Opt. 2007;12:041211. doi: 10.1117/1.2773728. [DOI] [PubMed] [Google Scholar]
  • [133].Bordenave E, Abraham E, Jonusauskas G, Tsurumachi N, Oberlé J, Rullière C, Minot PE, Lassègues M, Surlève Baxeille JE. Appl. Opt. 2002;41:2059. doi: 10.1364/ao.41.002059. [DOI] [PubMed] [Google Scholar]
  • [134].Ray SK, Choi TL, Groom KM, Stevens BJ, Liu H, Hopkinson M, Hogg RA. IEEE J. Select. Top. Quan. Elect. 2007;13:1267. [Google Scholar]
  • [135].Mamedov DS, Prokhorov VV, Yakubovich SD. Quant. Elec. 2003;33:511. [Google Scholar]
  • [136].Golubovic B, Bouma BE, Tearney GJ, Fujimoto JG. Opt. Lett. 1997;22:1704. doi: 10.1364/ol.22.001704. [DOI] [PubMed] [Google Scholar]
  • [137].Beitel D, Carrion L, Chen LR, Maciejko R. IEEE J. Select. Top. Quan. Elect. 2008;14:243. [Google Scholar]
  • [138].Birks TA, Wadsworth WJ. 2000;25:1415. doi: 10.1364/ol.25.001415. [DOI] [PubMed] [Google Scholar]
  • [139].Hori T, Takavanagi J, Nishizawa N, Goto T. Opt. Express. 2004;12:317. doi: 10.1364/opex.12.000317. [DOI] [PubMed] [Google Scholar]
  • [140].Wang H, Fleming CP, Rollins AM. Opt. Express. 2007;15:3085. doi: 10.1364/oe.15.003085. [DOI] [PubMed] [Google Scholar]
  • [141].Szkulmowski M, Wojtkowski M, Bajraszewski T, Gorczyńska I, Targowski P, Wasilewski W, Kowalczyk A, Radzewicz C. Opt. Commun. 2005;246:569. [Google Scholar]
  • [142].Spöler F, Kray S, Grychtol P, Hermes B, Bornemann J, Först M, Kurz H. Opt. Express. 2007;15:10832. doi: 10.1364/oe.15.010832. [DOI] [PubMed] [Google Scholar]
  • [143].Pircher M, Götzinger E, Leitgeb R, Fercher AF, Hitzenberger CK. J. Biomed. Opt. 2003;8:565. doi: 10.1117/1.1578087. [DOI] [PubMed] [Google Scholar]
  • [144].Dunsby C, French PMW. J. Phys. D: Appl. Phys. 2003;36:R207. [Google Scholar]
  • [145].Fujimoto JG, De Silvestri S, Ippen EP, Puliafito CA, Margolis R, Oseroff A. Opt. Lett. 1986;11:150. doi: 10.1364/ol.11.000150. [DOI] [PubMed] [Google Scholar]
  • [146].Doulé C, Lépine T, Georges P, Brun A. Opt. Lett. 2000;25:353. doi: 10.1364/ol.25.000353. [DOI] [PubMed] [Google Scholar]
  • [147].Duncan MD, Mahon R, Tankersley LL, Reintjes J. Opt. Lett. 1991;16:1868. doi: 10.1364/ol.16.001868. [DOI] [PubMed] [Google Scholar]
  • [148].Yasui T, Minoshima K, Abraham E, Matsumoto H. Appl. Opt. 2002;41:5191. doi: 10.1364/ao.41.005191. [DOI] [PubMed] [Google Scholar]
  • [149].Baigar E, Hauger C, Zinth W. Appl. Phys. B. 1998;67:257. [Google Scholar]
  • [150].Wang L, Ho PP, Liu C, Zhang G, Alfano RR. Science. 1991;253:769. doi: 10.1126/science.253.5021.769. [DOI] [PubMed] [Google Scholar]
  • [151].Muller MS, Webster PJL, Fraser JM. Opt. Lett. 2007;32:3336. doi: 10.1364/ol.32.003336. [DOI] [PubMed] [Google Scholar]
  • [152].Imanishi Y, Lodowski KH, Koutalos Y. Biochem. 2007;46:9674. doi: 10.1021/bi701055g. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [153].Millard AC, Campagnola PJ, Mohler W, Lewis A, Loew LM. Meth. Enzym. 2003;361:47. doi: 10.1016/s0076-6879(03)61005-0. [DOI] [PubMed] [Google Scholar]
  • [154].Chen C, Tsina E, Cornwall MC, Crouch RK, Vijayaraghavan S, Koutalos Y. Biophys. J. 2005;88:2278. doi: 10.1529/biophysj.104.054254. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [155].Han M, Bindewald-Wittich A, Holz FG, Giese G, Niemz MH, Snyder S, Sun H, Yu J, Agopov M, La Schiazza O, Bille JF. J. Biomed. Opt. 2006;11:010501. doi: 10.1117/1.2171649. [DOI] [PubMed] [Google Scholar]
  • [156].Gu X, Akturk S, Shreenath A, Cao Q, Trebino R. Opt. Rev. 2004;11:141. [Google Scholar]
  • [157].Trebino R. Frequency-Resolved Optical Gating: The Measurement of Ultrashort Laser Pulses. Springer-Verlag; Berlin Heidelberg: 2002. pp. 101–115. [Google Scholar]
  • [158].Hochheimer BF. Appl. Opt. 1982;21:1516. doi: 10.1364/AO.21.001516. [DOI] [PubMed] [Google Scholar]
  • [159].Morishige N, Wahlert AJ, Kenney MC, Brown DJ, Kawamoto K, Chikama T, Nishida T, Jester JV. Invest. Ophthalmol. Vis. Sci. 2007;48:1087. doi: 10.1167/iovs.06-1177. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [160].Brown DJ, Morishige N, Neekhra A, Minckler DS, Jester JV. J. Biomed. Opt. 2007;12:024029. doi: 10.1117/1.2717540. [DOI] [PubMed] [Google Scholar]
  • [161].Yeh AT, Nassif N, Zoumi A, Tromberg BJ. Opt. Lett. 2002;27:2082. doi: 10.1364/ol.27.002082. [DOI] [PubMed] [Google Scholar]
  • [162].Imanishi Y, Batten ML, Piston DW, Baehr W, Palczewski K. J. Cell Biol. 2004;164:373. doi: 10.1083/jcb.200311079. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [163].Hertel IV, Radloff W. Rep. Prog. Phys. 2006;69:1897. [Google Scholar]
  • [164].Winfried D, Detwiler PB. Proc. Natl. Acad. Sci. USA. 1999;96:7035. doi: 10.1073/pnas.96.12.7035. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [165].Xie XS, Cheng J-X, Potma E. In: Handbook of Biological Confocal Microscopy. chap. 33. Third Edition Pawley JB, editor. Springer; New York: 2006. [Google Scholar]
  • [166].Sarunic MV, Applegate BE, Izatt JA. Opt. Lett. 2005;30:2391. doi: 10.1364/ol.30.002391. [DOI] [PubMed] [Google Scholar]
  • [167].Tomov IV, Jiang Y, Chen Z. Appl. Opt. 2007;46:1770. doi: 10.1364/ao.46.001770. [DOI] [PubMed] [Google Scholar]

RESOURCES