Abstract
Smooth pursuit eye movements (pursuit) are used to minimize the retinal motion of moving objects. During pursuit, the pattern of motion on the retina carries not only information about the object movement but also reafferent information about the eye movement itself. The latter arises from the retinal flow of the stationary world in the direction opposite to the eye movement. To extract the global direction of motion of the tracked object and stationary world, the visual system needs to integrate ambiguous local motion measurements (i.e., the aperture problem). Unlike the tracked object, the stationary world’s global motion is entirely determined by the eye movement and thus can be approximately derived from motor commands sent to the eye (i.e., from an efference copy). Because retinal motion opposite to the eye movement is dominant during pursuit, different motion integration mechanisms might be used for retinal motion in the same direction and opposite to pursuit. To investigate motion integration during pursuit, we tested direction discrimination of a brief change in global object motion. The global motion stimulus was a circular array of small static apertures within which one-dimensional gratings moved. We found increased coherence thresholds and a qualitatively different reflexive ocular tracking for global motion opposite to pursuit. Both effects suggest reduced sampling of motion opposite to pursuit, which results in an impaired ability to extract coherence in motion signals in the reafferent direction. We suggest that anisotropic motion integration is an adaptation to asymmetric retinal motion patterns experienced during pursuit eye movements.
NEW & NOTEWORTHY This study provides a new understanding of how the visual system achieves coherent perception of an object’s motion while the eyes themselves are moving. The visual system integrates local motion measurements to create a coherent percept of object motion. An analysis of perceptual judgments and reflexive eye movements to a brief change in an object’s global motion confirms that the visual and oculomotor systems pick fewer samples to extract global motion opposite to the eye movement.
Keywords: motion perception, ocular following, perceptual integration, smooth pursuit eye movement
INTRODUCTION
During tracking of a moving object, smooth pursuit eye movements (pursuit for short) are used to reduce motion blur by minimizing the object movement on the retina. Much research on how object motion is perceived during pursuit considers how the visual system extracts an accurate representation of the direction of motion.
The direction of an object’s motion is represented at an early stage of visual processing by motion sensors, e.g., neurons along the motion processing pathway that are only receptive to a small part of the visual field. At the local level, contours are one-dimensional (1-D), meaning that their direction of motion becomes ambiguous (Fig. 1A); this is referred to as the aperture problem (Masson 2004; Wallach 1935). The aperture problem implies the need to integrate motion signals across space to determine an object’s speed and direction. During fixation, the aperture problem can be perceptually solved even when the 1-D motion signals emanating from a rigidly translating object have different orientations and locations in space (Fig. 1, B and C), resulting in a coherent motion percept (Amano et al. 2009; Lorenceau 1998; Mingolla et al. 1992). Previous research comparing motion coherence during pursuit and fixation indicates a perceptual bias during pursuit toward attributing the eye movement-induced (reafferent) motion to a single coherent object, even when the stimulation is equally compatible with a two-object interpretation (Hafed and Krauzlis 2006). However, this bias could reflect perceptual priors about the stability of the world during eye movements (Wexler et al. 2001), rather than the ability to integrate motion and solve the aperture problem during pursuit.
Fig. 1.
A: the aperture problem. When a rigid object (i.e., the wavy black shape) is seen moving through a small window (i.e., holes on a semitransparent screen), its local motion is ambiguous, due to the lack of 2-dimensional features. The object global motion (red arrow) can be recovered by integrating local motion vectors orthogonal to the contours across space (blue arrows). B: stimulus used to simulate rigid object motion behind multiple circular windows. Gabor elements were randomly oriented and could drift at speeds that were only compatible with one global motion direction. Dashed lines were not shown. C: in velocity space, if the object motion is rigid, every motion vector length is determined by its orientation relative to the global motion direction, forming a circle. D: eye movement conditions. The observers either fixated a central dot or pursued it as it moved horizontally across the screen. The gratings drifted in the middle of the trajectory for 200 ms (cf. Fig. 2A), but the envelopes of the Gabor patches always moved at the same velocity as the target. If tracking were perfect, retinal motion would be the same in fixation and pursuit conditions.
Several lines of evidence suggest that motion integration during pursuit is in general unlike motion integration during fixation. Gibson et al. (1957) noted that the optical pattern of movement that stimulates the eye also carries information about the world and about the observer’s own movements, making vision a proprioceptive sense. During pursuit, the stationary world moves on the retina in the direction opposite to the eye and therefore carries proprioceptive information about eye speed and direction. The reafferent motion information can be used to supplement extraretinal information about eye movements, such as that derived by a corollary discharge of the motor command (Haarmeier et al. 2001). In sum, there is a fundamental directional asymmetry whereby retinal motion opposite to pursuit may provide proprioceptive information, whereas retinal motion in the direction of motion does not. Therefore, motion information could be sampled and integrated differently depending on its direction relative to the eye movement.
Asymmetries in the processing of motion during pursuit have been previously tested by injecting a brief motion pulse to a structured background. Those studies have tended to find symmetrical perceptual and eye movement responses (e.g., Miura et al. 2009; Spering and Gegenfurtner 2007) when the background stimulus moves with the target before motion is injected. Asymmetries are only found when the motion is injected on a stationary background. In that case, eye movements toward background motion opposite to pursuit are suppressed compared with those in the direction of the pursuit (e.g., Lindner and Ilg 2006). This suppressive effect could be explained by rapid adaptation to reafferent background motion (Miura et al. 2009). Therefore, the processing of simple motion signals is symmetrical when the stimulation history is symmetrical. Yet, the possibility remains that the integration of motion signals is asymmetrical during pursuit, reflecting the different computations needed to extract proprioceptive information about the eye movement from reafferent motion opposite to pursuit and those required to extract object movement. We explore this possibility in the present contribution.
Monkey physiology gives further reasons to suspect anisotropic integration of motion signals during pursuit. Neurons in middle temporal (MT) and middle superior temporal (MST) visual areas show suppression for motion opposite to pursuit when tested with random-dot kinematograms, in addition to changes in motion tuning that indicate encoding of motion along a continuum from world to retinal coordinates. Units in MT and MST could form successive stages of integration of V1 motion information. Most MT units encode local motion (Majaj et al. 2007; Rust et al. 2006), whereas units in MST integrate MT outputs to extract object velocity (Khawaja et al. 2013; Mineault et al. 2012). MT and MST provide the primary visual input driving pursuit eye movements and motion perception (Newsome et al. 1985). Therefore, there are theoretical (object and background motion signals being most often asymmetrically distributed during pursuit) and empirical grounds to expect direction-dependent motion integration during pursuit eye movements.
In the present study, we tested motion integration during smooth pursuit eye movements by using a global motion stimulus composed of low-contrast gratings moving behind small “apertures” (Fig. 1B), formed by 2-D Gaussian contrast envelopes. The aperture’s shape or position on the retina did not change as observers tracked or fixated it. The multiple-aperture Gabor array allowed us to investigate motion integration independently of stimulus shape and position. By this means, we uncovered a new asymmetry in motion computations during smooth pursuit eye movements that can be explained by an impaired ability to extract coherent motion in the direction opposite to pursuit (i.e., in the reafferent direction).
EXPERIMENTAL PROCEDURES
Six undergraduate students from the University of Geneva and one of the authors (D. Souto) (18–33 yr old) took part in experiment 1. Experiments 2 and 3 were carried out at the University of Leicester; 7 undergraduate students took part (18–25 yr old) in experiment 2 and 11 (18–26 yr old) in experiment 3. Participants were paid £6 for each session (CHF 20 in Geneva) or received course credit. Participants gave informed written consent to participate before the experiments. They reported normal or corrected-to-normal vision at the viewing distance and were naive regarding the hypothesis of the experiment. The experimental procedure was approved by the Ethics Committee of the Faculté de Psychologie et des Sciences de l’Education of the University of Geneva and by the School of Psychology at the University of Leicester.
In experiment 1, stimuli were displayed on a NEC MultiSync CRT screen (1,280 × 1,024 pixels at 75 Hz) at 66 cm from the observer, whose head was held by a chin and front rest. Spatial resolution was 26 pixels per degree of visual angle. In experiments 2 and 3, stimuli were displayed on a HP P1130 CRT screen (1,280 × 1,024 pixels at 85 Hz), 61 cm from the observer. The right eye position was tracked at 1,000 Hz by a video-based eye tracker (EyeLink 1000; SR Research, Osgoode, ON, Canada). The visual stimulation was created with the Psychophysics toolbox PB-3 in MATLAB (Brainard 1997; Kleiner et al. 2007). We used a look-up table to linearize the screen gamma.
Visual Stimulation
The multiple-aperture Gabor array (Amano et al. 2009) shown in Fig. 1B was composed of a grid of 744 Gabor patches displayed within two notional concentric circles around a 0.3° fixation point. The inner circle had a 3° radius and the outer circle a 10° radius. Individual Gabor patches occupied 52 × 52 pixels (1° × 1°). To fit the screen size in experiments 2 and 3, the global array size was 60% of the original display size. Initially, each patch was assigned a random phase and orientation, a spatial frequency of 2 cycles/deg, a space constant of 0.2°, and 20% Michelson contrast. Background luminance was 27.8 cd/m2.
At the beginning of a trial the fixation point was brightened for 50 ms (going from 0.3 to 4.2 cd/m2), providing a warning signal that 1.7 s later, a global motion change would be displayed for 0.2 s. This warning was necessary to avoid differences in stimulus expectation in fixation and pursuit trials. The circular display continued moving across the screen for another 0.5 s. The stimulus is shown at different coherence levels in Supplemental Movies S1–S4 (https://doi.org/10.25392/leicester.data.7718453.v1).
Figure 1C illustrates how the drift speed was assigned to signal and noise patches to generate coherent motion by integration across space and orientations (Amano et al. 2009). A geometric regularity specifies the relation between global motion of an object behind apertures (i.e., the unique direction of motion of a rigid object, as illustrated in Fig. 1A) and the norm of a motion vector orthogonal to the 1-D contour (1-D motion). In velocity space, normal motion vectors consistent with a given global motion interpretation are located on a circle whose orientation and diameter are determined by the global motion vector, meaning that local drift speed (vloc) is a function of the difference between orthogonal (θorth) and global motion (θgl) angular directions, scaled by global motion speed (vgl):
To manipulate coherence in experiments 1–3, we varied the signal-to-noise ratio. Signal patches had a drift speed corresponding to a single global motion direction. Noise patches had a random drifting speed, drawn from a uniform distribution, ranging from −2 to 2 deg/s (Fig. 2C).
Fig. 2.

Stimulation time course in experiment 1. A: horizontal target position (top) and velocity (middle and bottom) are shown superimposed on the global motion (colored lines) of the grating pattern that was displayed behind multiple windows or apertures. The gratings moved with the pursuit target (or remained static during fixation) except for a 200-ms interval that is indicated by the dashed vertical lines. During this interval, the global motion speed of the gratings was ±2 deg/s relative to the target speed (5.72 deg/s). The blurred window through which each grating was viewed always moved at the same speed as the target (cf. Supplemental Movies S1–S4, https://doi.org/10.25392/leicester.data.7718453.v1). The colored lines refer to the velocity of the grating inside the window. B: unspeeded discrimination task. Gabor motion was either in the direction of pursuit (green arrows) or opposite to it (red arrows) and slightly upward or downward. At the end of the trial, observers reported whether they saw upward or downward global motion. C: composition of grating speeds to generate coherent global motion. Signal and noise velocity distribution are shown in velocity space. Signal gratings’ drift speed was compatible with either an upward (+10°; saturated color) or downward (−10°; unsaturated color) global motion component. The orientation of the global motion velocity vector relative to the horizontal is shown to scale. Observers discriminated vertical component direction at different levels of coherence (i.e., different amounts of signal relative to noise gratings).
Eye Movement Condition
Eye movement conditions in experiment 1 are illustrated in Fig. 1D and Fig. 2A. In the fixation condition, the Gabor array and fixation point remained at the center of the screen. In the pursuit condition, the Gabor array and the fixation dot moved horizontally across the screen, their starting position being randomly chosen to be 6° left or right of the screen center. For 1 s, the dot remained at the same peripheral location to allow fixation before the pursuit target motion started. The Gabor array and the fixation dot then moved at 5.72 deg/s for 1.4 s through the screen center, covering a total distance of 12°. Carrier motion was displayed in the middle of this trajectory.
Procedure
Perception and eye movements (experiment 1).
In five sessions, observers performed a two-alternative forced choice task, where they needed to report whether the global carrier motion (±2 deg/s) direction was above (+10°) or below (−10°; cf. Fig. 2, B and C) horizontal. Eye movement conditions (pursuit and fixation) were alternated in six blocks within a session. An additional five trials for each eye movement condition at the start of each session served as training. Pursuit blocks had 96 trials, 8 coherence levels × 2 motion directions (same or opposite to pursuit) × 6 repetitions, whereas fixation blocks had 48 trials (8 coherence levels × 6 repetitions), giving 432 trials per session. This meant that there were at best 91 trials per stimulus level for fitting psychometric functions. Coherence level and motion direction (same or opposite) relative to pursuit (in pursuit trials) were randomized during a block. Target direction was randomly assigned to leftward or rightward. Observers responded and controlled the pace of the experiment by pressing designated keyboard keys. They were given auditory feedback (a brief tone) for incorrect trials. They were also given textual feedback at the screen center when there was a blink during the brief global motion burst or when pursuit gain (eye velocity divided by target velocity) was lower than 0.8. We used the method of constant stimuli to derive psychometric functions, with eight nominal coherence levels representing the ratio of signal to signal plus noise: 0 (baseline), 0.14, 0.29, 0.43, 0.57, 0.71, 0.86, and 1.00.
Eye movements to uniformly and randomly oriented patterns (experiment 2).
We tested the effect of global motion type, coherence, and direction relative to pursuit on reflexive eye movements by using the same stimulus velocities as in experiment 1, but without any vertical component being added to the horizontal global motion, because no perceptual judgments were collected. We tested two types of Gabor arrays. Arrays were composed of randomly oriented gratings (as in experiment 1) or vertically oriented gratings (uniform condition). We presented five levels of coherence: 0 (baseline), 0.25, 0.50, 0.75, and 1.0. We had 48 repetitions for each condition, with a total of 960 interleaved trials split over two 30-min sessions, corresponding to 2 grating orientations (uniform or random) × 5 coherence levels × 2 motion directions (same or opposite to pursuit) × 48 repetitions.
Eye movement with different target velocities (experiment 3).
We used the same stimulus as in experiment 1, without any vertical component being added to the horizontal global motion. We tested the effect of target velocity and global motion direction on reflexive eye movements to 100% coherent global motion. Target velocity (2.54, 4.44, or 6.34 deg/s) and direction were interleaved, giving 360 trials (2 directions × 3 velocities × 60 repetitions) tested in one session.
Data Analysis
To detect saccadic episodes during pursuit, we used the pursuit settings in the EyeLink 1000 software. The velocity threshold was 22 deg/s, to which the eye velocity average during the last 40 ms, up to 60 deg/s, was added, combined with an acceleration threshold of 5,000 deg/s2. We avoided saccade contamination by discarding samples up to 25 ms before the saccade start and up to 40 ms after the saccade end. Velocity was derived by differentiating the position signal using a two-point central difference method with a 20-ms step size (Bahill and McDonald 1983). This velocity signal was further filtered by a low-pass Butterworth second-order filter, with a 35-Hz cutoff frequency. We fitted a logistic function to the proportion correct performance as a function of coherence, which is equivalent to the signal-to-noise ratio, s:
In the equation above, the parameter m represents the 75% threshold, the parameter w represents 90% of the interval width over which the function rises, and z is a constant equal to 2 × log(9). We used the Psignifit 3 toolbox to implement the maximum likelihood fitting procedure and derive bootstrapped confidence intervals for the parameters (Fründ et al. 2011).
Reflexive Ocular Tracking
To analyze eye movement responses to global motion, we inverted the sign of horizontal eye movements in leftward trials, meaning the data were averaged as if only rightward trials were present. To compare eye movements across pursuit and fixation conditions, we subtracted pursuit target velocity from eye velocity to obtain velocity error (e.g., Fig. 4). Therefore, we obtained positive values when the eye moved faster than the pursuit target and negative values when it was slower.
Fig. 4.
Experiment 1 oculomotor results in the horizontal direction. A: horizontal velocity error (eye velocity minus target velocity) in an example individual (subject AM), relative to the pursuit direction. Top shows same (green) and opposite (red) global motion conditions. Positive values indicate that the eye overshoots the pursuit target velocity and negative values, that it undershoots it. Bottom shows the fixation condition (yellow). In that condition a positive value indicates an eye movement in the global motion direction. Responses are locked to global motion onset for signal coherence conditions of 1 (red, green, or yellow) and 0 (gray), i.e., 100% and nominally 0% coherent signals. The gray shaded area indicates the duration of global motion stimulation. B: group averages (N = 7 subjects) using the same conventions as for A. The peak velocity averaging interval is shown by green and red horizontal lines. C: average horizontal velocity error at peak for different coherence levels [signal-to-noise ratio: S/(S + N)]. D: horizontal peak response, i.e., maximal velocity error in the direction of global motion from which the response to 0 signal coherence is subtracted, as a function of coherence. The sign of the opposite condition responses was inverted for comparison. Lines represent the best-fitting Naka-Rushton functions. Note a qualitatively similar response for same and fixation, with a reduced maximal response during fixation, but lack of saturation for opposite motion. Shading around the means (A and B) and error bars (C and D) indicate SE. Coh, coherence.
The Naka-Rushton function was used to fit (absolute) peak responses over 50-ms averaging intervals as a function of coherence (signal-to-noise ratio) s, using the Nelder-Mead simplex algorithm (MATLAB fminsearch) to minimize the sum of squared residuals (least-squares method):
where Rmax is the asymptote, S50 indicates the function at half-saturation, and n is proportional to the slope at S50.
RESULTS
We tested the ability to discriminate the direction of global motion depending on coherence and eye movement condition (experiment 1). The global motion stimulus in pursuit and fixation conditions are shown in Fig. 1, B and D. A multiple-aperture grating array (Fig. 1B) surrounded the fixation dot, which moved across the screen (pursuit condition) or remained stationary (fixation condition). Trial time course and task are illustrated in Fig. 2, A and B. In pursuit conditions (Fig. 2A), the grating array moved with the pursuit target either leftward or rightward and gratings drifted within the apertures for 200 ms in the middle of the trajectory. Observers had to judge the vertical component of global motion within the multiple apertures (cf. Supplemental Movies S1–S4, https://doi.org/10.25392/leicester.data.7718453.v1). The participants’ two-alternative forced-choice task (Fig. 2B) was to discriminate between global motion directions that were above (+10°) or below (−10°) horizontal. The proportion of patches with a consistent direction of motion (signal patches) was varied across trials to derive psychometric functions. Global motion drift speed was always ±2 deg/s (e.g., Fig. 2C) relative to the target speed of 5.72 deg/s. Our main interest was to compare the ability to integrate motion opposite to and in the direction of pursuit eye movements. Whereas the most straightforward task would be to ask for judgments of horizontal motion direction, a preliminary study showed that a nominally 0% coherent stimulus appeared to move opposite to pursuit in some participants (see also Terao et al. 2015). Discriminating between vertical components of motion avoided this issue.
Psychometric Data
Figure 3A shows psychometric functions for three main conditions in a typical subject: fixation, global motion opposite to pursuit (opposite motion condition), and global motion in the same direction as pursuit (same motion condition). On average, opposite motion yielded higher discrimination thresholds, as defined by the coherence level giving 75% correct performance (Fig. 3B). Thresholds were at 51% coherence for opposite motion, 42% for same motion, and 32% for fixation. The slopes of the psychometric functions (Fig. 3B) were also shallower for opposite motion, confirming poorer ability to discriminate. Paired t-tests indicate a significant increase of thresholds for opposite compared with fixation [t(6) = 4.79, P = 0.003] or same conditions [t(6) = 4.26, P = 0.005], as well as shallower slopes for opposite compared with fixation [1.68 vs. 2.25; t(6) = 5.38, P = 0.0016] or same conditions [1.68 vs. 2.30; t(6) = 2.78, P = 0.032]. Performance with same motion direction was more similar to fixation, with slopes not statistically significantly different (P > 0.84), but with significantly worse thresholds [t(6) = 3.34, P = 0.015]. We generated an individual suppression index by subtracting the fixation threshold from the pursuit threshold and dividing the result by the fixation threshold such that positive values indicate the deterioration of perceptual performance with pursuit. In Fig. 3, C and D, we plotted the suppression index in opposite vs. same conditions, demonstrating that most subjects showed a less effective discrimination of global motion direction when it was opposite to the eye movement.
Fig. 3.

Experiment 1 perceptual results. A: example psychometric function from one individual (subject AM) showing proportion correct responses in discriminating the vertical direction of global motion at different levels of coherence [signal-to-noise ratio: S/(S + N)] for opposite (red), same (green), and fixation (yellow) conditions. B: average thresholds (left) and slopes (at threshold; right) of psychometric functions show impaired performance for global motion opposite to pursuit (red). Error bars indicate SE. C and D: individual suppression indexes (pursuit condition performance relative to fixation) for coherence thresholds (C) and slopes (D) of the psychometric function. Error bars indicate bootstrapped 95% confidence intervals.
Oculometric Data
We looked for reflexive responses to global motion as a complementary way to understand global motion processing (e.g., Masson 2004). Eye movements in the direction of global motion are shown for a typical subject in Fig. 4A and for the group average in Fig. 4B. When comparing responses to 100% coherent global motion, we observed different responses depending on the eye movement condition. The response was weaker during fixation compared with pursuit conditions, in line with the literature indicating increased visuomotor gain during pursuit. More surprisingly, responses to opposite and same direction of global motion were qualitatively different. Opposite motion yielded a larger velocity error, which was more protracted and peaked later than same-direction motion. The maximal opposite motion response was ~50% (average of −1.1 deg/s for a 100% signal) of global motion velocity (2 deg/s) and close to 20% of the pursuit target velocity (5.72 deg/s). The response was very systematic within and across subjects and was typical of reflexive eye movements, such as ocular following (Kodaka et al. 2004).
Figure 4C shows the effect of coherence on the peak response. In this plot, the eye velocity was averaged over a 50-ms window centered on the peak observed with a 100% signal (red and green horizontal lines in Fig. 4B). We see a clear increase in response with coherence in all conditions, but the comparison between conditions is made difficult by differences in velocity error at 0% coherence in pursuit and fixation conditions, given that pursuit gain was a typical 0.95 (eye velocity divided by target velocity). Therefore, Fig. 4D shows the peak response relative to the 0% coherence velocity error. The sign of opposite motion responses was flipped for comparison.
The effect of signal coherence on the peak response in Fig. 4D shows a qualitatively different response pattern in opposite- and same-direction conditions. In the same and fixation conditions, responses saturate at low signal coherence (around 20–40% coherence). In contrast, responses opposite to pursuit increased linearly with stimulus coherence up to 100% coherence. This latter pattern has not been previously observed, whereas the difference in magnitude between pursuit and fixation responses can be explained by a well-known increase in the visuomotor gain during pursuit compared with fixation (Schwartz and Lisberger 1994).
To quantify the relationship between coherence and peak response, we fit a Naka-Rushton function (see methods). This function is often found to fit neural (Albrecht and Hamilton 1982) and ocular responses (Masson et al. 2000) as a function of stimulus contrast. We had no other theoretical reason to employ it, other than it provided a good fit to the data (R2; opposite: 0.97, same: 0.98, fixation: 0.89). We also fit the function to individual data, with a good correspondence to the group average fits for the pursuit conditions. Goodness of fit was high in pursuit conditions [mean R2 (95% confidence interval); opposite: 0.86 (0.74, 0.97), same: 0.77 (0.68, 0.86)], but less so in the fixation conditions, given weaker responses relative to eye movement variability [fixation: 0.45 (0.08, 0.83)]. In agreement with the group average, the asymptotic response parameter Rmax was significantly higher in the opposite compared with the same-direction condition [1.26 deg/s (0.87, 1.66) vs. 0.47 deg/s (0.29, 0.64); t(6) = 5.59, P = 0.0014] and the fixation condition [0.16 deg/s (0.07, 0.25); t(6) = 6.611, P = 0.0005]. The function half-saturation parameter S50 was significantly higher in the opposite compared with same condition [0.82 (0.62, 1.03) vs. 0.24 (0.16, 0.31); t(6) = 7.259, P = 0.00034], but not compared with the fixation condition given high variability for this parameter in the fixation condition [0.46 (0.15, 0.76)]. Same and fixation parameters were not significantly different, possibly for the same reason. The best fitting n parameter in the group average was used to constrain the Naka-Rushton fits (opposite: 2.17, same: 2.68, fixation: 10). In summary, when the target was fixed or when it was moving in the same direction as global motion, ocular tracking of global motion increased in velocity with motion coherence but saturated at ~0.1 and 0.4 deg/s, respectively, whereas ocular tracking responses continued to increase linearly with coherence when global motion was opposite to the target motion.
Finally, we confirmed that the peak latency was longer in opposite compared with same conditions by bootstrapping [opposite: 210 ms (200, 229), same: 172 ms (162, 184)], i.e., by resampling of individual traces with replacement (Efron and Tibshirani 1993). Latencies in the fixation condition could not be reliably estimated, given the weakness of the response.
Although our primary intention was to investigate asymmetries in eye movement responses to global motion in the horizontal direction, we also analyzed vertical eye movements in the direction of the much smaller vertical component of motion. For comparison, the horizontal eye movement component was 1.97 deg/s, whereas the vertical component was 0.35 deg/s. We averaged upward and downward responses by flipping the velocity error sign in downward conditions. Figure 5A shows the 0% and 100% coherence levels and suggests that there was indeed a small vertical eye movement component. Figure 5B shows the average response for all levels of coherence relative to the 0% coherence baseline. We used the averaging intervals centered around horizontal peak velocity, because the vertical response was too weak to yield a reliable peak. We did not fit a Naka-Rushton function for the same reasons. Vertical eye movements in the same and fixation conditions followed the vertical stimulus motion [same horizontal motion: 0.07 deg/s (0.02, 0.12), fixation: 0.09 deg/s (0.04, 0.14)]. In contrast, vertical eye movements tended to be opposite to vertical stimulus motion when horizontal stimulus motion was opposite to pursuit [opposite horizontal motion: −0.04 deg/s (−0.1, 0.0)]. A repeated-measures ANOVA tested the effect of coherence (without the 0% baseline) and eye movement condition on the vertical error. Eye movement condition was the only statistically significant effect [F(2, 12) = 14.56, P < 0.0001], suggesting that velocity errors were higher in the fixation and same motion conditions compared with the opposite motion condition (all other effects P > 0.75).
Fig. 5.

Experiment 1 oculomotor results in the vertical direction. A: vertical velocity error in pursuit (top) and fixation (bottom) conditions. Positive values represent eye movements in the direction of global motion. Responses are locked to global motion onset for signal coherence conditions of 1 (red, green, or yellow) and 0 (gray). The gray shaded area indicates the duration of global motion stimulation. The peak velocity averaging interval is shown by green and red horizontal lines. B: vertical response relative to the 0% coherence baseline. The averaging interval is based on the peak horizontal responses. The sign of vertical velocity errors was inverted in the downward condition for comparison. Thereby, positive values represent eye movements in the direction of the vertical component of global motion. Values are group averages (N = 7 subjects); shading around the means (A) and error bars (B) indicate SE. Coh, coherence [signal-to-noise ratio: S/(S + N)].
We wondered whether poorer perceptual performance with opposite compared with same global motion arises because of a greater velocity error. However, we found no evidence for a positive or negative correlation between perceptual thresholds and peak eye movement response [r(6) = −0.13, not significant; Fig. 6A]. It could also be that movement variability (i.e., jitter) was higher in one of the conditions, and this could explain deteriorated perceptual performance. However, eye movement variability and perceptual performance were also uncorrelated [r(6) = −0.004; Fig. 6B]. Therefore, differences in perceptual performance across eye movement conditions could not be accounted for by velocity error (retinal slip) or jitter during the presentation of the motion stimulus.
Fig. 6.
Relation between perceptual performance and horizontal velocity error (VE) in experiment 1. A: differences in perceptual thresholds between opposite and same conditions normalized by the fixation thresholds against horizontal VE difference in opposite and same conditions. Only 100% coherent trials were included. Eye movements were measured during the duration of the global motion change. Poorer performance in the opposite condition is not correlated with poor tracking. B: no correlation was shown between differences in variance (σ) or between differences in perceptual performance. Each circle represents an individual. Values are group averages (N = 7 subjects); error bars represent confidence intervals. Vertical confidence intervals were bootstrapped. p(c), Proportion correct.
Effect of Array Type on Eye Movements
Reflexive responses to background motion during pursuit can be determined by feature attention, as shown by reflexive tracking of motion in the background when a specific color and motion direction is attended and motion is balanced (Souto and Kerzel 2014). Therefore, the reflexive eye movement effects we observed may be due to the active nature of the task, where observers need to process information in the background to report global motion. In two additional experiments, we asked observers to track the black dot but disregard the surrounding motion altogether.
Additionally, we asked whether asymmetric motion integration is specific to the stimulus. With randomly oriented gratings, a robust solution to the aperture problem is obtained by a specialized motion integration mechanism, such as intersection of constraints (IOC) or the harmonic vector average (HVA; Johnston and Scarfe 2013). To examine the role played by global motion computation, we compared eye movements in response to motion carried by randomly oriented gratings (as in experiment 1) to motion carried by uniformly oriented gratings. If the effect is specific to integration across orientations and space, we expect the asymmetry to vanish with uniform gratings because that type of integration is not necessary. Figure 7B shows that with randomly oriented gratings, we largely replicated the asymmetry between same and opposite motion. Critically, the asymmetry was also present for unidirectional motion (Fig. 7D). However, the response to same-direction motion was attenuated, whereas the response to opposite motion was shifted rightward. It is possible that unidirectional stimuli with low coherence elicited tracking responses against the global motion direction, as if they caused induced movement of the fixation dot, but we lacked statistical power to test this.
Fig. 7.
Horizontal eye movements in experiment 2. A and B show the random orientation condition, and C and D show the uniform condition (vertically oriented gratings). A and C: horizontal velocity error locked to the onset of the global motion change. Positive values indicate that the eye overshoots the pursuit target velocity and negative values, that it undershoots it. Responses are locked to global motion onset for signal coherence conditions of 1 (red, green, or yellow) and 0 (gray), i.e., 100% and nominally 0% coherent signals. The gray shaded area indicates the duration of global motion stimulation. The peak velocity averaging interval is shown by green and red horizontal lines. Shading around the means indicates SE. Data were low-pass filtered for display (Butterworth, 35-Hz cutoff). B and D: average peak response as a function of coherence [signal-to-noise ratio: S/(S + N)]. The sign of the opposite condition responses was inverted for comparison. Lines represent the best-fitting Naka-Rushton functions. Values are group averages (N = 7 subjects); error bars indicate SE. Coh, coherence.
Because the data were noisier, we obtained bootstrapped 95% confidence intervals by resampling individual fit residuals for the Naka-Rushton fits, instead of deriving them from individual fits. Table 1 shows that Rmax was significantly higher for the opposite direction, but not significantly different between random and unidirectional stimuli. A repeated-measures three-way ANOVA on peak response velocity (excluding 0% coherence) confirmed significant effects of coherence [F(3,18) = 15.773, P < 0.0001], with an interaction between coherence and direction [F(3,18) = 6.812, P < 0.01] because peak responses at high coherence were larger for opposite than same direction. There was also a triple interaction between stimulus type (random vs. unidirectional), direction, and coherence [F(3,18) = 6.099, P = 0.0047], which could be explained by larger asymmetries in the unidirectional condition compared with the random condition. It seems clear that the eye movement anisotropy does not critically depend on attention to the global motion stimulus and that it is not specific to a specialized mechanism required by randomly oriented patterns.
Table 1.
Experiment 2 peak responses to global motion as a function of signal coherence
| Parameter Fits |
|||||
|---|---|---|---|---|---|
| Orientation Condition | Global Motion Direction | Rmax, deg/s | S50 | n | R2 |
| Random | Same | 0.53 (0.15, 0.84) | 0.61 (0.28, 1.00) | 2.99 | 0.90 |
| Opposite | 1.81 (1.02, 2.37) | 0.90 (0.53, 1.00) | 9 | 0.87 | |
| Uniform | Same | 0.10 (0.00, 0.56) | 0.72 (0.05, 1.00) | 10 | −0.09 |
| Opposite | 1.32 (0.88, 2.04) | 0.87 (0.72, 1.00) | 10 | 0.90 | |
Values are means and bootstrapped 95% confidence intervals. Experiment 2 peak responses to global motion as a function of signal coherence were fit with a Naka-Rushton function. Rmax is the asymptote, S50 indicates the function at half-saturation, and n is proportional to the slope at S50.
Latencies of the peak (with 100% coherent signals) depended again on the eye movement condition, as indicated by bootstrapped 95% confidence intervals (percentile method, resampling individuals’ average traces). The opposite condition peak occurred later [random: 246 ms (226, 278), unidirectional: 234 ms (215, 291)] than the same direction peak [random: 184 ms (178, 192), unidirectional: 176 ms (156, 187)]. Stimulus type did not affect peak latencies.
Effect of Target Velocity
Finally, we explored how eye movements to global motion depend on target velocity. Possibly, the larger eye movement response to opposite- compared with same-direction motion stimuli arises from the earlier saturation of the eye movement response to same compared with opposite motion stimuli. A simple test is to examine the increase or decrease of responses with target velocity. We compared the opposite and same direction condition with three target velocities (2.5, 4.4, and 6.3 deg/s), using the global motion stimulus with random orientation and 100% coherence. Because the 0% coherence condition was not included, we used target velocity as a rough baseline condition. As shown in Fig. 8, A and B, eye velocity increased similarly with increasing target velocity for same and opposite global motion, which is inconsistent with saturation as an explanation for the anisotropy. Furthermore, we did not always find the peak velocity error to be larger with opposite- compared with same-direction motion (unlike in the previous experiments), which may reflect an inaccurate baseline or an effect of expectancy. Pretrial expectancy was different in the present experiment because coherence was fixed, whereas it varied randomly from trial to trial in the previous experiments. However, we replicated the temporal differences. Responses to opposite global motion were more protracted and occurred later than responses to same-direction global motion.
Fig. 8.

Horizontal eye movements for different target velocities observed in experiment 3, in response to the same global motion change (±2 deg/s) as in experiments 1 and 2. A: horizontal velocity in same (green) and opposite (red) global motion conditions. The 3 target velocities are shown by a horizontal gray line. The gray shaded area indicates the duration of global motion stimulation. The peak velocity averaging interval is shown by green and red horizontal lines. Values are group averages (N = 9 subjects); shading around the means indicates SE. B: horizontal velocity error showing the average for different target velocities by line width (narrower being slower). Data were low-pass filtered for display (Butterworth, 35-Hz cutoff).
A repeated-measures ANOVA on the peak velocity error (Fig. 8B) was carried out to confirm these observations. There was no effect of direction (P = 0.95) and no interaction between direction and velocity (P = 0.26), but there was a simple effect of velocity [F(2,8) = 11.153, P < 0.001]. Post hoc t-tests showed that peak velocity error increased significantly from 2.5 to 4.4 deg/s target velocities [0.55 vs. 0.73 deg/s, respectively; t(8) = 2.931, Bonferroni-corrected P < 0.04], but not between 4.4 and 6.3 deg/s (P = 0.09). Because there was no interaction with target velocity, there is little evidence for earlier response saturation with same than with opposite motion stimuli at higher target velocities. Furthermore, we compared the latency of the peak velocity error by bootstrapped 95% confidence intervals. Peak velocity error occurred earlier for same-direction motion [179 ms (172, 186) for slow, 167 ms (157, 182) for medium, 162 ms (153, 171) for fast] than for opposite-direction motion [234 ms (223, 251) for slow, 230 ms (212, 253) for medium, 232 ms (216, 274) for fast].
DISCUSSION
In these experiments, we asked whether the extraction of an object’s global direction of motion (i.e., motion integration) depends on its motion direction relative to ongoing pursuit eye movements. We hypothesized that motion integration may reflect the dominance of retinal motion opposite to pursuit. Opposite retinal motion occurs naturally when the eyes are moving across a stationary background. We show that the integration of motion during pursuit is generally less efficient than during fixation, which is to be expected due to poorer stimulus stabilization. More importantly, perceptual judgements showed impaired motion integration for motion opposite to the direction of the eye movement compared with motion in the same direction. At the same time, pursuit eye movements were more strongly affected by opposite- than same-direction motion. Furthermore, effects of opposite motion on eye movements occurred later and were more dependent on the coherence of global motion than effects of motion in the same direction.
Perception
In humans, impaired coherence of motion signals opposite to pursuit has not previously been observed. However, this finding is consistent with previous literature, including monkey physiology, which shows that the activity of a proportion of neurons in MT and MST is suppressed when their preferred direction of motion is opposite to the direction of pursuit (Chukoskie and Movshon 2009). Those neurons could be responsible for integrating motion signals across space. Furthermore, when looking at temporal contrast sensitivity with single Gabor patches (i.e., one unit in our multiple-aperture array), Schütz et al. (2007) showed reduced sensitivity for opposite motion signals. This result was ascribed to feature attention directed to the target motion (which is typically in the direction of pursuit) spreading to same-direction motion signals. However, if reduced attention to opposite motion is equated with a drop in effective contrast, we should have observed better, and not worse, performance for opposite motion. Takeuchi (1998) showed that coherence is rather improved by a small reduction in contrast because higher contrasts favor local motion processing over global motion integration. Thus the attentional account would predict enhanced coherence perception for opposite motion because of reduced contrast, but we observed worse coherence perception.
Our results relate to previous work showing enhanced processing of motion opposite to the direction of pursuit (Terao et al. 2015) and enhanced integration of motion signals during pursuit in contrast to fixation (Hafed and Krauzlis 2006; Terao et al. 2015). An important distinction is that these studies were aimed at studying perception in perfectly ambiguous situations. Terao et al. (2015) showed that observers tend to see motion opposite to pursuit in a counter-phase grating, where forward and backward interpretations are equally valid. Our paradigm measured the ability to extract coherent signals embedded in noise independently of this bias because we asked for vertical direction judgements. Hafed and Krauzlis (2006) used an ambiguous multiple-aperture stimulus, where observers viewed two static chevrons through a moving aperture (pursuit condition) or viewed two moving chevrons through a static aperture (fixation condition). Again, enhanced coherence was attributed not to increased discrimination performance but to perceptual priors. Under ambiguous conditions, the perceptual system may assume that the world is stable and attribute retinal motion to one’s own movements (Wexler et al. 2001).
Reflexive Ocular Tracking
In our experiments, oculomotor responses showed a striking, qualitatively different response pattern for global motion in the direction of pursuit compared with opposite to pursuit. At the highest stimulus coherence levels, ocular tracking responses to opposite-direction global motion peaked higher than to same-direction global motion or fixation. These results are at odds with studies that have shown smaller reflexive responses to motion opposite to the pursuit eye movement, which led to the idea of a suppression of optokinesis during pursuit eye movements (Lindner et al. 2001; Lindner and Ilg 2006, 2010; Schwarz and Ilg 1999). Our results are also at odds with studies that have shown a symmetric response (Kodaka et al. 2004; Miura et al. 2009; Spering and Gegenfurtner 2007; Suehiro et al. 1999). However, there is a critical difference between our and previous paradigms (aside from the use of higher background and target speeds in previous studies). We used a global motion stimulus in which each element moved but in which the apertures did not change position relative to the target. In contrast, the background moved across space or was composed of a large grating in previous studies. This difference suggests that changes in motion direction (our paradigm) and position (previous paradigms) can have independent effects.
Because the global motion had a small vertical component in experiment 1, we were able to analyze vertical eye movements in the direction of global motion. This showed a correspondingly small but robust eye movement response during fixation and during pursuit for global motion in the same direction. This contrasted with eye movement responses to motion opposite to pursuit direction, where there was only a tendency to move the eyes opposite to the vertical component. The lack of a significant vertical component suggests that enhancement of eye movement responses to opposite global motion is specific to horizontal responses.
Evidence from our control experiments indicates that ocular responses are not artifacts of the task requirements (top-down control), because they were also observed when the global motion stimulus was to be ignored. Asymmetric responses were shown across a range of target velocities, ruling out response saturation as being responsible for the asymmetry. Furthermore, within this target velocity range, other studies showed no saturation in responses to target (Churchland and Lisberger 2002) or background velocity perturbations (Lindner et al. 2001). Although similar asymmetries were observed across target speeds, the peak response was not always stronger for global motion opposite to pursuit. A further investigation would be needed to understand the effect of target and background velocity. It could be that background motion velocities close to the expected reafferent signal are processed differently.
Suggested Mechanism
We suggest that the asymmetry in motion integration we observed reflects a fundamental asymmetry in the processing of the retinal flow emanating from the target and from the stationary world during smooth pursuit eye movements. When an object is pursued with the eyes, the retinal motion emanating from the object and the retinal motion emanating from the static background have mostly opposite signs: the eye undershoots the target velocity, whereas the stationary background will always move on the retina opposite to and with the same speed as the eye. In laboratory tasks, the eye undershoots the target velocity by ~5%, resulting in a small residual retinal motion in the direction of object motion. The undershoot is even more pronounced during natural viewing, with frequent short bouts of pursuit never quite reaching a steady state (Hayhoe and Ballard 2005).
The retinal flow asymmetry makes it possible for the visual system to integrate differentially reafferent retinal motion (i.e., the retinal motion emanating from the stationary world) and object motion. In both cases, global motion needs to be extracted from ambiguous local motion so that both are subject to the aperture problem. However, motion is less ambiguous in the reafferent motion direction because the global direction of motion can also be determined from extraretinal signals about the eye movement direction. For this reason, it would be more efficient and indeed sufficient to sample motion over a smaller proportion of the field when motion is opposite to pursuit. However, a side effect of reduced sampling is that the ability to perceive coherent motion decreases.
Performance in motion coherence tasks can be affected by internal noise, sampling, and the segregation of signal and noise (Dakin et al. 2005). However, across development, improvements in coherence threshold can be attributed to increases in effective sampling rather than changes in internal noise (Manning et al. 2014). To account for the perceptual asymmetries, we assume that the motion system uses fewer samples to compute the global motion (using either IOC or HVA integration rules; Johnston and Scarfe 2013) opposite to pursuit compared with the global motion in the same direction as pursuit. The effect of having a relatively small sample will be less problematic for the global motion computation as coherence increases. Reduced sampling for motion opposite to pursuit can therefore also explain the linear increase in reflexive eye movement responses to global motion but saturation with motion in the direction of pursuit.
Remarkably, despite reduced sampling, the visuomotor gain of reflexive eye movements was found to be higher to global motion opposite to pursuit under most conditions. This novel finding may indicate the importance of amplifying reafferent motion signals in eye movement control, in contrast to the idea that motion opposite to pursuit should be suppressed to avoid reflexive optokinetic responses. Image-based estimates of eye velocity may be important in signaling a mismatch between the intended motor plan and its execution (Haarmeier et al. 2001), justifying differential integration rather than their simple suppression.
In conclusion, we uncovered a new asymmetry in motion computations during smooth pursuit eye movements characterized by an impaired ability to extract coherence in motion signals in the reafferent direction.
GRANTS
This study was partially supported by Swiss National Science Foundation Grant 100014135374 (to D. Kerzel, D. Souto). A. Johnston was supported by Engineering and Physical Sciences Research Council (UK) Grant EP/M026965/1.
DISCLOSURES
No conflicts of interest, financial or otherwise, are declared by the authors.
AUTHOR CONTRIBUTIONS
D.S. and A.J. conceived and designed research; D.S. and J.C. performed experiments; D.S. analyzed data; D.S., J.C., D.K., and A.J. interpreted results of experiments; D.S. prepared figures; D.S. drafted manuscript; D.S., D.K., and A.J. edited and revised manuscript; D.S., J.C., D.K., and A.J. approved final version of manuscript.
ACKNOWLEDGMENTS
We thank Kelly Amâncio for helping in running the experiments, and members of the Visual Cognition Laboratory from the University of Geneva for participation.
REFERENCES
- Albrecht DG, Hamilton DB. Striate cortex of monkey and cat: contrast response function. J Neurophysiol 48: 217–237, 1982. doi: 10.1152/jn.1982.48.1.217. [DOI] [PubMed] [Google Scholar]
- Amano K, Edwards M, Badcock DR, Nishida S. Adaptive pooling of visual motion signals by the human visual system revealed with a novel multi-element stimulus. J Vis 9: 4, 2009. doi: 10.1167/9.3.4. [DOI] [PubMed] [Google Scholar]
- Bahill AT, McDonald JD. Frequency limitations and optimal step size for the two-point central difference derivative algorithm with applications to human eye movement data. IEEE Trans Biomed Eng 30: 191–194, 1983. doi: 10.1109/TBME.1983.325108. [DOI] [PubMed] [Google Scholar]
- Brainard DH. The Psychophysics Toolbox. Spat Vis 10: 433–436, 1997. doi: 10.1163/156856897X00357. [DOI] [PubMed] [Google Scholar]
- Chukoskie L, Movshon JA. Modulation of visual signals in macaque MT and MST neurons during pursuit eye movement. J Neurophysiol 102: 3225–3233, 2009. doi: 10.1152/jn.90692.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Churchland AK, Lisberger SG. Gain control in human smooth-pursuit eye movements. J Neurophysiol 87: 2936–2945, 2002. doi: 10.1152/jn.2002.87.6.2936. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dakin SC, Mareschal I, Bex PJ. Local and global limitations on direction integration assessed using equivalent noise analysis. Vision Res 45: 3027–3049, 2005. doi: 10.1016/j.visres.2005.07.037. [DOI] [PubMed] [Google Scholar]
- Efron B, Tibshirani RJ. An Introduction to the Bootstrap. Boca Raton, FL: CRC, 1993. [Google Scholar]
- Fründ I, Haenel NV, Wichmann FA. Inference for psychometric functions in the presence of nonstationary behavior. J Vis 11: 16, 2011. doi: 10.1167/11.6.16. [DOI] [PubMed] [Google Scholar]
- Gibson JJ, Smith OW, Steinschneider A, Johnson CW. The relative accuracy of visual perception of motion during fixation and pursuit. Am J Psychol 70: 64–68, 1957. doi: 10.2307/1419230. [DOI] [PubMed] [Google Scholar]
- Haarmeier T, Bunjes F, Lindner A, Berret E, Thier P. Optimizing visual motion perception during eye movements. Neuron 32: 527–535, 2001. doi: 10.1016/S0896-6273(01)00486-X. [DOI] [PubMed] [Google Scholar]
- Hafed ZM, Krauzlis RJ. Ongoing eye movements constrain visual perception. Nat Neurosci 9: 1449–1457, 2006. doi: 10.1038/nn1782. [DOI] [PubMed] [Google Scholar]
- Hayhoe M, Ballard D. Eye movements in natural behavior. Trends Cogn Sci 9: 188–194, 2005. doi: 10.1016/j.tics.2005.02.009. [DOI] [PubMed] [Google Scholar]
- Johnston A, Scarfe P. The role of the harmonic vector average in motion integration. Front Comput Neurosci 7: 146, 2013. doi: 10.3389/fncom.2013.00146. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Khawaja FA, Liu LD, Pack CC. Responses of MST neurons to plaid stimuli. J Neurophysiol 110: 63–74, 2013. doi: 10.1152/jn.00338.2012. [DOI] [PubMed] [Google Scholar]
- Kleiner M, Brainard DH, Pelli DG. What’s new in Psychtoolbox-3? (Abstract). Perception 36, Suppl 14, 2007. [Google Scholar]
- Kodaka Y, Miura K, Suehiro K, Takemura A, Kawano K. Ocular tracking of moving targets: effects of perturbing the background. J Neurophysiol 91: 2474–2483, 2004. doi: 10.1152/jn.01079.2003. [DOI] [PubMed] [Google Scholar]
- Lindner A, Ilg UJ. Suppression of optokinesis during smooth pursuit eye movements revisited: the role of extra-retinal information. Vision Res 46: 761–767, 2006. doi: 10.1016/j.visres.2005.09.033. [DOI] [PubMed] [Google Scholar]
- Lindner A, Ilg UJ. Cancellation of gaze stabilizing mechanisms during human smooth pursuit: indications for the involvement of an extra-retinal reference (Abstract). J Vis 2: 575, 2010. doi: 10.1167/2.7.575. [DOI] [Google Scholar]
- Lindner A, Schwarz U, Ilg UJ. Cancellation of self-induced retinal image motion during smooth pursuit eye movements. Vision Res 41: 1685–1694, 2001. doi: 10.1016/S0042-6989(01)00050-5. [DOI] [PubMed] [Google Scholar]
- Lorenceau J. Veridical perception of global motion from disparate component motions. Vision Res 38: 1605–1610, 1998. doi: 10.1016/S0042-6989(97)00295-2. [DOI] [PubMed] [Google Scholar]
- Majaj NJ, Carandini M, Movshon JA. Motion integration by neurons in macaque MT is local, not global. J Neurosci 27: 366–370, 2007. doi: 10.1523/JNEUROSCI.3183-06.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Manning C, Dakin SC, Tibber MS, Pellicano E. Averaging, not internal noise, limits the development of coherent motion processing. Dev Cogn Neurosci 10: 44–56, 2014. doi: 10.1016/j.dcn.2014.07.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Masson GS. From 1D to 2D via 3D: dynamics of surface motion segmentation for ocular tracking in primates. J Physiol Paris 98: 35–52, 2004. doi: 10.1016/j.jphysparis.2004.03.017. [DOI] [PubMed] [Google Scholar]
- Masson GS, Rybarczyk Y, Castet E, Mestre DR. Temporal dynamics of motion integration for the initiation of tracking eye movements at ultra-short latencies. Vis Neurosci 17: 753–767, 2000. doi: 10.1017/S0952523800175091. [DOI] [PubMed] [Google Scholar]
- Mineault PJ, Khawaja FA, Butts DA, Pack CC. Hierarchical processing of complex motion along the primate dorsal visual pathway. Proc Natl Acad Sci U S A 109: E972–E980, 2012. doi: 10.1073/pnas.1115685109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mingolla E, Todd JT, Norman JF. The perception of globally coherent motion. Vision Res 32: 1015–1031, 1992. doi: 10.1016/0042-6989(92)90003-2. [DOI] [PubMed] [Google Scholar]
- Miura K, Kobayashi Y, Kawano K. Ocular responses to brief motion of textured backgrounds during smooth pursuit in humans. J Neurophysiol 102: 1736–1747, 2009. doi: 10.1152/jn.00430.2009. [DOI] [PubMed] [Google Scholar]
- Newsome WT, Wurtz RH, Dürsteler MR, Mikami A. Deficits in visual motion processing following ibotenic acid lesions of the middle temporal visual area of the macaque monkey. J Neurosci 5: 825–840, 1985. doi: 10.1523/JNEUROSCI.05-03-00825.1985. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rust NC, Mante V, Simoncelli EP, Movshon JA. How MT cells analyze the motion of visual patterns. Nat Neurosci 9: 1421–1431, 2006. doi: 10.1038/nn1786. [DOI] [PubMed] [Google Scholar]
- Schütz AC, Delipetkos E, Braun DI, Kerzel D, Gegenfurtner KR. Temporal contrast sensitivity during smooth pursuit eye movements. J Vis 7: 3, 2007. doi: 10.1167/7.13.3. [DOI] [PubMed] [Google Scholar]
- Schwartz JD, Lisberger SG. Initial tracking conditions modulate the gain of visuo-motor transmission for smooth pursuit eye movements in monkeys. Vis Neurosci 11: 411–424, 1994. doi: 10.1017/S0952523800002352. [DOI] [PubMed] [Google Scholar]
- Schwarz U, Ilg UJ. Asymmetry in visual motion processing. Neuroreport 10: 2477–2480, 1999. doi: 10.1097/00001756-199908200-00008. [DOI] [PubMed] [Google Scholar]
- Souto D, Kerzel D. Ocular tracking responses to background motion gated by feature-based attention. J Neurophysiol 112: 1074–1081, 2014. doi: 10.1152/jn.00810.2013. [DOI] [PubMed] [Google Scholar]
- Spering M, Gegenfurtner KR. Contextual effects on smooth-pursuit eye movements. J Neurophysiol 97: 1353–1367, 2007. doi: 10.1152/jn.01087.2006. [DOI] [PubMed] [Google Scholar]
- Suehiro K, Miura K, Kodaka Y, Inoue Y, Takemura A, Kawano K. Effects of smooth pursuit eye movement on ocular responses to sudden background motion in humans. Neurosci Res 35: 329–338, 1999. doi: 10.1016/S0168-0102(99)00098-X. [DOI] [PubMed] [Google Scholar]
- Takeuchi T. Effect of contrast on the perception of moving multiple Gabor patterns. Vision Res 38: 3069–3082, 1998. doi: 10.1016/S0042-6989(98)00019-4. [DOI] [PubMed] [Google Scholar]
- Terao M, Murakami I, Nishida S. Enhancement of motion perception in the direction opposite to smooth pursuit eye movement. J Vis 15: 2, 2015. doi: 10.1167/15.13.2. [DOI] [PubMed] [Google Scholar]
- Wallach H. Über visuell wahrgenommene Bewegungsrichtung. Psychol Forsch 20: 325–380, 1935. doi: 10.1007/BF02409790. [DOI] [Google Scholar]
- Wexler M, Panerai F, Lamouret I, Droulez J. Self-motion and the perception of stationary objects. Nature 409: 85–88, 2001. doi: 10.1038/35051081. [DOI] [PubMed] [Google Scholar]




