Abstract
There is little direct psychophysical evidence that the visual system contains mechanisms tuned to head-centred velocity when observers make a smooth pursuit eye movement. Much of the evidence is implicit, relying on measurements of bias (e.g. matching and nulling). We therefore measured discrimination contours in a space dimensioned by pursuit target motion and relative motion between target and background. Within this space, lines of constant head-centred motion are parallel to the main negative diagonal, so judgements dominated by mechanisms that combine individual components should produce contours with a similar orientation. Conversely, contours oriented parallel to the cardinal axes of the space indicate judgements based on individual components. The results provided evidence for mechanisms tuned to head-centred velocity – discrimination ellipses were significantly oriented away from the cardinal axes, towards the main negative diagonal. However, ellipse orientation was considerably less steep than predicted by a pure combination of components. This suggests observers used a mixture of two strategies across trials, one based on individual components and another based on their sum. We provide a model that simulates this type of behaviour and is able to reproduce the ellipse orientations we found.
Introduction
There is a large literature concerning the ability of the visual system to compensate for the retinal effects of smooth pursuit eye movement. This literature emphasises perceptual errors that arise when the visual system estimates location, speed and direction during tracking eye movements, as well as more complex judgements such as depth and heading. Examples include the misperception of flashed locations during pursuit (Brenner, Smeets, & Van den Berg, 2001; Mitrani, Dimitrov, Yakimoff, & Mateef, 1979), the changes in perceived speed that occur when moving objects are tracked (Dichgans, Wist, Diener, & Brandt, 1975; Sumnall, Freeman, & Snowden, 2003), the illusory motion of stationary backgrounds over which the eye movement is made (Freeman & Sumnall, 2002; Mack & Herman, 1978) and the misperception of object direction when a separate target is pursued (Hansen, 1979; Souman, Hooge, & Wertheim, 2005). One general account of these effects starts with the idea that estimates of retinal position or motion are added to estimates of eye position or velocity to transform images into a head-centred frame (Freeman, 2001; Rotman, Brenner, & Smeets, 2005; Souman & Freeman, 2008; Souman, Hooge, & Wertheim, 2006; Turano & Massof, 2001; Wertheim, 1994). The mistakes exhibited by observers are then either a function of different errors associated with retinal and eye-based inputs or lie within the combination stage itself. An important implication of this general account is that relatively early on in the processing pathway there exist mechanisms tuned to position and velocity in a head-centred frame. Unfortunately, the psychophysical evidence for this is somewhat indirect, largely based on the measurement of bias (e.g. matching and nulling).
Here we make a more direct test of this claim by measuring discrimination contours in a 2D motion space. The discrimination-contour paradigm has been widely used in colour vision to probe the tuning of chromatic processing mechanisms (Gegenfurtner & Hawken, 1995; Noorlander, Heuts, & Koenderink, 1980; Poirson, Wandell, Varner, & Brainard, 1990; Wandell, 1985). This paradigm has been used to explore space-time separability in speed perception (Lappin, Bell, Harm, & Kottas, 1975; Reisbeck & Gegenfurtner, 1999) and bears a close relationship to the investigation of redundant cue combination in the 3D shape literature (e.g. Hillis, Ernst, Banks, & Landy, 2002; Landy, Maloney, Johnston, & Young, 1995). Discrimination contours are measured in a 2D space parameterised by the dimensions of interest (Figure 1). The subsequent orientation of the contours allows inferences to be drawn about the way the dimensions are combined by the observer. As discussed in more detail below, discrimination thresholds are determined in a number of directions away from the ‘standard’ stimulus. If the observer combines dimensions – for instance, combines space and time with a mechanism that yields speed – then the discrimination contour will be oriented with respect to the combination of interest (e.g. red oblique ellipse labelled ‘combination’ in Figure 1). If the observer is unable to combine the dimensions but instead uses the individual components, then the subsequent orientation of the discrimination contour will run parallel to one of the axes of the space (blue ellipse labelled ‘components’ in Figure 1).
Figure 1.
Predicted orientation of discrimination contours for a space dimensioned by pursuit target motion and relative motion. The oblique red ellipse labelled ‘combination’ shows predicted thresholds if observers combine the two dimensions to yield head-centred velocity. The blue ellipse labelled ‘components’ shows predicted thresholds for observers that use individual components only. See text for details.
We used this paradigm to seek direct psychophysical evidence for mechanisms tuned to the head-centred velocity. The stimulus consisted of a moving background over which the observer tracked a moving pursuit target. The background’s head-centred velocity is the sum of retinal motion and eye velocity (H = R + E). The most obvious dimensions to use are therefore R and E. However, two recent studies argue that observers do not rely on these motion cues but rather the relative motion between pursued target and background, and the motion of the pursued target itself. Using a speed discrimination task, Freeman, Champion, Sumnall & Snowden (2009) showed that observers do not have direct access to retinal motion when making discrimination judgements during pursuit – instead, observers rely on the relative motion between pursuit target and background object, even when feedback concerning absolute retinal motion was explicitly provided. In the case of the pursued target, Welchman, Harris & Brenner (2009) showed that observers summed eye velocity information with retinal slip information when discriminating the motion-in-depth of a target tracked by a vergence eye movement (we have found similar evidence in unpublished investigations of speed and direction discrimination for pursued stimuli moving in the fronto-parallel plane).
These results suggest that perceived head-centred velocity of background objects is separable into relative motion (Rel) and pursuit target motion (T). We do not mean this to imply that retinal motion and eye velocity are ignored by the observer – rather, retinal motion and eye velocity are used to estimate relative motion and target velocity. Importantly, relative motion by itself does not tell the observer how a background object is moving with respect to the head – it simply informs the observer how two objects are moving with respect to one another. To determine the head-centred velocity of the background stimulus, the observer must add relative motion to velocity of the pursuit target (H = Rel + T). The current paper therefore asks whether the visual system contains relatively low-level mechanisms explicitly tuned to H, or whether it is inferred by some more circuitous route.
Figure 1 shows in more detail how the discrimination-contour paradigm relates to the judgement of head-centred velocity. The figure describes a space spanned by pursuit target motion and relative motion. At any point in this space, we can define a standard stimulus (Ts, Rels) and a test stimulus (Tt, Relt), where Tt = Ts + ΔT and Relt = Rels + ΔRel. The variation in head-centred velocity of the background object is therefore ΔH = ΔT+ ΔRel, such that a line of constant ΔH has slope of −1 (the negative diagonal in Figure 1 defines ΔH=0). Suppose we are able to obtain thresholds for discriminating test from standard for the set of directions θ. The blue ellipse labelled ‘components’ in Figure 1 describes the expected threshold contour if observers base judgements on individual components rather than their combination (the figure assumes that sensitivity to relative motion is greater than pursuit target motion, which is why the major axis of the ‘components’ ellipse is horizontal). Along the cardinal axes, only one motion component conveys any useful information, so in these directions thresholds are limited by one component alone (dotted lines). In all other directions, however, useful information is conveyed by both components, so observers may gain a statistical advantage due to probability summation (e.g. Alais & Burr, 2004).
The red ellipse labelled ‘combination’ in Figure 1 describes the threshold contour expected if observers combine T and Rel to yield H. Ideally, if observers based their judgements on head-centred velocity alone, the threshold contour would consist of two lines parallel to the negative diagonal. In this case, observers would find it particularly difficult to differentiate any pair of stimuli that lie along lines of constant ΔH because these form head-centred ‘metamers’. In practice, however, for relatively extreme values of ΔT and ΔRel, observers are likely to be able to differentiate standard and test on the basis of individual components (see Hillis et al, 2002). Hence, the resulting thresholds will produce a closed contour oriented with respect to the negative diagonal.
Some authors have found that the mapping of retinal motion information onto a head-centred frame is more likely to occur when pursuit and retinal motion are in opposite directions (Brenner & van den Berg, 1994; Morvan & Wexler, 2009; Tong, Aydin, & Bedell, 2007; Tong, Patel, & Bedell, 2006; Turano & Heidenreich, 1996). A possible reason for the asymmetry is that the world is predominantly stationary, so pursuit eye movement is more likely to produce retinal motion in the opposite direction (Tong et al, 2006). We therefore measured discrimination contours for pursuit target and relative motion in ‘same’ and ‘opposite’ directions. In Figure 1, the ‘same’ conditions lie in the upper-right and lower-left quadrants of the depicted space. The ‘opposite’ conditions lie in the lower-right and upper-left quadrants.
Methods
Stimuli
Stimuli were generated using OpenGL and controlled by a Radeon 9800 Pro graphics card. Stimuli were presented on a ViewSonic P225f monitor at a frame rate of 100Hz and viewed binocularly from 70cm in a completely darkened room. A red gel placed over the monitor screen helped to eliminate phosphor glow and dot trails. An Eyelink 1000 eye-tracker recorded eye-movements at a sampling rate of 1000 Hz.
Stimuli consisted of a circular pursuit target at the centre of a random dot pattern against a black background. The pursuit target had a diameter of 0.2°; the random dot pattern was composed of dots with a diameter of 0.1° with a density of 1 dot/deg2. The random dot pattern appeared in an annulus window with inner and outer radii of 1° and 8° respectively. The movement of the window was yoked to the pursuit target. The target, dot pattern and window moved horizontally in all conditions investigated.
The pursuit target was stationary for the first 500ms of each interval. Its speed was then ramped to the desired value over a mean duration of 250ms and then continued moving at this value for the rest of the interval (mean 500ms). Random perturbations of ±50ms were added to these two time periods. Each interval therefore lasted for a mean duration of 1250ms. The random dot pattern was presented at the end of the ramp and remained visible until the pursuit target disappeared. The start position of the pursuit target was displaced from the centre of the screen by half the distance of the full sweep plus a random perturbation of ±1°. The perturbations in time and space were designed to encourage judgements of motion not position.
Procedure
To determine thresholds, we used a three-interval forced-choice oddity task, consisting of two standard intervals and one test interval. These were presented in random order on each trial and observers were required to judge which interval was the odd-one-out. In the ‘same’ condition, standard intervals consisted of a pursuit target and random dot stimulus moving in the same direction. Hence (Ts, Rels) = (+4, +4)°/s, with the dot stimulus moving at 8°/s on the screen. In the ‘opposite’ condition, (Ts, Rels) = (+4, −4)°/s. The dot stimulus in this case was always stationary on the screen.
Test stimuli had velocities (Tt, Relt) = (Ts+ ΔT, Rels+ΔRel), where the increments ΔT = gTscosθ and ΔRel = gRelssinθ. The parameter g defines the step size, and θ the direction of the discrimination task in T-Rel space (see Figure 1). Italics denote speed, emphasising that test stimuli could not flip phase through 180° for any given θ. The increments ΔT and ΔRel were controlled by a 3-down 1-up staircase. Along any direction θ, the ratio of ΔT to ΔRel was constant, with the staircase changing the distance between the test and standard in steps D = g(Ts2 cos2θ+ Rels2sin2θ)1/2. Staircases were terminated after 9 reversals, with the step size before the first reversal set to g=0.2 and all subsequent step sizes set to g=0.1. Sixteen directions in T-Rel space were investigated (θ = 0° to 337.5° in increments of 22.5°). Within one experimental session, 4 of these different directions were investigated, each assigned one staircase. Staircases were randomly interleaved. In total, three replications of each staircase were completed. Staircases for ‘same’ and ‘opposite’ were blocked, with observers S1-S3 completing ‘same’ blocks first, and S4 and S5 completing ‘opposite’ blocks first.
The direction of the standard was alternated on each trial ie. (−Ts, −Rels). By definition, this also flips the test, such that both test and standard rotate 180° about the origin in Figure 1. For the ‘same’ condition, trials therefore alternated between the upper-right and lower-left quadrants; for the ‘opposite’ condition, trials alternated between upper-left and lower-right quadrants. Data were collapsed within these quadrant pairings.
Observers were instructed to maintain fixation on the pursuit target at all times. Following the completion of each trial, observers indicated which interval contained the odd-one-out. No feedback was given.
Eye-movement analysis
Eye-movements were recorded using an Eyelink 1000 eye-tracker, with samples recorded at 1000Hz. Eye-position data were low-pass filtered and a time-derivative taken. A region of interest was defined as the period of time during which the dot pattern was presented. Any saccades occurring within this region of interest were detected using a velocity threshold of 40deg/s and these trials were discarded (mean = 7.4%). The mean eye-velocity was computed over this region of interest and the mean retinal slip (target velocity - average eye-velocity) was calculated for each interval. Retinal slip estimates were then averaged across intervals to summarise each observer’s pursuit accuracy for a given condition.
Psychophysical analysis
Error rates were plotted as a function of D (as defined above), which corresponds to length along a direction θ. Error rates from each direction condition θ were concatenated with those from the condition θ+180° (for the latter D= D*-1). A Gaussian was then fit to the data using maximum likelihood minimisation, with standard deviation and a lapse rate parameter free to vary and mean fixed at D=0. Lapse rate was constrained to be less than 6% (Wichman & Hill, 2001). Threshold values were defined as the standard deviation of the Gaussian. We also estimated 95% confidence limits by bootstrapping 999 thresholds. In some cases the lower and upper confidence intervals are unequal because the distribution of standard deviations was sometimes asymmetric. Trials were excluded on the basis of eye movements if a saccade was detected.
Following Noorlander et al (1980), Poirson et al (1990), Wandell (1985), Reisbeck & Gegenfurtner (1999) and others, ellipses were fit to the data to quantify the orientation of the discrimination contour. To do this we used an iterative method that minimised the geometric distance between data points and curve (Brown, 2007). Orientation was defined as the angle of the major axis swept out anti-clockwise from the right-hand horizontal (for example, the ellipse labelled ‘combination’ in Figure 1 has an orientation of 135°).
Observers
Five observers took part in the experiment, including the two authors (S1 = RAC, S2 = TCAF) and three naïve observers (S3-S5). All were experienced observers and wore their appropriate optical correction.
Results
Figure 2 shows the pattern of thresholds obtained for each of the five observers in the ‘same’ condition (top row) and ‘opposite’ condition (bottom row). The open points show thresholds with confidence intervals > 20°/s (in these cases the confidence intervals have not been plotted for clarity). The red curves show the best-fitting ellipses to the closed symbols. Note that the scales for naïve observers S4 and S5 are larger than S1-S3, indicating they were generally less sensitive to the speed differences displayed.
Figure 2.
Discrimination contours for ‘same’ (top row) and ‘opposite’ conditions (bottom row). Each column shows a different observer’s data. Open symbols correspond to thresholds. Error bars correspond to 95% confidence intervals (error bars > 20°/s are not plotted for clarity). The red-curves show best-fitting ellipses to the remaining data (closed symbols). Note that the axes for S4 and S5 have different scales to S1-S3.
The results show that ellipses are oriented close to the cardinal axes in each case, suggesting that observers based their judgements on individual motion cues. However, in all cases, except for S3 in the ‘same’ condition, there is a small but consistent deviation of the ellipse orientation away from cardinal, towards the negative oblique. This was confirmed statistically. In both ‘same’ and ‘opposite’ conditions, orientations were significantly different from cardinals (mean difference from cardinals: ‘same’ = 15.7° (SE=6.0°), t(4)=2.61, p<.05, one-tailed; ‘opposite’ = 17.0° (SE=4.4°), t(4)=3.81, p<.01, one-tailed). Hence our results lie somewhere in-between the predictions based on the use of individual cues (ellipses oriented along cardinal) and the prediction based on the use of head-centred cues (oriented along negative diagonal). Such a pattern of results suggest that discrimination was based on a mixture of two strategies; on some trials observers combined components and based their judgements on head-centred motion, whereas on other trials they used the individual components. Below we present a model that simulates such a strategy and is able to produce ellipse orientations like those found here.
Figure 2 also suggests there is little difference in ellipse orientation for the ‘same’ and ‘opposite’ conditions. Again this was confirmed statistically (mean orientation ‘same’ =158.0° (SE=10.9°), ‘opposite’ =157.6° (SE=9.4°), t(4)=0.004, p=.95, two-tailed). Hence, the lack of asymmetry between ‘same’ and ‘opposite’ conditions contrasts with work in other areas that report an anisotropy (e.g. motion smear: Tong et al, 2006; though see Morvan & Wexler, 2009). We note that we have previously failed to find this asymmetry in analogous experiments on retinal speed discrimination during pursuit and have discussed this finding in more detail elsewhere (Freeman et al, 2009).
The results of the eye-movement analysis are shown in Figure 3. The top panel shows the mean retinal slip of the target (target velocity - eye velocity ) across observers as a function of the direction θ. For the ‘opposite’ condition, eye movements were quite accurate – average retinal slip is close to 0. For the ‘same’ condition, pursuit tended to be faster than required. The influence of the direction of background motion on pursuit is well documented and explains the differences found here (Lindner & Ilg, 2006; Spering, Gegenfurtner, & Kerzel, 2006; Yee, Daniels, Jones, Baloh, & Honrubia, 1983). Closer inspection also revealed an influence of interval order. The lower panel of Figure 3 shows that eye speed decreased from intervals one to three in the ‘opposite’ condition, whereas eye speed remained more or less the same in the ‘same’ condition. These differences may reflect an influence of background motion on pursuit interacting with certain judgement strategies based on appearance. For instance, if the first two intervals appeared the same, observers could have decided the final interval was the odd one out before it was displayed. In this case, the final interval could be ignored, perhaps leading to lower eye speeds.
Figure 3.
Pursuit accuracy (retinal slip of pursuit target) averaged across the five observers for ‘same’ (closed symbols) and ‘opposite’ conditions (open symbols). The top panel shows accuracy as a function of direction θ (see Figure 1 for definition). The bottom panel shows accuracy as a function of interval order. Error bars represent ±1 SE.
To reiterate, the main results in Figure 2 suggest that observers used a mixture of two strategies across trials to make their judgements. Hence, we constructed the following model to investigate whether a mixture of ‘individual-components’ and ‘combination’ strategies could produce ellipses oriented between the cardinal axes and the negative diagonal.
Model
In the model, each interval was defined by three motion signals: Reli, Ti and Hi, where Hi=Reli+Ti and i = interval 1, 2 or 3. Reli and Ti were corrupted by Gaussian noise with a mean of 0 and a standard deviation σ. The precision of Hi was therefore assumed to be fully determined by noise at the input stage (ie σRel and σT). For the ‘combination strategy’, the odd interval was taken as the Hi that was most different from the mean of the other two intervals. For the ‘individual-components strategy’, an odd interval was identified separately for Reli and Ti, using the method described for Hi. This potentially yields two different candidate odd intervals on each trial, one determined by Reli and one determined by Ti. In these cases, the signal corresponding to the ‘most different’ odd-one-out was chosen.
We used a parameter ‘k’ to determine the probability on each trial of using the ‘combination strategy’ or ‘individual-components strategy’. The parameter k therefore set the weighting or mixture between strategies. For instance, with k=0, the model’s choice was determined entirely by the individual-components strategy. Conversely, with k=1, the model’s choice was determined entirely by the combination strategy. With k=0.5, the probability of using either strategy was the same on each trial. Hence in this case, the resulting threshold comprised a mixture of judgements based on combination and individual-component strategies. To determine discrimination-ellipse orientation as a function of k, a series of simulations was run for a range of directions θ through Rel-T space. Each simulation sampled the underlying psychometric function by running 10,000 trials at 7 equally-spaced steps along the given direction θ. Step size ‘g’ was set to 0.5 and values of ΔRel and ΔT were calculated as described in the Methods section. Thresholds and ellipse orientation were then derived using the fitting procedures also described in the Methods.
Figure 4A shows the simulated ellipse orientations as a function of the weighting k between strategies. We investigated five different levels of relative noise between Rel and T, corresponding to the five lines in the figure (dashed lines represent the same ratio σRel : σT as the solid lines, but with a factor of 10 decrease in noise). Figure 4B provides examples of the ellipses returned by the model at three of these levels of relative noise. We did not investigate a full range of noise values – the simulations shown in Figure 4 are simply meant to demonstrate ‘proof of principle’. In Figure 4A, the red lines show the results for σRel < σT and the green lines σRel > σT (corresponding to the upper and lower rows, respectively, in figure 4B). When k = 0, the model always uses the individual-components strategy and so the ellipse’s major axis is oriented parallel to the least precise motion cue (see first column of Figure 4B for examples). When k = 1, the model makes choices based on the combination strategy and so the thresholds lie parallel to the oblique. The ‘ellipse’ in this case is not closed (see Figure 4B, end column). Between these values of k, the orientation of the ellipse rotates away from the cardinal axis towards the oblique. The mean deviation from cardinal across observers was 16.4°, suggesting that they used the combination strategy between 10 - 20% of the time (assuming the noises present in our observers are within the range used in the simulations).
Figure 4.
Model results. (A) Simulated ellipse orientation as a function of k for five levels of σRel and σT. Note: when σRel = σT (blue line) ellipse orientation is undefined for k=0. (B) Example ellipses returned by the model at four values of k and three different levels of σRel and σT. Each row illustrates one of three cases of: σRel < σT (top row) , σRel = σT (middle row) and σRel > σT (bottom row).
The blue line in figure 4A shows the results for σRel = σT. Using individual components in this case (i.e. k=0) produces a circle because the underlying input signals are equally precise. Hence orientation is undefined at this value of k. As k increases, the circle becomes stretched along the negative oblique (as shown in figure 4B). Thus, for cases where input noises are equal, the defining feature of mixing strategies is a change in shape but not orientation.
The simulations demonstrate that mixing strategies across trials produces ellipses that rotate away from the cardinal axes towards the negative oblique, so long as the input noises are unequal. The model suggests that the judgements of all observers we tested were dominated by the use of individual components, but that they also switched to a combination strategy on a smaller proportion of trials. Because ellipses were stretched along the T axis for most observers (S1-S4), the motion signals associated with the pursuit target are less precise than those associated with the relative motion of the background.
Discussion
The data presented here provide a more direct test of the idea that the visual system contains mechanisms tuned to head-centred velocity, one that avoids the more implicit inferences drawn from the measurement of bias. We measured discrimination contours in a space dimensioned by relative motion and target motion to investigate whether observers had independent access to these two motion components or their head-centred sum. Within this space, lines of constant head-centred motion are parallel to the main negative diagonal, so judgements dominated by mechanisms that combine individual components should produce contours with a similar orientation. Conversely, contours oriented along the cardinal axes of the space indicate judgements based on individual components. The results provided evidence for mechanisms tuned to head-centred velocity – discrimination ellipses were oriented away from the cardinal axes and directed toward the main negative diagonal. However, ellipse orientation was closer to the cardinal axes than predicted by a strict head-centred combination of relative motion and pursuit target motion. We proposed that this pattern of data can be explained by observers switching from trial to trial between judgements based on head-centred velocity and individual components, with judgements dominated by the latter. Numerical simulation supported this idea, demonstrating that a discrimination ellipse can be oriented between the negative diagonal and the cardinal axes when observers switch between strategies. The model also showed that this type of switching behaviour is most easily seen when the internal noises associated with each dimensions are unequal. Lastly, the model showed that ellipses are oriented toward the axis that represents the less precise signal. For most of our observers, we found that ellipses were oriented towards the horizontal axis, indicating the estimates of target motion are more variable than estimates of relative motion.
The difference in signal precision suggested by our data agrees with recent data from Freeman, Champion & Warren (2010). Using a more standard 2AFC speed discrimination task, they found that the pursued stimuli were more difficult to discriminate than fixated stimuli. It is tempting to suggest that this result is caused by differences in the reliability of retinal and extra-retinal motion signals. However, as we have argued here, some care needs to be taken with this conclusion, primarily because moving targets are rarely pursued accurately (especially in older observers: Kolarik, Margrain & Freeman, 2010). The residual retinal slip is therefore a viable cue to pursuit target motion, so higher thresholds associated with pursuit target motion may possibly reflect less reliable retinal motion signals associated with retinal slip. Alternatively, observers may combine retinal-slip information with extra-retinal estimates of eye velocity. This latter strategy seems more likely because, as Welchman et al (2009) have shown, psychometric functions are better described by head-centred motion rather than either retinal slip or eye velocity on their own. Further support comes from Krukowski, Pirog, Beutter, Brooks & Stone (2003), who examined direction discrimination for a single moving point viewed with and without pursuit. They found that direction discrimination was unaffected by the ratio of eye velocity to retinal slip. This led them to suggest that the limiting noise for direction discrimination occurred at the combination stage, a claim bolstered by the fact that they found similar thresholds with and without pursuit, including a similar size of oblique effect in the two conditions. Of course, it is difficult to differentiate between noise at the input stage to combination versus noise at the combination stage itself, especially as Krukowski et al only investigated a single speed. Nevertheless, their evidence, when combined with ours, suggests that the combination stage limits direction discrimination more so than speed discrimination. If so, then measuring discrimination contours in a space spanned by the direction of relative motion and pursuit target motion may provide clearer evidence of mechanisms tuned to head-centred motion.
Acknowledgements
The work was supported by the Wellcome Trust. The authors would like to thank two anonymous reviewers for their comments and suggested improvements.
Contributor Information
Rebecca A. Champion, School of Psychology, Cardiff University, UK
Tom C.A. Freeman, School of Psychology, Cardiff University, UK
References
- Alais D, Burr D. No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research. 2004;19(19):185–194. doi: 10.1016/j.cogbrainres.2003.11.011. [DOI] [PubMed] [Google Scholar]
- Brenner E, Smeets JBJ, Van den Berg AV. Smooth eye movements and spatial localisation. Vision Research. 2001;41:2253. doi: 10.1016/s0042-6989(01)00018-9. [DOI] [PubMed] [Google Scholar]
- Brenner E, van den Berg AV. Judging object velocity during smooth-pursuit eye-movements. Experimental Brain Research. 1994;99(2):316–324. doi: 10.1007/BF00239598. [DOI] [PubMed] [Google Scholar]
- Brown R. FITELLIPSE : Least squares ellipse fitting demonstration. 2007 http://www.mathworks.com/matlabcentral/fx_files/15125/1/content/demo/html/ellipsedemo.html.
- Dichgans J, Wist E, Diener HC, Brandt T. The Aubert-Fleischl phenomenon: a temporal frequency effect on perceived velocity in afferent motion perception. Experimental Brain Research. 1975;23:529–533. doi: 10.1007/BF00234920. [DOI] [PubMed] [Google Scholar]
- Freeman TCA. Transducer models of head-centred motion perception. Vision Research. 2001;41:2741–2755. doi: 10.1016/s0042-6989(01)00159-6. [DOI] [PubMed] [Google Scholar]
- Freeman TCA, Champion RA, Sumnall JH, Snowden RJ. Do we have direct access to retinal image motion during smooth pursuit eye movements? Journal of Vision. 2009;9(1):33, 1–11. doi: 10.1167/9.1.33. [DOI] [PubMed] [Google Scholar]
- Freeman TCA, Champion RA, Warren PA. Bayesian analysis of perceived speed during smooth eye pursuit. Current Biology. 2010;20:757–762. doi: 10.1016/j.cub.2010.02.059. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Freeman TCA, Sumnall JH. Motion versus position in the perception of head-centred movement. Perception. 2002;31:603–615. doi: 10.1068/p3256. [DOI] [PubMed] [Google Scholar]
- Gegenfurtner KR, Hawken MJ. Temporal and chromatic properties of motion mechanisms. Vision Research. 1995;35(11):1547–1563. doi: 10.1016/0042-6989(94)00264-m. [DOI] [PubMed] [Google Scholar]
- Hansen RM. Spatial localization during pursuit eye movements. Vision Research, Vision-Research. 1979 doi: 10.1016/0042-6989(79)90186-x. [DOI] [PubMed] [Google Scholar]
- Hillis JM, Ernst MO, Banks MS, Landy MS. Combining sensory information: Mandatory fusion within but not between senses. Science. 2002;298:1627–1630. doi: 10.1126/science.1075396. [DOI] [PubMed] [Google Scholar]
- Kolarik AJ, Margrain TH, Freeman TCA. Precision and accuracy of ocular following: influence of age and type of eye movement. Experimental Brain Research. 2010;201:271–282. doi: 10.1007/s00221-009-2036-6. [DOI] [PubMed] [Google Scholar]
- Landy MS, Maloney LT, Johnston EB, Young M. Measurement and modelling of depth cue combination: in defense of weak fusion. Vision Research. 1995;35(3):389–412. doi: 10.1016/0042-6989(94)00176-m. [DOI] [PubMed] [Google Scholar]
- Lappin JS, Bell HH, Harm OJ, Kottas B. On the relation between time and space in the visual discrimination of velocity. Journal of Experimental Psychology: Human Perception and Performance. 1975;1(4):383–394. doi: 10.1037//0096-1523.1.4.383. [DOI] [PubMed] [Google Scholar]
- Lindner A, Ilg UJ. Suppression of optokinesis during smooth pursuit eye movements revisted: The role of extra-retinal information. Vision Research. 2006;46:761–767. doi: 10.1016/j.visres.2005.09.033. [DOI] [PubMed] [Google Scholar]
- Mack A, Herman E. The loss of position constancy during pursuit eye movements. Vision Research. 1978;18:55–62. doi: 10.1016/0042-6989(78)90077-9. [DOI] [PubMed] [Google Scholar]
- Mitrani L, Dimitrov G, Yakimoff G, Mateef S. Oculomotor and perceptual localization during smooth pursuit eye movements. Vision Research. 1979;19:609–612. doi: 10.1016/0042-6989(79)90148-2. [DOI] [PubMed] [Google Scholar]
- Morvan C, Wexler M. The nonlinear structure of motion perception during smooth eye movements. Journal of Vision. 2009;9(7):1, 1–13. doi: 10.1167/9.7.1. [DOI] [PubMed] [Google Scholar]
- Noorlander C, Heuts MJG, Koenderink JJ. Influence of the target size on the detection threshold for luminance and chromaticity contrast. Journal of the Optical Society of America. 1980;70(9):1116–1121. doi: 10.1364/josa.70.001116. [DOI] [PubMed] [Google Scholar]
- Poirson AB, Wandell BA, Varner DC, Brainard DH. Surface characterizations of color thresholds. Journal of the Optical Society of America A. 1990;7(4):783–789. doi: 10.1364/josaa.7.000783. [DOI] [PubMed] [Google Scholar]
- Reisbeck TE, Gegenfurtner KR. Velocity tuned mechanisms in human motion processing. Vision Research. 1999;39:3267–3285. doi: 10.1016/s0042-6989(99)00017-6. [DOI] [PubMed] [Google Scholar]
- Rotman G, Brenner E, Smeets JBJ. Flashes are localised as if they were moving with the eyes. Vision Research. 2005;45:355–364. doi: 10.1016/j.visres.2004.08.014. [DOI] [PubMed] [Google Scholar]
- Souman JL, Freeman TCA. Motion perception during sinusoidal smooth pursuit eye movements: Signal latencies and non-linearities. Journal of Vision. 2008;8(14):10, 1–14. doi: 10.1167/8.14.10. [DOI] [PubMed] [Google Scholar]
- Souman JL, Hooge ITC, Wertheim AH. Vertical object motion during horizontal ocular pursuit: compensation for eye movements increases with presentation duration. Vision Research. 2005;45:845–853. doi: 10.1016/j.visres.2004.10.010. [DOI] [PubMed] [Google Scholar]
- Souman JL, Hooge ITC, Wertheim AH. Frame of reference transformations in motion perception during smooth pursuit eye movement. Journal of Computational Neuroscience. 2006;20(1):61–76. doi: 10.1007/s10827-006-5216-4. [DOI] [PubMed] [Google Scholar]
- Spering M, Gegenfurtner KR, Kerzel D. Distractor interference during smooth pursuit eye movements. Journal of Experimental Psychology: Human Perception and Performance. 2006;32(5):1136–1154. doi: 10.1037/0096-1523.32.5.1136. [DOI] [PubMed] [Google Scholar]
- Sumnall JH, Freeman TCA, Snowden RJ. Optokinetic potential and the perception of head-centred speed. Vision Research. 2003;43:1709–1718. doi: 10.1016/s0042-6989(03)00254-2. [DOI] [PubMed] [Google Scholar]
- Tong J, Aydin M, Bedell HE. Dierction and extent of perceived motion smear during pursuit eye movement. Vision Research. 2007;47:1011–1019. doi: 10.1016/j.visres.2006.12.002. [DOI] [PubMed] [Google Scholar]
- Tong J, Patel SS, Bedell HE. The attenuation of perceived motion smear during combined eye and head movements. Vision Research. 2006;46:4387–4397. doi: 10.1016/j.visres.2006.08.034. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Turano KA, Heidenreich SM. Speed discrimination of distal motion during smooth pursuit eye motion. Vision Research. 1996;36(21):3507–3517. doi: 10.1016/0042-6989(96)00071-5. [DOI] [PubMed] [Google Scholar]
- Turano KA, Massof RW. Nonlinear contribution of eye velocity to motion perception. Vision Research. 2001;41:385–395. doi: 10.1016/s0042-6989(00)00255-8. [DOI] [PubMed] [Google Scholar]
- Wandell BA. Colour measurement and discrimination. Journal of Optical Society of America A. 1985;2(1):62–71. doi: 10.1364/josaa.2.000062. [DOI] [PubMed] [Google Scholar]
- Welchman AE, Harris JM, Brenner E. Extra-retinal signals support the estimation of 3D motion. Vision Research. 2009;49:782–789. doi: 10.1016/j.visres.2009.02.014. [DOI] [PubMed] [Google Scholar]
- Wertheim AH. Motion perception during self-motion - the direct versus inferential controversy revisited. Behavioral and Brain Sciences. 1994;17(2):293–311. [Google Scholar]
- Wichmann FA, Hill NJ. The psychometric function: I. Fitting, sampling, and goodness of fit. Perception & Psychophysics. 2001;63(8):1293. doi: 10.3758/bf03194544. [DOI] [PubMed] [Google Scholar]
- Yee RD, Daniels SA, Jones OW, Baloh RW, Honrubia V. Effects of an optokinetic background on pursuit eye-movements. Investigative Ophthalmology & Visual Science. 1983;24(8):1115–1122. [PubMed] [Google Scholar]




