Our visual environment is highly dynamic and marked by continuous change. The ability to see moving objects and to interact with them is a fundamental visual skill critical for survival. Many animals rely on visual motion to capture prey or to avoid predators. In humans, visual motion perception is at the heart of daily activities such as driving a car. Since the mid-19th century, a wealth of research has helped characterize humans’ behavioral response to visual motion and elucidated the workings of the brain when viewing moving objects. Psychologists and neuroscientists have developed experimental paradigms to probe responses to visual motion, assessed with a large portfolio of techniques ranging from behavioral (psychophysical) tests, neuronal recordings, and functional magnetic resonance imaging to optogenetics. However, many of these studies used simple visual stimuli such as single dots, multiple moving dots (random dot patterns), or textures with different motion components (e.g., sine-wave gratings, Gabor patterns, and plaids). Without doubt, such synthetic stimuli are powerful in terms of affording experimenter control and limiting observed behavior to the dimension under study. However, they do not capture the dynamics and richness of our visual environment. The PNAS paper by Knöll et al. (1) introduces a highly original paradigm that allows us to assess spatiotemporal integration of visual motion information using eye movements, a continuous natural response.
The authors tested three primate species (humans, macaques, and marmosets), who viewed a large display of continuously moving dots, forming an optic flow field that occupied most of the observers’ visual field. These dots moved toward or away from one point in the field [termed the focus of expansion (FOE)]. Dot velocity increased with distance from the FOE. Even though the dot field is a reduced and simplified version of what we experience in the real environment, the combination of velocity profiles across the dot field creates a realistic percept of expanding or contracting motion similar to what we perceive as we move toward or away from an object in the real world (Fig. 1). The authors discovered that this stimulus generated naturalistic oculomotor tracking of the shifting FOE with a combination of saccades (rapid displacements of the eyes that recenter gaze on objects of interest) and periods of fixation on the FOE. This gaze behavior is akin to reflexively directing gaze at approaching objects or naturally looking at where one is heading for navigation (2, 3). Knöll et al. (1) further show that random perturbations in the flow field reliably produced the same eye movement patterns repeatedly and across species. These results are interesting and important. However, the main significance of Knöll et al.s’ contribution is the introduction of a creative paradigm that has the potential for broad applications. The authors utilized a naturally occurring oculomotor response as a continuous behavioral readout of whether an observer perceived visual motion and detected the FOE. Their paradigm thus capitalizes on the close relation between visual motion perception and eye movements in terms of anatomical substrates and sensitivity to direction, speed, and other motion-related stimulus features (4, 5). It also builds on the rich optic flow literature, which investigates higher-level visual motion processing for navigation, locomotion, target tracking, and crowd behavior (6, 7).
Fig. 1.
Illustration of the optic flow field stimulus used in the paper by Knöll et al. (1). This stimulus is a synthetic and highly simplified representation of visual motion signals. However, we constantly encounter optic flow and shifting FOEs in our natural environment. (A) Imagine walking down a pedestrian boulevard. Forward motion toward a straight-ahead target (e.g., the flag pole in the distance) creates an optic flow field in which peripheral visual information dynamically expands (indicated by arrows). (B) As pedestrians or cyclists quickly approach, our FOE (black and white ring) might shift because we instinctively direct gaze at approaching objects or other areas of interest. Knöll et al.’s paradigm mimics such a natural environment and measures a continuous behavioral output by taking advantage of our reflexive tendency to redirect gaze at salient visual information.
Importantly, little training was required for all three species to track the FOE with their eyes in this paradigm. From the first stimulus exposure, tracking performance stabilized within only 100 min of tracking. Even though Knöll et al. (1) tested only a small number of primate observers, results are strikingly similar across species, notwithstanding differences in cognitive ability and capability to receive instruction among humans, macaques, and marmoset monkeys. Marmosets have only recently been introduced as a nonhuman primate model in the visual neurosciences. The marmoset model affords many important advantages complementary to other animal models. Unlike rodents, marmosets’ neural structure and complex behaviors are relatively comparable to humans, and unlike macaque monkeys, marmosets reproduce quickly, thus offering opportunities of studying visual development and the genetic basis of visual behavior (8). However, compared with macaques, marmosets commonly perform only up to half of the trials in a standard experimental setting. Thus, there is a need for stimulation material that is suited for those studies that can involve only basic instruction and training and have to be completed within a limited amount of time.
The dynamic optic flow tracking paradigm addresses this need. Due to the continuous nature of stimulus motion and oculomotor response, a given trial with a duration of 30 s can yield a large amount of data; results were similar to those obtained in longer sessions. By contrast, standard psychophysical testing typically shows a stimulus briefly and requires a binary response in each trial (e.g., a button-press judgment of leftward or rightward motion); a vast number of trials are often required for each stimulus parameter to yield meaningful and reliable performance measures. Knöll et al.’s (1) paradigm offers the opportunity to reliably measure continuous behavioral outputs in response to multiple stimulus conditions within one experimental session. It can be used to combine behavioral responses with measurements of continuous neural activity to further our understanding of the neural substrates underlying 3D motion processing (9) in an experimental context mimicking the statistical structure and dynamics of the natural environment (10).
The paradigm’s efficiency in assessing motion integration within a short time frame also opens many avenues toward broader applications. Future studies could explore the usability of this paradigm in species that rely heavily on visual signals for prey capture or navigation, such as amphibians (11) or birds (12), for broader interspecies comparisons. Knöll et al. (1) further mention that their paradigm could be used in combination with nonvisual sensory cues to measure motion integration abilities in animals relying on senses other than vision for navigation. The combination with virtual reality technologies, in which observers could navigate toward the FOE, would allow studying the bidirectional relationship between an observer’s navigational behavior and motion signals in a virtual environment.
In addition to advancing our understanding of naturalistic 3D motion processing across species and senses, the paradigm provides an easy and quick assessment of motion perception in applied contexts. For example, there is an urgent need for tools that can test motion perception in clinical settings. Standard eye examinations focus heavily on visual acuity (the ability to see fine spatial detail), commonly assessed using charts with letters or other stationary objects. However, many daily tasks require the ability to perceive and interact with moving objects in a dynamic environment, a visual function that is controlled by a network of brain areas, including the middle temporal visual area (13). Motion sensitivity is relatively independent of functions arising earlier in the visual processing hierarchy, and motion-sensitivity deficits have been shown to be uncorrelated with contrast sensitivity or visual acuity in some clinical populations (14–16). Motion sensitivity deficits are not commonly captured by standard optometric tests. Existing psychophysical tests are too lengthy and complicated to be used in any context requiring rapid skill assessments, such as in neurological examinations, in developmental settings, or for driver’s testing. Knöll et al.’s paradigm (1) could be integrated into new technology, enabling an easy and quick motion-sensitivity assessment using instinctive eye movement responses. Because eye-tracking technology is advanced and now allows direct, unobtrusive testing with relatively little cost and effort, such tests could easily be translated from the laboratory to the bedside and would then be accessible, regardless of language ability or cognitive or motor deficits. Applications to measure motion-processing deficits in patients with more advanced impairments or in developmental studies in children could be combined with animal models probed by the same behavioral assessment, providing opportunities for translational research for intervention or neural development. The experimental approach offered by Knöll et al. (1), therefore, constitutes an important step toward real-world applications in visual motion psychophysics.
Footnotes
The authors declare no conflict of interest.
See companion article on page E10486.
References
- 1.Knöll J, Pillow JW, Huk AC. Lawful tracking of visual motion in humans, macaques, and marmosets in a naturalistic, continuous, and untrained behavioral context. Proc Natl Acad Sci USA. 2018;115:E10486–E10494. doi: 10.1073/pnas.1807192115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Land MF. Motion and vision: Why animals move their eyes. J Comp Physiol A Neuroethol Sens Neural Behav Physiol. 1999;185:341–352. doi: 10.1007/s003590050393. [DOI] [PubMed] [Google Scholar]
- 3.Matthis JS, Yates JL, Hayhoe MM. Gaze and the control of foot placement when walking in natural terrain. Curr Biol. 2018;28:1224–1233.e5. doi: 10.1016/j.cub.2018.03.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Spering M, Montagnini A. Do we track what we see? Common versus independent processing for motion perception and smooth pursuit eye movements: A review. Vision Res. 2011;51:836–852. doi: 10.1016/j.visres.2010.10.017. [DOI] [PubMed] [Google Scholar]
- 5.Schütz AC, Braun DI, Gegenfurtner KR. Eye movements and perception: A selective review. J Vis. 2011;11:1–30. doi: 10.1167/11.5.9. [DOI] [PubMed] [Google Scholar]
- 6.Gibson JJ. Visually controlled locomotion and visual orientation in animals. Br J Psychol. 1958;49:182–194. doi: 10.1111/j.2044-8295.1958.tb00656.x. [DOI] [PubMed] [Google Scholar]
- 7.Warren WH. Collective motion in human crowds. Curr Dir Psychol Sci. 2018;27:232–240. doi: 10.1177/0963721417746743. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Mitchell JF, Leopold DA. The marmoset monkey as a model for visual neuroscience. Neurosci Res. 2015;93:20–46. doi: 10.1016/j.neures.2015.01.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Cormack LK, Czuba TB, Knöll J, Huk AC. Binocular mechanisms of 3D motion processing. Annu Rev Vis Sci. 2017;3:297–318. doi: 10.1146/annurev-vision-102016-061259. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Huk A, Bonnen K, He BJ. Beyond trial-based paradigms: Continuous behavior, ongoing neural activity, and natural stimuli. J Neurosci. 2018;38:7551–7558. doi: 10.1523/JNEUROSCI.1920-17.2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Borghuis BG, Leonardo A. The role of motion extrapolation in amphibian prey capture. J Neurosci. 2015;35:15430–15441. doi: 10.1523/JNEUROSCI.3189-15.2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Dakin R, Fellows TK, Altshuler DL. Visual guidance of forward flight in hummingbirds reveals control based on image features instead of pattern velocity. Proc Natl Acad Sci USA. 2016;113:8849–8854. doi: 10.1073/pnas.1603221113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Born RT, Bradley DC. Structure and function of visual area MT. Annu Rev Neurosci. 2005;28:157–189. doi: 10.1146/annurev.neuro.26.041002.131052. [DOI] [PubMed] [Google Scholar]
- 14.Spencer J, et al. Motion processing in autism: Evidence for a dorsal stream deficiency. Neuroreport. 2000;11:2765–2767. doi: 10.1097/00001756-200008210-00031. [DOI] [PubMed] [Google Scholar]
- 15.Ho CS, et al. Deficient motion perception in the fellow eye of amblyopic children. Vision Res. 2005;45:1615–1627. doi: 10.1016/j.visres.2004.12.009. [DOI] [PubMed] [Google Scholar]
- 16.Chakraborty A, et al. Global motion perception is independent from contrast sensitivity for coherent motion direction discrimination and visual acuity in 4.5-year-old children. Vision Res. 2015;115:83–91. doi: 10.1016/j.visres.2015.08.007. [DOI] [PMC free article] [PubMed] [Google Scholar]

