Abstract
Visual information is continuously sampled from our environment, even as the eyes move, which helps the visual system create a stable view of the world.
Why do we perceive the world as stable when our eyes are constantly moving? This question has puzzled researchers for decades (1, 2). Solving the puzzle could advance a myriad of artificial visual systems such as those used in robotics or visual display development.
When photons (packets of light) strike the retina, they initiate a cascade of neural processing, first locally in the retina itself, then within the brain—in the occipital cortex and further downstream in other cortical visual areas. In this processing stream, the photons are integrated into lines, patterns, textures, motion, and, finally, into abstract concepts that constitute our rich visual experience.
When the eyes move, photons coming from a single object project onto different parts of the retina. Similarly, when an object moves, photons project onto different parts of the retina. This creates an ambiguous situation: At any point when motion is detected by the visual system, did the eyes move, or did something in the outside world move?
To unravel how the brain produces a stable visual world, previous studies focused on what we can see shortly before and after an eye movement, thereby ignoring any visual input generated by the moving eye itself. However, this approach fails to clarify to what extent the motion stimulus created by the saccade or “motion streak” shapes visual processing and influences concurrent behavior, such as quick localization and identification of things you see in the corner of your eye. It is often assumed that during eye movements, as the world rapidly shifts across the retina, a pause occurs in perceptual input. However, a new study by Schweitzer and Rolfs (3) demonstrates that motion streaks help the oculomotor system (the system responsible for producing eye movements) correct for variability in eye movements and thus aid in building a stable view of the world (Fig. 1).
Fig. 1. Movement of the eyes creates motion streaks by sweeping the world across the retina.
These streaks can help establish a continuous and stable view of the world. In this example, an observer first fixates on a point next to the kite, and then moves their eyes to fixate directly on the kite. The faint rainbow represents the motion streak that results from the kite sweeping across the retina. The motion streak shows that the kite moved upward while the eyes themselves were moving. Credit: Kellie Holoski/Science Advances.
Moving target
Schweitzer and Rolfs reveal that visual information is continuously sampled from our environment, even while the eyes are in full motion. Although the quality of visual information received during eye movement might be less rich compared to when the eyes are stable, the visual system uses all available information to build our perceptual experience. Given that the eyes are in motion, a large portion of the time (estimates are that we make on average three eye movements each second), it could be an efficient strategy of the visual system to continuously process visual input during an eye movement. In this way, the visual system can optimally benefit from the available information.
In their study, Schweitzer and Rolfs used a novel high-speed projector and manipulated a visual scene during eye movement. Participants saw six small patches, all with unique patterns. One of the patches was designated as the eye movement target. The unique pattern ensured that each patch would generate a unique motion streak. To test whether participants were (unconsciously) processing the motion streaks, they rapidly shifted the position of the target while the participants’ eyes were in motion. As a result, the eyes landed on a location between the actual target and one of the other patches. Once the eye movement was complete, all patches were masked to make them indistinguishable from each other.
To complete the task of moving the eye to the target patch, participants had to make a corrective eye movement based on the motion streak generated by the target during the eye movements. Using this clever manipulation, Schweitzer and Rolfs noticed that corrective eye movements were successfully made to the target, indicating that participants used motion streaks to resolve the scene. Going one step further, the researchers manipulated the patch patterns during eye movement (i.e., changing them or removing them completely). They noticed that this activity slowed participants down and increased the probability of making an eye movement to a nontarget patch.
One of the unique features of the Schweitzer and Rolfs study is their use of a high-speed projector. The eyes are in motion for about 50 ms. During this time, high-end monitors can show about seven images, However, the high-speed projector generates about 70 images in that time and allows the researchers to rearrange the patches in addition to masking them.
These results convincingly demonstrate that visual input during eye movement can solve part of the visual stability problem and provide a potential solution to stabilizing visual features very quickly across eye movements (4). It now appears that the continuous presence of the target might accelerate stability of visual features and that a temporary removal of the target likely deteriorates visual feature stability by deteriorating the retinal motion streak.
Improving artificial visual systems
Beyond vision science, these findings have broad implications. Progress in areas such as neurological/ophthalmological care, robotics, computer vision, and visual display development crucially depends on a fundamental understanding of the human visual system. If we truly wish to understand and emulate the human visual system, we should take its entire architecture into careful consideration (5). Currently, a major difference between most artificial networks and the human visual system lies at the very basis of the human visual system: eye movements. The entire visual system functions within the context of eye movements. As such, we might expect that it uses the quirks induced by eye movements (such as the motion streaks) and that it is equipped with tricks to distinguish the motion stimuli created by eye movements from the movements of objects in the outside world: Did the world move or did the eyes?
Recent insights demonstrate that the alternating cycle of fixations and eye movements modifies the spectral content of visual input in a way that appears highly adaptive in our natural environment (6, 7). The new results of Schweitzer and Rolfs now show that the brain adapts even specific parts of the spectral content of the retinal motion streak generated by an eye movement. Together, these data demonstrate that eye movements are a fundamental component in the human visual system—not a nuisance, but useful.
In order to understand, replicate, and improve our visual system, we should not forget that processing starts in the eyes and that their mobility adds unique and fundamental features to the entire visual processing stream. It is in those features where part of the efficiency of the human visual system starts.
REFERENCES
- 1.Bridgeman B., van der Heijden A. H. C., Velichkovsky B. M., A theory of visual stability across saccadic eye movements. Behav. Brain Sci. 17, 247–292 (1994). [Google Scholar]
- 2.Melcher D., Visual stability. Phil. Trans. R. Soc. B 366, 468–475 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Schweitzer R., Rolfs M., Intra-saccadic motion streaks jump-start gaze correction. Sci. Adv. 7, abf2218 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Fabius J. H., Fracasso A., Nijboer T. C. W., Van der Stigchel S., Time course of spatiotopic updating across saccades. Proc. Natl. Acad. Sci. U.S.A. 116, 2027–2032 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.George D., Lázaro-Gredilla M., Guntupalli J. S., From CAPTCHA to commonsense: How brain can teach us about artificial intelligence. Front. Comput. Neurosci. 14, 554097 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Boi M., Poletti M., Victor J. D., Rucci M., Consequences of the oculomotor cycle for the dynamics of perception. Curr. Biol. 27, 1268–1277 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Mostofi N., Zhao Z., Intoy J., Boi M., Victor J. D., Rucci M., Spatiotemporal content of saccade transients. Curr. Biol. 30, 3999–4008.e2 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]