Abstract
The discovery of place cells and head direction cells in the hippocampal formation of freely foraging rodents has led to an emphasis of its role in encoding allocentric spatial relationships. In contrast, studies in head-fixed primates have additionally found representations of spatial views. We review recent experiments in freely moving monkeys that expand upon these findings and show that postural variables such as eye/head movements strongly influence neural activity in the hippocampal formation, suggesting that the function of the hippocampus depends on where the animal looks. We interpret these results in the light of recent studies in humans performing challenging navigation tasks which suggest that depending on the context, eye/head movements serve one of two roles – gathering information about the structure of the environment (active sensing) or externalizing the contents of internal beliefs/deliberation (embodied cognition). These findings prompt future experimental investigations into the information carried by signals flowing between the hippocampal formation and the brain regions controlling postural variables, and constitute a basis for updating computational theories of the hippocampal system to accommodate the influence of eye/head movements.
Introduction
The term navigation can be broadly described as the transition of an entity from one state to another (Hinman et. al. 2018). It might involve taking a trajectory through physical space (Eichenbaum 2017) or achieving an understanding in abstract space (Garvert et al. 2017). Perhaps all movement and thought is navigation. But one thing is clear: navigation is a fundamental cross-species behavior (Sosa et. al. 2021; Poulter et. al. 2018). From mice exploring labyrinths (Rosenberg et. al. 2021) to bats echolocating in flight (Ulanovsky et. al. 2007) to monkeys catching fireflies in virtual reality (Lakshminarasimhan et. al. 2020) to taxi drivers taking routes from memory (Maguire et. al. 2000), species exhibit assorted modes of navigation. Research across species has uncovered many shared motifs, but each species also contributes unique insights towards understanding the neural mechanisms and cognitive processes underlying navigation.
In this review, we first briefly summarize the similarities and differences between neural responses in the rodent and primate hippocampal formation. We highlight recent works which suggest that instead of (or in addition to) representing one’s location, which appears to be the primary function of the rodent hippocampus, the primate hippocampus is highly sensitive to the animals’ interaction with visual space. Then, we interpret this latter finding in the context of recent behavioral work on humans performing computationally demanding, naturalistic navigation tasks. Specifically, we hypothesize that eye movements shape the activity of the primate hippocampal formation during active sensing because they embody the planning computations facilitated by hippocampal representations.
The rodent hippocampal formation in spatial navigation
The substantial body of work in rodent spatial navigation suggests that the hippocampus wears many hats. During free foraging, some cells in the rodent hippocampus preferentially fire at specific locations in space, constituting a place code (O’Keefe 1976; O’Keefe and Nadel, 1978; Fig. 1). Place fields become more robust with prolonged exposure to an environment (Wilson et. al. 1993), while changes to the environmental cues result in partial or complete place field remapping (Muller et. al. 1987; Fyhn et. al. 2007; Jeffery 2011). On the other hand, cells in the medial entorhinal cortex were found to exhibit multiple firing fields that tessellate space in a regular hexagonal grid (Hafting et. al. 2005; Fig. 1). In addition, experiments have found head direction cells in the postsubiculum (Taube 2007), allocentric boundary cells in subiculum (Lever et al. 2009), and egocentric boundary cells in the retrosplenial (Alexander et al. 2020) and postrhinal (LaChance et al. 2019) cortices. These striking discoveries have supported the notion that the hippocampal formation constructs a cognitive map to facilitate navigation (Hartley et al. 2014).
Fig. 1. Differences in the orientation and relative sizes of Hippocampus (red) and Entorhinal Cortex (EC, cyan) in rodents, macaques and humans.

In rodents emphasis is on spatial location during real-world navigation (O’Keefe and Dostrovsky 1971; Fyhn et al. 2004; Hafting et al. 2005). Due to restrictions in clinical settings, human studies have almost exclusively used virtual navigation in patients (Ekstrom et al. 2003; Jacobs et al. 2013). Borrowed from Strange et al. 2014.
Furthermore, the hippocampus generates sequences to order memories and concepts in space and time (Buzsaki et. al. 2018). Sequential firing of place cells in rodents has been shown to encode spatial locations ahead of the animal (Johnson et al. 2007; Diba et al. 2007; Pfeiffer et al. 2013), locations behind the animal (Foster et al. 2006; Diba et al. 2007, Gupta et al. 2010), and remote locations (Lee et al. 2002; Karlsson et al. 2009). This suggests that the hippocampal code is a flexible representation that can be called upon to construct plans and make meaning out of experience and for the formation of cognitive maps. Interestingly, spatial properties have now been encountered in multiple brain areas beyond the hippocampal formation (Maisson et al. 2022) and conversely, hippocampal activation is also seen during non-spatial sequential decision tasks (Aronov et al. 2017; Sun et al. 2020). These diverse findings from rodent work have led to multiple computational theories of the hippocampal function.
Computational theories about the hippocampal formation
An emerging computational framework for interpreting the function of the hippocampal formation is reinforcement learning (RL). Navigation can be cast as a Markov decision process, where a state can be a location (or abstract concept), and an agent (animal or artificial being) can move between states by taking actions (Gershman et al., 2017, Behrens et al., 2018, Sosa et. al. 2021). To the extent that the goal of navigation (or any task) is to maximize future rewards, RL provides one possible way to predict and interpret hippocampal representations governing the animal’s policy i.e., actions chosen in each state (Tessereau et. al. 2021), and can explain why the hippocampus is engaged in non-spatial tasks. A comprehensive primer on the RL framework as it pertains to the hippocampal formation is provided by Whittington et. al. 2022.
Many new and old theories of hippocampal function can be interpreted using the RL framework. The classic cognitive map theory (Tolman et. al. 1948; O’Keefe and Nadel, 1978) posits that hippocampal formation represents a precise internal model of the spatial relationships between the states of the environment. According to this theory, navigation is model-based such that actions are chosen by planning using the internal model, and place cells and grid cells constitute the neural substrates of this model.
However, planning is computationally demanding especially in large environments with numerous states. This prompted a recent wave of studies that provide empirical support for a more computationally efficient representational theory for sequential decision-making tasks in humans – the successor representation (Momennejad et. al. 2017, Russek et. al. 2017). This theory also explains many empirical findings regarding hippocampal activity such as the skewness of place fields, remapping, and the geometric structure of grid fields (Stachenfeld et. al. 2017). Under this framework, the core function of the hippocampus is prediction, as place cell firing rates are posited to be proportional to the probability of visiting the corresponding place field, given the agent’s current position. Such probabilities are learned through experience, and are thus policy-dependent. Successor representation place fields are sensitive to the environment topology (Muller et. al. 1987) and skew backwards from the direction of travel (Mehta et. al. 2000), in agreement with experimental evidence. A low dimensional eigen-decomposition of the successor representation exhibits many qualities of entorhinal grid cells, including their alignment to environmental boundaries (Krupic et. al. 2015). This theory suggests that grid cells highlight bottlenecks which partition the world, as well as smoothen updates to predictive representations (Stachenfeld et. al. 2017).
Successor representation models account for the spatial structure of hippocampal activity, but do not explain the striking temporal structures such as sequential firing patterns and replays. An RL-based computational theory of prioritized memory access has recently been proposed to account for these robust temporal dynamics (Mattar et. al. 2018; Liu et. al. 2021). According to this theory, sequences and replays help improve the efficiency of learning and planning algorithms and may be interpreted as mental simulations (Hasselmo and Eichenbaum, 2005; Erdem and Hasselmo, 2012; Erdem and Hasselmo, 2014; Kubie and Fenton, 2012) that are restricted to specific regions of the state space based on their relevance to the navigation problem.
These RL-based theories were mainly geared towards explaining data from rodents performing spatial navigation tasks. However, research in primates and humans had revealed that the role of hippocampal formation extends to non-spatial domains. A new framework of relational inference, called the Tolman-Eichenbaum machine proposes that hippocampus and entorhinal cortex are involved in structural abstraction and generalization in spatial and relational memory tasks (Whittington et. al. 2020). This theory accounts for the conjunctive nature of hippocampal responses whereby neurons respond to both structural and sensory variables inherited from the medial and lateral entorhinal cortex, respectively. Such mechanistically grounded theories are helpful in generating precise predictions and can accelerate progress by informing experimental design. For example, attractor dynamics in the hippocampal formation was previously theorized as a possible mechanism of path integration (McNaughton et al. 2006, Rolls 2007, Burak et al. 2009), and has subsequently been supported by experiments (Yoon et al. 2013, Agmon and Burak, 2020). Another recent theory based on the Bayesian inference framework posits that the distortions induced in the response fields of grid cells due to changes in environmental geometry reflect the animal’s uncertainty in representing their spatial location from noisy visual input (Kang et al. 2023).
In general, the above theories and other computational accounts of the hippocampal formation begin to address its role in integration, inference, learning, planning, and remapping, computations that manifest diverse behavioral and neural signatures depending on the task and context. These theories, although fragmented at the outset, are likely to be directly linked. Since natural behaviors are interactive, expanding the above theories to incorporate the role of information gathering can help unify them. Information gathering is typically performed through whisking or visual exploration. Although rodent studies have only recently begun to consider the role of visual exploration (Payne et al. 2017; Mallory et al. 2021; Michaiel et al. 2020), the above perspective enjoys experimental support from experiments in primates, which we describe below.
Eye movement signals strongly infuse the primate hippocampal formation
The neurophysiology of the primate hippocampal formation has been explored less well compared to rodents, and previous studies had focused on a limited range of variables (Doeller et al. 2010; Ekstrom et al. 2003; Jacobs et al. 2013; Fig. 1), almost exclusively with head-fixed monkeys either performing cart navigation (Matsumura et al. 1999; O’Mara et al. 1994) or moving through a virtual environment (Furuya et al. 2014; Gulli et al. 2020; Wirth et al. 2017). In general, macaque hippocampal neurons show some location-specific hippocampal activity, reminiscent of rodent place cells, albeit more dispersed and less prevalent (Courellis et al. 2019; Hazama and Tamura 2019; Ludvig et al. 2004). Similar to the results of rodent studies, lesions of the monkey hippocampus impair allocentric, but not egocentric, navigation (Lavenex et. al. 2008; Rueckemann et. al. 2017).
Perhaps the most provocative finding distinguishing the primate hippocampal formation from that of the rodent is that the former is strongly modulated by vision (Nau et al. 2018; Rolls, 2021). Some of the earliest recordings in monkeys aimed at disentangling representations of place from spatial view (Rolls et al. 1995; 1997; Feigenbaum et al. 1991, Robertson et al. 1998; Georges-Francois et al. 1999; Rolls 1999), found that the primate hippocampus seems to encode views of particular regions of space, rather than the position of the monkey in the environment. Since then, there has been growing evidence for the existence of gaze-centered spatial representations in the hippocampal formation of head-fixed macaques (Killian et al. 2012; Meister and Buffalo, 2018; Wirth et al. 2017).
Flexible head movements are a critical component of natural behavior since they afford the opportunity to control one’s spatial view. Until recently, this phenomenon had not been interrogated by monitoring both 3D eye and head movements in truly freely-moving primates. Electrophysiological studies which reported the canonical place code in monkeys during free movement (Courellis et. al. 2019) or virtual navigation (Furuya et. al. 2014) and in humans during virtual games (Ekstrom et. al. 2003; Miller et. al. 2013) did not dissociate 3D head facing location from 3D spatial view, head movements, eye movements, nor body orientation. However, 3D head-facing location (the location of the arena towards which the head faces) and 3D spatial view (the location of the arena that is gazed upon) can be easily confounded when eye movements are not considered but they can be disambiguated by combining eye-in-head movement and 3D head orientation. We have recently disambiguated these variables in the study of the macaque hippocampal formation by analyzing simultaneously tracked head and eye positions during head-unrestrained foraging (Mao et. al. 2021; Fig. 2). By combining head tracking, wireless eye tracking, and telemetric recordings across multiple regions – the hippocampus (HPC), entorhinal cortex (EC), and subicular complex (SUB) – in freely-foraging macaques with a multimodal modeling approach to characterize the neural encoding of spatial variables, we have provided a quantitatively characterization of how the macaque HF encodes space under ethological conditions.
Fig. 2. Large-scale recordings across the entire Hippocampal Formation of freely-moving macaques.

From left to right: Recording drive and skull models. Segmented models of the brain, hippocampus (blue), entorhinal cortex (green) and blood vessels (red). Electrodes are visible as tracks in the co-registered images. Animals were instrumented to wear an eye tracker. Neural logger recordings and wireless transmission. Adapted from Mao et al. 2021.
In contrast to rodents, and in line with previous primate studies (Ekstrom et al. 2005; Talakoub et al. 2019; Watrous et al. 2013; Courellis et. al. 2019), we found that the theta rhythm is discontinuous in primates (Fig. 3). Interestingly, despite low power theta oscillations in macaques, theta phase coding is conserved: a significant fraction of neurons were phase-locked to low theta (56%) and theta (25%). Nevertheless, theta phase precession existed only in a very small number of random-foraging macaque hippocampal neurons, in contrast to its abundance among rodent place cells (Mizuseki et al. 2009). In this respect, the primate hippocampus has more in common with the bat place coding system (Eliav et al. 2018). It remains to be determined whether theta phase precession becomes more prominent when macaques engage in cognitively demanding memory consolidation and decision-making tasks (Zhang et al. 2015), as well as whether and how distinct theta bands and theta phase precession are tied to specific aspects of a task.
Fig. 3. Lack of sustained theta oscillations in freely-moving macaques.

Left: a short segment of the LFP spectrogram with speed trace overlaid in white: clearly there is no sustained theta during locomotion. Right: power spectrum of the oscillatory component, color-coded by locomotion speed; rigorous there is a local maximum around 5Hz and a broad peak around 20 Hz, and their relative power is modulated by running speed (from Mao et al. 2021).
By using a multimodal modeling approach, Mao et al. (2021) found that many neurons that were responsive to animal position in the world also encoded other variables (Fig. 4), in alignment with the high dimensional coding in the hippocampus reported by previous studies (Wirth et. al. 2017; Gulli et. al. 2020; Bernardi et. al. 2020). Perhaps the most unexpected finding was that the representation of head facing location and head tilt were more dominant than position. In fact, head facing location was consistently the best predictor for single neuron activity across regions and animals (Mao et al. 2021). These results expand upon the neural responses described previously in macaques in the form of ‘spatial view’ cells (Rolls et al. 1997), and indicate the importance of using naturalistic foraging tasks in which head-centered vs. eye-centered reference frames are naturally decoupled. While the strong head tilt correlates in the hippocampus may appear surprising, tilt tuning has nevertheless been shown to be prevalent in the bat presubiculum (Finkelstein et al. 2014), monkey anterior thalamus (Laurens et al. 2016) and mouse anterior thalamus and retrosplenial cortex (Angelaki et al. 2020). These results complement earlier findings of hippocampal representation of spatial view along the vertical dimension, independent of head tilt (reviewed in Rolls et al. 2022). Thus, vertical orientation may be an important component in spatial coding and behavior across species.
Fig. 4. Model-based approach to quantify simultaneous encoding of multiple variables.

(A) Model schematic (for details see Mao et al. 2021). (B) Left: Fraction of encoding neurons for each variable in each region (defined in the coronal section on the right. HPC: blue; EC: orange; SUB: yellow). The main model included 8 variables: 3D position, which was decomposed into horizontal position (Pos) and head height (HH), angular velocity (AV), linear speed (Spd), egocentric boundary (EB), facing location (FL), as well as 3D orientation, which was decomposed into head tilt (HT) and azimuth head direction (HD). (C) prevalence of mixed selectivity across areas (from Mao et al. 2021).
Using such a multimodal, model-based analysis, we found that only 7% of the cells recorded in the hippocampal formation were true place cells, which is much lower than the proportion reported by other primate studies using traditional tuning curve analysis (Matsumura et al. 1999; Hazama and Tamura, 2019; Ludvig et al. 2004; Courellis et al. 2019), but consistent with the conclusions drawn from earlier multimodal approaches based on information theory (Robertson et al. 1998, Georges-Francois et al. 1999). It is important to note that the discrepancy between these findings was not data-driven, but rather analysis-driven: using traditional tuning curve analysis, the proportion of place coding cells (26%) reported by Mao et al. (2021) is not different from the conclusions of previous monkey studies. However, by considering multiple variables simultaneously, activation previously attributed to place coding, due to taking averages to obtain tuning curves, is now assigned to other variables that were not measured in previous experiments. Thus, we emphasize the advantages of taking a multimodal modeling approach, as well as the need to consider all postural variables, when analyzing neurons in the hippocampal formation.
Many neurons in freely-moving macaques, particularly in the EC (22%), were tuned to egocentric eye-in-head position and/or velocity (Fig. 5A). Similar findings have been reported in head-fixed macaques exploring images on a screen (Killian et al. 2012; Meister and Buffalo, 2018; Nowicka and Ringo, 2000). In addition, we found that many cells, particularly in the hippocampus and subiculum, were modulated by saccadic eye movements and coupled to the local field potential, as previously reported in head-fixed animals (Doucet et al. 2020; Jutras et al. 2013). Despite strong eye movement-related responses in the hippocampal formation, we found that the macaque entorhinal cortex showed limited grid tuning for allocentric location and spatial view. Although a slightly greater percentage (11.9%) of EC cells were reported to show grid tuning based on eye movements in head-fixed macaques (Killian et al. 2012), compared to 9% freely-moving macaques (Mao et al. 2021) the tuning of visual grid tuning is far less robust compared to rodent grid cells. Nevertheless, even if visual grid-like tuning is weak, the coding of eye movements within the macaque navigation circuit is robust during both head-fixed and natural behavior.
Fig. 5. Eye movements strongly influence hippocampal formation activity.

(A) Fraction of neurons in each region (color-coded bars) encoding Eye Position, Eye Velocity and Saccade event. (B) Fraction of neurons in each in region encoding FL, spatial view (SV), azimuth, head direction (HD) and gaze direction (GD) when fitting all head- and gaze-related variables simultaneously (from Mao et al. 2021).
Interestingly, the hippocampus, entorhinal cortex, and subiculum were found to encode both spatial view and head-facing location (Fig. 5B). We propose that, because primates saccade continuously to explore their environment, head-anchoring may provide a more stable representation (Mao et al. 2021). Specifically, head facing direction might provide a useful scaffolding for organizing more detailed information contained in the spatial view. Facing direction might itself be, in turn, scaffolded into more abstract variables like the animal’s direction of heading in the world. Thus, information in the hippocampal formation might be organized hierarchically, with different levels of representation being appropriate for different tasks that involve navigation and memory retrieval. For instance, navigating in impoverished environments might rely on more abstract representations while feature-rich settings are likely to benefit from information about the spatial view. This perspective might help reconcile discrepancies in the relative proportions of different types of cells found under different experimental paradigms and across different species.
In summary, by disentangling traditionally intertwined variables, Mao et. al. (2021) found that eye movements strongly modulate neural responses in the primate hippocampal formation during unconstrained, freely-moving behaviors. The prevalence of eye movement representations in the primate hippocampal formation is not surprising, given that the eyes actively interrogate and efficiently gather information. The general consensus is that the reported eye movement encoding in navigation circuits may reflect task-relevant variables: e.g., eye movements may reflect internal deliberation and the prioritization of goals in real-time (Gottlieb and Oudeyer, 2018; Yang et al. 2016, Yang et al. 2018), but the functional principles remain poorly explored. Next, we summarize two recent studies showing that eye movements during navigation primarily embody subjects’ beliefs generated by their internal model (Lakshminarasimhan et al. 2020) or primarily reflect active sampling for improving their internal model (Zhu et al. 2022).
Eye movements embody internal beliefs during path integration
That vision facilitates navigation is easy to fathom. Navigators might compute their bearing with respect to a visible landmark, triangulate their allocentric position using combinations of landmarks, or even memorize sequences of landmarks to find their way (Rolls 2021). But there is much more to this, particularly in primates. To explore the neural basis of visual path integration, our lab designed a virtual reality task in which participants used a joystick to navigate to the location of a target after it disappeared (Lakshminarasimhan et. al. 2018). Only transient visual stimuli, termed optic flow (Warren et. al. 1988), and/or vestibular cues conferred by a motion platform (Stavropoulos et al. 2022), were available to guide path integration. This task, reminiscent of ‘catching fireflies’ at night, elicits naturalistic prey capture behaviors (Ngo et. al. 2022) and the associated neural computations (Lakshminarasimhan et. al. 2022; Noel et. al. 2022). Humans and monkeys successfully integrated optic flow to navigate by path integration (Lakshminarasimhan et. al. 2018; Alefantis et. al. 2022; Stavropoulos et. al. 2022).
The most remarkable behavioral finding was that oculomotor circuits contribute to landmark-free navigation. Both humans and monkeys track the invisible target with their eyes while performing path integration (Fig. 6; Lakshminarasimhan et. al. 2020). Navigation errors were correlated with target tracking errors, suggesting that participants’ eye movements reflected their evolving internal beliefs about the target location. Accurate goal-tracking by gaze was associated with improved task performance, and inhibiting eye movements impaired navigation precision.
Fig. 6. Eye movements dynamics during virtual navigation using a joystick.

(A) Firefly task. (B) Time-course of the horizontal and vertical eye position during a random subset of trials. Red dots denote the end of steering. (C) Accurate target-tracking is associated with increased task performance. Time-course of the target-tracking index shown separately for trials in which monkeys stopped within the reward zone (blue) or outside it (red). Shaded regions denote ± 1 standard error estimated by bootstrapping (from Lakshminarasimhan et al. 2020).
One potential explanation for these findings is that looking at the goal might optimize the structure of optic flow fields to facilitate the inference of self-motion (Land et. al. 1994; Wann et. al. 2000). Alternatively, participants’ gaze may have embodied the contents of working memory (Postle et. al. 2006; Theeuwes et. al. 2009; Hannula et. al. 2010; Luke et. al. 2018). The latter hypothesis is supported by the observation that when human participants were required to navigate using only vestibular cues (in the dark), they still tracked the invisible target (unpublished observation). Memory offloading is a prime example of embodied cognition where the boundaries between perceptual, cognitive, or motor neural systems vanish due to the representations being distributed across neural systems (Wilson 2002; Shapiro 2010; Bridgeman et. al. 2011; Foglia et. al. 2013) and could facilitate observing the contents of the hippocampal memory-activation indirectly through behavior.
While the causal role of hippocampus in the above task remains to be tested, it is worth noting that spatial view cells outside the CA3 region of the macaque hippocampal formation remain responsive in the dark (Robertson et al. 1998). Furthermore, Wirth et al. (2021) describe a subset of hippocampal neurons that predicts the contents of the upcoming spatial view. This is reminiscent of neurons in the rodent entorhinal cortex predictive of spatial position, which have been implicated in path integration behaviors (Campbell et al. 2018, Campbell et al. 2021). Therefore, one possibility is that the activity of spatial view cells is relayed to oculomotor areas to support embodiment of the spatial location of the goal during path integration.
The virtual environment in the above experiments comprised a featureless ground plane with no structure. Consequently, those studies did not test the extent to which humans could navigate optimally in large state spaces with complex transition structure – a combination that makes sequential decision-making notoriously hard. How do humans perform planning in such settings? This was addressed by another experiment which we address next.
Eye movements support navigational planning in structured environments
It just so happens that animals often navigate without planning substantially, such as by moving towards a visible beacon (Jain et. al. 2017), although in more complex scenarios like driving through a city (Maguire et. al. 2000), this strategy would be clearly inefficient. A navigator could also take a route from memory, and thus would not need to wrangle with a model of the environment (Shamash et. al. 2021). However, humans have demonstrated flexible, sometimes stratified planning behaviors with impressive planning horizons (Balaguer et. al. 2016; Mattar and Lengyel, 2022). Furthermore, humans (Liu et. al. 2019) and rodents (Pfeiffer et. al. 2013) exhibit neural evidence of rehearsing future action sequences never previously taken, which suggests that they skillfully use representations in the hippocampal formation to create their desired future (Kubie and Fenton, 2012; Erdem and Hasselmo, 2012; Erdem and Hasselmo, 2014). In fact, damage to the right hippocampus impairs allocentric model-based planning in humans (Vikbladh et. al. 2019) and detour taking in rodents (Winocur et. al. 2010).
Planning, the evaluation of prospective future actions using a model of the environment, plays a critical role in sequential decision making (Hunt et al. 2021; Mattar and Lengyel, 2022) – thus, also navigation. Planning might be loosely defined as using knowledge and evidence to make decisions about future actions and can be contrasted with acting out of habit (Hunt et. al. 2021). Choosing an optimal sequence of actions requires knowledge of the transition model - a mapping from the current state to the successor state associated with each possible action. This makes sequential decision-making substantially harder particularly in large state spaces. Under unfamiliar or uncertain task conditions, planning may depend upon and occur also in conjunction with active sensing, the cognitively motivated process of gathering information from the environment (Kaplan et al. 2018).
In common between the diverse studies of planning (which constitute a subset of a much larger body of work) is the underlying assumption that a model should be well-learned before quality planning can take place. As spatial representations are grounded by sensory stimuli, it might then appear upon first glance that vision best serves planning through allowing for the learning of a cognitive map. But this may not be the whole story: Could it be that vision more directly serves planning (in the absence of a learned cognitive map) such that even forgetful animals in unfamiliar environments could make good plans? To answer this question, we designed a virtual reality navigation task where participants were asked to navigate to transiently visible targets using a joystick in unfamiliar arenas with obstacles of varying degrees of complexity (Zhu et al., 2022). Although the obstacles served as barriers, they were relatively short such that human participants could look beyond them and learn about the structure of the environment at distal locations. The key motivation is that the paradigm used in this study allowed participants to use eye movements to visually sample the environment and learn the transition model in conjunction with planning. What are the principles governing eye movements under these conditions?
First, Zhu et al. (2022) found that humans took trajectories nearly optimal in length through complex, unfamiliar arenas and they spent more time planning prior to navigation in more complex environments. Using the RL framework, they identified spatial locations in the arenas that are most relevant for planning. Simulations confirmed that agents that prioritize gathering information about the transition structure at the relevant locations substantially outperform other agents achieving near-optimal navigation performance with very few samples.
Second, Zhu et al. (2022) showed that humans can plan optimal trajectories for navigation while simultaneously using eye movements to visually sample the environment and learn the transition dynamics in conjunction with planning. In fact, human participants balanced foveating the hidden reward (i.e., remembered goal) location with viewing highly task-relevant regions of space both prior to and during active navigation, and that environmental complexity mediated a trade-off between the two modes of information sampling (Fig. 7). The spatial distribution of gaze was largely concentrated at the hidden goal location in the simplest environments (as in the firefly task of Fig. 6), but participants increasingly interrogated the task-relevant structure of the environment as the arena complexity increased. In the temporal domain, participants often rapidly traced their future trajectory to and from the goal with their eyes (sweeping), and generally concentrated their gaze upon one subgoal (turn) at a time until they reached their destination. In fact, participants made sequential eye movements sweeping forward and/or backward along the future trajectory, reminiscent of sequential neural activity in the rodent hippocampus.
Fig. 7. Gaze is split between the remembered goal and chosen trajectory.

(A) Human participants wore a VR headset and navigated mazes of varying complexity towards a memorized goal. (B) Overhead view of one of the mazes. (C) The search epoch was defined as the period between goal stimulus appearance and goal stimulus foveation. A threshold applied on the filtered joystick input (movement velocity) was used to delineate the pre-movement and movement epochs. (D) Fraction of time gaze follows goal (target, green) vs. the chosen trajectory (purple). There is a trade-off between gathering information about the relevant aspects of the environmental structure and looking at the remembered goal location based on environmental complexity (from Zhu et al. 2022).
Role of hippocampus in planning
Whether these forward and reverse sweeping sequences relate to hippocampal temporal sequences remains to be explored. We hypothesize that direct or indirect hippocampal projections to higher oculomotor controllers (e.g. the supplemental eye fields through the orbitofrontal cortex) may allow eye movements to embody the underlying activations of state representations (Larson et al. 2009; Wilming et al. 2017; Hannula et al. 2009). This would allow memory replays to influence eye movements (Fig. 8, ‘mental simulation’ or planning). Alternatively, but not mutually exclusive, active sampling could be a result of rapid peripheral vision processing which drives saccade generation, such that the eye movements reflect the outcome of sensory processing rather than prior experience (Fig. 8, ‘active sensing’). Consistent with this idea, past studies have demonstrated that humans can smoothly trace paths through entirely novel 2D mazes (Crowe et al. 2000; Crowe et al. 2004).
Figure 8. Hypothesis of how to model the cognitive factors driving eye movements during navigation.

In novel environments, eye movements might be used primarily to sample information from the world to learn a model via active sensing (left). In familiar environments with a good internal model, eye movements might primarily embody mental computations such as internal deliberation or planning, and thus correspond to samples from memory (right).
Planning and active sensing share many similarities. In fact, Hunt et. al. (2021) noted that both processes lead to the creation of new policies, although the source of information is one’s cognitive map during planning (Callaway et. al. 2022), while the source of information is the environment during active sensing (Hunt et. al. 2021). However, the actions chosen by planning using an internal model stored in memory (Mattar and Daw, 2018) and those chosen by active sensing to minimize uncertainty in that internal model (Yang et. al. 2018) may coincide if the respective state spaces governing the two cognitive processes are aligned, which is likely the case when we go about our lives mindfully, watching where we are going. This is indeed the assumption underlying the active inference theory, which provides a computational and process-level account of how model-based planning and model learning occur simultaneously, serving the general principle of minimizing surprise (Friston et. al. 2016; Kaplan et al. 2018; Parr et. al. 2019).
We propose that these principles governing the functional roles of eye movements during visually-guided navigation may explain why the contents of gaze have been found to influence activity in the primate hippocampus and entorhinal cortex (Turk-Browne, 2019; Liu et al. 2017; Monaco et al. 2014; Jun et al. 2016; Fotowat et al. 2019; Ringo et al. 1994). Therefore, it is conceivable that sequential neural activity could emerge partly from consolidating temporally extended eye movements such as sweeps into a map of spatial relationships. Simultaneous recordings from brain areas involved in visual processing, eye movement control, and the hippocampal formation would uncover the mechanisms underlying trajectory sweeping eye movements and their relationship to planning and memory.
Taken together, we propose that the spatiotemporal dynamics of gaze are significantly shaped by cognitive computations underlying sequential decision-making tasks like navigation. The neural circuitry governing the oculomotor system optimally schedules and allocates resources to tackle the diverse cognitive demands of navigation, producing efficient eye movements through space and time. Planning is often regarded as a covert computation involving mental simulation which is similar to directing covert attention towards one’s memory without the movement of a sensory apparatus. The premotor theory of attention states that premotor circuits covertly orient attention, even if no movement is made during active sensing (Rizzolatti et. al. 1987). The theta rhythm may play a major role in this process (Buzsaki 2002). However, as demonstrated by Zhu et al. 2022, planning may also take place in conjunction with active information search which can be overtly studied through tracking the movements of animals’ sensory apparatuses (Hunt et. al. 2021).
Eye movements can reveal the dynamics of active sensing during many common tasks like viewing natural images (Yarbus 1967; Noton et. al. 1971), searching for a target stimulus (Najemnik et. al. 2005), and making decisions via visual inference (Orquin et. al. 2013). We here recapitulate the definition of active sensing as sensing guided by attention and intent (Wundt 1902; Schroeder et. al. 2010; Zweifel et. al. 2020). Where you direct your fovea in visual space during active sensing is where you direct your spotlight of attention, and this spotlight is moved during saccades to turn visual information into a temporal sequence of samples. Incidentally, RL provides a formal interpretation of active sensing, which can be understood as optimizing information sampling for the purpose of improving knowledge about the environment, allowing for better planning and ultimately greater long-term reward (Yang et al. 2018). Just as a note, although we have been discussing vision and overt active sensing, sensing
Concluding Remarks
The differences between the signals found in the primate vs. the rodent hippocampal formation are consistent with the species’ divergent adaptations evolved over time. While rodents might learn a model of the environment mainly through physical experience, primates have evolved highly advanced visual and semantic systems to rapidly gather the most relevant information (Prusky et. al. 2004; Huberman et. al. 2011).
However, these findings do not preclude that the common thread between the mammalian hippocampal formations of different species involves the interplay between a structural code, sensory inputs, and conjunctive binding (Mathis et. al. 2015; Whittington et. al. 2020), and that these codes serve prediction and is updated through replay. The multiplexed nature of the hippocampal code (Gulli et. al. 2020; Mao et. al. 2021) supports the previously hypothesized role that it plays in binding information about relational structure to sensory information (Whittington et. al. 2020). Replay has been observed in humans (Schapiro et. al. 2018; Liu et. al. 2019; Schuck et. al. 2019), and behavioral evidence of replay has been recorded in monkeys (Zuo et. al. 2020; Shushruth et. al. 2022). Uncovering neural evidence of replay in monkeys may require the combination of innovative experimental paradigms, advanced computational methods to detect sequential activity from sparse data, and more powerful recording techniques to capture the responses of many spatially tuned neurons in the hippocampus simultaneously.
On the other hand, spatial place and grid cell tuning is at best weak in monkeys and humans, although grid like firing was exhibited by humans evaluating stimuli along arbitrary dimensions like neck and leg length (Constantinescu et. al. 2016), olfactory space (Bao et. al. 2019), semantic space (Vigano et. al. 2021) and object spaces (Mack et. al. 2016). It is conceivable that the structural code in primates is high-dimensional (Mathis et. al. 2015) and flexibly modulated by task demands, such that it doesn’t reveal itself as a place/grid during physical navigation, although it becomes effectively low-dimensional when the attended stimulus is on a flat screen (Killian et. al. 2012). Nevertheless, hippocampal activation during abstract space navigation has also been described in rodents (Aronov et al. 2017; Sun et. al. 2020).
One possibility is that the capacity for abstraction, as conferred by a more advanced neocortex, is accompanied by the capacity for mental travel to locations and concepts outside of the body (Tulving 1985; Corballis et. al. 2019), and the neural apparatus originally evolved to represent an animal’s immediate position in their subjective model of the world now primarily encodes (in species like primates) the locus of attention in an appropriately allocated neural subspace. For example, in a virtual reality study where humans learned about the location and identity of treasures by exploring an arena from an egocentric perspective, but performed spatial recall from an allocentric vantage point, researchers observed that electrophysiological signals in the hippocampus mainly reflect allocentric goal locations (Tsitsiklis et. al. 2020).
Perhaps in primates, most natural behaviors entail mentally travelling through a series of abstract goals and subgoals, and then bridging our bodies to where our minds are. In studies of eye movements during everyday tasks, humans tend to look at the bounce point of balls, the end point of reaches, the principal ingredient in next step of a long procedure like making a sandwich, and so on (Land et. al. 2001; Hayhoe et. al. 2005; Foulsham 2015). Since the head facing location or gaze of an animal often reflects its object of attention, it makes sense that the contents of vision and eye movements are often reflected in the activity of the hippocampal formation as new spatial relationships are being learned via active sensing (Zhu et. al. 2022). On the other hand, eye movements observed during path integration by humans and monkeys suggest a cognitive advantage of embodying through gaze the contents of core cognition (Lakshminarasimhan et al. 2020). Such an embodiment is most likely to occur in familiar environments, reflecting the contents of memory recall or mental simulation and involve information flow in the other direction: from the hippocampal formation to cortical areas controlling eye movements. Developing a comprehensive computational theory of the mechanisms governing learning and planning in primates requires further empirical studies of bidirectional communication between hippocampal formation and oculomotor centers during naturalistic behaviors.
References
- Agmon H, & Burak Y (2020). A theory of joint attractor dynamics in the hippocampus and the entorhinal cortex accounts for artificial remapping and grid cell field-to-field variability. ELife, 9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Alefantis P, Lakshminarasimhan K, Avila E, Noel JP, Pitkow X, and Angelaki DE (2022). Sensory Evidence Accumulation Using Optic Flow in a Naturalistic Navigation Task. Journal of Neuroscience, 42(27):5451–5462. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Alexander AS, Carstensen LC, Hinman JR, Raudies F, William Chapman G, & Hasselmo ME (2020). Egocentric boundary vector tuning of the retrosplenial cortex. Science Advances, 6(8). 10.1126/sciadv.aaz2322 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Angelaki DE, Ng J, Abrego AM, Cham HX, Asprodini EK, Dickman JD, & Laurens J (2020). A gravity-based three-dimensional compass in the mouse brain. Nature Communications, 11(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aronov D, Nevers R, & Tank DW (2017). Mapping of a non-spatial dimension by the hippocampal-entorhinal circuit. Nature, 543(7647). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Balaguer J, Spiers H, Hassabis D, and Summerfield C (2016). Neural Mechanisms of Hierarchical Planning in a Virtual Subway Network. Neuron, 90(4):893–903. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bao X, Gjorgieva E, Shanahan LK, Howard JD, Kahnt T, & Gottfried JA (2019). Grid-like Neural Representations Support Olfactory Navigation of a Two-Dimensional Odor Space. Neuron, 102(5). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Behrens TEJ, Muller TH, Whittington JCR, Mark S, Baram AB, Stachenfeld KL, & Kurth-Nelson Z (2018). What Is a Cognitive Map? Organizing Knowledge for Flexible Behavior. Neuron, 100(2), 490–509. [DOI] [PubMed] [Google Scholar]
- Bernardi S, Benna MK, Rigotti M, Munuera J, Fusi S, and Salzman CD (2020). The Geometry of Abstraction in the Hippocampus and Prefrontal Cortex. Cell, 183(4):954–967.e21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bridgeman B and Tseng P (2011). Embodied cognition and the perception-action link. Physics of Life Reviews, 8(1):73–85. [DOI] [PubMed] [Google Scholar]
- Burak Y, & Fiete IR (2009). Accurate path integration in continuous attractor network models of grid cells. PLoS Computational Biology, 5(2). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Buzsaki G (2002). Theta Oscillations in the Hippocampus. Neuron, 33:1–16. [DOI] [PubMed] [Google Scholar]
- Buzsáki G and Tingley D (2018). Space and Time: The Hippocampus as a Sequence Generator. Trends in Cognitive Sciences, 22(10):853–869. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Callaway F, van Opheusden B, Gul S, Das P, Krueger PM, Griffiths TL, and Lieder F (2022). Rational use of cognitive resources in human planning. Nature Human Behaviour, 6(8):1112–1125. [DOI] [PubMed] [Google Scholar]
- Constantinescu AO, O’Reilly JX, and Behrens TE (2016). Organizing conceptual knowledge in humans with a gridlike code. Science, 352(6292):1464–1468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corballis MC (2019). Language, memory, and mental time travel: An evolutionary perspective. Frontiers in Human Neuroscience, 13(July):1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Courellis HS, Nummela SU, Metke M, Diehl GW, Bussell R, Cauwenberghs G, and Miller CT (2019). Spatial encoding in primate hippocampus during free navigation. PLoS Biology, 17(12):1–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Diba K, & Buzsaki G (2007). Forward and reverse hippocampal place-cell sequences during ripples. Nature Neuroscience, 10(10), 1241–1242. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Doeller CF, Barry C, & Burgess N (2010). Evidence for grid cells in a human memory network. Nature, 463(7281). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eichenbaum H (2017). The role of the hippocampus in navigation is memory. Journal of Neurophysiology, 117(4):1785–1796. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ekstrom AD, Kahana MJ, Caplan JB, Fields TA, Isham EA, Newman EL, and Fried I (2003). Cellular networks underlying human spatial navigation. Nature, 425(6954):184–187. [DOI] [PubMed] [Google Scholar]
- Ekstrom AD, Caplan JB, Ho E, Shattuck K, Fried I, & Kahana MJ (2005). Human hippocampal theta activity during virtual navigation. Hippocampus, 15(7). [DOI] [PubMed] [Google Scholar]
- Eliav T, Geva-Sagiv M, Yartsev MM, Finkelstein A, Rubin A, Las L, & Ulanovsky N (2018). Nonoscillatory Phase Coding and Synchronization in the Bat Hippocampal Formation. Cell, 175(4). [DOI] [PubMed] [Google Scholar]
- Erdem UM, & Hasselmo ME (2012). A Goal-Directed Spatial Navigation Model Using Forward Trajectory Planning Based on Grid Cells. 35(6), 916–931. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Erdem UM, & Hasselmo ME (2014). A biologically inspired hierarchical goal directed navigation model. Journal of Physiology Paris, 108(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feigenbaum JD and Rolls ET (1991). Allocentric and egocentric spatial information processing in the hippocampal formation of the behaving primate. Psychobiology, 19(1):21–40. [Google Scholar]
- Finkelstein A, Derdikman D, Rubin A, Foerster JN, Las L, & Ulanovsky N (2014). Three-dimensional head-direction coding in the bat brain. Nature, 517(7533). [DOI] [PubMed] [Google Scholar]
- Foglia L and Wilson RA (2013). Embodied cognition. Wiley Interdisciplinary Reviews: Cognitive Science, 4(3):319–325. [DOI] [PubMed] [Google Scholar]
- Foster DJ, & Wilson MA (2006). Reverse replay of behavioural sequences in hippocampal place cells during the awake state. Nature, 440(7084). [DOI] [PubMed] [Google Scholar]
- Fotowat H, Lee C, Jun JJ, and Maler L (2019). Neural activity in a hippocampus-like region of the teleost pallium are associated with navigation and active sensing. eLife, 8:e44119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Foulsham T (2015). Eye movements and their functions in everyday tasks. Eye (Basingstoke), 29(2):196–199. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Friston K, FitzGerald T, Rigoli F, Schwartenbeck P, O’Doherty J, and Pezzulo G (2016). Active inference and learning. Neuroscience and Biobehavioral Reviews, 68:862–879. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Furuya Y, Matsumoto J, Hori E, Boas CV, Tran AH, Shimada Y, Ono T, and Nishijo H (2014). Place-related neuronal activity in the monkey parahippocampal gyrus and hippocampal formation during virtual navigation. Hippocampus, 24(1):113–130. [DOI] [PubMed] [Google Scholar]
- Fyhn M, Hafting T, Treves A, Moser MB, and Moser EI (2007). Hippocampal remapping and grid realignment in entorhinal cortex. Nature, 446(7132):190–194. [DOI] [PubMed] [Google Scholar]
- Garvert MM, Dolan RJ, and Behrens TE (2017). A map of abstract relational knowledge in the human hippocampalentorhinal cortex. eLife, 6:1–20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Georges-François P, Rolls ET, and Robertson RG (1999). Spatial view cells in the primate hippocampus: Allocentric view not head direction or eye position or place. Cerebral Cortex, 9(3):197–212. [DOI] [PubMed] [Google Scholar]
- Gershman SJ, & Daw ND (2017). Reinforcement Learning and Episodic Memory in Humans and Animals: An Integrative Framework. Annual Review of Psychology, 68. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gottlieb J and Oudeyer PY (2018). Towards a neuroscience of active sampling and curiosity. Nature Reviews Neuroscience, 19(12):758–770. [DOI] [PubMed] [Google Scholar]
- Gulli RA, Duong LR, Corrigan BW, Doucet G, Williams S, Fusi S, and Martinez-Trujillo JC (2020). Context-dependent representations of objects and space in the primate hippocampus during virtual navigation. Nature Neuroscience, 23(1):103–112. [DOI] [PubMed] [Google Scholar]
- Gupta AS, van der Meer MAA, Touretzky DS, & Redish AD (2012). Segmentation of spatial experience by hippocampal theta sequences. Nature Neuroscience, 15(7), 1032–1039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hafting T, Fyhn M, Molden S, Moser MB, and Moser EI (2005). Microstructure of a spatial map in the entorhinal cortex. Nature, 436(7052):801–806. [DOI] [PubMed] [Google Scholar]
- Hannula D and Ranganath C (2009). The Eyes Have It: Hippocampal Activity Predicts Expression of Memory in Eye Movements. Neuron, 63(5):592–599. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hannula DE, Althoff RR, Warren DE, Riggs L, Cohen NJ, and Ryan JD (2010). Worth a glance: Using eye movements to investigate the cognitive neuroscience of memory. Frontiers in Human Neuroscience, 4(October):1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hartley T, Lever C, Burgess N, & O’Keefe J (2014). Space in the brain: How the hippocampal formation supports spatial cognition. In Philosophical Transactions of the Royal Society B: Biological Sciences (Vol. 369, Issue 1635). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hasselmo ME, & Eichenbaum Howard. (2005). Hippocampal mechanisms for the context-dependent retrieval of episodes. Neural Networks, 18(9). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hayhoe M and Ballard D (2005). Eye movements in natural behavior. Trends in Cognitive Sciences, 9(4):188–194. [DOI] [PubMed] [Google Scholar]
- Hazama Y, & Tamura R (2019). Effects of self-locomotion on the activity of place cells in the hippocampus of a freely behaving monkey. Neuroscience Letters, 701. [DOI] [PubMed] [Google Scholar]
- Hinman JR, Dannenberg H, Alexander AS, & Hasselmo ME (2018). Neural mechanisms of navigation involving interactions of cortical and subcortical structures. In Journal of Neurophysiology (Vol. 119, Issue 6). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Huberman AD and Niell CM (2011). What can mice tell us about how vision works? Trends Neursoci, 34(9):464–473. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hunt LT, Daw ND, Kaanders P, MacIver MA, Mugan U, Procyk E, Redish AD, Russo E, Scholl J, Stachenfeld K, Wilson CR, and Kolling N (2021). Formalizing planning and information search in naturalistic decision-making. Nature Neuroscience, 24(8):1051–1064. [DOI] [PubMed] [Google Scholar]
- Jacobs J, Weidemann CT, Miller JF, Solway A, Burke JF, Wei XX, Suthana N, Sperling MR, Sharan AD, Fried I, & Kahana MJ (2013). Direct recordings of grid-like neuronal activity in human spatial navigation. Nature Neuroscience, 16(9). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jain D, Jakhalekar IR, and Deshmukh SS (2017). Navigational Strategies and Their Neural Correlates. Journal of the Indian Institute of Science, 97(4):511–525. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jeffery KJ (2011). Place cells, grid cells, attractors, and remapping. Neural Plasticity, pp. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Johnson A, & Redish AD (2007). Neural ensembles in CA3 transiently encode paths forward of the animal at a decision point. Journal of Neuroscience, 27(45), 12176–12189. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jun JJ, Longtin A, and Maler L (2016). Active sensing associated with spatial learning reveals memory-based attention in an electric fish. Journal of Neurophysiology, 115(5):2577–2592. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jutras MJ, Fries P, and Buffalo EA (2013). Oscillatory activity in the monkey hippocampus during visual exploration and memory formation. Proceedings of the National Academy of Sciences of the United States of America, 110(32):13144–13149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kaplan R and Friston KJ (2018). Planning and navigation as active inference. Biological Cybernetics, 112(4):323–343. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Karlsson MP, & Frank LM (2009). Awake replay of remote experiences in the hippocampus. Nature Neuroscience, 12(7). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Killian NJ, Jutras MJ, and Buffalo EA (2012). A map of visual space in the primate entorhinal cortex. Nature, 491(7426):761–764. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krupic J, Bauza M, Burton S, Barry C, and O’Keefe J (2015). Grid cell symmetry is shaped by environmental geometry. Nature, 518(7538):232–235. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kubie JL, & Fenton AA (2012). Linear look-ahead in conjunctive cells: An entorhinal mechanism for vector-based navigation. Frontiers in Neural Circuits, APRIL2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- LaChance PA, Todd TP, & Taube JS (2019). A sense of space in postrhinal cortex. Science, 365(6449). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lakshminarasimhan KJ, Avila E, Neyhart E, DeAngelis GC, Pitkow X, and Angelaki DE (2020). Tracking the Mind’s Eye: Primate Gaze Behavior during Virtual Visuomotor Navigation Reflects Belief Dynamics. Neuron, 106(4):662–674.e5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lakshminarasimhan KJ, Avila E, Pitkow X, and Angelaki DE (2022). Dynamical Latent State Computation in the Posterior Parietal Cortex. bioRxiv, page 2022.01.12.476065. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lakshminarasimhan KJ, Petsalis M, Park H, DeAngelis GC, Pitkow X, and Angelaki DE (2018). A Dynamic Bayesian Observer Model Reveals Origins of Bias in Visual Path Integration. Neuron, 99(1):194–206.e5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Land MF and Hayhoe M (2001). In what ways do eye movements contribute to everyday activities? Vision Research, 41(25–26):3559–3565. [DOI] [PubMed] [Google Scholar]
- Land MF and Lee DN (1994). Where we look when we steer. Nature, 369(6483):742–744. [DOI] [PubMed] [Google Scholar]
- Larson AM and Loschky LC (2009). The contributions of central versus peripheral vision to scene gist recognition. Journal of Vision, 9(10):1–16. [DOI] [PubMed] [Google Scholar]
- Laurens J, Kim B, Dickman JD, & Angelaki DE (2016). Gravity orientation tuning in macaque anterior thalamus. In Nature Neuroscience (Vol. 19, Issue 12). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lavenex PB, Amaral DG, & Lavenex P (2006). Hippocampal lesion prevents spatial relational learning in adult macaque monkeys. Journal of Neuroscience, 26(17). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lee AK, & Wilson MA (2002). Memory of sequential experience in the hippocampus during slow wave sleep. Neuron, 36(6), 1183–1194. [DOI] [PubMed] [Google Scholar]
- Lever C, Burton S, Jeewajee A, O’Keefe J, & Burgess N (2009). Boundary vector cells in the subiculum of the hippocampal formation. Journal of Neuroscience, 29(31). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu Y, Dolan RJ, Kurth-Nelson Z, and Behrens TE (2019). Human Replay Spontaneously Reorganizes Experience. Cell, 178(3):640–652.e14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu Y, Mattar MG, Behrens TE, Daw ND, and Dolan RJ (2021). Experience replay is associated with efficient nonlocal learning. Science, 372(6544). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu ZX, Shen K, Olsen RK, and Ryan JD (2017). Visual sampling predicts hippocampal activity. Journal of Neuroscience, 37(3):599–609. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ludvig N, Tang HM, Gohil BC, & Botero JM (2004). Detecting location-specific neuronal firing rate increases in the hippocampus of freely-moving monkeys. Brain Research, 1014(1–2). [DOI] [PubMed] [Google Scholar]
- Luke SG, Darowski ES, and Gale SD (2018). Predicting eye-movement characteristics across multiple tasks from working memory and executive control. Memory and Cognition, 46(5):826–839. [DOI] [PubMed] [Google Scholar]
- Mack ML, Love BC, & Preston AR (2016). Dynamic updating of hippocampal object representations reflects new conceptual knowledge. Proceedings of the National Academy of Sciences of the United States of America, 113(46). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maguire EA, Gadian DG, Johnsrude IS, Good CD, Ashburner J, Frackowiak RS, and Frith CD (2000). Navigation-related structural change in the hippocampi of taxi drivers. Proceedings of the National Academy of Sciences of the United States of America, 97(8):4398–4403. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maisson DJ, Wikenheiser A, Noel JG, Keinath AT. (2022) Making Sense of the Multiplicity and Dynamics of Navigational Codes in the Brain. J Neurosci. 42(45):8450–8459. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mallory CS, Hardcastle K, Campbell MG, Attinger A, Low IIC, Raymond JL, & Giocomo LM (2021). Mouse entorhinal cortex encodes a diverse repertoire of self-motion signals. Nature Communications, 12(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mao D, Avila E, Caziot B, Laurens J, Dickman JD, and Angelaki DE (2021). Spatial modulation of hippocampal activity in freely moving macaques. Neuron, 109(21):3521–3534.e6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mathis A, Stemmier MB, and Herz AV (2015). Probable nature of higher-dimensional symmetries underlying mammalian grid-cell activity patterns. eLife, 2015(4):1–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Matsumura N, Nishijo H, Tamura R, Eifuku S, Endo S, & Ono T (1999). Spatial- and task-dependent neuronal responses during real and virtual translocation in the monkey hippocampal formation. Journal of Neuroscience, 19(6).1999. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mattar MG and Daw ND (2018). Prioritized memory access explains planning and hippocampal replay. Nature Neuroscience, 21(11):1609–1617. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mattar MG and Lengyel M (2022). Planning in the brain. Neuron, 110(6):914–934. [DOI] [PubMed] [Google Scholar]
- McNaughton BL, Battaglia FP, Jensen O, Moser EI, & Moser MB (2006). Path integration and the neural basis of the “cognitive map.” Nature Reviews Neuroscience, 7(8), 663–678. [DOI] [PubMed] [Google Scholar]
- Mehta MR, Quirk MC, and Wilson MA (2000). Experience-dependent asymmetric shape of hippocampal receptive fields. Neuron, 25(3):707–715. [DOI] [PubMed] [Google Scholar]
- Meister ML and Buffalo EA (2018). Neurons in primate entorhinal cortex represent gaze position in multiple spatial reference frames. Journal of Neuroscience, 38(10):2430–2441. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Michaiel AM, Abe ETT, & Niell CM (2020). Dynamics of gaze control during prey capture in freely moving mice. ELife, 9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller JF, Neufang M, Solway A, Brandt A, Trippel M, Mader I, Hefft S, Merkow M, Polyn SM, Jacobs J, and Michael J (2013). Spatial Context of Retrieved Memories. Science, 342(6162):1111–1114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mizuseki K, Sirota A, Pastalkova E, & Buzsáki G (2009). Theta Oscillations Provide Temporal Windows for Local Circuit Computation in the Entorhinal-Hippocampal Loop. Neuron, 64(2). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Momennejad I, Russek EM, Cheong JH, Botvinick MM, Daw ND, and Gershman SJ (2017). The successor representation in human reinforcement learning. Nature Human Behaviour, 1(9):680–692. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Monaco J, Rao G, Roth E, and Knierim J (2014). Attentive Scanning Behavior Drives One-Trial Potentiation of Hippocampal Place Fields. Nature Neuroscience, 17(5):725–731. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muller RU and Kubie JL (1987). The effects of changes in the environment on the spatial firing of hippocampal complex-spike cells. Journal of Neuroscience, 7(7):1951–1968. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Najemnik J and Geisler WS (2005). Optimal eye movement strategies in visual search. Nature, 434(7031):387–391. [DOI] [PubMed] [Google Scholar]
- Nau M, Navarro Schröder T, Bellmund JLS, & Doeller CF (2018). Hexadirectional coding of visual space in human entorhinal cortex. Nature Neuroscience, 21(2). [DOI] [PubMed] [Google Scholar]
- Ngo V, Gorman JC, De la Fuente MF, Souto A, Schiel N, and Miller CT (2022). Active vision during prey capture in wild marmoset monkeys. Current Biology, 32(15):3423–3428.e3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Noel J-P, Balzani E, Avila E, Lakshminarasimhan KJ, Bruni S, Alefantis P, Savin C, and Angelaki DE (2022). Coding of latent variables in sensory, parietal, and frontal cortices during closed-loop virtual navigation. eLife, 11:1–25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Noton D and Stark L (1971). Scanpaths in eye movements during pattern perception. Science, 171(3968):308–311. [DOI] [PubMed] [Google Scholar]
- Nowicka A, & Ringo JL (2000). Eye position-sensitive units in hippocampal formation and in inferotemporal cortex of the Macaque monkey. European Journal of Neuroscience, 12(2). [DOI] [PubMed] [Google Scholar]
- O’Keefe J (1976). Place units in the hippocampus of the freely moving rat. Experimental Neurology, 51(1):78–109. [DOI] [PubMed] [Google Scholar]
- O’Keefe J and Nadel L (1978). The Hippocampus as a Cognitive Map. Oxford University Press. [Google Scholar]
- O’Mara SM, Rolls ET, Berthoz A, & Kesner RP (1994). Neurons responding to whole-body motion in the primate hippocampus. Journal of Neuroscience, 14(11 I). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Orquin JL and Mueller Loose S (2013). Attention and choice: A review on eye movements in decision making. Acta Psychologica, 144(1):190–206. [DOI] [PubMed] [Google Scholar]
- Parr T and Friston KJ (2019). Generalised free energy and active inference. Biological Cybernetics, 113(5–6):495–513. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Payne HL, & Raymond JL (2017). Magnetic eye tracking in mice. ELife, 6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pfeiffer BE and Foster DJ (2013). Hippocampal place-cell sequences depict future pathsto remembered goals. Nature, 497(7447):74–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Postle BR, Sala SD, and Baddeley AD (2006). The selective disruption of spatial working memory by eye movements. Q J Exp Psychol, 59(1):100–120. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Poulter S, Hartley T, and Lever C (2018). The Neurobiology of Mammalian Navigation. Current Biology, 28(17):R1023–R1042. [DOI] [PubMed] [Google Scholar]
- Prusky GT and Douglas RM (2004). Characterization of mouse cortical spatial vision. Vision Research, 44(28 SPEC.ISS.):3411–3418. [DOI] [PubMed] [Google Scholar]
- Ringo JL, Sobotka S, Diltz MD, and Bunce CM (1994). Eye movements modulate activity in hippocampal, parahippocampal, and inferotemporal neurons. Journal of Neurophysiology, 71(3):1285–1288. [DOI] [PubMed] [Google Scholar]
- Rizzolatti G, Riggio L, Dascola I, and Umiltá C (1987). Reorienting attention across the horizontal and vertical meridians: Evidence in favor of a premotor theory of attention. Neuropsychologia, 25(1 PART 1):31–40. [DOI] [PubMed] [Google Scholar]
- Robertson RG, Rolls ET, and Georges-François P (1998). Spatial view cells in the primate hippocampus: Effects of removal of view details. Journal of Neurophysiology, 79(3):1145–1156. [DOI] [PubMed] [Google Scholar]
- Rolls ET, Robertson RG, & Georges-François P (1997). Spatial view cells in the primate hippocampus. European Journal of Neuroscience, 9(8), 1789–1794. [DOI] [PubMed] [Google Scholar]
- Rolls ET (1999). Spatial view cells and the representation of place in the primate hippocampus. Hippocampus, 9(4):467–480. [DOI] [PubMed] [Google Scholar]
- Rolls ET (2007). An attractor network in the hippocampus: Theory and neurophysiology. In Learning and Memory (Vol. 14, Issue 11). [DOI] [PubMed] [Google Scholar]
- Rolls ET (2021). Neurons including hippocampal spatial view cells, and navigation in primates including humans. Hippocampus, 31(6):593–611. [DOI] [PubMed] [Google Scholar]
- Rolls ET and O’Mara SM (1995). Viewresponsive neurons in the primate hippocampal complex. Hippocampus, 5(5):409–424. [DOI] [PubMed] [Google Scholar]
- Rolls ET (2022). Hippocampal spatial view cells for memory and navigation, and their underlying connectivity in humans. Hippocampus, 1–40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rosenberg M, Zhang T, Perona P, and Meister M (2021). Mice in a labyrinth: Rapid learning, sudden insight, and efficient exploration. eLife, 10:1–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rueckemann JW and Buffalo EA (2017). Spatial responses, immediate experience, and memory in the monkey hippocampus. Current Opinion in Behavioral Sciences, 17:155–160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Russek EM, Momennejad I, Botvinick MM, Gershman SJ, and Daw ND (2017). Predictive representations can link model-based reinforcement learning to model-free mechanisms, PLOS Computational Biology, Volume 13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schapiro AC, McDevitt EA, Rogers TT, Mednick SC, and Norman KA (2018). Human hippocampal replay during rest prioritizes weakly learned information and predicts memory performance. Nature Communications, 9(1):1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schroeder CE, Wilson DA, Radman T, Scharfman H, and Lakatos P (2010). Dynamics of Active Sensing and Perceptual Selection. Curr Opin Neurobiol, 20(2):172–176. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schuck NW and Niv Y (2019). Sequential Replay of Non-spatial Task States in the Human Hippocampus HHS Public Access. Science, 364(6447):1–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shamash P, Olesen SF, Iordanidou P, Campagner D, Banerjee N, and Branco T (2021). Mice learn multi-step routes by memorizing subgoal locations. Nature Neuroscience, 24(9):1270–1279. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shapiro L (2010). Embodied Cognition. Taylor Francis Group. [Google Scholar]
- Shushruth S, Zylberberg A, and Shadlen MN (2022). Sequential sampling from memory underlies action selection during abstract decision-making. Current Biology, 32(9):1949–1960.e5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sosa M and Giocomo LM (2021). Navigating for reward. Nature Reviews Neuroscience, 22(8):472–487. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stachenfeld KL, Botvinick MM, and Gershman SJ (2017). The hippocampus as a predictive map. Nature Neuroscience, 20(11):1643–1653. [DOI] [PubMed] [Google Scholar]
- Stavropoulos A, Lakshminarasimhan KJ, Laurens J, Pitkow X, and Angelaki DE (2022). Influence of sensory modality and control dynamics on human path integration. eLife, 11:e63405. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Strange BA, Witter MP, Lein ES, & Moser EI (2014). Functional organization of the hippocampal longitudinal axis. In Nature Reviews Neuroscience (Vol. 15, Issue 10). [DOI] [PubMed] [Google Scholar]
- Sun C, Yang W, Martin J, & Tonegawa S (2020). Hippocampal neurons represent events as transferable units of experience. Nature Neuroscience, 23(5). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Talakoub O, Sayegh P, Womelsdorf T, Zinke W, Fries P, Lewis CM, & Hoffman KL (2019). Hippocampal and neocortical oscillations are tuned to behavioral state in freely-behaving macaques. In bioRxiv. [Google Scholar]
- Taube JS (2007). The head direction signal: origins and sensory-motor integration. Annual Review of Neuroscience, 30, 181–207. [DOI] [PubMed] [Google Scholar]
- Tessereau C, O’Dea R, Coombes S, and Bast T (2021). Reinforcement learning approaches to hippocampus-dependent flexible spatial navigation. Brain and Neuroscience Advances, 5:239821282097563. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Theeuwes J, Belopolsky A, and Olivers CN (2009). Interactions between working memory, attention and eye movements. Acta Psychologica, 132(2):106–114. [DOI] [PubMed] [Google Scholar]
- Tolman EC (1948). Cognitive maps in rats and men. The Psychological Review, 55(4):189–208. [DOI] [PubMed] [Google Scholar]
- Tsitsiklis M, Miller J, Qasim SE, Inman CS, Gross RE, Willie JT, Smith EH, Sheth SA, Schevon CA, Sperling MR, Sharan A, Stein JM, and Jacobs J (2020). Single-Neuron Representations of Spatial Targets in Humans. Current Biology, 30(2):245–253.e4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tulving E (1985). Memory and consciousness. Canadian Psychology, 26(1):1–12. [Google Scholar]
- Turk-Browne NB (2019). The hippocampus as a visual area organized by space and time: A spatiotemporal similarity hypothesis. Vision Research, 165(October):123–130. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ulanovsky N and Moss CF (2007). Hippocampal cellular and network activity in freely moving echolocating bats. Nature Neuroscience, 10(2):224–233. [DOI] [PubMed] [Google Scholar]
- Viganò S, Rubino V, Soccio A. di, Buiatti M, & Piazza M. (2021). Grid-like and distance codes for representing word meaning in the human brain. NeuroImage, 232. [DOI] [PubMed] [Google Scholar]
- Vikbladh OM, Meager MR, King J, Blackmon K, Devinsky O, Shohamy D, Burgess N, and Daw ND (2019). Hippocampal Contributions to Model-Based Planning and Spatial Memory. Neuron, 102(3):683–693.e4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wann JP and Swapp DK (2000). Why you should look where you are going. Nature Neuroscience, 3(7):647–648. [DOI] [PubMed] [Google Scholar]
- Warren WH and Hannon DJ (1988). Direction of self-motion is perceived from optical flow. Nature, 336:162–163. [Google Scholar]
- Watrous AJ, Lee DJ, Izadi A, Gurkoff GG, Shahlaie K, & Ekstrom AD (2013). A Comparative Study of Human and Rat Hippocampal Low-Frequency Oscillations During Spatial Navigation. Hippocampus, 23(8). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Whittington JC, McCaffary D, Bakermans JJ, and Behrens TE (2022). How to build a cognitive map. Nature Neuroscience, 25(10):1257–1272. [DOI] [PubMed] [Google Scholar]
- Whittington JC, Muller TH, Mark S, Chen G, Barry C, Burgess N, and Behrens TE (2020). The Tolman-Eichenbaum Machine: Unifying Space and Relational Memory through Generalization in the Hippocampal Formation. Cell, 183(5):1249–1263.e23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilming N, König P, König S, and Buffalo EA (2017). Entorhinal cortex receptive fields are modulated by spatial attention, even without movement. bioRxiv, pages 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilson M (2002). Six views of embodied cognition. Psychometric Bulletin Review, 9(4):625–636. [DOI] [PubMed] [Google Scholar]
- Wilson MA and McNaughton BL (1993). Dynamics of the hippocampal ensemble code for space. Science, 261(5124):1055–1058. [DOI] [PubMed] [Google Scholar]
- Winocur G, Moscovitch M, Rosenbaum RS, and Sekeres M (2010). An investigation of the effects of hippocampal lesions in rats on pre- and postoperatively acquired spatial memory in a complex environment. Hippocampus, 20(12):1350–1365. [DOI] [PubMed] [Google Scholar]
- Wirth S, Baraduc P, Planté A, Pinède S, and Duhamel JR (2017). Gaze-informed, task-situated representation of space in primate hippocampus during virtual navigation. PLoS Biology, 15(2):1–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yang SCH, Lengyel M, and Wolpert DM (2016). Active sensing in the categorization of visual patterns. eLife, 5(FEBRUARY2016):1–22. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yang SCH, Wolpert DM, and Lengyel M (2018). Theoretical perspectives on active sensing. Current Opinion in Behavioral Sciences, 11:100–108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yarbus AL (1967). Eye Movements and Vision. Springer; New York. [Google Scholar]
- Yoon K, Buice MA, Barry C, Hayman R, Burgess N, & Fiete IR (2013). Specific evidence of low-dimensional continuous attractor dynamics in grid cells. Nature Neuroscience, 16(8). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang H, & Jacobs J (2015). Traveling theta waves in the human hippocampus. Journal of Neuroscience, 35(36), 12477–12487. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhu S, Lakshminarasimhan KJ, Arfaei N, and Angelaki DE (2022). Eye movements reveal spatiotemporal dynamics of visually-informed planning in navigation. eLife, 11:1–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zuo S, Wang L, Shin JH, Cai Y, Lee SW, Appiah K, Zhou YD, and Kwok SC (2020). Behavioral evidence for memory replay of video episodes in the macaque. eLife, 9:1–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zweifel NO and Hartmann MJ (2020). Defining active sensing through an analysis of sensing energetics: homeoactive and alloactive sensing. Journal of Neurophysiology, 124(1):40–48. [DOI] [PubMed] [Google Scholar]
