Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Jun 19.
Published in final edited form as: Nat Neurosci. 2017 Oct 26;20(11):1465–1473. doi: 10.1038/nn.4658

Our sense of direction: progress, controversies and challenges

Kathleen E Cullen 1, Jeffrey S Taube 2
PMCID: PMC10278035  NIHMSID: NIHMS1905255  PMID: 29073639

Abstract

In this Perspective, we evaluate current progress in understanding how the brain encodes our sense of direction, within the context of parallel work focused on how early vestibular pathways encode self-motion. In particular, we discuss how these systems work together and provide evidence that they involve common mechanisms. We first consider the classic view of the head direction cell and results of recent experiments in rodents and primates indicating that inputs to these neurons encode multimodal information during self-motion, such as proprioceptive and motor efference copy signals, including gaze-related information. We also consider the paradox that, while the head-direction network is generally assumed to generate a fixed representation of perceived directional heading, this computation would need to be dynamically updated when the relationship between voluntary motor command and its sensory consequences changes. Such situations include navigation in virtual reality and head-restricted conditions, since the natural relationship between visual and extravisual cues is altered.

Sense of direction versus a self-motion-detection system

Our ability to keep track of where we are going relative to where we have been requires knowledge of our ongoing location and orientation. In everyday life, this ability, or sense of direction, depends on the integration of both visual and nonvisual cues, including vestibular and proprioceptive information. The visual system provides retinal image-motion (optic flow) cues, which have long been known to provide a powerful sense of motion through an otherwise stationary environment1. In addition, extravisual cues make important contributions to navigation and are thought to contribute to path integration or dead reckoning (Box 1) in the absence of visual cues (for review see refs. 2,3). Notably, information sent to the brain by the vestibular nerve allows the estimation of head displacement (both linear and angular) in the absence of visual cues48. Further, during locomotion, proprioceptive and/or motor-related signals also contribute significantly to the estimation of self-motion. For example, estimates of distance traveled and self-velocity are more accurate when self-motion is generated by active locomotion than when subjects are passively moved912, and subjects can keep track of changes in heading during active motion in the absence of vision5.

Box 1. Terms.

Allothetic cues:

external sensory cues that can be used as landmarks; these cues are used in an episodic (as opposed to continuous) manner.

Forward model:

a process that predicts the expected sensory feedback of a motor command, allowing rapid error detection when actual and predicted feedback do not match.

Idiothetic cues:

internal body cues used to detect how the body and/or head is moving through space; to maintain spatial accuracy these cues must be used in a continuous (as opposed to episodic) manner; for example, vestibular, proprioceptive, motor efference copy.

Optic flow:

the flow of visual stimuli across the retina; this information can be used to determine how objects or the head is moving through the environment.

Path integration (dead reckoning):

the process whereby an organism updates its spatial orientation through the use of self-motion cues as it moves through the environment.

Spatial updating:

the process whereby an organism updates its spatial orientation after movement or a period of time has elapsed.

In everyday life, the vestibular system plays a vital role in a wide range of functions from reflexes to the highest levels of voluntary behavior. Notably, as we move through our environment, the vestibular system explicitly detects and encodes our self-motion in six dimensions: the semicircular canals and otolith end organs of the vestibular system detect three dimensions of rotational head velocity and three dimensions of translational head acceleration, respectively. This self-motion information is in turn essential for generating the fundamental reflexes that guarantee stable gaze and posture during our everyday activities, namely the vestibulo-ocular, vestibulocervical, and vestibulospinal reflexes13. Importantly, vestibular information is also sent to higher centers via ascending pathways. In particular, two main thalamocortical pathways—one anterior and the other posterior—transmit vestibular information to the cortex (Fig. 1; reviewed in refs. 14,15). Of the two, the anterior pathway has been more extensively studied. It comprises direct projections from the anterior dorsal thalamus to the retrosplenial cortex and multisynaptic projections to the entorhinal cortex (via presubiculum and parasubiculum), and it is thought to encode our static sense of direction during navigation and spatial memory tasks16. In contrast, the posterior pathway originates in the vestibular nuclei and projects to the ventral posterior thalamus, and then on to the somatosensory cortex and the parieto-insular vestibular cortex17. This pathway has been less extensively studied despite its likely contribution to the accurate coordination of perception and detection of self-motion (reviewed in ref. 18).

Figure 1.

Figure 1

Overview highlighting functions and projections of the two ascending vestibular pathways: (i) the anterior vestibulothalamic pathway through the nucleus prepositus and supragenual nucleus (NPH/SGN) to the HD network (blue; see Fig. 2 for details) and (ii) the posterior vestibulothalamic pathway through the ventral posterior lateral nucleus (red). PIVC, parieto-insular vestibular cortex; VN, vestibular nuclei; VPL, ventral posterior lateral thalamic nucleus.

This anatomical division then raises the question: are these two ascending pathways independent or do they involve common mechanisms? Below, we first consider the prevailing view that during navigation, vestibular information transmitted to the anterior thalamocortical pathway is integrated to provide our sense of direction by providing a critical input to the head-direction cell network. In this context, neurons within the anterior thalamic nuclei are tuned to specific static head directions. We then consider the implications of recent work establishing that early vestibular pathways integrate multimodal information to preferentially encode vestibular exafference (i.e., motion that is not self-generated). The implications of these recent studies are then considered in relation to both the ascending sense of direction and self-motion detection pathways.

A quick review of the classic head direction system

Head direction (HD) cells, originally discovered in the rat dorsal presubiculum (postsubiculum) by James Ranck in 1984 (see foreword19), discharge in relation to the animal’s directional heading in the horizontal plane, independent of the animal’s location and behavior20,21. In this regard their firing is analogous to a compass, which always points in a particular direction (north) no matter where the compass is located, although for HD cells their activity does not appear to be sensitive to the earth’s geomagnetic field22 but rather to landmark cues and the earth’s gravity axis21,23. Each HD cell is tuned to a different direction, and a population of such cells uniformly represents the 360° range. Their firing is thought to encode a continuous representation of the animal’s perceived directional heading. Peak firing rates vary across different cells and can range from a few spikes per s to more than 100 spikes per s. Figure 2a depicts three typical HD cells from three different brains areas. Note the variations in peak firing rates, as well as tuning widths, across the three cells, findings that are not well understood in terms of their functional roles. The variations depicted are not necessarily characteristic of a particular brain area but in general can be observed across all areas where HD cells have been identified. HD cells are found in many brain areas throughout the limbic system, and studies have identified several brainstem sites in rats that are responsible for generating the signal. Self-movement cues are integrated in these brainstem areas, and the processed HD signal is propagated rostrally through a network that includes the anterodorsal thalamus (ADN), dorsal presubiculum, and entorhinal cortex (reviewed in ref. 19). Visual landmark information is integrated into this circuit via direct projections from visual cortex to the dorsal presubiculum (and to a lesser extent, the retrosplenial cortex), which then exerts top-down control with projections to the lateral mammillary nuclei (reviewed in ref. 24). HD cells have been mostly studied in rats, and to a lesser extent in mice, but they have also been identified in monkeys25 and indirectly in humans26.

Figure 2.

Figure 2

Representative plots for HD cells and HD circuit. (a) Three typical HD cells are shown across three different brain areas. Peak firing rates in each cell’s preferred firing direction can range from low (top) to medium (middle) to high (bottom) across different cells. Cells with low and high peak firing rates can be observed in all brain areas. (b) Circuit diagram showing principal connections of areas containing HD, angular head velocity, place, and grid cells. The bracket shows the site of the postulated ring attractor network that generates the HD signal. Red dashed line shows point in network where lesions to brain areas below this level induce burst firing patterns in recorded ADN neurons; lesions above this level lead to the loss of the HD signal in ADN without the presence of burstiness among the recorded neurons.

Early studies demonstrated the importance of the vestibular system for the HD signal, as bilateral neurotoxic lesions of the vestibular labyrinth abolished the HD signal in the anterior thalamus27. Yet other studies have shown how self-motion cues that are not vestibular in origin can influence HD cell activity. For example, while HD cells retain directional tuning during passive motion, in which only vestibular cues are present28, there can be differences in how HD cells fire between active and passive locomotion2931. Moreover, when HD cells are monitored as rats move from a familiar environment to a novel one (in which they are unfamiliar with the visual landmark cues), the cells’ preferred firing directions remain stable between the two environments when animals are allowed to actively locomote to the novel environment32,33 but not when they are passively moved (via a wheeled cart) to the novel environment33,34. Thus, despite the presence of a normal vestibular signal when the animals are passively transported, HD cells require information from other types of movement signals (proprioception, motor efference copy) to maintain a stable orientation between a familiar and a novel environment. Further, other studies that have manipulated optic flow cues (Box 1) have shown that visual motion cues can influence HD cell activity35,36, as well as make hippocampal place cells that are nondirectional in a natural environment more directional37. Normally, these movement systems, along with information about visual landmarks, work interactively and in synch with one another, but situations can arise in which there is a conflict amongst these different types of sensory and motor cues, an issue that we will return to below.

The generation of a HD signal: vestibular pathways and ring attractor networks

The HD signal can be considered to be composed of two components: (i) the directional signal itself, which is theoretically based on the integration of self-movement cues, and (ii) the anchoring of this signal to a source, which subsequently defines the reference frame for the signal. To understand how and where the HD signal is constructed, researchers have recorded from one brain area following the lesioning or inactivation of a second brain area. Using rats as the experimental model, these studies have highlighted a circuit starting in the vestibular nuclei and projecting through a series of brainstem nuclei, including the nucleus prepositus, supragenual nucleus, and dorsal tegmental nucleus, and then projecting to the lateral mammillary nucleus and ADN thalamus19,38. A large percentage of the neurons within the ADN can be classified as ‘classic’ HD cells, where the signal is quite robust with a high signal-to-noise ratio. From the ADN, the HD signal is projected rostrally to a number of areas, each of which contains HD cells, including the presubiculum and parasubiculum, retrosplenial, entorhinal and medial precentral cortices, and striatum (Fig. 2b). There is good evidence that inputs from early vestibular pathways are critical for the HD signal, as various types of vestibular manipulations that interfere with a normal vestibular signal lead to the loss of the HD signal. For example, lesions, inactivation, or occlusions of the semicircular canals, as well as experiments in horizontal canal-deficient mice, all lead to the loss of head-direction-specific activity when recording from the ADN or postsubiculum (reviewed in refs. 16,39). Given that the vestibular system conveys information about how head orientation changes with head movements, the vestibular system is an ideal candidate to provide information for updating the animal’s resultant head orientation following a head turn. Specifically, the prevailing view is that information about the animals’ angular head velocity, which is encoded by neurons within the vestibular nuclei, is conveyed from here to various cell types within the brainstem nuclei, which all contain neurons whose firing also correlates to angular head velocity4042.

The most parsimonious computational models that have been proposed to account for HD cell firing use continuous ring attractor networks, which receive input from cells sensitive to angular head velocity (reviewed in ref. 43). These networks contain interconnected neurons, such that HD cells with similar preferred firing directions have recurrent excitatory connections with one another, and HD cells with opposing preferred firing directions inhibit one another. Once activity is initiated, the network can sustain activity without outside excitation. The local area of activity (referred to as the activity hill) is moved around the ring to different directional headings following inputs from self-motion-based (idiothetic; for example, angular head velocity cells) or allothetic-based (landmark) sources (Box 1).

Based on studies that have investigated the anatomical substrate of the HD signal, researchers have hypothesized that a ring attractor network may operate across the reciprocal connections between the dorsal tegmental nuclei and the lateral mammillary nuclei (Fig. 2b)39,44,45; cf. ref. 46. Consistent with this view are findings from lesion and recording studies, which show that where in the HD circuit a lesion was placed determined the type of response observed in downstream recording areas (Fig. 2b). Theoretically, a lesion placed within the attractor network, or any area efferent to it, should lead to disrupted firing in neurons further downstream to the lesion. In contrast, a lesion placed afferent to the site of the attractor network will leave the network’s internal connections intact, but the external inputs will no longer be able to drive or reset the activity hill; under these conditions, the network will be ‘untethered’ and the activity hill will drift around the attractor ring. Precisely such responses were found when lesions were made to brain areas associated with the ascending vestibulothalamic pathway (i.e., nucleus prepositus, supragenual nucleus; Fig. 2b)16,39.

The vestibular system integrates multimodal information and motor signals

Prior studies have demonstrated that multimodal information, including proprioceptive, optic flow, and motor efference copy signals, can influence HD cell activity, yet to date the question of how HD cells actually integrate vestibular signals with other self-motion cues remains open. Notably, as detailed above, the type of information encoded by vestibular afferents is ideally suited to act as a spatial updater for the HD network, and the prevailing view is that neurons within the vestibular nuclei relay this information to the HD cell network. However, the vestibular nuclei do much more than just receive and encode a vestibular signal (i.e., inputs from the vestibular sensory organs). Indeed, these nuclei integrate multimodal information originating from a number of different sources including head, body, and eye (gaze) motor commands, neck proprioceptive information, and visual inputs in both rodents47,48 and primates (reviewed in refs. 49,50). Thus, a remaining challenge is understanding how these different sources of information contribute to the HD signal generation, as well as the implications of this integration for understanding the nature of the HD signal itself.

Vestibular nuclei neurons can be grouped into two main categories on the basis of differences in their sensitivity to eye motion and passive head motion, as well as connectivity. The first category, vestibular-only (VO, alternatively termed non-eye-movement) neurons, contribute to posture and self-motion perception. Specifically, these neurons receive direct inputs from the vestibular nerve and in turn drive vestibular spinal reflexes via their direct and indirect projections to the spinal cord (reviewed in ref. 50). In addition, VO neurons are a source of vestibular input to thalamocortical pathways5153 and are reciprocally interconnected with the nodulus and uvula of the vestibular cerebellum54. The second category of vestibular nuclei neurons are vestibulo-ocular reflex (VOR) neurons. These neurons also receive direct inputs from the vestibular nerve and project to extraocular motor neurons, thereby driving compensatory VOR eye movements to ensure stabile gaze during self-motion. The majority of VOR neurons are the position-vestibular-pause (PVP) neurons, a distinct group of neurons deriving their name from the signals they carry during passive head rotations and eye movements. A second subclass of VOR pathway neurons, floccular target neurons (FTN), receives input from the flocculus of the cerebellum as well as from the vestibular nerve. FTNs similarly encode eye- as well as head-related signals, and their responses complement those of PVP neurons by ensuring the VOR remains calibrated during daily activities.

During everyday activities, such as gaze behavior and locomotion, the vestibular system is activated as a result of our own actively generated motor commands (Fig. 3a). Single-unit recording experiments in monkeys have established that the afferent activity in the eighth cranial nerve is the same for both actively self-generated movement and externally applied movement, during both rotations and translations5557. Further, these studies have shown that activity in the eighth nerve is also independent of the animal’s current gaze behavior. However, this comparable activity is not the case at the next stage of vestibular processing in the vestibular nuclei. Instead, the responses of both categories of vestibular nuclei neurons—VO neurons and VOR neurons—are modulated in a behaviorally dependent manner.

Figure 3.

Figure 3

Learning new relationships between motor commands and sensory feedback during self-motion.(a,b) A cancellation signal is sent to the vestibular nuclei when sensory feedback matches the expected sensory consequence of motor command (a), resulting in the suppression of responses to self-generated vestibular stimulation at the first central stage of processing in the vestibular nuclei (b; compare blue and green traces). (c) When the normal relationship between the motor command and resultant movement is altered for active head movements, there is an initial mismatch (red arrow) between expected and actual sensory feedback. As a result, these vestibular neurons robustly respond to self-generated vestibular stimulation (dashed red trace shows predicted response based on sensitivity to passive vestibular stimulation). Then, after a learning phase (gray box), the brain updates its model of the expected sensory consequence of the motor command (green arrow). (d) In head-fixed or head-restrained VR-based protocols there is effectively a mismatch between multisensory feedback (both vestibular and proprioceptive) and optic flow. Dynamic updating of the brain’s model of the expected sensory consequence of the motor command requires tracking the comparison of the predictive and actual sensory feedback signals.

Preferential encoding of passive versus active self-motion in the vestibular nuclei

Experiments in head-unrestrained monkeys have established that the sensory responses of VO neurons are suppressed during actively generated movements (Fig. 3b)5861. Specifically, while VO neurons robustly respond to passive head movements their responses are markedly (~70%) attenuated during active head movement. This is the case for both rotational and translational motion, as well as across different species of monkeys (reviewed in ref. 62) and in mice48. More recent experiments controlling the association between intended and actual head movement have established that a vestibular cancellation signal is exclusively generated when the activation of neck proprioceptors matches the motor-generated expectation63,64.

Because VO neurons mediate vestibulospinal reflex pathways, the finding that their responses are suppressed during active motion has important implications for voluntary motor control versus balance. Specifically, while the vestibulospinal reflex is vital for stabilizing posture in response to unexpected movements, the stabilizing commands produced by an intact vestibulospinal reflex would be counterproductive during voluntary movements. Accordingly, turning off vestibulospinal reflexes is functionally advantageous. Further, because these central vestibular neurons send ascending projections to the ventral posterior thalamus, it follows that the self-motion signals relayed to the higher-level cortical areas responsible for perception are attenuated during active-motion vestibular information. Accordingly, these findings are problematic for the classic view of how the HD signal is generated, resulting in the following conundrum: if the main ascending vestibular pathway from the vestibular nuclei does not robustly encode self-motion during active movements because the signal is dramatically suppressed, how is the HD signal computed?65,66

Finally, the finding that the vestibular pathway does not encode self-motion during active movements raises the question of whether HD cell responses differ between active versus passive head movements. Some studies have shown that the HD signal is attenuated during passive head turns, particularly when the animal is tightly restrained in a towel2931, but other studies using head-fixed animals have shown that HD cells fire similarly for both active and passive motion—at least for cells in the ADN thalamus28. Thus, we return to the initial quandary: if vestibular activity at the level of the vestibular nuclei is largely suppressed during active head movements, how does the vestibular system drive HD cell discharge?

Conundrum: building HD signal during active head movements

One possible answer to the conundrum raised above is that other self-motion cues carried by VO neurons during active movements are integrated by the HD network to compute an estimate of direction. While this is not the case in rhesus monkeys60,61, recordings from the vestibular nuclei in mice have shown that, in contrast to monkey neurons, VO neurons in rodents encode a static neck-position signal in both active and passive conditions48. Notably, this proprioceptive-derived and/or neck motor signal appears to remain robust (i.e., unchanged) for active as well as passive movements. Nevertheless, it remains the case that, in both mice and monkeys, the signals coded by VO neurons significantly differ between active and passive conditions, with neurons displaying less modulation during active movements compared to during passive ones. This finding then raises the related possibility that the VO neuron signal is decoded by the HD network in a context-dependent manner. Specifically, while the head-motion-related modulation of VO neurons is markedly suppressed during active head motion in both species, it is not completely cancelled. Thus, theoretically, this attenuated input could be up-weighted in the active condition and/or combined with other available inputs at the level of the HD network (for example, proprioceptive-derived signals and/or motor efference). This possibility raises the questions of whether it is the case and if so, what circuits underlie the required up-weighting and/or integration?

Eye-movement information and the HD network?

As reviewed above, the vestibular nuclei contain a second category of neurons in addition to VO neurons, namely VOR (i.e., PVP and FTN) neurons. These neurons encode eye- as well as head-motion information and project to oculomotor centers to control VOR eye movements. Unlike VO neurons, VOR neurons do not preferentially encode passive head motion, thus raising the question of whether inputs from these neurons might be combined by the HD network to compute an estimate of HD. However, these neurons integrate eye-movement commands with vestibular information such that they encode head motion in a manner that depends fundamentally on the animal’s current gaze strategy. Specifically, the responses of PVP neurons are suppressed whenever animals redirect their gaze (i.e., during saccades and smooth pursuit) by an inhibitory gaze-command signal. Similarly, FTN neurons carry gaze-related as well as vestibular information. Thus, while these neurons could theoretically provide head-movement-related inputs to the HD network, they would also transmit gaze-related inputs.

Indeed, VOR neurons project to the nucleus prepositus, a structure implicated in the circuit connecting the vestibular nuclei to HD network (Fig. 2). The nucleus prepositus is also known as the oculomotor integrator, due to its central role in shaping eye movements (for review, see ref. 67). Accordingly, understanding the role played by the nucleus prepositus, an area in which lesions have been shown to disrupt the HD signal68, is complicated by the fact that this structure also contributes to controlling gaze. Moreover, this observation raises the possibility that the ascending HD signal from the vestibular nuclei to the ADN thalamus does not exclusively transmit angular head-direction information derived from the vestibular labyrinth but also encodes eye-position information. In sum, none of the three classes of output neurons in the monkey vestibular nuclei (VO, PVP, and FTNs), neurons that would be the potential relays to the HD network, actually encode purely vestibular information during voluntary behavior (active head movements); rather they are either attenuated during voluntary self-motion or encode some type of eye-position signal along with any head-movement signal.

In monkeys, the vast majority of nucleus prepositus neurons respond only to eye movements, with a subset (~20%) responding to both eye movements and vestibular stimulation6971. Thus, the primate nucleus prepositus predominately outputs eye-in-head position and velocity information, even during voluntary head motion (for example, see ref. 69). Given that the nucleus prepositus targets regions projecting multisynaptically to the presubiculum19, a key question arises: to what extent is the HD cell an indicator of the animal’s head direction, as opposed to the direction of its gaze? Unfortunately, to date, no studies in rodents have recorded HD cells while simultaneously tracking eye position, so it is currently unknown whether rodent HD cells respond to changes in gaze direction. We must therefore turn our attention to primate studies that have recorded from hippocampal and parahippocampal areas while monitoring eye position and gaze direction.

In a series of studies in monkeys, Rolls and colleagues mostly identified spatial-view cells that responded when a certain part of the environment was in the animal’s field of view, as opposed to location or HD72,73. The majority of these cells were localized to the hippocampus rather than areas now known to contain HD cells. More recently, recordings from the entorhinal cortex in head-fixed monkeys viewing images on a video monitor showed that many cells fired based on the direction of the monkey’s saccadic eye movements74. It is notable, however, that this signal appears to be based on an egocentric reference frame, rather than the allocentric one seen for rodent HD cells. Nonetheless, together the above studies demonstrate the importance of eye position- and eye movement-related information as important determinants for cell firing—at least for monkeys. In contrast, other studies have recorded cells in monkeys that resemble HD cells that use an allocentric framework. Robertson et al.25 originally identified five such cells in parahippocampal areas (four in presubiculum and one in entorhinal cortex), and more recently, others have identified HD cells in the anterior thalamus75. It is noteworthy, however, that the proportion of HD cells in these studies was small (~5%) in comparison to rodent studies, which found 50% and 25% of cells in the ADN and postsubiculum, respectively20,30. Furthermore, while the HD signals recorded in these primate studies were independent of eye position, whether rodent HD cells encode true HD and not gaze-direction responses remains an open question.

Rapid updating of a forward model underlies the suppression of vestibular signals during active movement

The suppression of vestibular input resulting from active motion is common across species including mice48, squirrel monkeys76, and macaque monkeys (rhesus60 and fasicularis77). As mentioned above, recent studies in rhesus monkeys that focused on understanding the mechanism underlying this cancellation suggest that the brain generates a sensory expectation based on its motor output, which is then compared with the actual sensory input (reviewed in refs. 49,50). Specifically, vestibular responses are suppressed at the level of the vestibular nucleus when there is a match between the expected consequences of self-generated movement and resultant sensory feedback63 (Fig. 3a). Moreover, this mechanism appears to be cerebellar-dependent, and it allows selective encoding of passive self-motion at the earliest central stage of vestibular processing78,79.

A recent experiment emphasized the importance of prior experience in determining what constitutes a match between the expected sensory consequences of self-generated movement and the actual sensory feedback during motion78. As monkeys made voluntary head movements, the relationship between the motor command produced to generate the movement and the actual resultant movement was altered to produce a mismatch. Specifically, constant resistive torque was applied that initially reduced the monkey’s head motion by half. In this condition, vestibular nuclei neurons first responded as if head movement was externally generated, but the cancellation of active vestibular input re-emerged over the same time course as the ongoing behavioral learning (~40–50 head movements; Fig. 3c). Indeed, this result provides clear evidence for the rapid updating of a forward model (Box 1) that enables the vestibular system to learn to expect unexpected head motion inputs. Accordingly, what constitutes a match between (i) the expected sensory consequences of self-generated movement and (ii) the actual sensory feedback is continually updated as a function of recent experience.

In this context, it is noteworthy that the HD network is generally assumed to generate a stable representation of the animal’s current or anticipated head direction. Yet the expected correspondence between the sensory and motor cues that are inputs into this network require updating whenever there is a change in the relationship between the expected and actual sensory feedback. This constraint is particularly relevant to studies of navigation based on experimental designs in which this relationship is mismatched relative to what occurs in everyday life. For example, in virtual reality (VR) the lack of normal multisensory feedback (both vestibular and proprioceptive) during VR effectively results in a mismatch condition between the system’s internal state and resultant optic flow (Fig. 3d). Specifically, the information about the subject’s movement based on visual information would conflict with vestibular and proprioceptive cues. Furthermore, there would also be a mismatch between the expected and actual sensory vestibular and proprioceptive feedback experienced in studies using head-fixed or head-restrained protocols. Thus, in this context, knowing one’s perceived directional heading in the real world versus in an immersive world (i.e., in a video game), requires that the brain learns to switch back and forth between states in which the relationships between these cues are different. Further experiments will be required to understand the circuits and mechanisms that underlie this ability.

Implications for navigation in immersive environments

The considerations discussed above raise the fundamental question of how HD cells respond in mismatched VR and head-restrained conditions versus during actual navigation in the real world. Specifically, are the pathways that encode our sense of direction updated in such conditions to reflect a modified forward model that no longer expects the same sensory feedback from extravisual sources (i.e., vestibular and proprioceptive), as would be experienced under natural conditions? And if so, what is the time course of this updating and what are its implications for understanding how the brain encodes our sense of direction? Can this updating be observed in HD cells? If so, in contrast with common wisdom, this outcome could imply that the network generating the HD cell does not generate a constant signal representing the subject’s perceived directional heading in the real world but rather a perception of their directional heading in the reality they are immersed in, whether it is a VR environment or during sleep and dreaming.

How spatial cells behave under VR conditions is an important issue because most spatial studies in humans employ functional imaging techniques on subjects who are performing a spatial task while lying supine with their head fixed and viewing a video monitor—meaning that there is no active vestibular signal, motor-efference feedback, or proprioceptive feedback. These issues have been reviewed in more detail80,81, but the bottom line is that navigation in the virtual world is quite different than navigation in the real world, particularly in terms of the sensory and motor systems that are activated, which has important implications for interpreting underlying mechanisms.

What we do know, based on numerous behavioral studies, is that accurately performing a spatial task using a video monitor versus navigating successfully are two different things. For instance, rodents can accurately perform a spatial task, even when their HD cell system is not functional82, suggesting that different spatial tasks employ different computations that most likely require multiple brain areas and neural systems. Establishing how and where these systems interact and are integrated is critical for understanding the neural mechanisms underlying accurate navigation in both the real world and in VR conditions. The extent to which accurate performance can be maintained in either desktop or VR spatial tasks in the face of disruption to the HD system, and by default the grid cell system, which is dependent on the HD system83, is not known.

In this context, it is interesting to consider how HD cells might fire when a subject is immersed in a video game in which they adopt the spatial perspective of the avatar but still maintain an awareness of their orientation with the surrounding environment. Under these circumstances, would HD signals adopt the reference frame related to the video game or to the surrounding real world? Would HD cells across all brain areas adopt the same reference frame simultaneously and dynamically switch between representations, or would the HD cell population split into different representations? Recent recordings from the rat retrosplenial cortex have found different types of HD cells, some of which are tied to the global environment and some of which are tied to the local environment84. Remarkably, a third cell type, located only in the dysgranular retrosplenial cortex, encoded both local and global reference frames concurrently, thus having two preferred firing directions in the same environment and potentially providing a basis for how we simultaneously maintain multiple spatial representations for different environments.

Similarly to immersion in a video game, one might consider how HD cells respond when imagining a situation that involves navigation: for example, when sitting at home and thinking about how you might navigate between two places that are distant from your home. Again, would HD signals adopt the reference frame related to the imagined locations or to the surrounding real world? Acquiring answers to this question will be important to fully understand the brain mechanisms underlying spatial orientation and navigation. A few studies have begun to address this issue. For instance, Peyrache et al.46 monitored multiple HD cells simultaneously during sleep and found that their firing retained the temporal correlation structure between each other. Further, network activity that occurred during the awake state was replayed during sleep at a 10× higher rate, suggesting that HD cell activity is not always tied to real world events. Whether HD cell activity across other brain areas will contain these same properties is an interesting question, as one could imagine that certain brain areas may only be associated with real-world events, while other brain areas are more involved in imaginative states.

It is important to consider these findings in light of what’s known about hippocampal place cell activity in VR, where results have been less consistent. Some labs have reported robust place cell firing in VR compared to the real-world85, while others have reported either an absence of such firing or a lower percentage of cells showing location-specific firing86,87. This discrepancy is not due to species differences, because all the studies used rats. Currently, the most plausible reason(s) is that substantial vestibular, proprioceptive, and motor information is needed in order to observe robust place-like activity. In the studies involving less location-specific activity, the rats were either head- or body-fixed, which limited the amount of vestibular and proprioceptive stimuli. In contrast, in the study that observed robust place cell firing85, which also reported the presence of HD cells and grid cells, the rats were in a harness, and though they were confined in place (because their limb movements were directed to move a track ball), they were relatively free to rotate their head and body. Thus, although there was an absence in activating the linear component of the vestibular system (otoliths), the angular component of the vestibular system (canals) and proprioceptors in the neck and body would have been activated normally. In sum, if both vestibular and proprioceptive stimuli are needed in order to activate normal spatial firing, then monitoring subjects in VR conditions where these stimuli are limited may not reflect an accurate representation of the underlying neural processes that occur during real world navigation.

HD cells in three dimensions and the vestibular system

Over the past decade, several studies in rodents have examined how HD cells respond outside the horizontal plane (reviewed in refs. 88,89). HD cells in bats, a species that routinely moves in a three-dimensional world, display three-dimensional tuning and maintain directional tuning when inverted90. In contrast, while rat HD cells display directional tuning in the vertical plane, their tuning is lost when the rat is inverted82,91. One hypothesis proposed to explain these results is that HD cells define their reference frame based on the animal’s plane of locomotion such that the vertical plane is represented as an extension of the floor89. An alternative hypothesis, referred to as the mosaic hypothesis, is that the animal uses two rules for updating their directional heading: (i) a rule for yaw rotations around the dorsal–ventral axis and (ii) a rule for rotations of the animal’s dorsal–ventral axis around the gravity-defined vertical axis92. Notably, neither of these proposals can account for the absence of directional firing when the animal is inverted.

Recently, Shinder and Taube23 recorded HD cells from head-fixed rats as they were passively positioned in different orientations in all three rotational planes. Responses in large part could be explained in relation to the animal’s position relative to the earth’s gravitational axis. Using a reference frame that is at least partially defined by gravity would account for the finding that disruption of the otolith organs, which are sensitive to gravity, impairs HD cell activity93. Indeed, linking the reference frame to gravity is consistent with recent findings in monkeys that reported neurons in the anterior thalamus tuned to pitch-and-roll orientation relative to gravity94. Finally, vestibular nuclei neurons show strong attenuation to otolith-related input during voluntary changes in head position95. Thus, taken together, while the vestibular system, at the level of both the semicircular canals and the otolith organs, appears deeply involved in the HD signal, future experiments will be required to understand how vestibular and extravestibular sensory signals, as well as motor inputs, are integrated to build an estimation of three-dimensional orientation during active navigation.

Outlook and conclusions

In sum, to date there has been much progress in understanding the neural mechanisms underlying our sense of direction, as well as how early vestibular pathways encode self-motion. Disruption of the peripheral vestibular system impairs the directional firing patterns of HD cells27, as well as theta rhythm in the hippocampus (reviewed in ref. 96) and medial entorhinal cortex97. Further the grid-cell signal in the medial entorhinal cortex is dependent on both an intact HD signal83 and a theta signal98,99 and is believed to be intimately involved in navigational processes. Taken together, it follows that the vestibular system is also critically involved in the generation of the grid-cell signal, although this point has not yet been tested directly. Interestingly, the distinction between the effects of active versus passive motion on vestibular neurons is recapitulated for grid cells, in the sense that active movement is required for a normal grid cell signal100.

Neurons at the first central stage of vestibular processing encode proprioceptive, gaze, and head motor-efference copy, as well as vestibular signals—all cues that affect both HD and grid cell firing. Moreover, this multimodal information is transmitted in a behaviorally dependent manner that is rapidly updated when the relationship between voluntary motion and the actual sensory input to the brain changes. Thus, while the ascending vestibular pathway through the anterior thalamus is likely important for navigation, as it conveys the HD signal rostrally, there are open questions regarding how this pathway utilizes vestibular signals versus other sensory cues and motor signals during navigation. In contrast, the ascending vestibular pathway through the ventral posterior thalamus is likely more involved with one’s perceived sense of self-motion (whether such motion is relative to a visual scene on a video monitor or relative to the external world). Where and how these two ascending vestibular pathways are merged and integrated in the cortex remains an open question, and further appreciation of this integration will ultimately be key to understanding the neural mechanisms underlying our sense of direction and accurate navigation in both the three-dimensional real world and in VR conditions.

Footnotes

COMPETING FINANCIAL INTERESTS

The authors declare no competing financial interests.

References

  • 1.Gibson JJ The perception of visual surfaces. Am. J. Psychol. 63, 367–384 (1950). [PubMed] [Google Scholar]
  • 2.Loomis JM, Blascovich JJ & Beall AC Immersive virtual environment technology as a basic research tool in psychology. Behav. Res. Methods Instrum. Comput. 31, 557–564 (1999). [DOI] [PubMed] [Google Scholar]
  • 3.McNaughton BL, Battaglia FP, Jensen O, Moser EI & Moser MB Path integration and the neural basis of the ‘cognitive map’. Nat. Rev. Neurosci. 7, 663–678 (2006). [DOI] [PubMed] [Google Scholar]
  • 4.Berthoz A, Israël I, Georges-François P, Grasso R & Tsuzuku T Spatial memory of body linear displacement: what is being stored? Science 269, 95–98 (1995). [DOI] [PubMed] [Google Scholar]
  • 5.Glasauer S, Amorim MA, Vitte E & Berthoz A Goal-directed linear locomotion in normal and labyrinthine-defective subjects. Exp. Brain Res. 98, 323–335 (1994). [DOI] [PubMed] [Google Scholar]
  • 6.Grasso R, Ivanenko Y & Lacquaniti F Time course of gaze influences on postural responses to neck proprioceptive and galvanic vestibular stimulation in humans. Neurosci. Lett. 273, 121–124 (1999). [DOI] [PubMed] [Google Scholar]
  • 7.Israël I, Grasso R, Georges-Francois P, Tsuzuku T & Berthoz A Spatial memory and path integration studied by self-driven passive linear displacement. I. Basic properties. J. Neurophysiol. 77, 3180–3192 (1997). [DOI] [PubMed] [Google Scholar]
  • 8.Ivanenko YP & Grasso R Integration of somatosensory and vestibular inputs in perceiving the direction of passive whole-body motion. Brain Res. 5, 323–327 (1997). [DOI] [PubMed] [Google Scholar]
  • 9.Becker W, Nasios G, Raab S & Jürgens R Fusion of vestibular and podokinesthetic information during self-turning towards instructed targets. Exp. Brain Res. 144, 458–474 (2002). [DOI] [PubMed] [Google Scholar]
  • 10.Frissen I, Campos JL, Souman JL & Ernst MO Integration of vestibular and proprioceptive signals for spatial updating. Exp. Brain Res. 212, 163–176 (2011). [DOI] [PubMed] [Google Scholar]
  • 11.Jürgens R & Becker W Perception of angular displacement without landmarks: evidence for Bayesian fusion of vestibular, optokinetic, podokinesthetic, and cognitive information. Exp. Brain Res. 174, 528–543 (2006). [DOI] [PubMed] [Google Scholar]
  • 12.Telford L, Howard IP & Ohmi M Heading judgments during active and passive self-motion. Exp. Brain Res. 104, 502–510 (1995). [DOI] [PubMed] [Google Scholar]
  • 13.Goldberg JM et al. The Vestibular System: a Sixth Sense (Oxford University Press, 2012). [Google Scholar]
  • 14.Hitier M, Besnard S & Smith PF Vestibular pathways involved in cognition. Front. Integr. Nuerosci. 8, 59 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Shinder ME & Taube JS Differentiating ascending vestibular pathways to the cortex involved in spatial cognition. J. Vestib. Res. 20, 3–23 (2010). [DOI] [PubMed] [Google Scholar]
  • 16.Yoder RM & Taube JS The vestibular contribution to the head direction signal and navigation. Front. Integr. Neurosci. 10.3389/fnint.2014.00032 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Guldin WO & Grüsser OJ Is there a vestibular cortex? Trends Neurosci. 21, 254–259 (1998). [DOI] [PubMed] [Google Scholar]
  • 18.Clark BJ & Harvey RE Do the anterior and lateral thalamic nuclei make distinct contributions to spatial representation and memory? Neurobiol. Learn. Mem. 133, 69–78 (2016). [DOI] [PubMed] [Google Scholar]
  • 19.Taube JS The head direction signal: origins and sensory-motor integration. Annu. Rev. Neurosci. 30, 181–207 (2007). [DOI] [PubMed] [Google Scholar]
  • 20.Taube JS, Muller RU & Ranck JB Jr. Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis. J. Neurosci. 10, 420–435 (1990). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Taube JS, Muller RU & Ranck JB Jr. Head-direction cells recorded from the postsubiculum in freely moving rats. II. Effects of environmental manipulations. J. Neurosci. 10, 436–447 (1990). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Tryon VL et al. Magnetic field polarity fails to influence the directional signal carried by the head direction cell network and the behavior of rats in a task requiring magnetic field orientation. Behav. Neurosci. 126, 835–844 (2012). [DOI] [PubMed] [Google Scholar]
  • 23.Shinder ME & Taube JS Responses of head direction cells in the anterodorsal thalamus during inversion. Soc. Neurosci. Poster 895.1 (2010). [Google Scholar]
  • 24.Yoder RM, Clark BJ & Taube JS Origins of landmark encoding in the brain. Trends Neurosci. 34, 561–571 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Robertson RG, Rolls ET, Georges-François P & Panzeri S Head direction cells in the primate pre-subiculum. Hippocampus 9, 206–219 (1999). [DOI] [PubMed] [Google Scholar]
  • 26.Shine JP, Valdés-Herrera JP, Hegarty M & Wolbers T The human retrosplenial cortex and thalamus code head direction in a global reference frame. J. Neurosci. 36, 6371–6381 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Stackman RW & Taube JS Firing properties of head direction cells in the rat anterior thalamic nucleus: dependence on vestibular input. J. Neurosci. 17, 4349–4358 (1997). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Shinder ME & Taube JS Active and passive movement are encoded equally by head direction cells in the anterodorsal thalamus. J. Neurophysiol. 106, 788–800 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Knierim JJ, Kudrimoti HS & McNaughton BL Place cells, head direction cells, and the learning of landmark stability. J. Neurosci. 15, 1648–1659 (1995). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Taube JS Head direction cells recorded in the anterior thalamic nuclei of freely moving rats. J. Neurosci. 15, 70–86 (1995). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Zugaro MB, Tabuchi E, Fouquier C, Berthoz A & Wiener SI Active locomotion increases peak firing rates of anterodorsal thalamic head direction cells. J. Neurophysiol. 86, 692–702 (2001). [DOI] [PubMed] [Google Scholar]
  • 32.Taube JS & Burton HL Head direction cell activity monitored in a novel environment and during a cue conflict situation. J. Neurophysiol. 74, 1953–1971 (1995). [DOI] [PubMed] [Google Scholar]
  • 33.Yoder RM et al. Both visual and idiothetic cues contribute to head direction cell stability during navigation along complex routes. J. Neurophysiol. 105, 2989–3001 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Stackman RW, Golob EJ, Bassett JP & Taube JS Passive transport disrupts directional path integration by rat head direction cells. J. Neurophysiol. 90, 2862–2874 (2003). [DOI] [PubMed] [Google Scholar]
  • 35.Arleo A et al. Optic flow stimuli update anterodorsal thalamus head direction neuronal activity in rats. J. Neurosci. 33, 16790–16795 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Blair HT & Sharp PE Visual and vestibular influences on head-direction cells in the anterior thalamus of the rat. Behav. Neurosci. 110, 643–660 (1996). [DOI] [PubMed] [Google Scholar]
  • 37.Acharya L, Aghajan ZM, Vuong C, Moore JJ & Mehta MR Causal influence of visual cues on hippocampal directional selectivity. Cell 164, 197–207 (2016). [DOI] [PubMed] [Google Scholar]
  • 38.Biazoli CE Jr., Goto M, Campos AM & Canteras NS The supragenual nucleus: a putative relay station for ascending vestibular signs to head direction cells. Brain Res. 1094, 138–148 (2006). [DOI] [PubMed] [Google Scholar]
  • 39.Clark BJ & Taube JS Vestibular and attractor network basis of the head direction cell signal in subcortical circuits. Front. Neural Circuits 10.3389/fncir.2012.00007 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Bassett JP & Taube JS Neural correlates for angular head velocity in the rat dorsal tegmental nucleus. J. Neurosci. 21, 5740–5751 (2001). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Dumont JR, Winter SS, Farnes KB & Taube JS Neural correlates within nucleus prepositus and paragigantocellularis during active and passive movement. Soc. Neurosci. Poster 631.26 (2015). [Google Scholar]
  • 42.Sharp PE, Tinkelman A & Cho J Angular velocity and head direction signals recorded from the dorsal tegmental nucleus of Gudden in the rat: implications for path integration in the head direction cell circuit. Behav. Neurosci. 115, 571–588 (2001). [PubMed] [Google Scholar]
  • 43.Knierim JJ & Zhang K Attractor dynamics of spatially correlated neural activity in the limbic system. Annu. Rev. Neurosci. 35, 267–285 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Bassett JP, Tullman ML & Taube JS Lesions of the tegmentomammillary circuit in the head direction system disrupt the head direction signal in the anterior thalamus. J. Neurosci. 27, 7564–7577 (2007). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Sharp PE, Blair HT & Cho J The anatomical and computational basis of the rat head-direction cell signal. Trends Neurosci. 24, 289–294 (2001). [DOI] [PubMed] [Google Scholar]
  • 46.Peyrache A, Lacroix MM, Petersen PC & Buzsáki G Internally organized mechanisms of the head direction sense. Nat. Neurosci. 18, 569–575 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Beraneck M & Cullen KE Activity of vestibular nuclei neurons during vestibular and optokinetic stimulation in the alert mouse. J. Neurophysiol. 98, 1549–1565 (2007). [DOI] [PubMed] [Google Scholar]
  • 48.Medrea I & Cullen KE Multisensory integration in early vestibular processing in mice: the encoding of passive vs. active motion. J. Neurophysiol. 110, 2704–2717 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Cullen KE The neural encoding of self-motion. Curr. Opin. Neurobiol. 21, 587–595 (2011). [DOI] [PubMed] [Google Scholar]
  • 50.Cullen KE The vestibular system: multimodal integration and encoding of self-motion for motor control. Trends Neurosci. 35, 185–196 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Grüsser OJ, Pause M & Schreiter U Vestibular neurones in the parieto-insular cortex of monkeys (Macaca fascicularis): visual and neck receptor responses. J. Physiol. (Lond.) 430, 559–583 (1990). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Lang W, Büttner-Ennever JA & Büttner U Vestibular projections to the monkey thalamus: an autoradiographic study. Brain Res. 177, 3–17 (1979). [DOI] [PubMed] [Google Scholar]
  • 53.Marlinski V & McCrea RA Self-motion signals in vestibular nuclei neurons projecting to the thalamus in the alert squirrel monkey. J. Neurophysiol. 101, 1730–1741 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Cullen KE & Minor LB Semicircular canal afferents similarly encode active and passive head-on-body rotations: implications for the role of vestibular efference. J. Neurosci. 22, RC226 (2002). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Reisine H & Raphan T Unit activity in the vestibular nuclei of monkeys during off-vertical axis rotation. Ann. NY Acad. Sci. 656, 954–956 (1992). [DOI] [PubMed] [Google Scholar]
  • 56.Jamali M, Sadeghi SG & Cullen KE Response of vestibular nerve afferents innervating utricle and saccule during passive and active translations. J. Neurophysiol. 101, 141–149 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Sadeghi SG, Minor LB & Cullen KE Response of vestibular-nerve afferents to active and passive rotations under normal conditions and after unilateral labyrinthectomy. J. Neurophysiol. 97, 1503–1514 (2007). [DOI] [PubMed] [Google Scholar]
  • 58.Carriot J, Brooks JX & Cullen KE Multimodal integration of self-motion cues in the vestibular system: active versus passive translations. J. Neurosci. 33, 19555–19566 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Carriot J, Jamali M, Brooks JX & Cullen KE Integration of canal and otolith inputs by central vestibular neurons is subadditive for both active and passive self-motion: implication for perception. J. Neurosci. 35, 3555–3565 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Roy JE & Cullen KE Selective processing of vestibular reafference during self-generated head motion. J. Neurosci. 21, 2131–2142 (2001). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Roy JE & Cullen KE Dissociating self-generated from passively applied head motion: neural mechanisms in the vestibular nuclei. J. Neurosci. 24, 2102–2111 (2004). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Straka H, Zwergal A & Cullen KE Vestibular animal models: contributions to understanding physiology and disease. J. Neurol. 263 (Suppl. 1), 10–23 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Brooks JX & Cullen KE Early vestibular processing does not discriminate active from passive self-motion if there is a discrepancy between predicted and actual proprioceptive feedback. J. Neurophysiol. 111, 2465–2478 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Cullen KE & Brooks JX Neural correlates of sensory prediction errors in monkeys: evidence for internal models of voluntary self-motion in the cerebellum. Cerebellum 14, 31–34 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Cullen KE The neural encoding of self-generated and externally applied movement: implications for the perception of self-motion and spatial memory. Front. Integr. Neurosci. 7, 108 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Shinder ME & Taube JS Resolving the active versus passive conundrum for head direction cells. Neuroscience 270, 123–138 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.McCrea RA & Horn AKE Nucleus prepositus. Prog. Brain Res. 151, 205–230 (2006). [DOI] [PubMed] [Google Scholar]
  • 68.Butler WN, Smith KS, van der Meer MAA & Taube JS The head-direction signal plays a functional role as a neural compass during navigation. Curr. Biol. 27, 1259–1267 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Dale A & Cullen KE The nucleus prepositus predominantly outputs eye movement-related information during passive and active self-motion. J. Neurophysiol. 109, 1900–1911 (2013). [DOI] [PubMed] [Google Scholar]
  • 70.Dale A & Cullen KE Local population synchrony and the encoding of eye position in the primate neural integrator. J. Neurosci. 35, 4287–4295 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.McFarland JL & Fuchs AF Discharge patterns in nucleus prepositus hypoglossi and adjacent medial vestibular nucleus during horizontal eye movement in behaving macaques. J. Neurophysiol. 68, 319–332 (1992). [DOI] [PubMed] [Google Scholar]
  • 72.Georges-François P, Rolls ET & Robertson RG Spatial view cells in the primate hippocampus: allocentric view not head direction or eye position or place. Cereb. Cortex 9, 197–212 (1999). [DOI] [PubMed] [Google Scholar]
  • 73.Rolls ET, Robertson RG & Georges-François P Spatial view cells in the primate hippocampus. Eur. J. Neurosci. 9, 1789–1794 (1997). [DOI] [PubMed] [Google Scholar]
  • 74.Killian NJ, Potter SM & Buffalo EA Saccade direction encoding in the primate entorhinal cortex during visual exploration. Proc. Natl. Acad. Sci. USA 112, 15743–15748 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Kim B, Dickman JD, Taube JS & Angelaki DE Head direction tuned cells in the macaque anterior thalamus. Soc. Neurosci. Poster 444.11 (2015). [Google Scholar]
  • 76.Gdowski GT & McCrea RA Neck proprioceptive inputs to primate vestibular nucleus neurons. Exp. Brain Res. 135, 511–526 (2000). [DOI] [PubMed] [Google Scholar]
  • 77.Sadeghi SG, Mitchell DE & Cullen KE Different neural strategies for multimodal integration: comparison of two macaque monkey species. Exp. Brain Res. 195, 45–57 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Brooks JX, Carriot J & Cullen KE Learning to expect the unexpected: rapid updating in primate cerebellum during voluntary self-motion. Nat. Neurosci. 18, 1310–1317 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Brooks JX & Cullen KE The primate cerebellum selectively encodes unexpected self-motion. Curr. Biol. 23, 947–955 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Donato F & Moser EI A world away from reality. Nature 533, 325 (2016). [DOI] [PubMed] [Google Scholar]
  • 81.Taube JS, Valerio S & Yoder RM Is navigation in virtual reality with fMRI really navigation? J. Cogn. Neurosci. 25, 1008–1019 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Gibson B, Butler WN & Taube JS The head-direction signal is critical for navigation requiring a cognitive map but not for learning a spatial habit. Curr. Biol. 23, 1536–1540 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Winter SS, Clark BJ & Taube JS Spatial navigation. Disruption of the head direction cell network impairs the parahippocampal grid cell signal. Science 347, 870–874 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Jacob P-Y An independent, landmark-dominated head-direction signal in dysgranular retrosplenial cortex. Nat. Neurosci. 20, 173–175 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Aronov D & Tank DW Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 84, 442–456 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Aghajan ZM et al. Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat. Neurosci. 18, 121–128 (2015). [DOI] [PubMed] [Google Scholar]
  • 87.Chen G, King JA, Burgess N & O’Keefe J How vision and movement combine in the hippocampal place code. Proc. Natl. Acad. Sci. USA 110, 378–383 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Jeffery KJ, Wilson JJ, Casali G & Hayman RM Neural encoding of large-scale three-dimensional space—properties and constraints. Front. Psychol. 6, 297 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Taube JS Head direction cell firing properties and behavioural performance in 3-D space. J. Physiol. (Lond.) 589, 835–841 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Finkelstein A et al. Three-dimensional head-direction coding in the bat brain. Nature 517, 159–164 (2015). [DOI] [PubMed] [Google Scholar]
  • 91.Calton JL & Taube JS Degradation of head direction cell activity during inverted locomotion. J. Neurosci. 25, 2420–2428 (2005). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Wilson JJ, Page H & Jeffery KJ A proposed rule for updating of the head direction cell reference frame following rotations in three dimensions. Preprint at bioRxiv 10.1101/043711 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Yoder RM & Taube JS Head direction cell activity in mice: robust directional signal depends on intact otolith organs. J. Neurosci. 29, 1061–1076 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Laurens J, Kim B, Dickman JD & Angelaki DE Gravity orientation tuning in macaque anterior thalamus. Nat. Neurosci. 19, 1566–1568 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Mackrous I, Carriot J, Jamali M, Brooks J & Cullen K Selective encoding of unexpected head tilt by the central neurons takes into account the cerebellar computation output. Soc. Neurosci. Poster 718.04 http://www.abstractsonline.com/pp8/#!/4071/presentation/15239 (2016). [Google Scholar]
  • 96.Aitken P, Zheng Y & Smith PF The regulation of hippocampal theta rhythm by the vestibular system. J. Neurophysiol. (in the press; ). [DOI] [PubMed] [Google Scholar]
  • 97.Jacob P-Y, Poucet B, Liberge M, Save E & Sargolini F Vestibular control of entorhinal cortex activity in spatial navigation. Front. Integrative Neurosci. 8, 38 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Brandon MP et al. Reduction of theta rhythm dissociates grid cell spatial periodicity from directional tuning. Science 332, 595–599 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Koenig J, Linder AN, Leutgeb JK & Leutgeb S The spatial periodicity of grid cells is not sustained during reduced theta oscillations. Science 332, 592–595 (2011). [DOI] [PubMed] [Google Scholar]
  • 100.Winter SS, Mehlman ML, Clark BJ & Taube JS Passive transport disrupts grid signals in the parahippocampal cortex. Curr. Biol. 25, 2493–2502 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES