Abstract
Virtual reality (VR) is one of the techniques that became particularly popular in neuroscience over the past few decades. VR experiments feature a closed-loop between sensory stimulation and behavior. Participants interact with the stimuli and not just passively perceive them. Several senses can be stimulated at once, large-scale environments can be simulated as well as social interactions. All of this makes VR experiences more natural than those in traditional lab paradigms. Compared to the situation in field research, a VR simulation is highly controllable and reproducible, as required of a laboratory technique used in the search for neural correlates of perception and behavior. VR is therefore considered a middle ground between ecological validity and experimental control. In this review, I explore the potential of VR in eliciting naturalistic perception and behavior in humans and non-human animals. In this context, I give an overview of recent virtual reality approaches used in neuroscientific research.
Keywords: virtual reality, naturalistic behavior, naturalistic neuroscience, ecological validity, animal behavior, behavioral neuroscience
1. Introduction
The arguably most important feature of natural behavior is active exploration and interrogation of the environment (Gottlieb and Oudeyer, 2018). External stimuli are not passively perceived. What is paid attention to is selected and specifically probed, reflecting the animals' motivations and needs. Moreover, natural environmental features and sensory cues are dynamic, multimodal, and complex (Sonkusare et al., 2019). This is in stark contrast to laboratory settings, which are characterized by numerous repetitions of the same imposed stimuli. These stimuli are often directed to only a single sense under simplified, artificial conditions and are disconnected from the animal's responses. Repetitions are important for behavioral modeling and the search for neural correlates and mechanisms, which both rely on trial-based averaging. It is nevertheless not surprising that results from laboratory experiments are of limited ecological validity and may not reveal the neural mechanisms underlying natural behavior (Krakauer et al., 2017; Dennis et al., 2021). Virtual reality (VR) may be part of a solution to this problem.
With VR, an artificial environment is simulated in which the user's actions determine the sensory stimulation, closing the loop between stimulation, perception, and action. A major motivation for the application of VR in neurophysiology is the desire to test behavior while recording with apparatuses that cannot be easily carried by the test subject or that require stability that cannot be achieved during free movement. The potential of VR, however, lies beyond this simple wish to fixate a behaving subject in place.
In connection with scientific VR-use, terms like ecological and ethological validity as well as naturalistic conditions are frequently voiced. In this regard, VR is considered to stand above traditional laboratory methods while maintaining a similar level of experimental control (e.g., Bohil et al., 2011; Parsons, 2015; Minderer et al., 2016; Krakauer et al., 2017; Lenormand and Piolino, 2022).
In the present article, I explore the potential of VR for evoking naturalistic perception and behavior and to promote the understanding of underlying brain function. I will give an overview of current VR technologies applied in neuroscience and use cases across different species, motivate why they are used and evaluate them in view of naturalistic neuroscience. To begin, let us briefly address the question: what is VR?
2. What is VR?
In his book, LaValle (2020) defines VR as: “Inducing targeted behavior in an organism by using artificial sensory stimulation, while the organism has little or no awareness of the interference.” This definition seems quite broad, but is flexible enough to embrace a variety of approaches, including those relevant to the present article. In a neuroscientific VR experiment, the participant experiences the stimulation of one or more senses to create the illusion of a “reality” that is intended by the researcher. Limited awareness seems less crucial for neuroscientific VR applications. However, one can argue that limited awareness is important for the feeling of presence in the artificial world, which as a result is treated as being natural—a basis for ecological validity.
What is missing from the above definition is that the virtual world is updated based on the user's behavior, providing an interactive experience (Bohil et al., 2011; Dombeck and Reiser, 2011; Naik et al., 2020). In terms of VR application in neuroscience, this narrowing of the definition is important because it distinguishes VR from simple sensory stimulation. The update is done in real time so that a closed loop is achieved between stimulation and behavior. For a real-time experience the update cycle needs to be sufficiently fast; how fast depends on the perceptual capabilities of the animal species and the sensory-motor system under investigation. Update delays can be increased parametrically depending on the research question. The most extreme case is the open loop, where the stimulation and the participant's actions are independent. Open loop corresponds to conventional stimulus conditions typically used in neuroscience studies.
3. Why using VR? And why for naturalistic neuroscience?
Typical motivations for using VR revolve around three different aspects: (1) multimodal stimulation with flexible and precise control, (2) interactivity instead of purely passive perception, and (3) the application of neural recording techniques that require particular mechanical stability. For naturalistic approaches, the first two points are the most important, but in a neuroscience context, the last is also relevant. I therefore discuss these three motivations next in view of their utility for naturalistic paradigms. Stimulus control and closed-loop methods have been steadily refined throughout the history of VR. For an account with regard to animal VRs and specifically rodent VRs used in research, the interested reader may be referred to Thurley and Ayaz (2017) and Naik et al. (2020). A general history of VR can be found in LaValle (2020). Finally, in this section I address the issue of immersion, i.e., the ability of a VR to draw the user in so that they feel present in it, which is closely related to achieving naturalistic conditions with VR.
3.1. VR provides flexible stimulus control
As a laboratory technique, VR benefits from the ability to perform experiments under precise control. This is what lets VR induce targeted behavior. Confounding and unintended influences although not completely excluded can be substantially reduced. VR is inherently flexible. It provides control over the complexity of the environment such as its size or the positioning of landmarks. Space restrictions, which can be a problem in the laboratory, do not exist in VR. Features can be easily and quickly altered without the participant noticing. One may, e.g., add or remove certain cues and test their contribution to a neural activity or a behavior. The manipulations can be done systematically and without influencing other components of the environment (Powell and Rosenthal, 2017). Also, stimuli may be provided that are unavailable in nature – although this speaks against the naturalistic use focused at in the present article. All of the above is hard to achieve in the field, where it is often less obvious which cues are attended to and which information is leveraged and which not; think, for instance, of investigating spatial navigation of primates in their natural habitat (De Lillo et al., 2014; Dolins et al., 2014, 2017; Allritz et al., 2022).
In the following, I will discuss paradigms of VR stimulus control utilized in specific areas of neuroscience research.
3.1.1. Spatial cognition and navigation
The most obvious neuroscience use of VR is in the study of spatial perception and navigation (Bohil et al., 2011; Thurley and Ayaz, 2017). However, the advantage of VR for this purpose has been questioned since locomotion in traditional real world laboratory paradigms like foraging for food on linear tracks and in open field boxes is more natural than on a treadmill or alike (Minderer et al., 2016). In real arenas, information from the external world, e.g., from visual cues, and internally generated information, e.g., from moving body parts, are coordinated. In VR, these sources may not be aligned due to problems integrating simulation and tracking.
Such conflicts are likely the reason for the altered responses of space-encoding neurons in the rodent brain found in VR compared to real-world experiments. Head-fixed or body-fixed rodents do not receive normal vestibular input, resulting in mismatches between vestibular and visual information. Place cells in the hippocampus show altered position coding under such conditions, as confirmed by direct comparisons between virtual paradigms and their real-world counterparts (Chen et al., 2013; Ravassard et al., 2013; Aghajan et al., 2014). In VR setups that do not restrict body rotations and in which vestibular information about rotational movements is available to the animal, normal place-selective firing has been reported (Aronov and Tank, 2014; Chen et al., 2018; Haas et al., 2019). Freely-moving VRs may even better solve this problem (Del Grosso et al., 2017; Kaupert et al., 2017; Stowers et al., 2017; Madhav et al., 2022).
Thus, the design of a VR setup and the quality of a VR simulation may elicit atypical neural responses. These issues do not devalue certain VR systems—each setup may provide informative insights—but they are important indications that the suitability for understanding natural behavior and associated neural activity may be limited for some VR applications.
However, VR has advantages over traditional laboratory paradigms, which are themselves far from the situation animals face in the wild. These advantages depend on the research question. The possibility to simulate environments that are much larger than the available space in the laboratory, can increase ecological validity (Dolins et al., 2017), e.g., for the study of spatial learning in macaques (Taillade et al., 2019) and chimpanzees (Allritz et al., 2022), and fly search behavior (Kaushik et al., 2020). Specialized VR systems and paradigms can provide insights into specific topics, (e.g., path integration Petzschner and Glasauer, 2011; Lakshminarasimhan et al., 2018, 2020; Thurley and Schild, 2018; Jayakumar et al., 2019; Robinson and Wiener, 2021; Madhav et al., 2022). VR enables task standardization for cross-species comparison. For instance, spatial behavior can be tested with humans in typical rodent laboratory mazes like the Morris water maze (Laczó et al., 2010). Moreover, VR helps to overcome difficulties of testing spatial behavior and cognition in the wild as I already pointed out above.
3.1.2. (Multi-)sensory processing
In VR, several senses can be stimulated at once and in concert. Such a multimodal stimulation increases immersion and engagement. The experience will be more ecological and if natural stimuli are used more naturalistic. Already the first applications of VR, e.g., for studying sensory-motor control of flying in insects, combined visual, mechanosensory (wind source), and olfactory cues (Gray et al., 2002). In general, any VR method that connects locomotion with some type of sensory stimulation provides a multimodal experience because it inevitably encompasses sensory feedback about self-motion. In this sense, the most typical VR that uses visual stimulation with walking on a treadmill or tethered flying will always be multimodal.
In principle, unnatural stimuli may be given or the stimulation of different senses may be mismatched, allowing for experiments that are not possible in the real world. Of course, this speaks against the naturalistic principle, but let me nevertheless give a few examples for illustration. VR makes it possible to decouple stimuli that are inextricably linked in the real world. In rodent experiments, visual sensory have been dissociated from non-visual self-motion inputs to probe their differential influences on spatial responses in the hippocampal formation (Chen et al., 2013; Tennant et al., 2018; Haas et al., 2019; Jayakumar et al., 2019) or on running speed responses in visual cortex (Saleem et al., 2013). In flies, the feedback from eyes and halteres has been decoupled in simulated flight setups (Sherman and Dickinson, 2003). Also the lag between an action and the subsequent update of the virtual stimulation could be changed. For instance, positional changes could be delayed or made jump-/teleportation-like (see Domnisoru et al., 2013; Kaupert et al., 2017; Stowers et al., 2017; Tennant et al., 2018, for examples from experiments in fish and rodents). Thus, in general, sensory and motor variables can be separated in VR.
3.1.3. Social interactions
The issues of laboratory vs. field work also apply to the study of social interactions. VR can alleviate some of them. Virtual stimuli can be designed to appear more similar to real-life counterparts than stimuli used in classical ethological experiments (Naik et al., 2020). A particular advantage is that stimuli can be animated. Even under open-loop conditions, moving prey can be simulated to study prey-capture (Ioannou et al., 2012) or conspecifics to probe mate-choice (Gierszewski et al., 2017). Further examples can be found in Naik et al. (2020).
An important point for experiments on social interactions is consistency (Powell and Rosenthal, 2017). No matter if in the wild or the laboratory, the behavior of real subjects depends on their motivation and will change by their interaction with others. Simulated subjects do not change their behavior in this manner (Chouinard-Thuly et al., 2017). Alike other VR stimuli, socially-relevant ones can also be precisely controlled, held constant or adapted, presented several times and to different subjects. Importantly, not just the static appearance is under close control but also the simulated movement patterns. In general, interaction with moving objects may sometimes be easier simulated in VR than provided in the real world, cf. e.g., the prey-capture study mentioned above (Ioannou et al., 2012).
3.2. VR goes beyond mere stimulus delivery
With VR the loop between perception and action can be closed. The participant in a VR experiment not only passively perceives the stimulation but behaves and interacts with it, which in turn changes the stimulus environment. This active engagement makes the VR experience much more reminiscent of real life and natural conditions than traditional, passive approaches.
The importance of motor actions for perception has been demonstrated, for instance, by experiments with mice moving on a treadmill while perceiving visual stimuli. Even when the treadmill is not coupled to the stimulation in such experiments, i.e., open-loop, neuronal responses in the visual cortex are substantially modulated (Niell and Stryker, 2010; Ayaz et al., 2013). More recently impacts of movement have been described not only for vision (Dadarlat and Stryker, 2017; Clancy et al., 2019) but also for audition and somatosensation (Fu et al., 2014; Schneider and Mooney, 2018). Similar dependence of sensory processing on behavioral state has also been reported in insects (Maimon et al., 2010). In zebrafish, the interaction between motor responses and visual feedback (Portugues and Engert, 2011) and related neural processing (Ahrens et al., 2012) has been investigated with closed-loop experiments as well as visually-driven swim patterns underlying natural prey capture (Trivedi and Bollmann, 2013).
Closed-loop is also helpful for decision making studies in species such as mice (Harvey et al., 2012), gerbils (Kautzky and Thurley, 2016), and zebrafish (Bahl and Engert, 2020; Dragomir et al., 2020). These studies use rather abstract visual stimuli, like random dots and stripe patterns, which are admittedly not very naturalistic. However, enabling natural motor responses, like walking and swimming, are key improvements over conventional designs that rely on nose-poking or lever-pressing.
3.3. VR enables recording of brain activity with bulky devices
One of the early motivations of using VR in neuroscience was that neural recording techniques can be used that require a high degree of mechanical stability or are too bulky and heavy to be carried by the animal (Dombeck et al., 2010; Harvey et al., 2012; Ahrens et al., 2013; Domnisoru et al., 2013; Schmidt-Hieber and Häusser, 2013; Leinweber et al., 2014). Similar reasons apply to the use of VR with fMRI or other methods for recoding human brain activity (Lenormand and Piolino, 2022). The application of VR with a focus on recording brain activity in the behaving animal has, e.g., been reviewed by Dombeck and Reiser (2011). Technological progress will increasingly weaken this motivation to use VR in the future. Miniature head-mounted systems for imaging (Yu et al., 2015) and single cell recordings (Valero and English, 2019) are under constant development and can be applied in freely moving animals.
3.4. Achieving immersion and presence in VR
Important concepts that are frequently expressed, especially in the context of human VR, are those of immersion and presence. How immersive a VR is, i.e., how strongly it draws the user in, is determined by the degree of sensory stimulation and the sensitivity to motor actions of the VR system in use. Deeper immersion leads to increased presence, i.e., the feeling of being in the virtual world (Bohil et al., 2011). For recreational or therapeutic applications with humans, high levels of immersion are surely desired and necessary. But how about scientific use?
Immersion does not seem to be the most important factor for investigating certain research questions. A VR setup could in principle only be a tool to provide some sensory stimulation and to connect it to behaviors. However, the ultimate goal of neuroscience is to investigate behavioral and brain responses that occur under natural conditions (Krakauer et al., 2017). A VR approach could only contribute to this goal if it elicits such responses. Yet, a VR that evokes responses as in real life implies deeper immersion. Thus, ecological validity and immersion are linked.
But how to determine presence and immersion? Humans can be questioned (Hofer et al., 2020), but what about animals? To determine and quantify the degree of immersion, two different types of responses seem at hand: neural and behavioral. However, not all possible types of neural activity must occur in natural behaviors, so only behavioral responses are suitable to determine proximity to natural conditions (Krakauer et al., 2017). Therefore, sufficient understanding of the behavior to be elicited in VR is required under real-world conditions.
A number of studies compared physiological and psychological reactions between real-life situations and their virtual counterparts in humans (examples are reviewed by Lenormand and Piolino, 2022). Behavioral conformity between virtual and real world is less regularly assessed with animals (Powell and Rosenthal, 2017). While it is more common in insects (see Dahmen et al., 2017 for an elegant example with ant spatial navigation) and spiders (Peckmezian and Taylor, 2015), work with rodents on this topic is scarce (Hölscher et al., 2005). Comparisons have instead been made of neural activity between real-world and virtual conditions (e.g., for space-responses in rodent hippocampus Chen et al., 2013; Ravassard et al., 2013; Safaryan and Mehta, 2021).
Immersion is often considered in the context of the quality of the visual stimulation as it is the dominant sensory modality in primates and flying insects. For other species, however, vision is not as dominant. In these animals, immersion and ecological validity will depend more strongly on other types of perceptions, e.g., sound, touch, smell. As an example that may not seem like VR at first glance but is nonetheless consistent with the broad definition of VR favored in this article and which is ecologically valid see Faumont et al. (2011). In this study, osmo-sensitive neurons of the nematode Caenorhabditis elegans were optogenetically activated to simulate an aversive location in the animal's environment.
4. Technical components for naturalistic VR
Key technical components of VR are devices that provide sensory stimulation to create the virtual experience and those that keep track of the behavioral responses. How these components may promote naturalistic stimulation and behavior is discussed next.
4.1. Tracking movements and actions
In VR setups, participants are often restrained so that they can sense the stimuli appropriately while having enough freedom to move in the virtual environment. For instance, a specific position may need to be maintained in relation to a screen for visual stimulation or speakers for auditory stimulation. Other reasons are requirements on mechanical stability of neural recording devices as was already discussed above.
The type of fixation depends on the tested species. Flying insects may be tethered with their body leaving the wings free to beat (Gray et al., 2002; Sherman and Dickinson, 2003; Dombeck and Reiser, 2011). Wing motion is monitored with an optical sensor, and the difference between the amplitudes of left and right wing beats serves as an indicator of attempted body rotations (Reiser and Dickinson, 2008). In legged animals, fixation on a treadmill is the standard technique (Carrel, 1972; Dahmen, 1980; Seelig et al., 2010; Takalo et al., 2012; Peckmezian and Taylor, 2015; Thurley and Ayaz, 2017; Haberkern et al., 2019; Naik et al., 2020). Such treadmills are typically styrofoam balls on an air-cushion, cylindrical treadmills or linear belts. The animals move the treadmill with their legs, which is captured and used to update the position in the virtual world. In animals like rodents, which have a natural need for walking (Meijer and Robbers, 2014), a treadmill gives a more natural way of responding to the animals—even in non-spatial tasks (Garbers et al., 2015; Kautzky and Thurley, 2016; Henke et al., 2021, 2022). To provide a realistic, natural feeling of motion, the physical properties of the treadmill, such as its moment of inertia, must be taken into account and adapted to the animal species. For instance, treadmills for ants have particularly low friction and weight (Dahmen et al., 2017).
Any type of fixation imposes unnatural movements and disrupts sensory feedback about motor behavior. Tethered insects do not receive normal input from their balance organs (Fry et al., 2008). Head-fixed rodents do not receive natural input about rotations and linear acceleration from their vestibular organs. Also they have to make unnatural shear movements with their legs on the treadmill to make rotations in the virtual environment (Thurley and Ayaz, 2017). A similar lack of vestibular input is also found in zebrafish VRs, in which the animals' heads are immobilized (e.g., Portugues and Engert, 2011). A solution to this problem is offered by VR setups for freely flying, walking, and swimming animals (Fry et al., 2008; Del Grosso et al., 2017; Stowers et al., 2017; Ferreiro et al., 2020; Madhav et al., 2022). These setups use cameras to track the position of the animal (or only its head) and update a perspective-correct visual scenery. Alternatively, tracking information can be used to drive a motorized treadmill that compensates for the animal's movements to hold it in place with respect to the VR hardware (Kaupert et al., 2017).
Several technical considerations apply to ensure proper tracking, especially to meet the needs of the experimental animal (see Naik et al., 2020). A number of different tracking methods exist based on deep learning and other machine learning techniques (e.g., Hedrick, 2008; Robie et al., 2017; Graving et al., 2019; Mathis and Mathis, 2020; Vagvolgyi et al., 2022).
Body fixation is also not required when the stimulus display is directly attached to the sense organ and can be carried as with head-mounted displays. In VR headsets, head-mounted displays are combined with head-tracking hardware (Bohil et al., 2011; LaValle, 2020). Headsets prevail in human VR nowadays but also other tracking methods exist like treadmills for humans (examples are found in LaValle, 2020). In humans and other primates, often joysticks, game pads or keyboards are used to track motion and other responses (Washburn and Astur, 2003; Sato et al., 2004), for instance, when particular fixation is necessary like in fMRI (Lenormand and Piolino, 2022).
4.2. Displaying visual stimuli
Visual virtual worlds are the predominant type of VR. They are almost exclusively provided in first-person view, i.e., from the point of view of the participant. Compared to a third person perspective behind a visible avatar—which might be possible with humans but is hard to imagine with animals—the first-person view enhances the experience (Dolins et al., 2017). For presentation, different types of displays are used, such as simple monitors, panoramic projection screens and head-mounted displays. Projections to the floor below the animals are also leveraged, e.g., with zebrafish (Ahrens et al., 2012; Bahl and Engert, 2020; Dragomir et al., 2020). In insects with their lower visual acuity but fast reaction times, LED displays are used (Dombeck and Reiser, 2011). For animals with eyes in the front, like primates and carnivorans, flat monitors may be sufficient. For animals with laterally positioned eyes, like rodents, only wide displays cover a sufficient part of the field of view (Dolins et al., 2017; Thurley and Ayaz, 2017). For ecological validity, the projection needs to have correct perspective and be undistorted (Dolins et al., 2017; Naik et al., 2020).
In general, it has to be kept in mind that images shown on displays are perceived differently by different animals (Chouinard-Thuly et al., 2017; Naik et al., 2020). Photoreceptor sensitivities differ across species (e.g., Osorio and Vorobyev, 2005) and the color display has to be adapted to the species' specifics to enable naturalistic stimulation. Behavioral methods can also readout animals' sensitivities (Knorr et al., 2018). Other visual capabilities like integration times and acuity also vary between species and need to be accommodated. For a detailed discussion with a focus on technical challenges see Naik et al. (2020). Similar considerations obviously apply to other sensory systems as well and have to be taken into account, especially when a naturalistic perceptual experience is intended.
Images on a screen remain 2D and natural vision is only partially achieved (Dolins et al., 2017). For instance, stereopsis is not possible with single images. Head-mounted displays in humans solve this by presenting offset images to each eye (LaValle, 2020). For an approach with insects, see Nityananda et al. (2016). Currently, there are no VR headsets for animals, although they may be in development, as they are mainly a miniaturization issue, apart from species-specific needs. Technology in this direction includes head-mounted camera systems to track eye movements (Meyer et al., 2018) and inertial sensors for head-tracking in rodents (Venkatraman et al., 2010; Fayat et al., 2021).
4.3. Sound stimulation
To simulate 3D spatial sound scenes that mimic real-life situations, virtual acoustic approaches have been developed. Human VR headsets often include headphones to provide sound stimuli in conjunction with the visual display. Alternatively, in free-field auralization, arrays of loudspeakers are placed around the user, such that sound sources can be precisely positioned in virtual space (Seeber et al., 2010). Compared to headphones, the user can listen with their own ears and the characteristics of their ears can be captured. Therefore, experiments can be also done with hearing aid wearers. A disadvantage is that the setup has to be placed in an anechoic chamber, which is demanding and expensive to construct. For correct deliverance of sound cues, the user has to be placed in a specific location with respect to the array. With such auditory VR setups, e.g., auditory motion parallax could be demonstrated in humans (Genzel et al., 2018). In rodents, virtual acoustics is done with loudspeakers placed around the treadmill (Cushman et al., 2013; Funamizu et al., 2016). Other approaches use more of an augmentation of a real arena than virtual acoustics to probe spatial localization of objects with the help of acoustic stimulation (Ferreiro et al., 2020; Amaro et al., 2021).
4.4. Tactile and haptic stimulation
In VR tactile and haptic stimulation can also be provided, simulating surfaces with different textures or the feel of forces (Bohil et al., 2011). Haptic systems for humans consist, e.g., of robotic arms with which force or pressure can be applied or pin arrays can be used to simulate surfaces (Culbertson et al., 2018; Wang et al., 2019). In tactile VR systems for rodents, the animals move through corridors simulated by movable plates (Sofroniew et al., 2014) or rotating cylinders with different textures (Ayaz et al., 2019). These “walls” are touched by the animals with their whiskers and they are adapted in closed-loop by the movements of the animal. Similar setups exist in which the animals are freely moving and that are not actually VR but still allow for simulating different tactile textures (Kerekes et al., 2017). Belt treadmills can also be equipped with tactile cues (Geiller et al., 2017).
4.5. Odors
Recently, devices have been developed to quickly and precisely deliver odorants with sufficient diffusion and clearance times for simulating spatially confined olfactory cues. Examples for use with humans are Salminen et al. (2018) and Micaroni et al. (2019). In animal studies, olfactory VR has been used with tethered rodents (Radvansky and Dombeck, 2018; Fischler-Ruiz et al., 2021; Radvansky et al., 2021) and insects (Gray et al., 2002). Precise odor delivery poses a problem for freely moving VRs, either a distribution system has to be carried on the body or, alternatively, odors could be delivered on room scale (Fry et al., 2008). However, the latter is hard to control in terms of odor concentration and distribution, preventing proper localization. Systems for humans that simulate taste are under development (Narumi et al., 2011; Vi et al., 2017; Kerruish, 2019) but have not yet been used in neuroscience as far as I know.
4.6. Rotation and gravity
VR setups that require fixation of the animals typically suffer from providing only inadequate information about rotational and linear acceleration cues. To overcome such problems, motion platforms with multiple degrees of freedom or rotating chairs providing horizontal rotations have been used for vestibular stimulation (Gu et al., 2010; Dokka et al., 2011; Genzel et al., 2016; Garzorz and MacNeilage, 2017). Similarly, rotational gimbals have been used with flies (Sherman and Dickinson, 2003). VR setups that allow for free movement do not suffer from these problems.
5. Limitations and potentials for naturalistic VR
Technical considerations for VRs with respect to species specifics and naturalistic experiments have already been discussed above. Here I address some more general issues.
5.1. Not everything can be tested in VR in terms of naturalistic experiments
There is in principle no limitation on what can be simulated with VR. For the purpose of the present article the simulation just needs to be naturalistic. We can intuitively judge how a VR simulation affects a human participant—or often simply take it for granted that we can—but this is impossible with animals. Thus, as I have argued above, naturalistic approaches must ensure that a VR simulation elicits the same behaviors that would occur in the real world counterpart (Krakauer et al., 2017; Powell and Rosenthal, 2017). This strongly constrains what can and cannot be done with VR in terms of naturalistic experiments. When a strict comparison between the real world and VR is not possible, such as with the teleportation-like position changes mentioned above, it means that the experiment is not suitable for a naturalistic VR study. Other questions may be better investigated directly in the real world, instead of investing in building a VR with all its limitations.
5.2. How natural can VR become and how natural or real does it have to be?
As pointed out by LaValle (2020), it is tempting to try to match the physical world in VR as closely as possible (universal simulation principle). Such a goal is inappropriate, since a simulation will never be perfect and always comprise unanticipated confounding variables. One should rather be guided by the research objective when designing the VR. A sensible design can at times mean reduction and simplification, without losing ecological validity (Bucci-Mansilla et al., 2021). Related to this is the uncanny valley phenomenon, in which high realism of an artificial stimulus makes observers feel uneasy (Chouinard-Thuly et al., 2017; LaValle, 2020). Among non-human animals this problem has been described with macaques (Steckenfinger and Ghazanfar, 2009).
5.3. VR sickness and fatigue
A regularly encountered problem with human VR applications is that of cyber, simulator, or VR sickness (Bohil et al., 2011; LaValle, 2020). Some participants experience discomfort and nausea due to latencies in the synchronization of the VR components, which results in incongruent sensory inputs. Of particular importance here is vestibular feedback from self-motion, which does not match visual input. This problem may occur due to improper tracking but also a misunderstanding and disregard of the user-perspective by the designer of the VR experience (LaValle, 2020). Related to this, fatigue can arise. Whereas, fatigue is certainly an issue that can be accounted for in animal studies—consider, for example, a treadmill that is too heavy or creates much friction (Dahmen et al., 2017)—analogs of VR sickness in animals may be difficult to determine. Animal VRs can suffer from unnatural feedback from different senses (Dombeck and Reiser, 2011; Thurley and Ayaz, 2017). This is exemplified by the issues of head-fixation with regard to hippocampal space-related activity in rodents discussed above.
6. Conclusions
In this article, I tried to show that VR has a multitude of applications in neuroscience that can help advancing from traditional laboratory-based to naturalistic research themes. VR can mediate between the opposing poles of ecological validity and experimental control, facilitating generalizability of laboratory results to the situation in the wild. As with any scientific approach, the means have to be adapted to the research question. A specific and maybe novel technology or method does not help with this by itself (Minderer et al., 2016; Thurley and Ayaz, 2017). When designing a VR, it is important to consider the specifics of the model species. Only then immersion can be reached, which results in a naturalistic experience and ecological validity. To determine how immersive a VR experience is, only behavioral readout is appropriate, which needs to be compared to real-world behavior. Otherwise, VR experiments will likely elicit unnatural behaviors and neural responses, which are not related to the intended research questions (Krakauer et al., 2017; Powell and Rosenthal, 2017).
Developers of VR for humans, especially for consumer applications or therapy, realized that without knowledge about our senses, our perception and ultimately our brains, it is not possible to build VR (LaValle, 2020). This concept closes the cycle for the present article—and presents a somewhat circular argument for use of VR in naturalistic neuroscience: VR is used in neuroscience to gain insights into perception, behavior, and brain function. However, good VR experiments that are also naturalistic and ecologically valid can only be conducted if the subjects' perception, behavior, and knowledge of their physiological basis are sensibly taken into account.
Author contributions
The author confirms being the sole contributor of this work and has approved it for publication.
Conflict of interest
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Acknowledgments
I wish to thank the two reviewers for their detailed comments and valuable suggestions, which contributed substantially to the improvement of the present article. I am also grateful for ongoing support from the Bernstein Center Munich.
References
- Aghajan Z. M., Acharya L., Moore J. J., Cushman J. D., Vuong C., Mehta M. R. (2014). Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat Neurosci. 18, 121–128. 10.1038/nn.3884 [DOI] [PubMed] [Google Scholar]
- Ahrens M. B., Huang K. H., Narayan S., Mensh B. D., Engert F. (2013). Two-photon calcium imaging during fictive navigation in virtual environments. Front. Neural Circuits 7:104. 10.3389/fncir.2013.00104 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ahrens M. B., Li J. M., Orger M. B., Robson D. N., Schier A. F., Engert F., et al. (2012). Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471–477. 10.1038/nature11057 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Allritz M., Call J., Schweller K., McEwen E. S., de Guinea M., Janmaat K. R. L., et al. (2022). Chimpanzees (Pan troglodytes) navigate to find hidden fruit in a virtual environment. Sci. Adv. 8:eabm4754. 10.1126/sciadv.abm4754 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Amaro D., Ferreiro D. N., Grothe B., Pecka M. (2021). Source identity shapes spatial preference in primary auditory cortex during active navigation. Curr. Biol. 31, 3875–3883.e5. 10.1016/j.cub.2021.06.025 [DOI] [PubMed] [Google Scholar]
- Aronov D., Tank D. W. (2014). Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 84, 442–456. 10.1016/j.neuron.2014.08.042 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ayaz A., Saleem A. B., Schölvinck M. L., Carandini M. (2013). Locomotion controls spatial integration in mouse visual cortex. Curr. Biol. 23, 890–894. 10.1016/j.cub.2013.04.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ayaz A., Stäuble A., Hamada M., Wulf M.-A., Saleem A. B., Helmchen F. (2019). Layer-specific integration of locomotion and sensory information in mouse barrel cortex. Nat. Commun. 10:2585. 10.1038/s41467-019-10564-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bahl A., Engert F. (2020). Neural circuits for evidence accumulation and decision making in larval zebrafish. Nat. Neurosci. 23, 94–102. 10.1038/s41593-019-0534-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bohil C. J., Alicea B., Biocca F. A. (2011). Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 12, 752–762. 10.1038/nrn3122 [DOI] [PubMed] [Google Scholar]
- Bucci-Mansilla G., Vicencio-Jimenez S., Concha-Miranda M., Loyola-Navarro R. (2021). Challenging paradigms through ecological neuroscience: Lessons from visual models. Front. Neurosci. 15:758388. 10.3389/fnins.2021.758388 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Carrel J. S. (1972). An improved treading device for tethered insects. Science 175:1279. 10.1126/science.175.4027.1279.b4400809 [DOI] [Google Scholar]
- Chen G., King J. A., Burgess N., O'Keefe J. (2013). How vision and movement combine in the hippocampal place code. Proc. Natl. Acad. Sci. U.S.A. 110, 378–383. 10.1073/pnas.1215834110 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen G., King J. A., Lu Y., Cacucci F., Burgess N. (2018). Spatial cell firing during virtual navigation of open arenas by head-restrained mice. eLife 7:e26. 10.7554/eLife.34789.026 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chouinard-Thuly L., Gierszewski S., Rosenthal G. G., Reader S. M., Rieucau G., Woo K. L., et al. (2017). Technical and conceptual considerations for using animated stimuli in studies of animal behavior. Curr. Zool. 63, 5–19. 10.1093/cz/zow104 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Clancy K. B., Orsolic I., Mrsic-Flogel T. D. (2019). Locomotion-dependent remapping of distributed cortical networks. Nat. Neurosci. 22, 778–786. 10.1038/s41593-019-0357-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Culbertson H., Schorr S. B., Okamura A. M. (2018). Haptics: the present and future of artificial touch sensation. Annu. Rev. Control Robot. Auton. Syst. 1, 385–409. 10.1146/annurev-control-060117-105043 [DOI] [Google Scholar]
- Cushman J. D., Aharoni D. B., Willers B., Ravassard P., Kees A., Vuong C., et al. (2013). Multisensory control of multimodal behavior: do the legs know what the tongue is doing? PLoS ONE 8:e80465. 10.1371/journal.pone.0080465 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dadarlat M. C., Stryker M. P. (2017). Locomotion enhances neural encoding of visual stimuli in mouse v1. J. Neurosci. 37, 3764–3775. 10.1523/JNEUROSCI.2728-16.2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dahmen H. (1980). A simple apparatus to investigate the orientation of walking insects. Cell. Mol. Life Sci. 36, 685–687. 10.1007/BF01970140 [DOI] [Google Scholar]
- Dahmen H., Wahl V. L., Pfeffer S. E., Mallot H. A., Wittlinger M. (2017). Naturalistic path integration of Cataglyphis desert ants on an air-cushioned lightweight spherical treadmill. J. Exp. Biol. 220, 634–644. 10.1242/jeb.148213 [DOI] [PubMed] [Google Scholar]
- De Lillo C., Kirby M., James F. C. (2014). Spatial working memory in immersive virtual reality foraging: path organization, traveling distance and search efficiency in humans (Homo sapiens). Am. J. Primatol. 76, 436–446. 10.1002/ajp.22195 [DOI] [PubMed] [Google Scholar]
- Del Grosso N. A., Graboski J. J., Chen W., Blanco-Hernández E., Sirota A. (2017). Virtual reality system for freely-moving rodents. bioRxiv [preprint]. 10.1101/161232 [DOI] [Google Scholar]
- Dennis E. J., El Hady A., Michaiel A., Clemens A., Tervo D. R. G., Voigts J., Datta S. R. (2021). Systems neuroscience of natural behaviors in rodents. J. Neurosci. 41, 911–919. 10.1523/JNEUROSCI.1877-20.2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dokka K., MacNeilage P. R., DeAngelis G. C., Angelaki D. E. (2011). Estimating distance during self-motion: a role for visual-vestibular interactions. J. Vis. 11, 1–16. 10.1167/11.13.2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dolins F. L., Klimowicz C., Kelley J., Menzel C. R. (2014). Using virtual reality to investigate comparative spatial cognitive abilities in chimpanzees and humans. Am. J. Primatol. 76, 496–513. 10.1002/ajp.22252 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dolins F. L., Schweller K., Milne S. (2017). Technology advancing the study of animal cognition: using virtual reality to present virtually simulated environments to investigate nonhuman primate spatial cognition. Curr. Zool. 63, 97–108. 10.1093/cz/zow121 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dombeck D. A., Harvey C. D., Tian L., Looger L. L., Tank D. W. (2010). Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat. Neurosci. 13, 1433–1440. 10.1038/nn.2648 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dombeck D. A., Reiser M. B. (2011). Real neuroscience in virtual worlds. Curr. Opin. Neurobiol. 22, 3–10. 10.1016/j.conb.2011.10.015 [DOI] [PubMed] [Google Scholar]
- Domnisoru C., Kinkhabwala A. A., Tank D. W. (2013). Membrane potential dynamics of grid cells. Nature 495, 199–204. 10.1038/nature11973 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dragomir E. I., Štih V., Portugues R. (2020). Evidence accumulation during a sensorimotor decision task revealed by whole-brain imaging. Nat. Neurosci. 23, 85–93. 10.1038/s41593-019-0535-8 [DOI] [PubMed] [Google Scholar]
- Faumont S., Rondeau G., Thiele T. R., Lawton K. J., McCormick K. E., Sottile M., et al. (2011). An image-free opto-mechanical system for creating virtual environments and imaging neuronal activity in freely moving Caenorhabditis elegans. PLOS ONE 6:e24666. 10.1371/journal.pone.0024666 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fayat R., Delgado Betancourt V., Goyallon T., Petremann M., Liaudet P., Descossy V., et al. (2021). Inertial measurement of head tilt in rodents: principles and applications to vestibular research. Sensors 21:6318. 10.3390/s21186318 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ferreiro D. N., Amaro D., Schmidtke D., Sobolev A., Gundi P., Belliveau L., et al. (2020). Sensory island task (sit): a new behavioral paradigm to study sensory perception and neural processing in freely moving animals. Front. Behav. Neurosci. 14:576154. 10.3389/fnbeh.2020.576154 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fischler-Ruiz W., Clark D. G., Joshi N. R., Devi-Chou V., Kitch L., Schnitzer M., et al. (2021). Olfactory landmarks and path integration converge to form a cognitive spatial map. Neuron, 109, 4036–4049.e5. 10.1016/j.neuron.2021.09.055 [DOI] [PubMed] [Google Scholar]
- Fry S. N., Rohrseitz N., Straw A. D., Dickinson M. H. (2008). Trackfly: virtual reality for a behavioral system analysis in free-flying fruit flies. J. Neurosci. Methods 171, 110–117. 10.1016/j.jneumeth.2008.02.016 [DOI] [PubMed] [Google Scholar]
- Fu Y., Tucciarone J. M., Espinosa J. S., Sheng N., Darcy D. P., Nicoll R. A., et al. (2014). A cortical circuit for gain control by behavioral state. Cell 156, 1139–1152. 10.1016/j.cell.2014.01.050 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Funamizu A., Kuhn B., Doya K. (2016). Neural substrate of dynamic bayesian inference in the cerebral cortex. Nat. Neurosci. 19, 1682–1689. 10.1038/nn.4390 [DOI] [PubMed] [Google Scholar]
- Garbers C., Henke J., Leibold C., Wachtler T., Thurley K. (2015). Contextual processing of brightness and color in Mongolian gerbils. J. Vis. 15:13. 10.1167/15.1.13 [DOI] [PubMed] [Google Scholar]
- Garzorz I. T., MacNeilage P. R. (2017). Visual-vestibular conflict detection depends on fixation. Curr. Biol. 27, 2856–2861.e4. 10.1016/j.cub.2017.08.011 [DOI] [PubMed] [Google Scholar]
- Geiller T., Fattahi M., Choi J.-S., Royer S. (2017). Place cells are more strongly tied to landmarks in deep than in superficial ca1. Nat. Commun. 8:14531. 10.1038/ncomms14531 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Genzel D., Firzlaff U., Wiegrebe L., MacNeilage P. R. (2016). Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals. J. Neurophysiol. 116, 765–775. 10.1152/jn.00052.2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Genzel D., Schutte M., Brimijoin W. O., MacNeilage P. R., Wiegrebe L. (2018). Psychophysical evidence for auditory motion parallax. Proc. Natl. Acad. Sci. U.S.A. 115, 4264–4269. 10.1073/pnas.1712058115 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gierszewski S., Müller K., Smielik I., Hütwohl J.-M., Kuhnert K.-D., Witte K. (2017). The virtual lover: variable and easily guided 3d fish animations as an innovative tool in mate-choice experiments with Sailfin mollies-II. Validation. Curr. Zool. 63, 65–74. 10.1093/cz/zow108 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gottlieb J., Oudeyer P.-Y. (2018). Towards a neuroscience of active sampling and curiosity. Nat. Rev. Neurosci. 19, 758–770. 10.1038/s41583-018-0078-0 [DOI] [PubMed] [Google Scholar]
- Graving J. M., Chae D., Naik H., Li L., Koger B., Costelloe B. R., et al. (2019). Deepposekit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8:e47994. 10.7554/eLife.47994.sa2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gray J. R., Pawlowski V., Willis M. A. (2002). A method for recording behavior and multineuronal CNS activity from tethered insects flying in virtual space. J. Neurosci. Methods 120, 211–223. 10.1016/S0165-0270(02)00223-6 [DOI] [PubMed] [Google Scholar]
- Gu Y., Fetsch C. R., Adeyemo B., Deangelis G. C., Angelaki D. E. (2010). Decoding of MSTD population activity accounts for variations in the precision of heading perception. Neuron 66, 596–609. 10.1016/j.neuron.2010.04.026 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Haas O. V., Henke J., Leibold C., Thurley K. (2019). Modality-specific subpopulations of place fields coexist in the hippocampus. Cereb. Cortex 29, 1109–1120. 10.1093/cercor/bhy017 [DOI] [PubMed] [Google Scholar]
- Haberkern H., Basnak M. A., Ahanonu B., Schauder D., Cohen J. D., Bolstad M., et al. (2019). Visually guided behavior and optogenetically induced learning in head-fixed flies exploring a virtual landscape. Curr. Biol. 29, 1647–1659.e8. 10.1016/j.cub.2019.04.033 [DOI] [PubMed] [Google Scholar]
- Harvey C. D., Coen P., Tank D. W. (2012). Choice-specific sequences in parietal cortex during a virtual-navigation decision task. Nature 484, 62–68. 10.1038/nature10918 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hedrick T. L. (2008). Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems. Bioinspir. Biomimet. 3:034001. 10.1088/1748-3182/3/3/034001 [DOI] [PubMed] [Google Scholar]
- Henke J., Bunk D., von Werder D., Häusler S., Flanagin V. L., Thurley K. (2021). Distributed coding of duration in rodent prefrontal cortex during time reproduction. eLife 10:e71612. 10.7554/eLife.71612 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Henke J., Flanagin V. L., Thurley K. (2022). A virtual reality time reproduction task for rodents. Front. Behav. Neurosci. 16:957804. 10.3389/fnbeh.2022.957804 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hofer M., Hartmann T., Eden A., Ratan R., Hahn L. (2020). The role of plausibility in the experience of spatial presence in virtual environments. Front. Virt. Real. 1:2. 10.3389/frvir.2020.00002 [DOI] [Google Scholar]
- Hölscher C., Schnee A., Dahmen H., Setia L., Mallot H. A. (2005). Rats are able to navigate in virtual environments. J. Exp. Biol. 208(Pt 3), 561–569. 10.1242/jeb.01371 [DOI] [PubMed] [Google Scholar]
- Ioannou C. C., Guttal V., Couzin I. D. (2012). Predatory fish select for coordinated collective motion in virtual prey. Science 337, 1212–1215. 10.1126/science.1218919 [DOI] [PubMed] [Google Scholar]
- Jayakumar R. P., Madhav M. S., Savelli F., Blair H. T., Cowan N. J., Knierim J. J. (2019). Recalibration of path integration in hippocampal place cells. Nature 566, 533–537. 10.1038/s41586-019-0939-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kaupert U., Thurley K., Frei K., Bagorda F., Schatz A., Tocker G., et al. (2017). Spatial cognition in a virtual reality home-cage extension for freely moving rodents. J. Neurophysiol. 117, 1736–1748. 10.1152/jn.00630.2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kaushik P. K., Renz M., Olsson S. B. (2020). Characterizing long-range search behavior in diptera using complex 3d virtual environments. Proc. Natl. Acad. Sci. U.S.A. 117, 12201–12207. 10.1073/pnas.1912124117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kautzky M., Thurley K. (2016). Estimation of self-motion duration and distance in rodents. R. Soc. Open Sci. 3:160118. 10.1098/rsos.160118 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kerekes P., Daret A., Shulz D. E., Ego-Stengel V. (2017). Bilateral discrimination of tactile patterns without whisking in freely running rats. J. Neurosci. 37, 7567–7579. 10.1523/JNEUROSCI.0528-17.2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kerruish E. (2019). Arranging sensations: smell and taste in augmented and virtual reality. Senses Soc. 14, 31–45. 10.1080/17458927.2018.1556952 [DOI] [Google Scholar]
- Knorr A. G., Gravot C. M., Gordy C., Glasauer S., Straka H. (2018). I spy with my little eye: a simple behavioral assay to test color sensitivity on digital displays. Biol. Open 7:bio035725. 10.1242/bio.035725 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krakauer J. W., Ghazanfar A. A., Gomez-Marin A., MacIver M. A., Poeppel D. (2017). Neuroscience needs behavior: correcting a reductionist bias. Neuron 93, 480–490. 10.1016/j.neuron.2016.12.041 [DOI] [PubMed] [Google Scholar]
- Laczó J., Andel R., Vyhnalek M., Vlcek K., Magerova H., Varjassyova A., et al. (2010). Human analogue of the Morris water maze for testing subjects at risk of Alzheimer's disease. Neuro-degener. Dis. 7, 148–152. 10.1159/000289226 [DOI] [PubMed] [Google Scholar]
- Lakshminarasimhan K. J., Avila E., Neyhart E., DeAngelis G. C., Pitkow X., Angelaki D. E. (2020). Tracking the mind's eye: primate gaze behavior during virtual visuomotor navigation reflects belief dynamics. Neuron 106, 662–674.e5. 10.1016/j.neuron.2020.02.023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lakshminarasimhan K. J., Petsalis M., Park H., DeAngelis G. C., Pitkow X., Angelaki D. E. (2018). A dynamic Bayesian observer model reveals origins of bias in visual path integration. Neuron 99, 194–206.e5. 10.1016/j.neuron.2018.05.040 [DOI] [PMC free article] [PubMed] [Google Scholar]
- LaValle S. (2020). Virtual Reality. Available online at: http://lavalle.pl/vr/
- Leinweber M., Zmarz P., Buchmann P., Argast P., Hübener M., Bonhoeffer T., et al. (2014). Two-photon calcium imaging in mice navigating a virtual reality environment. J. Vis. Exp. 2014:e50885. 10.3791/50885 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lenormand D., Piolino P. (2022). In search of a naturalistic neuroimaging approach: exploration of general feasibility through the case of VR-fMRI and application in the domain of episodic memory. Neurosci. Biobehav. Rev. 133:104499. 10.1016/j.neubiorev.2021.12.022 [DOI] [PubMed] [Google Scholar]
- Madhav M. S., Jayakumar R. P., Lashkari S. G., Savelli F., Blair H. T., Knierim J. J., et al. (2022). The Dome: a virtual reality apparatus for freely locomoting rodents. J. Neurosci. Methods 368:109336. 10.1016/j.jneumeth.2021.109336 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maimon G., Straw A. D., Dickinson M. H. (2010). Active flight increases the gain of visual motion processing in drosophila. Nat. Neurosci. 13, 393–399. 10.1038/nn.2492 [DOI] [PubMed] [Google Scholar]
- Mathis M. W., Mathis A. (2020). Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 60, 1–11. 10.1016/j.conb.2019.10.008 [DOI] [PubMed] [Google Scholar]
- Meijer J. H., Robbers Y. (2014). Wheel running in the wild. Proc. R. Soc. B 281:20140210. 10.1098/rspb.2014.0210 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Meyer A. F., Poort J., O'Keefe J., Sahani M., Linden J. F. (2018). A head-mounted camera system integrates detailed behavioral monitoring with multichannel electrophysiology in freely moving mice. Neuron 100, 46–60.e7. 10.1016/j.neuron.2018.09.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Micaroni L., Carulli M., Ferrise F., Gallace A., Bordegoni M. (2019). An olfactory display to study the integration of vision and olfaction in a virtual reality environment. J. Comput. Inform. Sci. Eng. 19:031015. 10.1115/1.4043068 [DOI] [Google Scholar]
- Minderer M., Harvey C. D., Donato F., Moser E. I. (2016). Neuroscience: virtual reality explored. Nature 533, 324–325. 10.1038/nature17899 [DOI] [PubMed] [Google Scholar]
- Naik H., Bastien R., Navab N., Couzin I. D. (2020). Animals in virtual environments. IEEE Trans. Visual. Comput. Graph. 26, 2073–2083. 10.1109/TVCG.2020.2973063 [DOI] [PubMed] [Google Scholar]
- Narumi T., Nishizaka S., Kajinami T., Tanikawa T., Hirose M. (2011). “Augmented reality flavors: gustatory display based on edible marker and cross-modal interaction,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '11 (New York, NY: Association for Computing Machinery; ), 93–102. 10.1145/1978942.1978957 [DOI] [Google Scholar]
- Niell C. M., Stryker M. P. (2010). Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479. 10.1016/j.neuron.2010.01.033 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nityananda V., Tarawneh G., Rosner R., Nicolas J., Crichton S., Read J. (2016). Insect stereopsis demonstrated using a 3D insect cinema. Sci. Rep. 6:18718. 10.1038/srep18718 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Osorio D., Vorobyev M. (2005). Photoreceptor sectral sensitivities in terrestrial animals: adaptations for luminance and colour vision. Proc. R. Soc. B Biol. Sci. 272, 1745–1752. 10.1098/rspb.2005.3156 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parsons T. D. (2015). Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 9:660. 10.3389/fnhum.2015.00660 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Peckmezian T., Taylor P. W. (2015). A virtual reality paradigm for the study of visually mediated behaviour and cognition in spiders. Anim. Behav. 107, 87–95. 10.1016/j.anbehav.2015.06.018 [DOI] [Google Scholar]
- Petzschner F. H., Glasauer S. (2011). Iterative Bayesian estimation as an explanation for range and regression effects: a study on human path integration. J. Neurosci. 31, 17220–17229. 10.1523/JNEUROSCI.2028-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Portugues R., Engert F. (2011). Adaptive locomotor behavior in larval Zebrafish. Front. Syst. Neurosci. 5:72. 10.3389/fnsys.2011.00072 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell D. L., Rosenthal G. G. (2017). What artifice can and cannot tell us about animal behavior. Curr. Zool. 63, 21–26. 10.1093/cz/zow091 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Radvansky B. A., Dombeck D. A. (2018). An olfactory virtual reality system for mice. Nat. Commun. 9:839. 10.1038/s41467-018-03262-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Radvansky B. A., Oh J. Y., Climer J. R., Dombeck D. A. (2021). Behavior determines the hippocampal spatial mapping of a multisensory environment. Cell Rep. 36:109444. 10.1016/j.celrep.2021.109444 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ravassard P., Kees A., Willers B., Ho D., Aharoni D., Cushman J., et al. (2013). Multisensory control of hippocampal spatiotemporal selectivity. Science 340, 1342–1346. 10.1126/science.1232655 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reiser M. B., Dickinson M. H. (2008). A modular display system for insect behavioral neuroscience. J. Neurosci. Methods 167, 127–139. 10.1016/j.jneumeth.2007.07.019 [DOI] [PubMed] [Google Scholar]
- Robie A. A., Seagraves K. M., Egnor S. E. R., Branson K., Levine J. D., Kronauer D. J. C., et al. (2017). Machine vision methods for analyzing social interactions. J. Exp. Biol. 220, 25–34. 10.1242/jeb.142281 [DOI] [PubMed] [Google Scholar]
- Robinson E. M., Wiener M. (2021). Dissociable neural indices for time and space estimates during virtual distance reproduction. NeuroImage 226:117607. 10.1016/j.neuroimage.2020.117607 [DOI] [PubMed] [Google Scholar]
- Safaryan K., Mehta M. R. (2021). Enhanced hippocampal theta rhythmicity and emergence of eta oscillation in virtual reality. Nat. Neurosci. 24, 1065–1070. 10.1038/s41593-021-00871-z [DOI] [PubMed] [Google Scholar]
- Saleem A. B., Ayaz A., Jeffery K. J., Harris K. D., Carandini M. (2013). Integration of visual motion and locomotion in mouse visual cortex. Nat. Neurosci. 16, 1864–1869. 10.1038/nn.3567 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Salminen K., Rantala J., Isokoski P., Lehtonen M., Müller P., Karjalainen M., et al. (2018). “Olfactory display prototype for presenting and sensing authentic and synthetic odors,” in Proceedings of the 20th ACM International Conference on Multimodal Interaction, ICMI '18 (New York, NY: Association for Computing Machinery; ), 73–77. 10.1145/3242969.3242999 [DOI] [Google Scholar]
- Sato N., Sakata H., Tanaka Y., Taira M. (2004). Navigation in virtual environment by the macaque monkey. Behav. Brain Res. 153, 287–291. 10.1016/j.bbr.2003.10.026 [DOI] [PubMed] [Google Scholar]
- Schmidt-Hieber C., Häusser M. (2013). Cellular mechanisms of spatial navigation in the medial entorhinal cortex. Nat. Neurosci. 16, 325–331. 10.1038/nn.3340 [DOI] [PubMed] [Google Scholar]
- Schneider D. M., Mooney R. (2018). How movement modulates hearing. Annu. Rev. Neurosci. 41, 553–572. 10.1146/annurev-neuro-072116-031215 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Seeber B. U., Kerber S., Hafter E. R. (2010). A system to simulate and reproduce audio-visual environments for spatial hearing research. Hear. Res. 260, 1–10. 10.1016/j.heares.2009.11.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Seelig J. D., Chiappe M. E., Lott G. K., Dutta A., Osborne J. E., Reiser M. B., et al. (2010). Two-photon calcium imaging from head-fixed drosophila during optomotor walking behavior. Nat. Methods 7, 535–540. 10.1038/nmeth.1468 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sherman A., Dickinson M. H. (2003). A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly drosophila melanogaster. J. Exp. Biol. 206, 295–302. 10.1242/jeb.00075 [DOI] [PubMed] [Google Scholar]
- Sofroniew N. J., Cohen J. D., Lee A. K., Svoboda K. (2014). Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. J. Neurosci. 34, 9537–9550. 10.1523/JNEUROSCI.0712-14.2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sonkusare S., Breakspear M., Guo C. (2019). Naturalistic stimuli in neuroscience: critically acclaimed. Trends Cogn. Sci. 23, 699–714. 10.1016/j.tics.2019.05.004 [DOI] [PubMed] [Google Scholar]
- Steckenfinger S. A., Ghazanfar A. A. (2009). Monkey visual behavior falls into the uncanny valley. Proc. Natl. Acad. Sci. U.S.A. 106, 18362–18366. 10.1073/pnas.0910063106 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stowers J. R., Hofbauer M., Bastien R., Griessner J., Higgins P., Farooqui S., et al. (2017). Virtual reality for freely moving animals. Nat. Methods 14, 995–1002. 10.1038/nmeth.4399 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Taillade M., N'Kaoua B., Gross C. (2019). Navigation strategy in macaque monkeys: an exploratory experiment in virtual reality. J. Neurosci. Methods 326:108336. 10.1016/j.jneumeth.2019.108336 [DOI] [PubMed] [Google Scholar]
- Takalo J., Piironen A., Honkanen A., Lempeä M., Aikio M., Tuukkanen T., et al. (2012). A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments. Sci. Rep. 2:324. 10.1038/srep00324 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tennant S. A., Fischer L., Garden D. L. F., Gerlei K. Z., Martinez-Gonzalez C., McClure C., et al. (2018). Stellate cells in the medial entorhinal cortex are required for spatial learning. Cell Rep. 22, 1313–1324. 10.1016/j.celrep.2018.01.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thurley K., Ayaz A. (2017). Virtual reality systems for rodents. Curr. Zool. 63, 109–119. 10.1093/cz/zow070 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thurley K., Schild U. (2018). Time and distance estimation in children using an egocentric navigation task. Sci. Rep. 8:18001. 10.1038/s41598-018-36234-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Trivedi C. A., Bollmann J. H. (2013). Visually driven chaining of elementary swim patterns into a goal-directed motor sequence: a virtual reality study of zebrafish prey capture. Front. Neural Circuits 7:86. 10.3389/fncir.2013.00086 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vagvolgyi B. P., Jayakumar R. P., Madhav M. S., Knierim J. J., Cowan N. J. (2022). Wide-angle, monocular head tracking using passive markers. J. Neurosci. Methods 368:109453. 10.1016/j.jneumeth.2021.109453 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Valero M., English D. F. (2019). Head-mounted approaches for targeting single-cells in freely moving animals. J. Neurosci. Methods 326:108397. 10.1016/j.jneumeth.2019.108397 [DOI] [PubMed] [Google Scholar]
- Venkatraman S., Jin X., Costa R. M., Carmena J. M. (2010). Investigating neural correlates of behavior in freely behaving rodents using inertial sensors. J. Neurophysiol. 104, 569–575. 10.1152/jn.00121.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vi C. T., Ablart D., Arthur D., Obrist M. (2017). “Gustatory interface: the challenges of ‘how' to stimulate the sense of taste,” in Proceedings of the 2nd ACM SIGCHI International Workshop on Multisensory Approaches to Human-Food Interaction, MHFI 2017 (New York, NY: Association for Computing Machinery; ), 29–33. 10.1145/3141788.3141794 [DOI] [Google Scholar]
- Wang D., Guo Y., Liu S., Zhang Y., Xu W., Xiao J. (2019). Haptic display for virtual reality: progress and challenges. Virt. Real. Intell. Hardw. 1, 136–162. 10.3724/SP.J.2096-5796.2019.0008 [DOI] [Google Scholar]
- Washburn D. A., Astur R. S. (2003). Exploration of virtual mazes by rhesus monkeys (Macaca mulatta). Anim. Cogn. 6, 161–168. 10.1007/s10071-003-0173-z [DOI] [PubMed] [Google Scholar]
- Yu H., Senarathna J., Tyler B. M., Thakor N. V., Pathak A. P. (2015). Miniaturized optical neuroimaging in unrestrained animals. NeuroImage 113, 397–406. 10.1016/j.neuroimage.2015.02.070 [DOI] [PMC free article] [PubMed] [Google Scholar]