Abstract
Standard animal behavior paradigms incompletely mimic nature, limiting our understanding of behavior and brain function. Virtual Reality (VR) can help, but poses challenges. Typical VR systems require movement restrictions but disrupt sensorimotor experience, causing neuronal and behavioral alterations. We report the development of FreemoVR, a VR system for freely moving animals. We validate immersive VR for mice, flies and zebrafish. FreemoVR enables new types of experiments by allowing instant, disruption-free environmental reconfigurations and interactions between real organisms and computer-controlled agents. This allows us to establish a height aversion assay in mice and to discover visuomotor effects in Drosophila and zebrafish. Furthermore, photo-realistically mimicking zebrafish, we discovered that effective social influence depends on a prospective leader balancing its internally preferred directional choice with social interaction. FreemoVR technology allows detailed investigations into neural function and behavior by the precise manipulation of sensorimotor feedback loops in unrestrained animals.
Keywords: Drosophila, mouse, fish, behavior, motor control, anxiety, vision, mitf-a, leadership, social behavior
Introduction
Sensory experience and motor output are, under natural conditions, inextricably linked in a perception-action cycle. Consequently, brains evolved to control behavior and process information dependent on movement and movement-related state1–7. Many physiological techniques to understand brain function require partially, or fully, immobilizing animals and thereby disrupt sensory feedback, resulting in altered neuronal responses, even in central neurons. For example, the fraction of rodent hippocampal neurons showing sparse spatial response fields is strongly reduced when the restrained animal walks on a spherical treadmill, likely due to the loss of vestibular feedback8, whereas directional tuning of cells in the same region is largely unaffected9. Furthermore, our ability to mimic natural conditions in the laboratory are limited and can be challenging to implement. Likewise, to unravel rules underlying social behavior, it is essential to manipulate inter-individual interactions in reproducible ways with control over feedback and causality.
Virtual Reality (VR), defined as experimenter-controlled sensory-motor coupling, allows studying behaviors and physiology for which feedback is important. VR systems employed for animal research usually operate on restrained animals and provide a single modality of feedback such as visual10–12 or tactile cues13. By restraining animals, such as on a treadmill, VR allows precisely controlled stimulation14 and brain activity imaging2,15 or electrophysiological recording10,16. Thus, VR enables studying brain responses connected to self-motion and on important aspects of natural conditions, such as photo-realistic and perspective-correct visual sceneries. However, because systems for restrained animals usually restore feedback for a single modality, animals still lack normal coupling of the other senses. Even when VR has been used to test navigation in restrained animals by providing visual and auditory cues17, coupling between visual and vestibular feedback remained strongly unnatural.
Here we present a system that overcomes most of these limitations by immersing an unrestrained animal in a reactive, 3D world under computer control. We demonstrate the use of free-moving VR to unravel behavioral differences that went previously unnoticed, to perform experiments involving simulated teleportation and swarms to establish new decision-making paradigms, and to decipher rules that govern social interactions.
Results
Implementation of FreemoVR
We built a VR system that allows an animal to move freely while providing artificial visual feedback by creating a system capable of simulating any desired visual scenery (Fig. 1a-d). This system, FreemoVR, maintains natural sensory-motor feedback for the mechanical senses while providing experimental control over the animal’s visual experience. It is based on the use of animal tracking, precise spatial calibration of computer displays, and computer games technology to draw photo-realistic and perspective-correct images from the animal’s perspective as it walks, flies or swims. We made use of computer vision to track animal position with low latency18. We built behavioral arenas whose walls or floors are computer displays, including projection surfaces of any shape. Compared to other work19–23, the innovations here are, first, the ability to create sophisticated visual VR on arbitrarily shaped displays and, second, the ability to do so for multiple species moving in three dimensions, including under water. Flexibility in arena design and animal tracking allows the experimenter to choose tracking and display technologies best matched to the needs of a particular experiment (Fig. 1a-e, g, Supplementary Fig. 1, 2, Supplementary Videos 1-4, Supplementary Data 1 and strawlab.org/freemovr).
Figure 1. FreemoVR virtual reality system for visual simulation.
(a) Schematic illustrating perspective-correct projection of virtual objects onto a curved display surface. The VR representation of the magenta and cyan objects is their projection onto the display surface from the perspective of the observer. (b) The ‘Flycave’, a 1m high, 1m diameter cylindrical flight arena. Images are displayed on the walls of the cylinder using 3 projectors and show a virtual world consisting of a magenta pyramid and a cyan sphere in front of a photographic background. (c) ‘MouseVR’, a VR apparatus for freely walking mice using a 1.9 m television for display. (d) ‘FishVR’, an apparatus in which Zebrafish swim in 9cm deep water in a 32cm diameter hemispherical bowl while a virtual environment is projected from beneath. The apparatus shows the same virtual scene as depicted in b. (e) Real-world (RW) and virtual reality (VR) post of 7.5cm diameters were simulated at the center of a 1m textured cylinder. (f) Top views of Drosophila trajectories over a 12 hour trial in RW (N=25 flies), VR (N=40) and no post (NP) (N=40) conditions. (g) A 4 cm diameter virtual post was simulated at the center of a 30cm textured hemisphere. (h) Top views of zebrafish trajectories from 30 minute trials in RW (N=11), VR (N=13), and no post (‘NP’, N=13) conditions.
Validating Behavioral Responses to Virtual Objects and Virtual Environments
An ideal VR system would be able to mimic the sensory experience of real-world (RW) stimulation. If the VR system works reliably, animal behavior in response to a RW stimulus and the comparable VR stimulus should be highly similar. We therefore characterized flight trajectories of Drosophila in the presence of an upright grey post, placed at the center of an arena onto which a high-contrast checkerboard was projected (Fig. 1e). In response to both the RW object and the VR object, flies typically circled the post (Fig. 1f, Supplementary Figs. 3,4, Supplementary Video 5,6). In the no post (NP) condition, identical except for the absence of the post, flies flew through the entire arena, including the center. Thus, flies behave as if they perceive the virtual and real objects similarly, attesting to the principle functionality of our system.
We performed similar experiments with juvenile zebrafish (46-56dpf) in the presence of an upright black post at the center of a checkerboard texture arena (Fig. 1g). To achieve this, we extended existing multi-camera tracking software18 with the new capability of correcting refraction from the air-water interface. Again, trajectories were consistent with animals perceiving the virtual object similarly to a real object (Fig. 1h, Supplementary Video 7). These data strongly support that our system generates naturalistic visual percepts of objects for freely moving animals.
Free-moving VR in a rodent innate aversion assay
Visual motion parallax from self-motion is important for rodents to estimate horizontal distance24 and we wondered if height may be estimated similarly. We simultaneously asked this question and validated FreemoVR effectiveness in mice with an assay measuring height aversion. The RW configuration (Fig. 2a) used a circular track placed above a shallow (20 cm) and deep (40 cm) checkerboard surface. The VR configuration simulated these surfaces and we also used a static (ST) configuration (Fig. 2b,c, Supplementary Video 8,9). In RW and VR conditions, check size was 4 cm. Due to perspective, the shallow checks subtend larger visual angles than deep checks and the ST condition showed checks of similar angular size but without correct motion parallax.
Figure 2. Innate anxiety behavior to real and virtual elevated heights in mice.
(a) Real world (RW) configuration of an elevated anxiety maze paradigm in which an ‘O’ shaped platform is placed above checkerboards at different heights. (b) Stationary (ST) and (c) virtual reality (VR) configurations in which a television is placed under the ‘O’ shaped platform. In VR, a perspective-correct simulation of the RW elevation maze is created using video-based head tracking, whereas in ST the displayed image does create motion parallax. Experiments were performed in which mouse position was measured to determine locations avoided or preferred by mice. The locations of shallow and deep relative to the experimental room were switched between trials, thus allowing us to check for effects from the checkerboards and the distal environment. (d-f) Top view of occupancy in the RW, ST and VR configurations. (g) Time spent on the shallow side in each configuration. (h) Data replotted from g aligned to distant environmental cues to investigate the effect of the distal environmental cues. (i) Distance walked in each configuration. Values are mean of all animals. Error bars represent ± s.d. N=15 mice for RW trials, N=16 for VR, N=16 for ST.
We found that mice spend more time above RW shallow surfaces but show no preference in ST conditions (Fig. 2d-e). In VR, mice spend more time above the simulated shallow surface (Fig. 2f), similar to RW conditions. Statistically, the distribution of time spent per mouse on the shallow side was significantly different from a null hypothesis of 50% in both the RW and VR conditions, but not in the ST condition (Fig. 2g) (one sample t-test vs 50%; VR; p=0.0072, N=16 mice, RW; p=0.0071, N=15, ST; p=0.7152, N=16). This shallow preference was overruled by static environmental cues, namely the preference for the side of the arena near a wall in the ST condition (Fig. 2h, Supplementary Fig. 5a) (one-way ANOVA F(2,44) = 16, p < 0.0001; Tukey’s post-hoc test; ST vs VR; p = 0.0017 N=16 vs 16 mice, ST vs RW; p < 0.0001, N=16 vs 15, VR vs RW; p = 0.1456, N=16 vs 15, equal group variance, Bartlett test P=0.94). Neither total locomotion nor head dipping was affected by these conditions (Fig. 2i, Supplementary Fig. 5,6) (one-way ANOVA p = 0.18; VR; N=16 mice, RW; N=15, ST; N=16, equal group variance, Bartlett test P=0.41). Thus, FreemoVR can elicit height aversion in mice and, considering no such preference exists in static conditions, mouse aversion to heights depends on parallax not texture.
Action-perception is fundamentally altered in rigidly tethered Drosophila
We sought to determine the effect of a common experimental manipulation – head immobilization– in fly visuo-motor control. In tethered flies, this manipulation maximizes precision of visual stimulation, enables electrophysiology, and has been important for studies of fly vision25,26. The few studies that investigated head immobilization in tethered flight found confusing or minimal effects27,28, whereas head movements are suggested to be important in free flight29,30. To date, no study has directly compared the effect of head immobilization in free versus tethered flight. Enabled by FreemoVR, we can showed similar stimuli to tethered and freely flying flies and recorded behavioral performance.
For the free-flight experiments, we used the animals’ own optomotor response to gather long behavioral trajectories in a limited spatial volume by coupling wide-field optic flow to the animals position such that we could “steer” the fly on an infinity-symbol (∞) shaped trajectory via visual rotation (Fig. 3a, Supplementary Fig. 7, Supplementary Video 10). In the head-fixed group, the head was glued to the thorax preventing relative movement. A control group with glue droplets that were prevented from fusing while hardening. These “head-free glue” flies could still move the head while having similar mechanical and sensory perturbations as head-fixed flies. Head-fixed flies flew relatively little (Supplemental Table 1) and when they did, were unable to perform optomotor following in contrast to head-free glued and no-glue flies (Fig. 3a-e). Quantitatively, we found the correlation between visual stimulus and behavioral response significantly decreased in the head-fixed flies, while measuring no such difference in the head-free and wildtype flies with no glue (Fig. 3d, Supplemental Figure 8) (Pearson correlation at lags 180 msec, Mann-Whitney test; no-glue vs head-fixed, p=0.003, n trials =55 vs 34, no-glue vs head-free; p=0.20, n=55 vs 132, head-free vs head-fixed, p=1.8e-5, n=132 vs 34).
Figure 3. Effect of head movement on Drosophila flight.
(a-c) Top views of trajectories in which a freely flying fly was visually stimulated with VR-based panoramic image motion to keep it on an infinity symbol ‘∞’ path. In a, no glue was applied (N=36 flies), whereas in b the head was immobilized (N=78). In c, glue was applied similarly to b, but the droplets on head and thorax were not fused and the head could thus move independently of the thorax (N=39). (d) Correlation between angular velocity of the stimulus and fly trajectory at its maximum value. (e) Photographs of the glue application. (f-g) Top views of trajectories of tethered flies in which a simple physical simulation with constant forward thrust and wingstroke amplitude differences resulting in torque. In f, the heads were fixed as in b (N=6). In g, the glue droplet on the head and thorax were not fused (N=6). (h) Correlation between angular velocity of the stimulus and simulated fly trajectory at its maximum value. (i) Histogram of angular velocity for freely flying flies. (j) Histogram of horizontal speed for freely flying flies. (k) Histogram of simulated angular velocity for tethered flies. Histograms include data for all trials. Lagged correlations are the mean of trials from all individuals at latency 180 msec. Error bars for all plots represent ± s.d. For free flight experiments; head-fixed n trials = 34, head free n = 132, wildtype n = 55. For tethered experiments, head fixed n trials = 11, head free n=6. Box plots (d,h) indicate median, upper- and lower-quartile. Whiskers extend to 5 IQRs of the lower and upper quartile.
We performed corresponding experiments in rigidly tethered flies (Fig. 3f-h). Both head-fixed and head-free glued flies followed the optomotor stimulus, and we found no significant difference between groups (Fig. 3h, Supplemental Figure 8) (Mann-Whitney test, head-fixed vs head-free; p=0.20, n=11 vs 6). Overall, the effect of head immobilization in free flight is unlikely from impaired mechanical maneuverability because gluing caused little change in free-flight angular velocity statistics (Fig. 3i) and flight speed increased with glue application (Fig. 3j). In tethered flight, head-fixed and head-free glue resulted in little difference in angular velocity statistics (Fig 3k). Our results indicate that tethered experiments, even without head fixation, cannot be used to study head movements for naturalistic flight control because flight control is fundamentally altered by tethering.
Experiments with FreemoVR reveal a subtle, specific visuomotor deficit in mitf-a mutant zebrafish
The zebrafish strain nacre, a mitf-a gene loss of function31, has reduced pigmentation and is frequently used for whole-brain activity imaging in combination with sensory-motor behavior32. Two previous investigations reported no behavioral differences between nacre and wildtype fish33,34. Given that behavioral differences exist even between wildtype strains35,36, we wondered if such differences might have gone unnoticed due to the level of analysis or particular assays used. Automated VR system gives us the ability to test multiple visuo-motor assay conditions by automatically and instantly reconfiguring the visual environment and experimental conditions with no physical disturbance.
We compared visuo-motor behaviors of nacW2 mutant and wildtype AB larvae (Fig. 4a,b) using a modified infinity-symbol assay. We used a panoramic cloud of many circles, like a random-dot display, (Supplementary Fig. 9, Supplementary Video 11) to elicit compensatory movements by automatically adjusting dot 3D linear velocity. When tested with 3.1° dots, both nacW2 and wildtype fish robustly followed the target trajectory equally well (Fig. 4c) and the time-varying correlation between the lateral component of the input visual stimulus velocity and the output swim velocity showed no difference (Fig. 4d,e) (Pearson correlation at lags 20-30 msec Mann-Whitney test, p=0.52, n=112 vs 124). With 1.3° dots, however, wildtype fish reversed the direction of their swim response, perhaps due to dot size dependency37 or spatial aliasing38. Mutant mitf-a fish showed near zero correlation (Fig. 4f,g,h), significantly different from wildtype (p=0.014). Control experiments with zero-contrast gray showed, as expected, no correlation between stimulus and response and no significant difference (p=0.82) between strains (Fig. 4i,j,k) and we found no obvious motor difference between strains (Supplementary Fig. 10). The behavioral alteration associated with the mitf-a strain is specific to the small point condition. This finding demonstrates the sensitivity and suitability of our VR system for discovering even relatively small derivations from wildtype behavior in freely moving animals.
Figure 4. Specific visuomotor deficit in mitf-a-/- zebrafish.
(a) Photograph of wildtype (AB strain) and (b) mitf-a-/- mutant larvae (7dpf). (c) Top views of trajectories of fish in infinity symbol ‘∞’ assay in which a cloud of dots (angular size 3.1°) in 3D space were presented to wildtype and mutant fish. (d) Lagged Pearson correlation between lateral velocity of the visual motion in body coordinates and angular velocity of the swim trajectory for the two genotypes (Mean ±68% c.i.). (e) Maximal correlation in each condition (Mean ±68% c.i.). (f-h) As in c-e with smaller dots (angular size 1.3°). (i-k) As in c-e with a uniform zero contrast background. Top views show a randomly selected 45 minute subset of all data (c,f,i). N=56 wildtype fish, N=62 mitf-a-/- fish (6-9 dpf) tested. Box plots (e,h,k) indicate median, upper- and lower-quartile. Whiskers extend to 1.5 IQRs of the lower and upper quartile.
Virtual teleportation to assess decision making in fish
To further demonstrate the novelty of experimental designs enabled by FreemoVR, we implemented a decision making assay using virtual teleportation: when a fish entered a ‘portal’ – a special region with distinct visual appearance – an instantaneous simulated teleportation to a new environment occurred. We designed experiments in which a particular portal appearance was coupled with teleportation to a specific simulated environment (Fig.5a-c, Supplementary Fig. 11). We measured fish “decisions”, operationally defined as entering one of the two available portals. Two types of environmental settings were tested. One in which the fish could chose between a checkerboard vs. a plant world (Supplementary Fig. 11, Supplementary Video 12), and one scenario with the choice between the absence or presence of a virtual swarm (Fig. 5a-h, Supplementary Video 13). The swarm consisted of video game space invaders each controlled independently but exhibiting flocking behavior because they were controlled by the boids algorithm39. Each simulated creature was programmed to respond to the real fish exactly as it responded to the other creatures. We found that fish were influenced by the swarm, as seen in the spatial occupancy distribution (Fig. 5d).
Figure 5. Teleportation, swarms and social feedback in virtual reality.
(a) Zebrafish could trigger appearance of a moving swarm of space invaders by entering one portal and trigger disappearance by entering the other portal. (b) For each fish, coupling between portal appearance and swarm effect was constant, but different fish had different couplings. Upon entering a portal, the portals were rearranged to equalize distance required for subsequent portal entry. (c) Decisions (vertical marks) and current condition (horizontal line) of each fish. (d) Top view of occupancy. (e) Fraction of all decisions per fish coupled to swarm appearance. (f) Fraction of all portal entries per fish that were magenta. (g) Fraction of 60 minutes per fish in which the fish was in the presence of the swarm. (h) Mean horizontal speed per fish in each coupling condition. (c-h) N=12 fish, AB strain. Box plots indicate median, upper- and lower-quartile. Whiskers extend to 1.5 IQRs of the lower and upper quartile. (i) Image of virtual zebrafish, animated based on analysis of real zebrafish as shown in Fig. S. 12. (j) Example trajectories in which virtual fish (red color) is only influenced by the position of the real fish (black color) (ω=0), when it weights social influence and its internal preferred direction equally (ω=1) and when it strongly favors its internal preferred direction (ω=100). (k) Histogram of the real fish’s distance from the periphery of the arena, r, as a function of the strength of the goal-oriented tendency, ω, of the virtual fish. Control condition shows real fish with no virtual fish present. The virtual fish’s internal preferred trajectory was fixed at r = 0.07m (dotted white). (N=16 in all experiments with virtual fish, N=15 in control experiments) (l) From the frame of reference of the real fish, the probability distribution of the location of the virtual fish at a given position. (N=16)
We tested if fish could associate a “decision” (Fig. 5b-h) with the teleportation outcome. We kept the coupling constant in individual fish, but tested the two possible couplings in different fish (Fig. 5b). We analyzed the results as a series of two-alternative forced choice decisions and, in one hour trials, found that fish decisions were not significantly biased for swarms or their absence (Fig. 5e, one-sample t-test difference from 0.5, p=0.34;). Fish preferred magenta portals (Fig 5f, p=0.0025) and spent more time in the swarm condition (Fig. 5g, p=0.014), likely because they swam more slowly there (Fig. 5h, two-related-sample t-test, p=0.00029). These teleportation experiments revealed scene-specific swimming speeds, occupancy differences, and preference for portal appearance, but no learning.
By balancing internal preferred direction with social responsiveness, prospective leaders minimize loss of followers in zebrafish
We addressed the question of how individuals reconcile personal and social information when making movement decisions, and explore how the strength of feedback between a real and a virtual fish under computer control impacts leadership. Theoretical work suggests that an individual can exert influence while minimizing group splitting by balancing internal preferred travel direction with travel towards others40.
We created a photo-realistic virtual fish (Fig. 5i, Supplementary Fig. 12,13, Supplemental Video 14) and adjusted its interactivity towards real fish. Zebrafish of this age (23dpf), without social influence, predominantly swim around the arena periphery (Fig. 5k, control). We programmed our virtual fish to differ by preferring a trajectory around the arena center (0.07m from the periphery, Fig. 5k, white dotted line). Thus, we created a conflict allowing us to explore how social feedback – under computer control in the virtual fish - impacts movement.
We changed, via weighting factor ω40–42, how strongly the virtual fish is influenced by its preferred central trajectory relative to swimming toward the real fish (Supplementary Information). If ω = 0, the virtual fish is exclusively influenced by the real fish. As ω increases, movement becomes more influenced by preferred direction, such that for ω = 1 social and goal-oriented are equal, and with ω>1, the virtual fish increasingly biases its motion toward its preferred direction.
When the virtual fish is influenced relatively weakly by its preferred trajectory (ω=0-0.2), the real fish dominates the direction of travel and the virtual fish mostly follows it (Fig. 5k). In these conditions, both fish spend most time circling the perimeter of the arena (the real fish’s preferred trajectory) and the real fish’s distance from arena edge is near zero. This perimeter-circling behavior is similar to control conditions with no virtual fish (Fig 5k, Supplementary Fig. 14). In this low ω regime, probability histograms in body-centric coordinates for the real fish confirm that the virtual fish is most frequently immediately behind the real fish (Fig. 5l). As the virtual fish more equally weights the direction of the real fish into its own direction of travel (ω=0.4-4), the real fish is influenced by this social feedback and thus its own position in the arena shifts away from edge-following (Fig. 5k, p<10-5, Mann-Whitney U test comparing distance from edge for ω=0 to ω=1). This is associated with a change in the relative position; real fish now more frequently follow virtual fish (Fig. 5l, Supplementary Fig. 15). For still higher values of ω, when the virtual fish movement is dominated by its own preferred route (ω=100), the real and virtual fish often separate, with the real fish reverting back to edge-following (Fig. 5k, p<10-5, Mann-Whitney U test comparing distance from edge for ω=1 to ω=100) and the relative positions are more variable, resulting in more uniform, and therefore less distinctly peaked, relative positions in the probability histograms of fish position (Fig. 5l). Thus, while reducing social feedback allows the virtual fish to exert a greater influence on the real fish, it also increases the risk of losing, and thus completely failing to lead, the real fish.
Discussion
We demonstrate the ability of our FreemoVR system to elicit naturalistic object responses in freely flying Drosophila and freely swimming zebrafish and anxiety-related behaviors in freely walking mice. We showed that head movements in Drosophila are important for free flight but not in more traditional tethered preparations. We took advantage of the system to detect highly-specific phenotypic differences between strains of zebrafish. Finally, we showed virtual teleportation between different environments and a design testing for the balance of social interactions vs. social unresponsiveness required for leadership in fish. Immersive, reactive VR has the potential to become an invaluable tool in the study of multi-modal sensory integration, spatial cognition, social interactions and collective behavior.
Despite its advantages, our system also has limitations. Because it renders visual stimuli in a perspective-correct manner for a single viewpoint, it is not directly suitable for the investigation of behaviors for which stereopsis is important or for simulating virtual worlds for multiple animals simultaneously. Here we have not tracked eye position or angular orientation, but these would be useful future additions. By using computer graphics cards and displays designed for humans, we enable the use of inexpensive and flexible hardware, but certain experiments, especially those on animals with visual requirements far different than our own, such as high temporal resolution or broad spectral and polarization sensitivity, will present further challenges.
When used together with technologies to measure neural activity in freely moving animals43–46, VR for freely moving animals will advance investigation of brain function of high-level behaviors such as navigation. Additionally, the ability to programmatically control virtual agents will allow careful study of causality in collective behavior. Ultimately, this is important because it will allow study of the mechanistic basis of behavior under conditions in which the brain evolved to operate.
Online Methods
Implementation of the FreemoVR Engine
The FreemoVR Engine is software written in C++ and uses the OpenSceneGraph library for rendering. Different virtual worlds or other experimental paradigms can be programmed as plugins and a general purpose plugin for rendering complete 3D scenes (OSG format files) is included. Other plugins for custom rendering can be developed while using the FreemoVR projector calibrations and rendering infrastructure. Provided plugins include those for drawing 3D random dot patterns. OSG format 3D scenes can be designed using modeling software such as Blender. The FreemoVR Engine runs as part of a larger FreemoVR System built on the Robot Operating System (ROS). Experiments and experimental logic are written in Python and communicate with other parts of the system using the ROS inter-process communication and multi-process launching and parameter setting framework. The FreemoVR Engine itself does not depend on a specific tracking software – it works with any tracking implementation providing 3D position. The tracking implementations used within the FreemoVR System for the experiments presented are discussed below.
High-level control of VR experiments is implemented in a separate process (the ‘experimental node’) from the FreemoVR Engine and the tracking program. This experimental node enforces the starting and stopping criteria for every trial over the duration of the experiment, changes between different stimulus conditions, and alters the real world in real time if required (such as in the case of path-following trials.
All experiments presented in the manuscript make use of stimuli whose precise nature depends on the translational (x,y,z) position of the animal’s eyes but does not attempt to alter the animal’s own rotational (yaw, pitch, roll) closed loop. (In Fig. 3, rotational bias was added by rotating the stimulus in world coordinates without attempting to compensate for the animal’s orientation.) In other words, we did not attempt to alter the visuo-motor feedback loop with respect to rotation, but only with translation.
The FreemoVR software is available from www.strawlab.org/FreemoVR and is in included in Supplementary Data 1, as is the tracking software.
Measuring latency and its behavioral effects
Given the physical coupling between animal motion and sensory feedback under natural conditions, we wondered how the latency of our system – the total duration between movement of the subject and a compensatory reaction by the display – might affect these results. For the different VR systems we implemented, we made latency measurements and found values between 60 and 75 msec (Supplementary Fig. 3c), with more than half the total latency due to rendering on the graphics card and display on the monitor or projector. To address the question of how latency affects our results, we measured behavioral performance, the turning rate of a fly turning away from a rapidly nearing object, as we artificially varied latency. As we approached the smallest possible latencies of our system (0 msec added latency, 60 msec total latency), turn rate asymptotically approached ~50°/s, and distribution of the turns in response to the VR object were not measurably different than turns in response to the RW object (Supplementary Figs. 3-4), (Mann-Whitney test, p= 0.82, difference between mean of all turns toward post, n= 109 vs 99 trials, group variance not different, Baretts test p=0.66), while longer artificial latencies reduced post avoidance turning.
Animal Handling
All animal work was conducted according to Austrian, German and European laws and guidelines for animal research.
Zebrafish
All fish used in Figs. 1,4 and 5a-h were bred and raised by the fish facility of the Max F. Perutz Laboratories Vienna (BMWFW-66.006/0012-WF/II/3b/2014 and BMWFW-66.006/0003-WF/V/3b/2016). Zebrafish (Danio rerio) strains AB and mitf-a were kept in a constant recirculating system at 28°C on a 16 h light/8 h dark cycle. Collected embryos were kept at 28°C until hatched. Water was stored at the experimental room to have the same temperature as the experimental apparatus. All fish were moved to the experimental room at least 12 hours before testing. Powdered food was available constantly in the holding containers for this period. The water in the experimental apparatus was changed each day before the first experiment except if protocol demanded differently. All fish were tested individually, were naive and picked at random from the tanks they were raised in. We did not perform size selection for the experiments in Figs. 1 and 4. For experiments in Fig. 5a-h, relatively large fish were selected.
All fish used in Fig. 5i-l were bred and raised by the animal care staff of the Department of Collective Behaviour (University of Konstanz/Max Planck Institute). 16 h prior to conducting experiments, water from the fish facility was taken to the experimental room ensuring that water conditions in the VR arenas was the same as in the holding facility. The experimental room was kept at 28°C. All fish were moved to the experimental room at least 12 hours before testing and kept on a 16:8 day:night light regime, as in the fish facility. The water in the experimental apparatus was changed each day before the first experiment. All fish were tested individually, were naive and picked at random from the tanks they were raised in (according to the animal ethics Permit approved by Regierungspräsidium Freiburg, G-16/158).
Drosophila
All experiments were performed on wild-type CS strain Drosophila melanogaster. Flies were raised at 22°C on a 12-h light, 12-h dark cycle.
Mice
Mice were C57BL/6J males purchased from Charles River Laboratories (Sulzfeld, Germany) and 5 - 6 months old at time of testing. Animals were group-housed at 21 °C, with food and water provided ad libitum in a 14h light and 10h dark cycle (day starting at 6:00 a.m.), not adjusted for daylight savings time during the summer. All tests were performed during the light period. All mouse experiments were performed in accordance with institutional guidelines and were approved by the respective Austrian (BGBl nr. 501/1988, idF BGBl I no. 162/2005) and European (Directive 86/609/EEC of 24 November 1986, European Community) authorities and covered by the license MA58/002220/2011/9.
Behavioral Set-ups
A parts list for all setups can be found in Supplementary Data 2 (“Parts List.xls”).
FishVR (zebrafish)
The bowl was constructed from acrylic by Immersive Display Group, London, UK and is a subsection of a sphere with a diameter of 40.6 cm. The bowl is filled with 4.5l of water resulting to depth of 9.1cm and diameter of 33.8cm. Infrared light for tracking of a wavelength of 850nm is provided from underneath the bowl. The projection illuminates the entire surface of the bowl under water level. We used a LED DLP projector (Optoma ML500) with vsync-enabled on the graphics card. The water surface is protected against dust and airflow with a hood that also covers the cameras.
A computer (Intel, Core i7, nVidia GeForce GTX 960) does all tracking and VR computation. 4 cameras (Basler acA640) mounted outside the water acquire images at 100fps. Fish position is tracked at 100fps. A correction for the refraction of light through the water surface is made in real-time (see Animal Tracking below).
FlyCave (Drosophila)
The flight arena is a 1m diameter, 1m high translucent cylinder constructed from projection material (Gerriets International Opera Creamy White Front/Back) bonded to lampshade backing PVC (generic brand, translucent, 0.3mm thick) for structural support. The cylinder is freestanding and is capped on the top and bottom by sheets of transparent Perspex. There are two mesh-covered openings in the top cover of the arena to allow cool air to circulate. The room in which the arena stands is temperature controlled at 20.5°C. 10 cameras (Basler acA640) are distributed around the top of the cylinder and their field of view and focus depth are adjusted to cover the whole arena.
The VR is projected onto the outside of the cylinder using 3 projectors (Mitsubishi WD-385U-EST) projectors operating at 1024x768 pixel resolution and 120Hz refresh rate.
One computer (Intel, Core i5, nVidia GeForce GTX 670 rev a1) does 3D computations, VR projection and two further computers (Intel, Core i5) tracks 2D fly positions at 100fps. Flies were tracked with 850nm IR light.
MouseVR (Mouse)
A 1.9m (75”, Sony KDL-75W855CBAEP) was used to display the VR. A single camera (Basler acA640) was mounted centrally above the television. Mouse head position was tracked in 2D under 850nm IR light using custom software (Supplementary Data 1). 3D position was estimated by assuming the mouse had remained at a fixed altitude. A single computer (Intel, Core i7, nVidia GTX 670) did all tracking and VR computation in real-time.
The elevated circular platform measures 0.5m (external diameter), and 0.0396m (internal diameter), giving the ring platform a width of 52mm. The ring was constructed from light grey PVC. The ring is 10mm thick and has a 2mm high ridge on the inner and outer sides to make it harder for mice to slip off. In VR trials the ring was placed 13.4 cm above the television surface. The ring sits atop three 2cm diameter legs made from dark PVC, place at 120° from one another.
Light measurements were recorded (PCE-174 Lux Meter) for VR and RW trials.
| Assay | Looking Down | Looking Up |
|---|---|---|
| VR | 100 lux | 2 lux |
| RW | 35-60 lux | 250-370 lux |
Experimental Methods
Statistical reporting
Number of animals is reported with N, number of trials with n.
Zebrafish (Post experiments, Fig. 1)
Zebrafish of the AB strain with an age between 46 and 56 days post fertilization were used. They were picked at random from two breeding tanks. Zebrafish varied in size from 7-15mm. Testing blind to the condition was not possible, but all evaluation was done by automated computer programs and thus blinding was not necessary for analysis. In total we tested 24 fish; 13 for VR post and control condition and 11 for the real post condition. One fish was in the assay at each time. A new fish was used for each experiment. NP condition was followed by VR condition, RW experiments were performed separately. Conditions were presented for 30 minutes. Consequentially NP/VR experiments took one hour, RW experiments took 30 minutes. When changing experiments from NP/VR to RW, or from RW to NP/VR, the water was changed. Experiments were performed at room temperature 22-25°C.
The RW post was 15cm high and 4cm in diameter and milled from black Delrin. The VR post was modelled to the same size, and placed inside a checkerboard textured sphere of the same size as the physical bowl. The VR model was generated with blender (2.74), exported to OpenSceneGraph formal (osg) and loaded directly by FreemoVR. Experimental data files are included in Supplementary data. In VR trials the VR post and bowl were present. In RW and NP trials, the VR post was made invisible but the textured bowl remained.
Sample size was not pre-determined. Experiments were run until top views of the trajectories covered the arena densely.
Starting and Stopping Criteria
Trials were started when a fish was detected in the bowl and under water. Trials were stopped when the fish was not detected for 0.5s, when any position estimate for the fish was outside of the bowl or 2cm above the water level, or when the stimulus condition changed.
Zebrafish (Path-following experiments, Fig. 4)
Zebrafish larvae of the wild type AB strain and nacre mutant were tested 6 - 9 days post fertilization. We chose AB because the nacW2 mutation was generated in this wildtype background31. Larvae were tested for 45min each between 10:00 and 19:00. In total 62 fish from the mitf-a-/- strain were tested as were 56 age-matched larvae from wildtype AB strain reared under identical conditions. This sample size resulted after excluding experiments with technical (projector malfunction) or environmental disturbances (construction works and noise). Blind testing with regard to strain or experimental condition was not possible, but all evaluation was done by automated computer programs and thus blinding was not necessary. If possible, tests were shuffled between the two genotypes. Experiments were performed at room temperature 22-25°C.
A cloud of 3D dots was designed to elicit the optomotor response and thus path following. The stimulus was generated by the agglomeration of identical white dots randomly positioned in a 3D volume of 1x1x1m volume. All dots moved with identical velocity. The origin of the dot cloud was fixed to the position of the fish. The velocity of the dots, V = (Vx,Vy,Vz)T, was calculated by a closed loop position proportional control law which tries to bring the active fish position to a target point on the path. More exactly,
where Khorizontal is the feedback gain for path following, and Kvertical is the feedback gain for altitude control. when the active fish position was within 0.1m of the target position, the target position was advanced to the next point along the path.
Three experimental conditions were tested; ‘large dots’, ‘small dots;, and ‘grey’. The ‘large dot’ angular size was 3.1°. The ‘small dot’ condition angular size was 1.3°. In the ‘grey’ condition a static texture less grey color was shown. Conditions were randomly ordered, and then alternated sequentially every 5 minutes.
Sample size was determined by the number of fish we could test in a single week of the correct genotypes for a single replicate. Three replicates were performed using three separate egg batches.
Starting and Stopping Criteria
Trials were started when a fish was in a cubic volume -0.15m < x < 0.15m, -0.15m < y < 0.15m, -0.10m < z < 0.01m. Trials ended when the active fish average speed over 1 second was less than 1mm/s, when the active fish performing the trial had not been detected for 0.5s, or when the stimulus condition changed.
Zebrafish (Virtual teleportation and swarm experiments, Fig. 5)
Zebrafish of the AB strain with an age between 24 and 30 days post fertilization were used. Testing blind to experiment condition was not possible, but all evaluation was done by automated computer programs and thus blinding was not necessary. In total we tested 24 fish; 12 for virtual teleportation and 12 for swam experiments. One fish was in the assay at each time. A new fish was used for each experiment. Virtual teleportation experiments lasted 30 minutes and the swarm experiment trials were one hour. Experiments were performed at 28°C.
Sample size was determined by the time available to perform experiments combined with fish availability.
Zebrafish (Social feedback experiments, Fig. 6)
We used 31 juvenile wild-type zebrafish (23 days post fertilization, body length 9.5mm ± 0.5mm) picked at random from one breeding tank. Each individual was tested once for 90min between 10:00 and 19:00. 16 fish were tested with VR stimuli and 15 with no VR stimuli (control condition). Sample size was determined by the time available to perform experiments combined with fish availability.
A Virtual Fish (VF), 3d mesh and texture mapping, was generated by ENGELS Visualisierungen from multiple images of a zebrafish larvae (9.5mm). The tailbeat movement of the VF was animated with blender based on median-line tracking data from video extracted using the software KymoRod47. A linear straight chain of 32 bones was applied to the mesh and this was animated, based on the extracted tracking data, with an oscillating wave propagating from the head to the tail (Supplementary Figure 12).
The movement of the zebrafish larvae is described as burst and glide motion. A first phase of propulsion is followed by a glide movement that is damped by the friction with water. A new burst and glide movement was generated each ttb = 0.5s. During this movement, the linear velocity was described by an exponential function
with tc = 0.25s and V0 = 0.14m.s−1. The direction of the VF, dVF, was updated at the beginning of each burst and glide event and determined in accordance with the model described previously40,41. Following this model, avoidance of collisions is the highest priority of the VF. We define a repulsion zone, of size lr = 2cm. When the distance between the real fish (RF) and the VF, is smaller than the distance of repulsion, lr, the VF turns away of the RF. The direction of the VF is then given by
If the RF is outside of the repulsion zone, the VF balance the influence of its preferred direction, g, and their attraction to the RF
Due to the finite size, and the symmetry, of the system, the preferred direction g cannot be constant. A preferred circular direction is defined by a point c on a circle of radius 7cm at depth 3mm from the surface. The position c is updated continuously around the circle with a speed vc that matches the average speed of the VF.
For each RF, 9 different values of omega for the VF were tested
Every 10 minutes a new value of omega was picked once, in random order, so that the same fish can experience the range of ω (social feedback). The direction of rotation clockwise (CW) or counterclockwise (CCW) was picked at random for each stimulus.
Drosophila (Post experiments)
Multiple mixed male and female flies of 4 days of age were used for each experiment. 20-25 flies were introduced into the arena in the evening. Experiments ran from 18:00 – 10:30. Each conditions was tested for 5 minutes and then changed sequentially. Experiments were performed at room temperature 22-25°C.
The physical post was 1m high and 10cm diameter and made from grey PVC pipe. The virtual post was generated using blender and styled to have the same color as the real post. In the virtual-post condition the grey virtual post was placed inside a virtual cylinder with a checkerboard texture and with the same diameter as the FlyCave arena. This textured cylinder prevented the flies from flying into the walls. In real-post experiments the real post was placed into the arena and the grey virtual post was not shown. In the no-post condition the checkerboard texture was replaced with a grey texture the same color as the post. In validation experiments with no added latency (Figure 2b) the texture of the cylinder was controlled vertically (see following section) to elicit longer flight trials. In experiments to compare turning response of the flies in VR and RW, the texture of the cylinder was static.
Sample size was not pre-determined. As each experiment contained multiple flies (20-30), we performed experiments until top views of the trajectories covered the arena densely.
Starting and Stopping Criteria
When one or many Drosophila were in flight, the experimental node chose the first one whose altitude met the geometric criteria 0.1m < z (altitude) < 0.9m and whose distance from the center of the arena was < 0.35m.
Trials were stopped when the active fly average speed over 5 seconds was less than 4cm/s, when the active fly performing the trial had not been detected for 0.5s, or when the stimulus condition changed.
Drosophila (Path-following experiments)
Female flies of 4 days of age were used for each experiment. Experiments ran overnight, and multiple conditions were alternated as per Drosophila post-trials.
To elicit the optomotor response, and thus path following, a stimulus was designed that placed the fly at the center of a virtual textured cylinder. The cylinder was textured with a black and white checkerboard texture to give a strong behavioral response. The origin of the cylinder was fixed at the position of the active fly so as to eliminate translational optical flow. The angular velocity, ϕ, of the textured cylinder surrounding the active fly was calculated by a closed-loop angular proportional control law48 which tries to align the velocity vector of the fly with vector connecting the current fly position with a target point on the path. A second closed-loop attitude control law also biases the vertical motion of the cylinder to keep the flies at the same altitude in the arena. More exactly,
Where Vk = max (1.0, V * 20) increases the gain for slowly flying flies. θerr = θfly – arctan(|xtarget – xfly|, |ytarget – yfly|) is the angular error between the fly velocity vector and the current position - target position vector, and K is the feedback gain for path following, and Kvertical is the feedback gain for altitude control. See Supplementary Figure SA for the influence of different gain values on path following behavior in Drosophila. When the active fly position was within 0.1m of the target position, the target position was advanced to the next point along the path.
Sample size was not pre-determined. Because the flight activity of the non-glued flies was higher than the head-fixed flies, we ran double the number of head-fixed fly experiments to increase number of trajectories analyzed.
Starting and Stopping Criteria
As per Drosophila post-trials.
Drosophila (Tethered experiments)
Female flies of 4 days of age were used for each experiment. Flies were anesthetized by cooling to 3.2C on a custom-built thermo-electric cold plate (IMP Mechanical Workshop, Vienna) and fixed to a tungsten tether rod with a small drop of blue-light solidifying glue (Bondic). Depending on the preparation, the head was additionally bonded to the thorax. Flies were allowed at least 20 minutes’ recovery time before the experiments.
Rigidly tethered flies are placed in the center of a ping-pong ball used as a spherical projection screen for visual stimuli as described in 49. Flies were filmed from above using a single camera. Estimation of right and left wing beat amplitudes happened at 100 Hz. The difference between the right and the left wing beat amplitudes was converted to a turning torque and forward force. Applying these forces to a virtual agent gave the tethered fly the ability to control its virtual position in a planar 2D space.
An identical virtual world as per freeflight experiments was used (a textured cylinder of the same dimensions as the FlyCave).
Sample size was not pre-determined. After pilot experiments showed no difference (data not shown), multiple experiments were performed until an N comparable with prior freeflight experiments was obtained.
Starting and Stopping Criteria
One the tethered fly had started flying (beating its wings), trials were started by warping the virtual fly somewhere inside the virtual cylinder volume. Trials ended when the tethered subject stopped beating its wings, or exited the virtual cylinder volume.
Mice
Experiments were performed at room temperature 22-25°C from 11:30 – 18:00. Sample size was chosen based on effect strength of similar elevated plus-maze experiments. We elected to test 15 mice per condition or until our stock of suitable mice were exhausted.
Starting and Stopping Criteria
Mice were placed at the border between the shallow and deep sides and left for 15 minutes, or until they jumped from the platform. Mice which jumped before 10 minutes were excluded from further analysis (N=1 mouse jumped before completion of 10 minutes). Only naive mice were used. Mice were used for only one experiment.
Animal Tracking
Fish and Fly Tracking
Multi-camera tracking software called Flydra was used18. This software operates by updating a six dimensional estimate (three dimensional position and velocity) of animal state based on observations from each camera on every frame. The state update operation is done using an extended Kalman filter in which the motion model of the animal is a linear constant velocity model and the non-linear observation model is derived from the camera calibration and is linearized about the a priori predicted animal position on each frame. Camera calibrations are done using MultiCamSelfCal50.
To track fish underwater in 3D with cameras out of water, we developed a new capability of correcting for refraction at the air-water interface. The non-linear observation model described above was augmented with a model of the water surface and the refractive indices of water and air. Fermat’s principle of least time and the camera calibration were used to calculate forward and reverse models of a 3D point underwater leading to a pixel on the image sensor.
Mouse Tracking
Mouse tracking software is implemented using Python and OpenCV. Detection and tracking of head position is implemented using the following sequence of image processing operations operating at 120fps (Supplementary Video 11).
The incoming image is Gaussian blurred (sigma=0.7)
The foreground/background model (OpenCV MOG2, varThreshold=0.8) is updated
The largest contour from the foreground is extracted and simplified (cvApproxPolyDP, sigma=1)
The contour is filled and rotated into a landscape orientation
The ‘heavy’ end of the contour is extracted (representing the head of the mouse)
The extremity of the heavy end of the contour in the x-axis, and the mean (weighted center-of-mass) of the contour in the y-axis is taken to be the head of the mouse
If the x- or y-position of the head has moved more than 30 pixels since the last frame, ignore the measurement
Otherwise, use the mouse head position is used to update a Kalman Filter (constant motion model, process_noise=3, measurement_noise=300). Measurement update covariance (R) was chosen dynamically based on the elongation of the mouse contour - with head position estimates from elongated mice trusted more. If no head could be detected then the mouse head position is predicted based on the last position.
The head x-, y-position in pixels is converted to the position in world coordinates by a scale factor. The head z-position is fixed at 0.14m (the height of the platform).
Data Preparation and Data Handling
Lagged Correlation Measure
The Pearson correlation was used to measure the relationship between visual system input and behavioral output. The correlation was computed at different time lags (by shifting the system output with respect to the system input) in order to account for the closed loop latency the system, and to characterize how the organism response to stimulus changes over time. Lagged Pearson correlation was used rather than cross-correlation to deal with timeseries of different lengths and with missing data.
Zebrafish
Trials shorter than 1s were discarded. In path-following experiments, all remaining data was then split into 1 second segments for further analysis. Lagged correlations were computed for each 1 second segment, this was then averaged to give one lagged-correlation series per fish and again one per condition.
In Post experiment trials associated with Fig. 1, observations where the 3D position estimate had been obtained from only one camera were removed.
For comparisons of lagged correlations between groups, variances were checked to not differ. Under large dots, where no difference between genotypes was found: Mann-Whitney test, p=0.52, Bartlett’s test of equality of variance p=0.42. For small dots, where a difference between genotypes was observed: p=0.014, equality of variance p=0.15. For gray condition where no difference between genotypes was observed: p=0.82, equality of variance p=3e-8, however this was due to no overall response.
Drosophila (post trials)
For all experimental groups. Trials shorter than 2s were discarded. Trials were truncated when z-position was outside the range 0.1m < z < 0.9m (flies sitting on the floor or ceiling of the arena). Trials were truncated when the lateral (x,y) distance from the center of the cylinder to the fly exceeded 0.42m (flies sitting on the walls of the cylinder). Trials were truncated when the fly forward velocity was <= 0.05m/s (experimentally determined to represent when flies landed on the post or walls). In trials associated with Fig. 1, observations where the 3D position estimate had been obtained from only one camera were removed.
Drosophila (path following trials)
For all experimental groups. Trials shorter than 1s were discarded. Trials were truncated when z-position was outside the range 0.1m < z < 0.9m (flies sitting on the floor or ceiling of the arena). Trials were truncated when the lateral (x,y) distance from the center of the cylinder to the fly exceeded 0.42m (flies sitting on the walls of the cylinder). Trials were discarded when >50% of the trial had a mean forward velocity of <= 0.05m/s (experimentally determined to represent walking trajectories). Observations where the 3D position estimate had been obtained from only one camera were removed.
Mice
Mice which did not complete the 10 minute trial were excluded.
Code availability
The most recent version of the software is made available from https://strawlab.org/freemovr. A snapshot is included as Supplementary Data 1.
Data availability
Metadata for all figures is listed in Supplementary Data 3 “Data Record.xls” and original data will be made available on request.
Supplementary Material
Acknowledgements
We thank M. Colombini, A. Fuhrmann, L. Fenk, E. Campione, S. Villalba and the IMP/IMBA Workshop for help constructing FreemoVR hardware and software. We thank M. Dickinson and T. Klausberger for helpful discussions, V. Böhm for help with experiments, the MFPL fish facility for fish care. The manual mouse behavior annotation was performed by the Preclinical Phenotyping Facility at Vienna Biocenter Core Facilities. This work was supported by European Research Council (ERC) starting grants 281884 to A.D.S., 311701 to W.H., 337011 to K.T.-R., Wiener Wissenschafts-, Forschungs- und Technologiefonds (WWTF) grant CS2011-029 to A.D.S, FWF (http://www.fwf.ac.at/) research project grants #P28970 to K.T.-R and #P29077 to K.N., NSF grants PHY-0848755 to I.D.C., IOS-1355061 to I.D.C., EAGER-IOS-1251585 to I.D.C., ONR grants N00014-09-1-1074 to I.D.C., N00014-14-1-0635 to I.D.C, ARO grants W911NG-11-1-0385 to I.D.C., W911NF-14-1-0431 to I.D.C. A.D.S and W.H. were further supported by the IMP, Boehringer Ingelheim and the Austrian Research Promotion Agency (FFG). K.T.-R. is supported by grants from the University of Vienna (research platform “Rhythms of Life”). IDC acknowledges further support from the “Struktur- und Innovationsfonds für die Forschung (SI-BW)“ of the State of Baden-Württemberg, and the Max Planck Society. I.D.C. and R.B. gratefully acknowledge fish care and technical support from C. Bauer, J. Weglarski, A. Bruttel and G. Mazué.
Footnotes
Author Contributions
A.D.S., K.TR., W.H., I.D.C. conceived the projects. J.R.S., M.H., R.M.F., R.B., and A.D.S. developed the hardware and software, and built the apparatus. J.R.S., M.H., R.B., J.G., P.H., S.F., and A.D.S. performed experiments. J.R.S., M.H., R.B., J.G., S.F., W.H., I.D.C., K.TR. and A.D.S. performed data analyses. A.D.S., K.TR., I.D.C., J.R.S., M.H. and J.G. wrote the manuscript. A.D.S., K.TR., W.H., I.D.C. and K.N. funded the work.
Competing Financial Interests Statement
J.R.S. and M.H. are executives with loopbio, gmbh, a company offering virtual reality services. The other authors declare no competing financial interests.
References
- 1.Aghajan ZM, et al. Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat Neurosci. 2015;18:121–128. doi: 10.1038/nn.3884. [DOI] [PubMed] [Google Scholar]
- 2.Chiappe ME, Seelig JD, Reiser MB, Jayaraman V. Walking Modulates Speed Sensitivity in Drosophila Motion Vision. Curr Biol. 2010;20:1470–1475. doi: 10.1016/j.cub.2010.06.072. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.von Holst E, Mittelstaedt H. Das Reafferenzprinzip - Wechselwirkungen zwischen Zentralnervensystem und Peripherie. Naturwissenschaften. 1950;37:464–476. [Google Scholar]
- 4.Jung SN, Borst A, Haag J. Flight Activity Alters Velocity Tuning of Fly Motion-Sensitive Neurons. J Neurosci. 2011;31:9231–9237. doi: 10.1523/JNEUROSCI.1138-11.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Kim AJ, Fitzgerald JK, Maimon G. Cellular evidence for efference copy in Drosophila visuomotor processing. Nat Neurosci. 2015;18:1247–55. doi: 10.1038/nn.4083. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Leinweber M, et al. Two-photon calcium imaging in mice navigating a virtual reality environment. J Vis Exp JoVE. 2014;e50885 doi: 10.3791/50885. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Sperry RW. Neural basis of the spontaneous optokinetic response produced by visual inversion. J Comp Physiol Psychol. 1950;43:482–9. doi: 10.1037/h0055479. [DOI] [PubMed] [Google Scholar]
- 8.Ravassard P, et al. Multisensory control of hippocampal spatiotemporal selectivity. Science. 2013;340:1342–6. doi: 10.1126/science.1232655. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Acharya L, Aghajan ZM, Vuong C, Moore JJ, Mehta MR. Causal Influence of Visual Cues on Hippocampal Directional Selectivity. Cell. 2016;164:197–20. doi: 10.1016/j.cell.2015.12.015. [DOI] [PubMed] [Google Scholar]
- 10.Harvey CD, Collman F, Dombeck DA, Tank DW. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature. 2009;461:941–946. doi: 10.1038/nature08499. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Schmidt-Hieber C, Häusser M. Cellular mechanisms of spatial navigation in the medial entorhinal cortex. Nat Neurosci. 2013;16:325–31. doi: 10.1038/nn.3340. [DOI] [PubMed] [Google Scholar]
- 12.Aronov D, Tank DW. Engagement of Neural Circuits Underlying 2D Spatial Navigation in a Rodent Virtual Reality System. Neuron. 2014;84:442–456. doi: 10.1016/j.neuron.2014.08.042. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Sofroniew NJ, Cohen JD, Lee AK, Svoboda K. Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. J Neurosci. 2014;34:9537–9550. doi: 10.1523/JNEUROSCI.0712-14.2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Hölscher C, Schnee A, Dahmen H, Setia L, Mallot HA. Rats are able to navigate in virtual environments. J Exp Biol. 2005;208:561–9. doi: 10.1242/jeb.01371. [DOI] [PubMed] [Google Scholar]
- 15.Dombeck DA, Harvey CD, Tian L, Looger LL, Tank DW. Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat Neurosci. 2010;13:1433–40. doi: 10.1038/nn.2648. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Maimon G, Straw AD, Dickinson MH. Active flight increases the gain of visual motion processing in Drosophila. Nat Neurosci. 2010;13:393–399. doi: 10.1038/nn.2492. [DOI] [PubMed] [Google Scholar]
- 17.Cushman JD, et al. Multisensory control of multimodal behavior: do the legs know what the tongue is doing? PloS One. 2013;8:e80465. doi: 10.1371/journal.pone.0080465. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Straw AD, Branson K, Neumann TR, Dickinson MH. Multi-camera Realtime 3D Tracking of Multiple Flying Animals. J R Soc Interface. 2011;8:395–409. doi: 10.1098/rsif.2010.0230. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Fry SN, Rohrseitz N, Straw AD, Dickinson MH. Visual flight speed control in Drosophila melanogaster. J Exp Biol. 2009;212:1120–1130. doi: 10.1242/jeb.020768. [DOI] [PubMed] [Google Scholar]
- 20.Schuster S, Strauss R, Götz KG. Virtual-Reality Techniques Resolve the Visual Cues Used by Fruit Flies to Evaluate Object Distances. Curr Biol. 2002;12:1591–1594. doi: 10.1016/s0960-9822(02)01141-7. [DOI] [PubMed] [Google Scholar]
- 21.Straw AD, Lee S, Dickinson MH. Visual Control of Altitude in Flying Drosophila. Curr Biol. 2010;20:1550–1556. doi: 10.1016/j.cub.2010.07.025. [DOI] [PubMed] [Google Scholar]
- 22.Stowers JR, et al. Reverse Engineering Animal Vision with Virtual Reality and Genetics. Computer. 2014;14:38–45. [Google Scholar]
- 23.Del Grosso N, Graboski J, Chen W, Blanco-Hernández E, Sirota A. Virtual Reality system for freely-moving rodents. bioRxiv. 2017 doi: 10.1101/161232. [DOI] [Google Scholar]
- 24.Ellard CG, Goodale MA, Timney B. Distance estimation in the mongolian gerbil: The role of dynamic depth cues. Behav Brain Res. 1984:29–39. doi: 10.1016/0166-4328(84)90017-2. [DOI] [PubMed] [Google Scholar]
- 25.Poggio T, Reichardt W. A theory of the pattern induced flight orientation of the fly Musca domestica II. Kybernetik. 1973;12:185–203. doi: 10.1007/BF00270572. [DOI] [PubMed] [Google Scholar]
- 26.Kim AJ, Fitzgerald JK, Maimon G. Cellular evidence for efference copy in Drosophila visuomotor processing. Nat Neurosci. 2015 doi: 10.1038/nn.4083. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Duistermars BJ, Care Ra, Frye Ma. Binocular interactions underlying the classic optomotor responses of flying flies. Front Behav Neurosci. 2012;6:6. doi: 10.3389/fnbeh.2012.00006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Reiser MB, Dickinson MH. Visual motion speed determines a behavioral switch from forward flight to expansion avoidance in Drosophila. J Exp Biol. 2012;216:719–732. doi: 10.1242/jeb.074732. [DOI] [PubMed] [Google Scholar]
- 29.Kress D, Egelhaaf M. Head and body stabilization in blowflies walking on differently structured substrates. J Exp Biol. 2012;215:1523–1532. doi: 10.1242/jeb.066910. [DOI] [PubMed] [Google Scholar]
- 30.Schilstra C, Hateren JH. Blowfly flight and optic flow. I. Thorax kinematics and flight dynamics. J Exp Biol. 1999;202:1481–1490. doi: 10.1242/jeb.202.11.1481. [DOI] [PubMed] [Google Scholar]
- 31.Lister JA, Robertson CP, Lepage T, Johnson SL, Raible DW. nacre encodes a zebrafish microphthalmia-related protein that regulates neural-crest-derived pigment cell fate. Development. 1999;126:3757–3767. doi: 10.1242/dev.126.17.3757. [DOI] [PubMed] [Google Scholar]
- 32.Ahrens MB, et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature. 2012;485:471–477. doi: 10.1038/nature11057. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.O’Malley DM, et al. Optical physiology and locomotor behaviors of wild-type and nacre zebrafish. Methods Cell Biol. 2004;76:261–284. doi: 10.1016/s0091-679x(04)76013-6. [DOI] [PubMed] [Google Scholar]
- 34.Antinucci P, Hindges R. A crystal-clear zebrafish for in vivo imaging. Sci Rep. 2016;6:29490. doi: 10.1038/srep29490. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Lange M, et al. Inter-individual and inter-strain variations in zebrafish locomotor ontogeny. PloS One. 2013;8:e70172. doi: 10.1371/journal.pone.0070172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Liu Y, et al. Statistical Analysis of Zebrafish Locomotor Response. PLOS ONE. 2015;10:e0139521. doi: 10.1371/journal.pone.0139521. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Barker AJ, Baier H. Sensorimotor Decision Making in the Zebrafish Tectum. Curr Biol. 2015;25:2804–2814. doi: 10.1016/j.cub.2015.09.055. [DOI] [PubMed] [Google Scholar]
- 38.Thibos LN, Still DL, Bradley A. Characterization of spatial aliasing and contrast sensitivity in peripheral vision. Vision Res. 1996;36:249–258. doi: 10.1016/0042-6989(95)00109-d. [DOI] [PubMed] [Google Scholar]
- 39.Reynolds CW. Proceedings of the 14th Annual Conference on Computer Graphics and Interactive Techniques. ACM; 1987. Flocks, Herds and Schools: A Distributed Behavioral Model; pp. 25–34. [DOI] [Google Scholar]
- 40.Couzin ID, Krause J, Franks NR, Levin SA. Effective leadership and decision-making in animal groups on the move. Nature. 2005;433:513–516. doi: 10.1038/nature03236. [DOI] [PubMed] [Google Scholar]
- 41.Couzin ID, et al. Uninformed Individuals Promote Democratic Consensus in Animal Groups. Science. 2011;334:1578–1580. doi: 10.1126/science.1210280. [DOI] [PubMed] [Google Scholar]
- 42.Ioannou CC, Guttal V, Couzin ID. Predatory fish select for coordinated collective motion in virtual prey. Science. 2012;337:1212–5. doi: 10.1126/science.1218919. [DOI] [PubMed] [Google Scholar]
- 43.Naumann EA, Kampff AR, Prober DA, Schier AF, Engert F. Monitoring neural activity with bioluminescence during natural behavior. Nat Neurosci. 2010;13:513–20. doi: 10.1038/nn.2518. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Randlett O, et al. Whole-brain activity mapping onto a zebrafish brain atlas. Nat Methods. 2015;12:1039–1046. doi: 10.1038/nmeth.3581. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Szuts TA, et al. A wireless multi-channel neural amplifier for freely moving animals. Nat Neurosci. 2011;14:263–269. doi: 10.1038/nn.2730. [DOI] [PubMed] [Google Scholar]
- 46.Ziv Y, et al. Long-term dynamics of CA1 hippocampal place codes. Nat Neurosci. 2013;16:264–266. doi: 10.1038/nn.3329. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Bastien R, et al. KymoRod: a method for automated kinematic analysis of rod-shaped plant organs. Plant J. 2016;88:468–475. doi: 10.1111/tpj.13255. [DOI] [PubMed] [Google Scholar]
- 48.D’Azzo JJ. Linear Control System Analysis and Design: Conventional and Modern. McGraw-Hill Science, Engineering & Mathematics; 1995. [Google Scholar]
- 49.Fenk LM, Poehlmann A, Straw AD. Asymmetric processing of visual motion for simultaneous object and background responses. Curr Biol. 2014;24:2913–2919. doi: 10.1016/j.cub.2014.10.042. [DOI] [PubMed] [Google Scholar]
- 50.Svoboda T, Martinec D, Pajdla T. A convenient multi-camera self-calibration for virtual environments. PRESENCE Teleoperators Virtual Environ. 2005;14:407–422. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Metadata for all figures is listed in Supplementary Data 3 “Data Record.xls” and original data will be made available on request.





