Skip to main content
PLOS Biology logoLink to PLOS Biology
. 2020 Jul 14;18(7):e3000712. doi: 10.1371/journal.pbio.3000712

PiVR: An affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior

David Tadres 1,2,3, Matthieu Louis 1,2,4,*
Editor: Tom Baden5
PMCID: PMC7360024  PMID: 32663220

Abstract

Tools enabling closed-loop experiments are crucial to delineate causal relationships between the activity of genetically labeled neurons and specific behaviors. We developed the Raspberry Pi Virtual Reality (PiVR) system to conduct closed-loop optogenetic stimulation of neural functions in unrestrained animals. PiVR is an experimental platform that operates at high temporal resolution (70 Hz) with low latencies (<30 milliseconds), while being affordable (<US$500) and easy to build (<6 hours). Through extensive documentation, this tool was designed to be accessible to a wide public, from high school students to professional researchers studying systems neuroscience. We illustrate the functionality of PiVR by focusing on sensory navigation in response to gradients of chemicals (chemotaxis) and light (phototaxis). We show how Drosophila adult flies perform negative chemotaxis by modulating their locomotor speed to avoid locations associated with optogenetically evoked bitter taste. In Drosophila larvae, we use innate positive chemotaxis to compare behavior elicited by real- and virtual-odor gradients. Finally, we examine how positive phototaxis emerges in zebrafish larvae from the modulation of turning maneuvers to orient in virtual white-light gradients. Besides its application to study chemotaxis and phototaxis, PiVR is a versatile tool designed to bolster efforts to map and to functionally characterize neural circuits.


Unravelling the logic of neural circuits underlying behavior requires precise control of sensory inputs based on the animal's behavior. This article presents Raspberry Pi Virtual Reality (PiVR) as an affordable tool designed to enable both academic labs and citizen-science projects to conduct such functional analysis by immersing freely-moving animals in virtual realities.

Introduction

Since the advent of molecular tools to map and manipulate the activity of genetically targeted neurons [1,2], a major goal of systems neuroscience has been to unravel the neural computations underlying sensorimotor transformation [3,4]. Because of the probabilistic nature of behavior [57], probing sensorimotor functions requires stimulating an animal with reproducible patterns of sensory input that can be conditioned by the behavioral history of the animal. These conditions can be achieved by immersing animals in virtual realities [8].

A virtual reality paradigm consists of a simulated sensory environment perceived by an animal and updated based on a readout of its behavior. Historically, virtual realities have been introduced to study optomotor behavior in tethered flies and bees [9,10]. For several decades, sophisticated computer-controlled methods have been developed to produce ever-more-realistic immersive environments. In FreemoVR, freely moving flies avoid collisions with fictive tridimensional obstacles, and zebrafish engage in social interactions with artificial peers [11]. Spatial learning has been studied in tethered flies moving on a treadmill in 2D environments filled with geometrical objects projected on a visual display [12]. The same technology has been used to record the neural activity of mice exploring a virtual space [13].

Immobilized zebrafish larvae have been studied while hunting virtual prey [14], adapting their motor responses to fictive changes in intensity of water flow speed [15] and by virtually aligning themselves with a visual moving stimulus [16]. Although virtual realities were initially engineered to study visual behavior, they have been generalized to other sensory modalities, such as touch and olfaction. In a treadmill system, navigation has been studied in mice directed by localized stimulations of their whiskers [17]. The combination of closed-loop tracking and optogenetic stimulation of genetically targeted sensory neurons has enabled a quantitative analysis of chemotaxis in freely moving Caenorhabditis elegans and in Drosophila larvae immersed in virtual-odor gradients [18,19].

Virtual reality assays aim to reproduce the natural feedback that binds behavior to sensation [8]. First, the behavior of an animal must be accurately classified in real time. In tethered flying flies, wing beat patterns have been used to deduce turning maneuvers [20]. Likewise, the movement of a tethered walking fly or a mouse can be inferred from the rotation of the spherical trackball of a treadmill [13,21]. In immobilized zebrafish larvae, recordings from the motor neuron axons have been used to infer intended forward swims and turns [16]. In freely moving C. elegans and Drosophila larvae, the posture of an animal and the position of specific body parts—the head and tail, for instance—have been tracked during motion in 2D arenas [18,19]. Variables related to the behavioral readout—most commonly, the spatial coordinates—are then mapped onto a virtual sensory landscape to update the stimulus intensity [13]. A similar methodology that enables the tracking of individual sensory organs (left and right eyes) has been recently proposed as an open-source package dedicated to zebrafish larvae [22].

The effectiveness of the virtual reality paradigm is determined by the overall temporal delay between the animal’s behavior and the update of the stimulus. The shorter this delay, the more authentic the virtual reality is perceived. As a result, the methodology deployed to create efficient virtual realities with closed-loop tracking relies on advanced technology that makes behavioral setups costly and often difficult to adopt by nonspecialists. There is a scope to complement the existing collection of sophisticated assays with tools that are affordable, easily built, and accessible to most laboratories. The fly “ethoscope” proposes a hardware solution that exploits 3D printing to study the behavior of adult flies [23]. In this system, tracking and behavioral classification are implemented by a portable and low-cost computer, the Raspberry Pi. Although this system can implement a feedback loop between real-time behavioral tracking and stimulus delivery (e.g., the physical rotation of an assay to disrupt sleep), it was not conceived to create refined virtual reality environments using optogenetic stimulations.

Here, we present the Raspberry Pi Virtual Reality (PiVR) platform enabling the presentation of virtual realities to freely moving small animals. This closed-loop tracker was designed to be accessible to a wide range of researchers by keeping the construction costs low and by maintaining the basic operations simple and customizable to suit the specificities of new experiments. We benchmark the performance of PiVR by studying navigation behavior in the Drosophila larva. We then reveal how adult flies adapt their speed of locomotion to avoid areas associated with the activation of bitter-sensing neurons. Finally, we show that zebrafish larvae approach a virtual-light source by modifying their turn angle in response to temporal changes in light intensity.

Results

PiVR permits high-performance closed-loop tracking and optogenetic stimulations

The PiVR system enables high-resolution, optogenetic, closed-loop experiments with small, freely moving animals. In its standard configuration, PiVR is composed of a behavioral arena, a camera, a Raspberry Pi microcomputer, a light-emitting diode (LED) controller, and a touch screen (Fig 1A). The platform is controlled via a user-friendly graphical interface (Fig 1B). Given that PiVR does not require the use of an external computer, the material for one setup amounts to less than US$500, with the construction costs decreasing to about US$350 when several units are built in parallel (S1 Table). As shown in S2 Table, PiVR is significantly cheaper than published alternatives that are capable of operating at equally high frame rates [11,22]. In spite of its affordability, PiVR runs a customized software (S1S6 Figs) that automatically identifies semitransparent animals such as Drosophila larvae behaving in a static background (Fig 2) and that monitors movement at a frame rate sufficient to accurately track rapidly moving animals such as walking adult flies and zebrafish larvae (Figs 3 and 4).

Fig 1. Virtual realities created by PiVR.

Fig 1

(A) Picture of the standard PiVR setup. The animal is placed on the light diffuser and illuminated from below using infrared LEDs and recorded from above. The Raspberry Pi computer and the LED controller are attached to the touch screen, which permits the user to interface with the PiVR setup. (B) Screenshot of the GUI while running a virtual reality experiment. The GUI has been designed to be intuitive and easy to use while presenting all important experimental parameters that can be modified. (C) Virtual realities are created by updating the intensity of a homogeneous light background based on the current position of a tracked animal mapped onto a predefined landscape shown at the center. (Center) Predefined virtual gradient with a Gaussian geometry. (Left) Trajectory of an unconstrained animal moving in the physical arena. (Right) The graph indicates the time course of the light intensity experienced by the animal during the trajectory displayed in the left panel. Depending on the position of the animal in the virtual-light gradient, the LEDs are turned off (t = 1) or turned on at an intermediate (t = 2) or maximum intensity (t = 3). GUI, graphical user interface; LED, light-emitting diode; PiVR, Raspberry Pi Virtual Reality.

Fig 2. Benchmarking PiVR performance by eliciting larval chemotaxis in virtual-odor gradients.

Fig 2

(A) Drosophila larva with a pair of single Or42a-functional OSNs (red dots). Illustration of the identification of different body parts of a moving larva by PiVR. (B) Illustrative trajectory of a larva in a Gaussian virtual-odor gradient elicited by light stimulation. Arrowheads and numbers indicate lateral head movements (casts), and the time points are congruent with the arrowheads shown in (C). Panel D shows the behavior of Drosophila larvae directed by Or42a OSNs in a gradient of IAA (green color). (E–F) Behavior of larvae expressing the light-gated ion channel CsChrimson in the Or42a OSNs evoking a virtual-odor gradient. In panel E, the virtual-odor gradient (red) has a geometry similar to the real-odor gradient (green) presented in (D). The “volcano” virtual-odor landscape presented in (F) highlights that the information conveyed by the Or42a OSN alone is sufficient for larvae to chemotax with high accuracy along the rim of the gradient. Thick lines in panels Diii, Eiii, and Fiii indicate the median distances to the source, and the light traces indicate individual trials. All data used to create this figure are available from https://doi.org/10.25349/D9ZK50. IAA, isoamyl acetate; LED, light-emitting diode; Or42a, odorant receptor 42a; OSN, olfactory sensory neuron; PiVR, Raspberry Pi Virtual Reality; Sens. exp., sensory experience; VR, virtual reality.

Fig 3. Adult fruit flies avoid activation of bitter-sensing neurons by modulating their locomotion speed.

Fig 3

(A) Adult Drosophila expressing CsChrimson in Gr66a bitter-sensing neurons (red circles). Illustration of the identification of different body parts of a moving fly by PiVR. (B) Illustrative trajectories of flies in a virtual checkerboard pattern and the corresponding ethogram. Flies were behaving in a petri dish. (D) The ethogram reports the time spent by individual animals (rows) in the dark (white) and lit (red) squares. Panel C displays a quantification of the avoidance of virtual bitter taste through a preference index: PI=TONTOFFTON+TOFF, where T is the time spent on the ON or OFF quadrants (Mann–Whitney U test, p < 0.001). (E) Median locomotion speeds of individual animals as a function of the exposure to light. (F) Quantification of locomotion speeds across experimental conditions (Dunn’s multiple comparisons test, different letters indicate at least p < 0.01). Statistical procedures are detailed in the Methods section. Statistical significances are indicated with lowercase letters. All data used to create this figure are available from https://doi.org/10.25349/D9ZK50. Gr66a, gustatory receptor 66a; PiVR, Raspberry Pi Virtual Reality.

Fig 4. Zebrafish larvae adapt their turn dynamics to stay close to a virtual-light source.

Fig 4

(A) Illustration of the identification of a moving zebrafish larva by PiVR. (B) Illustrative trajectory of a zebrafish larva in a virtual-light gradient having a Gaussian geometry. Panel C displays the time course of the speed and the white-light intensity during that trajectory shown in panel B. Yellow vertical lines indicate automatically detected bouts. (D) Trajectories of 13 fish tested in virtual-light gradient (left) and 11 fish in control (right). Red circles indicate 10-, 20-, 30-, and 40-mm distances to the center of the virtual-light source (see Methods). (E) Thick lines indicate the time courses of the median distances to the virtual-light source. The light lines indicate individual trials. (F) Illustration of the discretization of a trajectory segment into bouts: yellow vertical lines indicate the position at which the animal stops, reorients, and starts the next bout. Black dashed lines indicate movement of the fish. The turn angle (θ) and change in light intensity (ΔI) are calculated for every pair of consecutive bouts (see Methods). The bottom of panel F illustrates a swim bout oriented up-gradient (purple, ΔI > 0) and down-gradient (green, ΔI < 0). (G) Relationship between θ and I during the previous bout (independent two-sample t test, different letters indicate p < 0.001). (H) Turn angles θ of the virtual reality condition are grouped according to negative (green) and positive (magenta) intensity experienced in the previous bout (t test for paired samples, different letters indicate at least p < 0.05). (I) The turn index (β) is calculated from the average reorientation accuracy (βi) of the animal relative to the virtual-light source at the onset of each swim bout. (J) Turn index (β) as a function of stimulus intensity (Mann–Whitney U test, all groups p > 0.05). All reported statistical significances are Bonferroni corrected and indicated with lowercase letters. Statistical procedures are detailed in the Methods section. All data used to create this figure are available from https://doi.org/10.25349/D9ZK50. PiVR, Raspberry Pi Virtual Reality; VR, virtual reality.

We characterized the overall latency of PiVR in two ways. First, by measuring the following three parameters: (1) image acquisition time, (2) image processing time, and (3) the time taken for commands issued by the Raspberry Pi to be actuated by the LED hardware (S1A–S1D Fig). We find that the image processing time is the main time-consuming step. Secondly, we measured the total latency between the moment an image starts being recorded and the update of the intensity of the LED system based only on the analysis of that image. We find that the total latency is shorter than 30 milliseconds (S1E–S1G Fig). Thus, PiVR is suited to perform online tracking of small animals at a frame rate of up to 70 Hz with a lag shorter than three frames and an accuracy suitable to create virtual olfactory realities in Drosophila larvae [18] and virtual visual realities in walking adult flies [21].

The PiVR software implements image acquisition, object tracking, and the update of background illumination for optogenetic stimulation. It is free, fully open-source, and written in the programming language Python. At the beginning of each experiment, an autodetect algorithm separates the moving object—the animal—from the background (S3 Fig). During the rest of the experiment, the tracking algorithm operates based on a principle of local background subtraction to achieve high frame rates (S4 Fig). Besides locating the position of the animal’s centroid, PiVR uses a Hungarian algorithm to tell apart the head from the tail positions (S5 Fig). For applications involving off-line tracking with a separate software [24,25], the online tracking module of PiVR can be disabled to record videos at 90 Hz in an open-loop mode.

PiVR has been designed to create virtual realities by updating the intensity of a homogeneous stimulation backlight based on the current position of a tracked animal (Fig 1C, left panel) relative to a preset landscape (Fig 1C, middle panel, S1 Movie). Virtual sensory realities are generated by optogenetically activating sensory neurons of the peripheral nervous system [1]. In the present study, we focus on applications involving CsChrimson because the red activation spectrum of this light-gated ion channel is largely invisible to Drosophila [26]. Depending on the light-intensity range necessary to stimulate specific neurons, PiVR features a light pad emitting stimulation light at low-to-medium (2 μW/mm2) or high (22 μW/mm2) intensities (S2C and S2D Fig). A key advantage of the closed-loop methodology of PiVR is that it permits the creation of virtual realities with arbitrary properties free of the physical constraints of real stimuli. In Fig 2Fi, we illustrate the use of PiVR by immersing Drosophila larvae in a virtual-odor gradient that has the shape of a volcano—a geometry that challenges the sensorimotor responses of larvae in a predictable way [18].

The possibility to 3D print components that once required sophisticated machining has empowered the “maker movement” in our scientific community [27]. Inspired by this philosophy, PiVR is built from hardware parts that are 3D printed. Thus, the modular design of the setup can be readily adapted to accommodate the experimental needs of traditional model organisms (larvae, adult flies, and zebrafish) as well as less conventional small animals (S2 and S6 Figs). For example, we adapted PiVR to acquire movies of 10 fruit fly larvae simultaneously with an image quality sufficient to permit off-line tracking of multiple-animal tracking with the idtracker.ai software [25]. To achieve this, we modified the design of the arena to allow illumination from the side instead of bottom (S2B Fig). This adaptation of the illumination setup was necessary to enhance contrast in the appearance of individual larvae for idtracker.ai to detect idiosyncratic differences between larvae (S2Bi Fig). The versatility of PiVR was also illustrated by tracking various arthropods and a vertebrate with diverse body structures and locomotor properties (Figs 24, S6 Fig).

Benchmarking PiVR performances by eliciting larval chemotaxis in virtual-odor gradients

To benchmark the performances of PiVR, we turned to the navigation behavior evoked by odor gradients (chemotaxis) in the Drosophila larva [28,29]. Larval chemotaxis relies on a set of well-characterized sensorimotor rules [30]. To ascend an attractive odor gradient, larvae modulate the alternation of relatively straight runs and reorientation maneuvers (turns). Stops are predominantly triggered when the larva undergoes negative changes in odor concentration during down-gradient runs. Following a stop, turning is directed toward the gradient through an active sampling process that involves lateral head movements (head casts). A key advantage of the larva as a model organism for chemotaxis is that robust orientation responses can be directed by a functionally reduced olfactory system. The odorant receptor gene 42a (Or42a) is expressed in a pair of bilaterally symmetric olfactory sensory neurons (OSNs) [31,32] that are sufficient to direct chemotaxis [33]. We exploited this property to compare reorientation performances elicited by real- and virtual-odor stimulations of the Or42a OSNs.

We started by applying the computer-vision algorithm of PiVR to track larvae with a single functional Or42a-expressing OSN (Fig 2A). Individual animals were introduced in a rectangular arena comprising a gradient of isoamyl acetate (IAA) at its center (Fig 2D). Once exposed to the odor gradient, Or42a-functional larvae quickly identified the position of the odor source, and they remained in the source’s vicinity (Fig 2D and S2 Movie). The trajectories of consecutively tested larvae were analyzed by quantifying the time course of the distance between the head of the larva and the center of the odor source. The navigation of the Or42a-functional larvae yielded an average distance to the source significantly lower than that observed in the presence of the solvent alone (Fig 2Diii). This result is consistent with the behavior of wild-type larvae in response to attractive odors [34]. It establishes that PiVR can automatically detect and accurately track animals in real time.

Next, we tested the ability of PiVR to create virtual olfactory realities by optogenetically stimulating the larval olfactory system. In past work, robust chemotaxis was elicited in light gradients by expressing the blue-light-gated ion channel channelrhodopsin in the Or42a-expressing OSN of blind larvae [18]. In these experiments, the light stimulus was delivered as a point of LED light kept focused on the larva. Using a closed-loop paradigm, the intensity of the LED light was updated at a rate of 30 Hz, based on the position of the larva’s head mapped onto a landscapes predefined by the user [18]. PiVR was built on the same principle with the following modifications: (1) the spatial resolution of PiVR was reduced because the field of view of the camera captures the whole arena and not just the larva, (2) optogenetic stimulation was achieved through homogeneous background illumination instead of a light spot that must follow the larva, and (3) we favored the red-light-gated ion channel CsChrimson [26] over channelrhodopsin to minimize the innate photophobic response of larvae to the blue-light range [35].

The simplified hardware design of PiVR produced precise tracking of the head position of a larva exposed to a fictive light gradient (Fig 2B). The spatiotemporal resolution of the tracking is illustrated in Fig 2C, in which surges in head speed are associated with scanning movements of the head on a timescale shorter than 500 milliseconds [36]. Head “casts” induced transient changes in light intensity I(t) [18]. These changes in stimulus intensity correspond to spikes in the relative sensory experience of the larva (1I*dIdt) (Fig 2C, arrowheads in purple trace) [30]. The tracking resolution of PiVR enabled recording periodic patterns in tail speed (Fig 2C, top green trace) that reflect consecutive cycles of acceleration/deceleration during forward peristalsis [37]. Or42a-functional larvae displayed strong chemotaxis in response to a point-source virtual-odor gradient (Fig 2E and S3 Movie) with a level of attraction comparable to the behavior evoked by a real-odor gradient (Fig 2D). Moreover, PiVR recapitulated the meandering trajectories along the rim of a volcano-shaped virtual-odor gradient (Fig 2F and S4 Movie) [18]. Together, the results of Fig 2 and S7 Fig validate that PiVR has the tracking accuracy and closed-loop performances necessary to elicit genuine navigation in virtual-odor gradients.

Adult flies avoid activation of bitter-sensing neurons by modulating their locomotion speed

After having established the capability of PiVR to create virtual realities in Drosophila larvae, we sought to generalize the application of this tool to other small model organisms. Adult Drosophila are covered by a thick and opaque cuticle. Consequently, activating light-gated ion channels expressed in the sensory neurons of adult flies requires higher light intensities than in semitransparent larvae [6,26]. The background illumination system of PiVR was modified to deliver light intensities as high as 50 μW/mm2 to penetrate the adult-fly cuticle [38]. Despite a 10-fold increase in locomotion speed between adult flies and larvae (peak speed of 12 mm/s and 1.6 mm/s, respectively), PiVR accurately monitored the motion of adult fruit flies for the entire duration of 5-minute trials (Fig 3A and 3B). We turn to gustation to test the ability of PiVR to evoke orientation behavior in adult flies stimulated by virtual chemical gradients.

Drosophila demonstrates innate strong aversion to bitter taste [39]. This behavior is mediated by a set of sensory neurons expressing the gustatory receptor gene 66a (Gr66a). Optogenetic activation of the Gr66a-expressing neurons alone is sufficient to elicit aversive responses [38]. Using the closed-loop tracking capabilities of PiVR (Fig 3A), we examined taste-driven responses of flies expressing the red-light-gated ion channel CsChrimson in their Gr66a-expressing neurons. Because we reasoned that navigation in response to taste might be less directed than navigation in response to airborne odors, we presented flies with a 2D landscape emulating a checkerboard (Fig 3B). In this virtual checkerboard, quadrants were associated with either virtual bitter taste (light “ON”) or no taste (light “OFF”). Flies adapted their motion to avoid squares paired with virtual bitter taste (Fig 3B–3D and S5 Movie). This result generalized the field of application of PiVR to fast-moving small animals, such as walking adult flies.

To determine how flies actively avoid being exposed to bitter-tasting squares, we interrogated the spatial trajectories recorded by PiVR (Fig 3B) and correlated stimulus input with behavioral output [40]. This quantitative analysis of the relationship between stimulus dynamics and behavior highlighted that flies modulate their locomotion speed in response to excitation of their bitter-tasting neurons. When flies were located in a lit square eliciting virtual bitter taste, they moved significantly faster than when located in a dark square with no bitter taste. When flies encountered sensory relief in a dark square, they frequently stopped (Fig 3E and 3F). In summary, our results establish that PiVR is suitable to track and immerse adult flies in virtual sensory realities. Moreover, computational quantification of behavioral data produced by PiVR suggests that flies avoid bitter tastes by modulating their locomotion speed to avoid staying exposed to virtual bitter taste in the illuminated squares [41,42]. The contribution of other orientation mechanisms that integrate spatial information is left to be examined in future work. For instance, it is possible that flies implement directed turns upon their entry in a bitter-tasting quadrant (Fig 3B and S5 Movie).

Zebrafish larvae adapt their turn amplitude to stay close to a virtual-light source

Because of its transparency and amenability to molecular genetics, the zebrafish Danio rerio has emerged as a tractable model system to study how sensory representations and sensorimotor transformations arise from the activity in neural ensembles in vertebrates [43,44]. Zebrafish are innately attracted by real sources of white light [45]. Here, we show that PiVR is suitable to study the organization of orientation behavior of zebrafish larvae immersed in a virtual 2D light gradient (Fig 4A and 4B). Already at the larval stage, individuals are capable of staying confined to virtual disks of white light [46]. In spite of the fact that 5-days-postfertilization (dpf) zebrafish larvae stay in constant motion, tracking with PiVR established that larvae have the navigational capabilities to stay near the peak of a virtual white-light gradient (Fig 4B and S6 Movie) by constantly returning to this position for the duration of the entire trial (Fig 4D and 4E, S8A Fig).

The elementary motor patterns (actions) underlying the behavioral repertoire of zebrafish larvae can be decomposed into stops and slow and rapid swims [47]. By analyzing the time series of the centroid speed recorded by PiVR, we observed periodic increases in swim speed (Fig 4C). These episodes correspond to bursts, or “bouts,” of swim [48]. To examine the orientation strategy used by zebrafish larvae, we discretized trajectories into bouts. As described in previous work [47], each bout was found to be approximately straight (Fig 4B, cyan segments comprised by the circles). At low light intensities, significant reorientation occurred between consecutive swim bouts compared with the controls (Fig 4G).

Given that the virtual landscape produced by PiVR resulted from a temporal update of the intensity of isotropic light stimulation, we can rule out the detection of binocular differences in stimulus intensity [49]. Thus, reorientation in the virtual-light landscape of Fig 4 could only result from the detection of temporal changes in light intensity. Using the data set recorded with PiVR, we conducted a correlative analysis between the stimulus history and the reorientation maneuvers to define the visual features eliciting an increase in turning at low light intensity. More specifically, we analyzed the turn angle as a function of the light intensity for bouts associated with positive (up-gradient) and negative (down-gradient) changes in light intensity. When zebrafish larvae moved up-gradient, the turn angle was not modulated by the absolute intensity of the stimulus (Fig 4H, purple). By contrast, the turn rate increased for swim bouts oriented down-gradient at low stimulus intensity (Fig 4H, green). Therefore, we conclude that the rate of turning of zebrafish larvae is determined by a combination of the absolute light intensity and the sign of change in stimulus intensity.

Given the ability of zebrafish larvae to efficiently return to the peak of a light gradient (Fig 4D), we asked whether zebrafish larvae can also bias their turns toward the light source. To this end, we defined the turn index (β) to quantify the percentage of turns directed toward the light gradient (Fig 4I and Methods). This metric leads us to conclude that zebrafish larvae do not bias their turns toward the source more often than the control (Fig 4J). In summary, our results demonstrate that PiVR can track zebrafish larvae at a sufficiently high spatiotemporal resolution to characterize individual swim bouts. We show that zebrafish achieve positive phototaxis by increasing the amplitude of their turns when they are moving down-gradient and experiencing low light intensities. This result is consistent with a recent in-depth study of the sensorimotor strategy controlling zebrafish phototaxis in a virtual reality paradigm [50]. We conclude that phototaxis in zebrafish is at least partially controlled by an increase in the amplitude of turns when two conditions are met: (1) the animal must detect a negative change in light intensity, and (2) the absolute light intensity must be low enough. The combination of the previous two conditions appears sufficient to generate positive phototaxis, even in the absence of turning biases toward the light gradient.

Discussion

Anyone seeking to unravel the neural logic underlying a navigation process—whether it is the response of a cell to a morphogen gradient or the flight of a seabird guided by the Earth’s magnetic field—faces the need to characterize the basic orientation strategy before speculating about its molecular and cellular underpinnings. PiVR is a versatile closed-loop experimental platform devised to create light-based virtual sensory realities by tracking the motion of unconstrained small animals subjected to a predefined model of the sensory environment (Fig 1). It was created to assist the study of orientation behavior and neural circuit functions by scholars who might not have extensive background in programming or instrumentation to customize existing tools.

An experimental platform that is inexpensive and customizable

Prior to the commercialization of consumer-oriented 3D printers and cheap microcomputers such as the Raspberry Pi, virtual reality paradigms necessitated using custom setups that cost several thousand or even one hundred thousand dollars [11,13,18,19]. PiVR is an affordable (< US$500) alternative that enables laboratories without advanced technical expertise to conduct high-throughput virtual reality experiments. The procedure to build PiVR is visually illustrated in S7 Movie. All construction steps are intended to be tractable by virtually any users who have access to a soldering station and a 3D printer. A detailed step-by-step protocol is available on a dedicated website (www.pivr.org).

The creation of realistic immersive virtual realities critically depends on the update frequency and the update latency of the closed-loop system. The maximal update frequency corresponds to the maximally sustained frame rate that the system can support. The update latency is the latency between an action of the tested subject and the implementation of a change in the virtual reality environment. In normal working conditions, PiVR can be used with an update frequency of up to 70 Hz with a latency below 30 milliseconds (S2 Fig). These characteristics are similar to those routinely used to test optomotor responses with visual display streaming during walking behavior in insects [21], thereby ensuring the suitability of PiVR for a wide range of applications.

Different optogenetic tools require excitation at different wavelengths ranging from blue to deep red [26,51,52]. The modularity of PiVR enables the experimenter to customize the illumination system to any wavelength range. Additionally, different animals demand distinct levels of light intensities to ensure adequate light penetration in transparent and opaque tissues. Although 5 μW/mm2 of red light had been used to activate neurons located in the leg segments of adult flies with CsChrimson [38], 1 μW/mm2 is enough to activate OSNs of the semitransparent Drosophila larva (Fig 2). In its standard version, PiVR can emit red-light intensities as high as 2 μW/mm2 and white-light intensities up to 6,800 Lux—a range sufficient for most applications in transparent animals (Figs 2 and 4). For animals with an opaque cuticle, we devised a higher-power version of the backlight illumination system that delivers intensities up to 22 μW/mm2 (525 nm) and 50 μW/mm2 (625 nm) (Fig 3 and S1D Fig). Given that an illumination of 50 μW/mm2 is approaching the LED eye safety limits (International Electrotechnical Commission: 62471), it is unlikely that experimenters will want to exceed this range for common applications in the lab.

Exploring orientation behavior through virtual reality paradigms

By capitalizing on the latest development of molecular genetics and bioengineering [1], virtual sensory stimuli can be created by expressing optogenetic tools in targeted neurons of the peripheral nervous system of an animal. In the present study, PiVR was used to immerse Drosophila in virtual chemosensory gradients (Figs 2 and 3). In a first application, we stimulated one OSN of Drosophila larvae with CsChrimson. The attractive search behavior elicited by a single source of a real odor (Fig 2Di) was reproduced in an exponential light gradient (Fig 2Ei). To reveal the precision with which larvae orient their turns toward the gradient, larvae were tested in a virtual-odor landscape with a volcano shape (Fig 2Fi).

Adult flies move approximately 10 times faster than larvae. We established the ability of PiVR to immerse freely moving flies in a virtual bitter-taste gradient (Fig 3B). Unlike the attractive responses elicited by appetitive virtual odors in the larva (positive chemotaxis), bitter taste produced strong aversive behavior (S5 Movie). A correlative analysis of data recorded with PiVR revealed that the aversive behavior of adult flies is at least partly conditioned by a modulation of the animal’s locomotor speed: random search is enhanced upon detection of (virtual) bitter, whereas locomotion is drastically reduced upon sensory relief. As illustrated in Fig 3, PiVR offers a framework to explore the existence of other mechanisms contributing to the orientation of flies experiencing taste gradients. This approach adds to earlier studies of the effects of optogenetic stimulation of the bitter taste system on spatial navigation [38] and feeding behavior [53].

Zebrafish move through discrete swim bouts. The loop time of PiVR was sufficiently short to rise to the tracking challenge posed by the discrete nature of fish locomotion (Fig 4B). Genuine phototactic behavior was elicited in zebrafish larvae for several minutes based on pure temporal changes in light intensity devoid of binocular differences and a panoramic component. By synthetically recreating naturalistic landscapes [45], the behavior recorded by PiVR in Gaussian light gradients (Fig 4) complements previous studies featuring the use of discrete light disks [46] and local asymmetric stimulations [54]. A correlative analysis of the sensory input and the behavioral output corroborated the idea that positive phototaxis in fish can emerge from a modulation of the turn rate by the detected changes in light intensity (Fig 4G and 4H) without necessarily involving a turning bias toward the light gradient (Fig 4J). This orientation strategy shares similarities with the nondirectional increase in locomotor activity (“dark photokinesis”) that follows a sudden loss of illumination [55,56]. Taken together, our results establish that PiVR is suitable to conduct a detailed analysis of the sensorimotor rules directing attractive and aversive orientation behavior in small animals with distinct body plans and locomotor properties.

Outlook

Although this study focused on sensory navigation, PiVR is equally suited to investigate neural circuits [4] through optogenetic manipulations. To determine the connectivity and function of circuit elements, one typically performs acute functional manipulations during behavior. PiVR permits the time-dependent or behavior-dependent presentation of light stimuli to produce controlled gain of functions. The use of multiple setups in parallel is ideal to increase the throughput of behavioral screens—a budget of US$2,000 is sufficient to build more than five setups. If the experimenter wishes to define custom stimulation rules—triggering a light flash whenever an animal stops moving, for instance—this rule can be readily implemented by PiVR on single animals. For patterns of light stimulation that do not depend on the behavior of an animal—stimulations with a regular series of brief light pulses, for instance—groups of animals can be recorded at the same time. In its standard configuration (Fig 1A), the resolution of videos recorded with PiVR is sufficient to achieve individual tracking of group behavior through off-line analysis with specialized algorithms such as idtracker.ai [25] (S2Bi Fig). Even for small animals such as flies, videos of surprisingly good quality can be recorded by outfitting the charge-coupled device (CCD) camera of PiVR with appropriate optics (S8 Movie).

Until recently, systems neuroscientists had to design and build their own setup to examine the function of neural circuits, or they had to adapt existing systems that were often expensive and complex. Fortunately, our field has benefited from the publication of a series of customizable tools to design and conduct behavioral analysis. The characteristics of the most representative tools are reviewed in S2 Table. The ethoscope is a cost-efficient solution based on the use of a Raspberry Pi computer [23], but it was not designed to create virtual realities. Several other packages can be used to track animals in real time on an external computer platform. For instance, Bonsai is an open-source visual programming framework for the acquisition and online processing of data streams to facilitate the prototyping of integrated behavioral experiments [57]. FreemoVR can produce impressive tridimensional virtual visual realities [11]. Stytra is a powerful open-source software package designed to carry out behavioral experiments specifically in zebrafish with real-time tracking capabilities [22]. As illustrated in comparisons of S2 Table, PiVR complements these tools by proposing a versatile solution to carry out closed-loop tracking with low-latency performance. One limitation of PiVR is that it produces purely homogenous temporal changes in light intensity without any spatial component. Because of its low production costs, the simplicity of its hardware design, and detailed documentation, PiVR can be readily assembled by any laboratory or group of high school students having access to a 3D printer. For these reasons, PiVR represents a tool of choice to make light-based virtual reality experiments accessible to experimentalists who might not be technically inclined.

We anticipate that the performance (S1 Fig) and the resolution of PiVR (Methods) will keep improving in the future. Historically, a new and faster version of the Raspberry Pi computer has been released every 2 years. In the near future, the image processing time of PiVR might decrease to just a few milliseconds, pushing the frequency to well above 70 Hz. Following the parallel development of transgenic techniques in nontraditional genetic model systems, it should be possible to capitalize on the use of optogenetic tools in virtually any species. Although PiVR was developed for animals measuring no more than a few centimeters, it should be easily scalable to accommodate experiments with larger animals such as mice and rats. Together with FlyPi [58] and the ethoscope [23], PiVR represents a low-barrier technology that should empower many labs to characterize new behavioral phenotypes and study neural circuit functions with minimal investment in time and research funds.

Methods

Hardware design

We designed all 3D-printed parts using 3D Builder (Microsoft Corporation). An Ultimaker 3 (Ultimaker, Geldermalsen, the Netherlands) with 0.8-mm print cores was used to print all parts. We used 2.85-mm PLA (B01EKFVAEU) as building material and 2.85-mm PVA (HY-PVA-300-NAT) as support material. The STL files were converted to gCode using Ultimaker’s Cura software (https://ultimaker.com/en/products/ultimaker-cura-software). The printed circuit boards (PCBs) were designed using Fritzing (http://fritzing.org/home/) and printed by AISLER BV (Lemiers, the Netherlands).

Hardware parts

Raspberry Pi components were bought from Newark element14 (Chicago, United States) and Adafruit Industries (New York, US). The 850-nm long-pass filter was bought from Edmond Optics (Barrington, US). Other electronics components were obtained from Mouser Electronics (Mansfield, US), Digi-Key Electronics (Thief River Falls, US), and Amazon (Seattle, US). Hardware was obtained from McMaster (Elmhurst, US). A complete bill of materials is available in S1 Table. Updates will be available on www.pivr.org and https://gitlab.com/louislab/pivr_publication.

Building and using PiVR

Detailed instructions on how to build a PiVR and how to use it can be found in S1 HTML. Any updates will be available on www.pivr.org.

Data analysis and statistics

All data and scripts have been deposited as a data package on Dryad: https://doi.org/10.25349/D9ZK50 [59]. Data analysis was performed using custom written analysis codes, which are bundled with the data package. In addition, the data analysis scripts are available from https://gitlab.com/LouisLab/PiVR.

Latency measurements of PiVR

To estimate the time it takes between an animal performing an action and PiVR presenting the appropriate light stimulus, the following elements were taken into account: (1) image acquisition time, (2) image processing and VR calculation latency, and (3) software-to-hardware latency. To measure image acquisition time (S1B Fig), the camera was allowed to set optimal exposure at each light intensity before measuring the shutter speed. To measure image processing and VR calculation latency (S1Ci Fig), time was measured using the non-real-time Python time.time() function, which is documented to have uncertainty in the order of a few microseconds. To confirm these measurements, we also recorded the timestamps given by the real-time graphical processing unit (GPU) as reported in S1Cii Fig. Together, these measurements show that although image processing time has a median around 8–10 milliseconds, there are a few frames for which the image processing time takes longer than 20 milliseconds, which, at high frame rates, leads to frames being skipped or dropped (arrows in S1Ci Fig and S1Cii Fig). Finally, to measure the software-to-hardware latency, we measured how long it takes to turn the general-purpose input/output (GPIO) pins ON and OFF during an experiment (S1D Fig). The pins are connected to a transistor with rise and fall times in the order of microseconds (https://eu.mouser.com/datasheet/2/308/FQP30N06L-1306227.pdf). LEDs tend to have a latency in the order of 100 ns. To confirm these measurements, we used PiVR to measure total time between movement in the field of view of the camera and the LED being turned on. The standard tracking software was modified to operate according to the following rules: (1) turn the LED “ON” at the 50th frame (and multiples thereof) and (2) compare pixel intensity of a fixed region of interest of the image to the previous image. If the value is above a threshold, count the frame as “flash.” Turn the LED OFF again. (3) If the LED was turned OFF in the previous frame, turn it ON again. The complete code can be found on the PiVR GitLab repository on branch “LED_flash_test.” This paradigm allowed us to quantify the maximal latency between animal behavior on the arena and the update of the LED. We found that the LED was always turned ON while the next image was collected as the time to detection depended on the observed location in the frame (S1E and S1G Fig). Therefore, the maximal latency is <30 milliseconds.

Video recording performance of PiVR

When PiVR is used as a video recorder, the resolution (px) limits the maximal frame rate (fps): at 640 × 480 px, the frame rate can be set up to 90 fps, at 1,296 × 972 up to 42 fps, at 1,920 × 1,080 up to 30 fps, and at 2,592 × 1,944 a frame rate up to 15 fps can be used. PiVR is compatible with a wide variety of M12 lenses, allowing for high-quality video recordings, depending on the experimental needs (S8 Movie).

Fruit fly larval experiments

For the experiments using fruit fly larvae (Fig 2), animals were raised on standard cornmeal medium at 22°C on a 12-hour day/night cycle. Third instar larvae were placed in 15% sucrose for 20–120 minutes prior to the experiment. Experiments were conducted on 2% agarose (Genesee, 20–102). In the experiments described in Fig 2, the arena was a 100-mm-diameter petri dish (Fisher Scientific, FB0875712).

For the experiments featuring a real-odor gradient, IAA (Sigma Aldrich, 306967–100 ML) was diluted in paraffin oil (Sigma Aldrich, 18512-1L) to produce a 1 M solution. A single source of the odor dilution was tested in larvae with the following genotype: w;Or42a-Gal4;UAS-Orco,Orco-/-. The control consisted of solvent (paraffin oil) devoid of odor. For the virtual reality experiments, the following genotype was used: w;Or42a-Gal4,UAS-CsChrimson;UAS-Orco,Orco-/-. In Fig 2, the same genotype was used in controls, but the tests were conducted without any light stimulations. Larvae expressing CsChrimson were grown in complete darkness in 0.5 M all-trans retinal (R2500, MilliporeSigma, MO, USA).

Adult fruit fly experiments

Male flies with the Gr66a-Gal4 transgene (Bloomington stock number: 57670) [60] were crossed to virgin females carrying 20xUAS-CsChrimson-mVenus [26] integrated into the attP40 landing site. The flies were grown in complete darkness on standard cornmeal medium with 0.5M all-trans retinal at 25°C. Female flies between 1 and 7 days after eclosion were selected after putting the vial on ice for a few seconds. The experiment was conducted in a 100-mm-diameter petri dish (Fisher Scientific, FB0875712) under a white-light condition.

Zebrafish larva experiments

In the experiments shown in Fig 4, we used AB casper [61] as parents. Only pigmented larvae were used for the experiments. The larvae were reared at 28.5°C and a 14:10 light cycle. The experiments were run at 26°C. The arena was a 100-mm petri dish (Fisherbrand, 08-757-12PK). All ambient light was blocked. The maximum white-light intensity provided by PiVR was measured to be approximately 6,800 Lux (Extech Instruments Light Meter 401025).

Data analysis and statistical procedures

All data analysis was performed using Python. The scripts used to create the plots shown in the figures (including all the data necessary to recreate the plots) can be found at (https://doi.org/10.25349/D9ZK50). Generally, data sets were tested for normal distribution (Lilliefors test) and for homogeneity of variance (Levene’s test) [62]. Depending on the result, the parametric t test or the nonparametric Mann–Whitney U rank-sum test was used. To compare multiple groups, either Bonferroni correction was applied after comparing multiple groups, or Dunn’s test was applied [62]. Below, information about the analysis and the applied statistical tests throughout the manuscript are separately addressed for each figure. To estimate peak movement speed of different animals, the median of the 90th percentile of maximum speed per experiment was calculated.

Data analysis of Fig 2

To calculate movement speed, the x and y coordinates were first filtered using a triangular rolling filter with a window size equal to the frame rate (30 fps) divided by the high bound on the speed of the animal (1 mm/s) times the pixel-per-millimeter value of the experiment. Depending on the exact distance between camera and the arena (pixel-per-millimeter value), the window size of the filter was typically 0.3 seconds. Speed was calculated using Euclidian distance. The time series of the speed was smoothened using a triangular rolling filter with a window size of 1 second. To calculate the sensory experience, the stimulus intensity time course was filtered using a boxcar rolling filter with window size 1 second (Fig 2C). To calculate the distance to source for the IAA gradient, the source location was manually defined using the PiVR software. In the Gaussian virtual-odor gradient, the coordinates with the maximum intensity value was defined as the source. In the volcano-shaped virtual gradient, the circle with the highest values was defined as the nearest source to the animal (Fig 2F). At 4 minutes into the experiment, the distance to source between the experimental and control condition was compared. As the data were not normally distributed (Lilliefors test), Mann–Whitney U test was used (S7A–S7C Fig).

Data analysis of Fig 3

Each experiment lasted 5 minutes. The preference index for each animal was calculated by subtracting the time spent by an animal in the squares without light from the time spent in the squares with light (squares eliciting virtual bitter taste). This subtraction was then divided by the total experimental time (Fig 3C). Mann–Whitney U test was used to compare preference between genotypes because the distribution of the preference indices was not normally distributed (Lilliefors test). Speed was calculated as described in Fig 2: first, the x and y coordinates were smoothened using a triangular rolling filter with a window size equal to the frame rate (30) divided by the approximate walking speed of flies (approximately 5 mm/s) times the pixel-per-millimeter ratio of the experiment, which was usually 0.06–0.09 seconds. Locomotion speed was calculated by using the Euclidian distance. The time series of the speed itself was filtered using a triangular rolling filter with a window size equal to the frame rate (30 fps). To test for statistical difference in speed between genotypes and the light on and off condition, Dunn’s multiple comparison test was implemented through the scikit-posthocs library of Python.

Data analysis of Fig 4

Each experiment lasted 5 minutes. Each trial started with the animal facing the virtual-light source. To focus the analysis on animals that demonstrated significant movement during the experiment, the final data set was based on trials fulfilling the following criteria: (1) an animal had to move at least 20 mm over the course of the experiment, (2) only movements above 1 mm per frame were recorded (due to camera/detection noise, the centroid in resting animals can move), and (3) the animal must have been tracked for at least 3 out of 5 minutes.

Locomotion speed was calculated by first smoothening the x and y centroid coordinates with a half-triangular rolling filter with the window size of 1 second. The speed was calculated using Euclidian distance and filtered again using a 0.3-second triangular rolling filter. Swim bouts were identified from the time course of the movement speed by using the “find_peaks” function of the scipy library [63] with the following parameters: a minimum speed of 2.5 mm/s and a minimum time between two consecutive peaks (bouts) of five frames (0.165 seconds) (Fig 4C). The same filter was applied (half-triangular rolling filter with window size of 1 second) to calculate the distance to source. Distance to the maximum value of the virtual reality landscape was calculated. To calculate the distance to source of the controls, the same virtual reality was assigned relative to the animal starting position, and the distance to this simulated landscape was calculated (Fig 4E). In S8A Fig, the distance to source was compared across experimental conditions at 4 minutes into the experiment. We used Mann–Whitney U test to compare the distance to source because the variance between samples could not be assumed to be equal (Levene’s test) (S8A Fig). Bouts (see above) were then used to discretize the trajectory. The reorientation angle θ was calculated for each pair of consecutive bouts by comparing the animal’s coordinates 0.66 seconds (approximate duration of a bout) before and after the local peak in locomotor speed. To filter out occasional (<10%) misdetection of the animal’s reflection once it was located near the wall of the petri dish, we only considered bouts with a reorientation angle θ smaller than 135° (Fig 4F). The light intensity experienced by the animal was used to bin the reorientation angles into four groups, shown in Fig 4G and 4H. To compare the control versus experimental condition of the four light intensity bins, Student t test was used after checking for normality (Lilliefors test) and for equality of variance (Levene’s test). Bonferroni correction was used to correct for multiple comparisons (Fig 4G).

To compare reorientation (turn) angles θ associated with positive and negative visual sensory experiences, the change in light intensity experienced during the previous bouts (ΔI) was used to split the experimental data in two groups: ΔI > 0 (up-gradient) and ΔI < 0 (down-gradient). The data were binned according to stimulus intensity as in Fig 4G. To compare turn angle in the two groups, Student t test for paired samples was used after checking for normality (Lilliefors test) and for equality of variance (Levene’s test). Bonferroni correction was used to adjust for multiple comparisons (Fig 4E).

The turn index β was calculated based on the angular difference α between the heading of the animal at a given bout and the bearing with respect to the white-light source. This angular difference α indicates whether the animal approaches the source (angles near 0°) or swims away from it (angles near 180°). We defined 90° as the boundary between swimming toward and swimming away from the source. The turn index β was then defined by counting the number of bouts toward the source and by subtracting from it the number of bouts away from the source normalized by the total number of bouts. A turn index was calculated for each trial. We compared each group using the Mann–Whitney U test because the data were not normally distributed (Lilliefors test). Bonferroni correction was used to adjust for multiple testing (Fig 4J). To bin the reorientation (turn) angle θ according to distance to source, the distance was calculated for each bout and binned in four groups according to S8C Fig. We compared each group using Mann–Whitney U test because the data were not always normally distributed (Lilliefors test). Bonferroni correction was used to adjust for multiple comparisons.

Supporting information

S1 Fig. Timing performance of PiVR.

(A) Illustration of the three parameters measured to estimate overall loop time and latency. (B) Dependence of image acquisition time on the infrared background illumination strength. (Ci) To measure image processing and VR computation time, the non-real-time python 3.5 time.time() function is used. At 50, 60, and 70 fps some frames (6/5,598, 25/7,198 and 16/8,398, respectively) take longer than the period of the frame rate, which leads to dropped frames. (Cii) To confirm these measurements, we also recorded timestamps of the images assigned by the real-time clock of the GPU. (D) To estimate the software-to-hardware latency during a full update cycle of the tracker, we measured the time between the GPIO pin being instructed to turn ON and the GPIO pin reporting being turned ON. (E) Cameras with a rolling shutter take images by reading out lines of pixels from top to bottom (E, left). This can be illustrated in a simple x/y plot (E, right). To estimate maximum latency between the animal position and the update of the LED intensity, we used an LED flash paradigm while PiVR was tracking a dummy object at 70 Hz. Our latency results depend on the ROI associated with the location of the dummy object. When PiVR tracks an ROI located at the top of the frame (Fi, left), it detects the LED only two frames later (Fii, left). By contrast, when PiVR tracks an ROI located at the bottom of the frame (Fi, right), it detects the LED in the next possible frame (Fii, right). If PiVR tracks an ROI in the center of the image (Fi, center), it either detects the LED during the next frame or two frames later. We conclude that the LED is being turned ON while the next image is being formed (1.34 milliseconds). For a full LED ON–LED OFF–LED ON sequence, we find that there are three frames between the LED being turned ON when PiVR tracks the top of the image (Fiii, left), whereas it takes two frames when PiVR tracks the bottom of the image (Fiii, right). Because the time between two light flashes contains the frame during which the LED is turned OFF, the image acquisition and processing time corresponds to one or two frames. This is summarized in the timeline of panel G. Taken together, this shows that the time necessary to detect the movement of an animal and to update the LED intensity takes a maximum of two frames plus the time necessary to take the picture, which amounts to a maximum of 30 milliseconds at a frame rate of 70 Hz. All data used to create these plots are available from https://doi.org/10.25349/D9ZK50. GPIO, general-purpose input/output; GPU, graphical processing unit; LED, light-emitting diode; PiVR, Raspberry Pi Virtual Reality; ROI, region of interest; VR, virtual reality.

(TIF)

S2 Fig. Modularity of PiVR (hardware).

(A) PiVR is highly modular. The standard version (shown in Fig 1A) can easily be adapted to allow for side (or top) illumination using the same LED controller system. If high-light intensities are needed, a high-power LED controller can be used in combination with high-power LEDs. Panel B shows an example of a side illuminator. Side illumination increases contrast on the surface of the animal. (Bi) This side illuminator was used to collect videos with sufficient detail for idtracker.ai [25] to track 10 fruit fly larvae while retaining their identity. (C) The standard PiVR illuminator consists of at least two different 12-V LED strips: one for background illumination (850 nm) and another color (here, 625 nm) to stimulate the optogenetic tool of choice. (D) The high-power stimulation arena uses an LED controller that keeps current to high-power LEDs constant and can drive a maximum of 12 high-power LEDs. (Ci and Di) The intensity of the stimulation light was measured using a Photodiode (Thorlabs Inc. S130VC). Each pixel is 2 cm2. The black circles indicate petri dishes used as behavioral arenas. LED, light-emitting diode; PiVR, Raspberry Pi Virtual Reality.

(TIF)

S3 Fig. Flow diagram of the automatic animal detection and background reconstruction.

(A) After placing the animal and pressing “start tracking” on the graphical user interface, the software will grab the first image. All images are immediately filtered using a Gaussian kernel with a sigma depending on the size of the animal (user defined) to reduce camera noise. (B) Then a second image is taken. The mean of the images taken so far is being calculated. (C) The current image (in this example, the second image of the experiment) is then subtracted from the mean image shown in panel B. (D) The histogram of the subtracted image shows most gray scale values to be 0. For the region where the animal has moved since the first frame, the pixel values are negative (magenta). For the region that the animal has left since the first frame, the pixel values are positive (green). (E) The threshold value is calculated based on the histogram: it is the mean of the image subtracted by 4 (optimal value defined by trial and error). (F) The threshold value is used to binarize the subtracted image shown in panel C. If there is no or more than one blob with a minimal area (defined by user in animal parameters file), the loop restarts at step (B). (G) If there is exactly one blob, an area around the blob (defined by user in animal parameters file) is defined as the current region of interest. (H) The region of interest is the area where movement has been detected. The algorithm will now restrict the search for the animal to this region. (I) The histogram of this small area of the first image (A) shows that the few pixel defining the animal are distinct from the background. (J) To find the optimal local threshold for binarizing the image, the threshold is adjusted if more than one blob is detected (top, green arrow). As soon as only one blob with the characteristics of the animal (defined by user in animal parameters file) has been detected, the local threshold value is set, and the shape of the identified animal in the first frame is saved (bottom). (K) Using the local threshold, each new image is binarized and then subtracted from the first image as shown in panel (J, bottom). If the identical blob that was detected in panel J (bottom) is found in any of the new subtracted binary images (cyan arrow), the animal is considered as having left its original position, and the algorithm continues. (L) The region occupied by the animal is then copied from the latest image and (M) pasted into the first image. (N) The resulting image does not contain the animal and will be used as the background image for the tracking algorithm (S4 Fig). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

(TIF)

S4 Fig. Flow diagram of the animal tracking algorithm.

(A) At the start of the experiment, the ROI is defined during animal detection (S3G Fig). During the experiment, the current ROI is defined using the previous frame. The ROI of the current image (C) is then subtracted from the ROI of the background (B). The fact that the tracking algorithm only considers a subsample of the image is central to the temporal performances (short processing time) of PiVR. (D) In the resulting image, the animal clearly stands out relative to the background. (E) The histogram of the image indicates that whereas the background consists mostly of values around 0, the animal has pixel intensity values that are negative. (F) The threshold is defined as being three standard deviations away from the mean (G). This threshold is used to binarize the subtracted ROI. The largest blob with animal characteristics (defined by animal parameters) is defined to be the animal. (H) The image of the detected animal is saved, and the next ROI is designated (defined by animal parameters). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50. PiVR, Raspberry Pi Virtual Reality; ROI, region of interest.

(TIF)

S5 Fig. Head/tail classification using a Hungarian algorithm.

(A) During tracking, head/tail classification starts with the binarized image (S4G Fig). (B) The binary image is used to calculate the morphological skeleton, which in turn is used to identify the two endpoints, one of which must be the head and the other the tail. (C) The Euclidian distance between the tail position in the previous frame and each of the endpoints is calculated. If the tail was not defined in the previous frame, the centroid position is used instead. (D) Whichever endpoint has less distance is defined as the tail (here v). The other endpoint is defined as the head.

(TIF)

S6 Fig. Capability of PiVR to track a wide variety of small animals.

PiVR is able to detect, track, and assign head and tail positions to a variety of invertebrate species with different body plans: (A) kelp fly, (B) jumping spider, (C) firefly, and (D) pill bug. PiVR, Raspberry Pi Virtual Reality.

(TIF)

S7 Fig. Distance to real- and virtual-odor sources in larval chemotaxis to real- and virtual-odor gradients.

(A) Distance to real-odor source (isoamyl acetate, n = 30) and the solvent (paraffin oil, n = 30), (B) between the Gaussian-shaped virtual-odor reality (n = 31) and the control (n = 26), and (C) the distance to the local maximum (rim of the volcano, n = 29) and the control (n = 26). Time point is 4 minutes into the experiment (Mann–Whitney U test, p < 0.001). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

(TIF)

S8 Fig. Zebrafish larvae in virtual-light source.

(A) Distance to virtual-light source of control (black) and experimental condition (red) at 4 minutes into the experiment. (B) Relationship between turn angle θ and distance to the virtual-light source (Mann–Whitney U test, different letters indicate p < 0.01, n = 11 and 13). All reported p-values are Bonferroni corrected. All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

(TIF)

S1 Movie. Illustration of homogenous illumination creating a virtual checkerboard reality.

The behavior of a freely moving fly is shown in the petri dish (left), in the virtual checkerboard recorded by PiVR (middle). The corresponding time course of the homogenous illumination intensity is shown in the (right) panel. PiVR, Raspberry Pi Virtual Reality.

(MP4)

S2 Movie. Sample trajectory of a Drosophila larva expressing Orco only in the Or42a olfactory sensory neuron behaving in a quasistatic isoamyl acetate gradient.

The odor gradient was reconstructed for visualization and an estimation of the experienced odor intensity as described before [33] (see also Methods section). Or42a, odorant receptor 42a.

(MP4)

S3 Movie. Illustrative trajectory of a Drosophila larva expressing the optogenetic tool CsChrimson in the Or42a olfactory sensory neuron behaving in a Gaussian-shaped virtual-odor reality.

Or42a, odorant receptor 42a.

(MP4)

S4 Movie. Illustrative trajectory of a Drosophila larva expressing the optogenetic tool CsChrimson in the Or42a olfactory sensory neuron in a volcano-shaped virtual-odor reality.

Or42a, odorant receptor 42a.

(MP4)

S5 Movie. Illustrative trajectory of an adult Drosophila expressing the optogenetic tool CsChrimson in the Gr66a bitter-sensing neurons in a checkerboard-shaped virtual gustatory reality.

Gr66a, gustatory receptor 66a.

(MP4)

S6 Movie. Illustrative trajectory of a zebrafish larva in a virtual visual reality.

(MP4)

S7 Movie. Time-lapse video illustrating the production and assembly of a PiVR setup ending with the start of an experiment.

For a detailed step-by-step protocol, please visit www.pivr.org. PiVR, Raspberry Pi Virtual Reality.

(MP4)

S8 Movie. Video sequences of an adult fly in a dish recorded with PiVR outfitted with three different lenses.

Comparison of the image quality between the standard 3.6-mm, F1.8 lens that comes with the Raspberry Pi camera; a 6-mm, F1.8 lens (US$25, PT-0618MP); and a higher-magnification 16-mm, F1.8 lens (US$21, PT-1618MP). PiVR, Raspberry Pi Virtual Reality.

(MP4)

S1 Table. Bill of materials for the standard version of PiVR.

PiVR, Raspberry Pi Virtual Reality.

(CSV)

S2 Table. Comparative analysis of prevalent tracking systems used for behavioral analysis.

(1) Optogenetic activation in closed-loop experiments. FreemoVR and Stytra are designed to present complex 3D and 2D visual stimuli, respectively. (2) Maximal frequency of the closed-loop stimulus. (3) Maximal time between an action of the tracked animal and the update of the hardware presenting the virtual reality. (4) Most tracking algorithms monitor the position of the centroid. Some tools will automatically detect other features of the tracked animal. (5) Some tracking algorithms are designed to specifically identify animals with a stereotypic shape, size, and movement. Other tracking algorithms are more flexible. (6) Assessed based on published information. (7) Bonsai was used in the optoPAD system, which uses optogenetics [53]. The information presented in this comparative table relies on the following publications: FlyPi [58]; Ethoscope [23]; FreemoVR [11]; Bonsai [57]; PiVR, present manuscript. PiVR, Pi Raspberry Virtual Reality.

(PDF)

S1 HTML. Copy of the www.pivr.org website, which includes information on how to build a standard PiVR setup.

PiVR, Raspberry Pi Virtual Reality.

(ZIP)

Acknowledgments

We thank Ellie Heckscher, Primoz Ravbar, and Andrew Straw for comments on the manuscript. We are grateful to Ajinkya Deogade for work performed during the initial phase of the project (development of a do-it-yourself open-loop tracker based on FlyPi), for creating a preliminary draft of 3D-printed parts, for measuring the white-light intensities used in Fig 4, and for discussions. We thank Tanya Tabachnik for technical advice about the assay construction. We are grateful to Stella Glasauer for collecting the kelp fly tracked in S6A Fig, for taking the picture of the spider in S6B Fig, for designing the PiVR logo, and for comments on the manuscript. We are in debt to Tyler Sizemore for collecting the firefly and pill bug tracked in S6 Fig. We thank Igor Siwanowicz for providing the pictures of the firefly and pill bug of S6 Fig. We are grateful to Minoru Koyama and Jared Rouchard for rearing and providing the fish used in Fig 4. The development and optimization of PiVR greatly benefited from feedback received during the Drosophila Neurobiology: Genes, Circuits & Behavior Summer School in 2017, 2018, and 2019 at the Cold Spring Harbor Laboratory.

Abbreviations

CCD

charge-coupled device

dpf

days postfertilization

Gr66a

gustatory receptor 66a

IAA

isoamyl acetate

LED

light-emitting diode

OSN

olfactory sensory neuron

PiVR

Raspberry Pi Virtual Reality

Or42a

odorant receptor 42a

Data Availability

All data files and scripts are available from the Dryad database (accession DOI: https://doi.org/10.25349/D9ZK50).

Funding Statement

This work was funded by the National Institute for Health (RO1-NS113048-01) and by the University of California, Santa Barbara (startup funds). This work was also supported by the National Science Foundation under Grant No. NSF PHY-1748958, Grant No. IOS-1523125, IH Grant No. R25GM067110, and the Gordon and Betty Moore Foundation Grant No. 2919.01. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Simpson JH, Looger LL. Functional imaging and optogenetics in Drosophila. Genetics. 2018;208(4):1291–309. 10.1534/genetics.117.300228 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Venken KJ, Simpson JH, Bellen HJ. Genetic manipulation of genes and cells in the nervous system of the fruit fly. Neuron. 2011;72(2):202–30. 10.1016/j.neuron.2011.09.021 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Kristan WB. Neuronal decision-making circuits. Current Biology. 2008;18(19):R928–R32. 10.1016/j.cub.2008.07.081 [DOI] [PubMed] [Google Scholar]
  • 4.Olsen SR, Wilson RI. Cracking neural circuits in a tiny brain: new approaches for understanding the neural circuitry of Drosophila. Trends in neurosciences. 2008;31(10):512–20. 10.1016/j.tins.2008.07.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Gordus A, Pokala N, Levy S, Flavell SW, Bargmann CI. Feedback from network states generates variability in a probabilistic olfactory circuit. Cell. 2015;161(2):215–27. 10.1016/j.cell.2015.02.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Inagaki HK, Jung Y, Hoopfer ED, Wong AM, Mishra N, Lin JY, et al. Optogenetic control of Drosophila using a red-shifted channelrhodopsin reveals experience-dependent influences on courtship. Nature methods. 2014;11(3):325 10.1038/nmeth.2765 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Mueller JM, Ravbar P, Simpson JH, Carlson JM. Drosophila melanogaster grooming possesses syntax with distinct rules at different temporal scales. PLoS Comput Biol. 2019;15(6):e1007105 10.1371/journal.pcbi.1007105 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Dombeck DA, Reiser MB. Real neuroscience in virtual worlds. Current opinion in neurobiology. 2012;22(1):3–10. 10.1016/j.conb.2011.10.015 [DOI] [PubMed] [Google Scholar]
  • 9.von Holst E, Mittelstaedt H. The principle of reafference: Interactions between the central nervous system and the peripheral organs. Perceptual processing: Stimulus equivalence and pattern recognition. 1971:41–72. [Google Scholar]
  • 10.Srinivasan M, Zhang S, Lehrer M, Collett T. Honeybee navigation en route to the goal: visual flight control and odometry. Journal of Experimental Biology. 1996;199(1):237–44. [DOI] [PubMed] [Google Scholar]
  • 11.Stowers JR, Hofbauer M, Bastien R, Griessner J, Higgins P, Farooqui S, et al. Virtual reality for freely moving animals. Nature methods. 2017;14(10):995 10.1038/nmeth.4399 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Haberkern H, Basnak MA, Ahanonu B, Schauder D, Cohen JD, Bolstad M, et al. Visually guided behavior and optogenetically induced learning in head-fixed flies exploring a virtual landscape. Current Biology. 2019;29(10):1647–59.e8. 10.1016/j.cub.2019.04.033 [DOI] [PubMed] [Google Scholar]
  • 13.Dombeck DA, Harvey CD, Tian L, Looger LL, Tank DW. Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nature neuroscience. 2010;13(11):1433 10.1038/nn.2648 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Bianco IH, Engert F. Visuomotor transformations underlying hunting behavior in zebrafish. Current biology. 2015;25(7):831–46. 10.1016/j.cub.2015.01.042 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Ahrens MB, Li JM, Orger MB, Robson DN, Schier AF, Engert F, et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature. 2012;485(7399):471–7. 10.1038/nature11057 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Ahrens MB, Huang K-H, Narayan S, Mensh BD, Engert F. Two-photon calcium imaging during fictive navigation in virtual environments. Frontiers in neural circuits. 2013;7:104 10.3389/fncir.2013.00104 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Sofroniew NJ, Cohen JD, Lee AK, Svoboda K. Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. Journal of Neuroscience. 2014;34(29):9537–50. 10.1523/JNEUROSCI.0712-14.2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Schulze A, Gomez-Marin A, Rajendran VG, Lott G, Musy M, Ahammad P, et al. Dynamical feature extraction at the sensory periphery guides chemotaxis. Elife. 2015;4:e06694. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Kocabas A, Shen C-H, Guo ZV, Ramanathan S. Controlling interneuron activity in Caenorhabditis elegans to evoke chemotactic behaviour. Nature. 2012;490(7419):273 10.1038/nature11431 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Reiser MB, Dickinson MH. A modular display system for insect behavioral neuroscience. Journal of neuroscience methods. 2008;167(2):127–39. 10.1016/j.jneumeth.2007.07.019 [DOI] [PubMed] [Google Scholar]
  • 21.Seelig JD, Chiappe ME, Lott GK, Dutta A, Osborne JE, Reiser MB, et al. Two-photon calcium imaging from head-fixed Drosophila during optomotor walking behavior. Nature methods. 2010;7(7):535 10.1038/nmeth.1468 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Štih V, Petrucco L, Kist AM, Portugues R. Stytra: an open-source, integrated system for stimulation, tracking and closed-loop behavioral experiments. PLoS Comput Biol. 2019;15(4):e1006699 10.1371/journal.pcbi.1006699 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Geissmann Q, Rodriguez LG, Beckwith EJ, French AS, Jamasb AR, Gilestro GF. Ethoscopes: An open platform for high-throughput ethomics. PLoS Biol. 2017;15(10):e2003026 10.1371/journal.pbio.2003026 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Branson K, Robie AA, Bender J, Perona P, Dickinson MH. High-throughput ethomics in large groups of Drosophila. Nature methods. 2009;6(6):451 10.1038/nmeth.1328 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Romero-Ferrero F, Bergomi MG, Hinz RC, Heras FJ, de Polavieja GG. idtracker. ai: tracking all individuals in small or large collectives of unmarked animals. Nature methods. 2019;16(2):179 10.1038/s41592-018-0295-5 [DOI] [PubMed] [Google Scholar]
  • 26.Klapoetke NC, Murata Y, Kim SS, Pulver SR, Birdsey-Benson A, Cho YK, et al. Independent optical excitation of distinct neural populations. Nature methods. 2014;11(3):338 10.1038/nmeth.2836 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Baden T, Chagas AM, Gage G, Marzullo T, Prieto-Godino LL, Euler T. Open Labware: 3-D printing your own lab equipment. PLoS Biol. 2015;13(3):e1002086 10.1371/journal.pbio.1002086 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Gershow M, Berck M, Mathew D, Luo L, Kane EA, Carlson JR, et al. Controlling airborne cues to study small animal navigation. Nature methods. 2012;9(3):290 10.1038/nmeth.1853 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Gomez-Marin A, Stephens GJ, Louis M. Active sampling and decision making in Drosophila chemotaxis. Nat Commun. 2011;2:441 Epub 2011/08/25. 10.1038/ncomms1455 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Gomez-Marin A, Louis M. Active sensation during orientation behavior in the Drosophila larva: more sense than luck. Current opinion in neurobiology. 2012;22(2):208–15. 10.1016/j.conb.2011.11.008 [DOI] [PubMed] [Google Scholar]
  • 31.Fishilevich E, Domingos AI, Asahina K, Naef F, Vosshall LB, Louis M. Chemotaxis behavior mediated by single larval olfactory neurons in Drosophila. Curr Biol. 2005;15(23):2086–96. Epub 2005/12/08. 10.1016/j.cub.2005.11.016 . [DOI] [PubMed] [Google Scholar]
  • 32.Kreher SA, Kwon JY, Carlson JR. The molecular basis of odor coding in the Drosophila larva. Neuron. 2005;46(3):445–56. 10.1016/j.neuron.2005.04.007 [DOI] [PubMed] [Google Scholar]
  • 33.Louis M, Huber T, Benton R, Sakmar TP, Vosshall LB. Bilateral olfactory sensory input enhances chemotaxis behavior. Nat Neurosci. 2008;11(2):187–99. Epub 2007/12/25. 10.1038/nn2031 . [DOI] [PubMed] [Google Scholar]
  • 34.Tastekin I, Khandelwal A, Tadres D, Fessner ND, Truman JW, Zlatic M, et al. Sensorimotor pathway controlling stopping behavior during chemotaxis in the Drosophila melanogaster larva. Elife. 2018;7:e38740 10.7554/eLife.38740 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Humberg T-H, Bruegger P, Afonso B, Zlatic M, Truman JW, Gershow M, et al. Dedicated photoreceptor pathways in Drosophila larvae mediate navigation by processing either spatial or temporal cues. Nature communications. 2018;9(1):1260 10.1038/s41467-018-03520-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Gomez-Marin A, Louis M. Multilevel control of run orientation in Drosophila larval chemotaxis. Frontiers in behavioral neuroscience. 2014;8:38 10.3389/fnbeh.2014.00038 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Heckscher ES, Lockery SR, Doe CQ. Characterization of Drosophila larval crawling at the level of organism, segment, and somatic body wall musculature. Journal of Neuroscience. 2012;32(36):12460–71. 10.1523/JNEUROSCI.0222-12.2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Shao L, Saver M, Chung P, Ren Q, Lee T, Kent CF, et al. Dissection of the Drosophila neuropeptide F circuit using a high-throughput two-choice assay. Proceedings of the National Academy of Sciences. 2017;114(38):E8091–E9. 10.1073/pnas.1710552114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Moon SJ, Köttgen M, Jiao Y, Xu H, Montell C. A taste receptor required for the caffeine response in vivo. Current biology. 2006;16(18):1812–7. 10.1016/j.cub.2006.07.024 [DOI] [PubMed] [Google Scholar]
  • 40.Benhamou S, Bovet P. Distinguishing between elementary orientation mechanisms by means of path analysis. Animal Behaviour. 1992;43(3):371–7. [Google Scholar]
  • 41.Bell WJ, Tobin TR. Chemo‐orientation. Biological Reviews. 1982;57(2):219–60. [Google Scholar]
  • 42.Fraenkel GS, Gunn DL. The Orientation of Animals. New York: Dover Publications; 1961. [Google Scholar]
  • 43.Ahrens MB, Engert F. Large-scale imaging in small brains. Current opinion in neurobiology. 2015;32:78–86. 10.1016/j.conb.2015.01.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Friedrich RW. Neuronal computations in the olfactory system of zebrafish. Annual review of neuroscience. 2013;36:383–402. 10.1146/annurev-neuro-062111-150504 [DOI] [PubMed] [Google Scholar]
  • 45.Burgess HA, Schoch H, Granato M. Distinct retinal pathways drive spatial orientation behaviors in zebrafish navigation. Current biology. 2010;20(4):381–6. 10.1016/j.cub.2010.01.022 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Chen X, Engert F. Navigational strategies underlying phototaxis in larval zebrafish. Frontiers in systems neuroscience. 2014;8:39 10.3389/fnsys.2014.00039 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Budick SA, O'Malley DM. Locomotor repertoire of the larval zebrafish: swimming, turning and prey capture. Journal of Experimental Biology. 2000;203(17):2565–79. [DOI] [PubMed] [Google Scholar]
  • 48.Kalueff AV, Gebhardt M, Stewart AM, Cachat JM, Brimmer M, Chawla JS, et al. Towards a comprehensive catalog of zebrafish behavior 1.0 and beyond. Zebrafish. 2013;10(1):70–86. 10.1089/zeb.2012.0861 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Orger MB, de Polavieja GG. Zebrafish behavior: opportunities and challenges. Annual review of neuroscience. 2017;40:125–47. 10.1146/annurev-neuro-071714-033857 [DOI] [PubMed] [Google Scholar]
  • 50.Karpenko S, Wolf S, Lafaye J, Le Goc G, Panier T, Bormuth V, et al. From behavior to circuit modeling of light-seeking navigation in zebrafish larvae. eLife. 2020;9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Boyden ES, Zhang F, Bamberg E, Nagel G, Deisseroth K. Millisecond-timescale, genetically targeted optical control of neural activity. Nature neuroscience. 2005;8(9):1263 10.1038/nn1525 [DOI] [PubMed] [Google Scholar]
  • 52.Mauss AS, Busch C, Borst A. Optogenetic neuronal silencing in Drosophila during visual processing. Scientific reports. 2017;7(1):13823 10.1038/s41598-017-14076-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Moreira J-M, Itskov PM, Goldschmidt D, Baltazar C, Steck K, Tastekin I, et al. optoPAD, a closed-loop optogenetics system to study the circuit basis of feeding behaviors. Elife. 2019;8:e43924 10.7554/eLife.43924 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Karpenko S, Wolf S, Lafaye J, Le Goc G, Panier T, Bormuth V, Candelier R, Debrégeas G. From behavior to circuit modeling of light-seeking navigation in zebrafish larvae. Elife. 2020. January 2;9:e52882 10.7554/eLife.52882 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Fernandes AM, Fero K, Arrenberg AB, Bergeron SA, Driever W, Burgess HA. Deep brain photoreceptors control light-seeking behavior in zebrafish larvae. Current Biology. 2012;22(21):2042–7. 10.1016/j.cub.2012.08.016 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Horstick EJ, Bayleyen Y, Sinclair JL, Burgess HA. Search strategy is regulated by somatostatin signaling and deep brain photoreceptors in zebrafish. BMC biology. 2017;15(1):4 10.1186/s12915-016-0346-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Lopes G, Bonacchi N, Frazão J, Neto JP, Atallah BV, Soares S, et al. Bonsai: an event-based framework for processing and controlling data streams. Frontiers in neuroinformatics. 2015;9:7 10.3389/fninf.2015.00007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Chagas AM, Prieto-Godino LL, Arrenberg AB, Baden T. The€ 100 lab: A 3D-printable open-source platform for fluorescence microscopy, optogenetics, and accurate temperature control during behaviour of zebrafish, Drosophila, and Caenorhabditis elegans. PLoS Biol. 2017;15(7):e2002702 10.1371/journal.pbio.2002702 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Tadres D, Louis M. Data from PiVR: an affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior. 2020. Dryad Digital Repository. Available from: 10.25349/D9ZK50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Kwon JY, Dahanukar A, Weiss LA, Carlson JR. Molecular and cellular organization of the taste system in the Drosophila larva. Journal of Neuroscience. 2011;31(43):15300–9. 10.1523/JNEUROSCI.3363-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.White RM, Sessa A, Burke C, Bowman T, LeBlanc J, Ceol C, et al. Transparent adult zebrafish as a tool for in vivo transplantation analysis. Cell stem cell. 2008;2(2):183–9. 10.1016/j.stem.2007.11.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Zar JH. Biostatistical analysis. Pearson Education India; 1999. [Google Scholar]
  • 63.Oliphant TE. Python for Scientific Computing. Computing in Science & Engineering. 2007;9(3):10–20. 10.1109/MCSE.2007.58 [DOI] [Google Scholar]

Decision Letter 0

Ines Alvarez-Garcia

19 Dec 2019

Dear Matthieu,

Thank you for submitting your manuscript entitled "PiVR: an affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior" for consideration as a Methods and Resources by PLOS Biology.

Your manuscript has now been evaluated by the PLOS Biology editorial staff as well as by an academic editor with relevant expertise and I am writing to let you know that we would like to send your submission out for external peer review.

However, before we can send your manuscript to reviewers, we need you to complete your submission by providing the metadata that is required for full assessment. To this end, please login to Editorial Manager where you will find the paper in the 'Submissions Needing Revisions' folder on your homepage. Please click 'Revise Submission' from the Action Links and complete all additional questions in the submission questionnaire.

Please re-submit your manuscript within two working days, i.e. by Dec 23 2019 11:59PM.

Login to Editorial Manager here: https://www.editorialmanager.com/pbiology

During resubmission, you will be invited to opt-in to posting your pre-review manuscript as a bioRxiv preprint. Visit http://journals.plos.org/plosbiology/s/preprints for full details. If you consent to posting your current manuscript as a preprint, please upload a single Preprint PDF when you re-submit.

Once your full submission is complete, your paper will undergo a series of checks in preparation for peer review. Once your manuscript has passed all checks it will be sent out for review.

***Please be aware that, due to the voluntary nature of our reviewers and academic editors, manuscripts may be subject to delays due to their limited availability during the holiday season. Please also note that the journal office will be closed entirely 21st- 29th December inclusive, and 1st January 2020. Thank you for your patience.***

Feel free to email us at plosbiology@plos.org if you have any queries relating to your submission.

Kind regards,

Ines

--

Ines Alvarez-Garcia, PhD

Senior Editor

PLOS Biology

Carlyle House, Carlyle Road

Cambridge, CB4 3DN

+44 1223–442810

Decision Letter 1

Ines Alvarez-Garcia

31 Jan 2020

Dear Matthieu,

Thank you very much for submitting your manuscript "PiVR: an affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior" for consideration as a Methods and Resources at PLOS Biology. Thank you also for your patience as we completed our editorial process, and please accept my apologies for the delay in providing you with our decision. Your manuscript has been evaluated by the PLOS Biology editors, an Academic Editor with relevant expertise, and by three independent reviewers.

As you will see, the reviewers feel that the system you have developed is novel and useful for the scientific community, however they also raise several issues that would need to be addressed before we can consider your manuscript further for publication. You should make a comprehensive comparison with other methods already available (such as Bonsai, Stytra or FlyPI) and state clearly the advantages and limitations of PiVR. In addition, you should streamline the text and follow the reviewers’ suggestions to improve the structure of the manuscript.

In light of the reviews (attached below), we are pleased to offer you the opportunity to address the [comments/remaining points] from the reviewers in a revised version that we anticipate should not take you very long. We will then assess your revised manuscript and your response to the reviewers' comments and we may consult the reviewers again.

We expect to receive your revised manuscript within 1 month.

Please email us (plosbiology@plos.org) if you have any questions or concerns, or would like to request an extension. At this stage, your manuscript remains formally under active consideration at our journal; please notify us by email if you do not intend to submit a revision so that we may end consideration of the manuscript at PLOS Biology.

**IMPORTANT - SUBMITTING YOUR REVISION**

Your revisions should address the specific points made by each reviewer. Please submit the following files along with your revised manuscript:

1. A 'Response to Reviewers' file - this should detail your responses to the editorial requests, present a point-by-point response to all of the reviewers' comments, and indicate the changes made to the manuscript.

*NOTE: In your point by point response to the reviewers, please provide the full context of each review. Do not selectively quote paragraphs or sentences to reply to. The entire set of reviewer comments should be present in full and each specific point should be responded to individually.

You should also cite any additional relevant literature that has been published since the original submission and mention any additional citations in your response.

2. In addition to a clean copy of the manuscript, please also upload a 'track-changes' version of your manuscript that specifies the edits made. This should be uploaded as a "Related" file type.

*Resubmission Checklist*

When you are ready to resubmit your revised manuscript, please refer to this resubmission checklist: https://plos.io/Biology_Checklist

To submit a revised version of your manuscript, please go to https://www.editorialmanager.com/pbiology/ and log in as an Author. Click the link labelled 'Submissions Needing Revision' where you will find your submission record.

Please make sure to read the following important policies and guidelines while preparing your revision:

*Published Peer Review*

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. Please see here for more details:

https://blogs.plos.org/plos/2019/05/plos-journals-now-open-for-published-peer-review/

*PLOS Data Policy*

Please note that as a condition of publication PLOS' data policy (http://journals.plos.org/plosbiology/s/data-availability) requires that you make available all data used to draw the conclusions arrived at in your manuscript. If you have not already done so, you must include any data used in your manuscript either in appropriate repositories, within the body of the manuscript, or as supporting information (N.B. this includes any numerical values that were used to generate graphs, histograms etc.). For an example see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5

*Protocols deposition*

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosbiology/s/submission-guidelines#loc-materials-and-methods

Thank you again for your submission to our journal. We hope that our editorial process has been constructive thus far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments.

Sincerely,

Ines

--

Ines Alvarez-Garcia, PhD

Senior Editor

PLOS Biology

Carlyle House, Carlyle Road

Cambridge, CB4 3DN

+44 1223–442810

-----------------------------------------------------

Reviewers’ comments

Rev. 1:

The present study by Tadres and Louis presents a useful and affordable open-source virtual reality setup based on the Raspberry Pi system. The authors present solid data using virtual chemotaxis and phototaxis experiments, confirming that their system works accurately for such purposes. I find this work valuable to the community, especially for the effort to design systems that can be implemented at low cost as well as for teaching platforms.

I have a few comments that I believe would improve the presentation and message of this manuscript and one concern regarding the proposed speed of the system.

First, I have my doubts that the overall close-loop reaction time is correct. The shutter speeds and the image processing time were measured. However, due to the limitations of the bus, there is always a lag between image acquisition and the start of image processing. I have no experience with the Raspberry Pie system, but I have encountered these limitations in top of the line computing systems. Thus, I would test this directly. My suggestion is to use their system at maximal acquisition speed, image short LED flash, and use their computing platform to detect this flash and turn the same LED on for a second time. By looking at the number of frames between the two flashes, the authors could test the speed of the entire system. Although the sampling rate will be limited by the acquisition speed of the camera, the authors will be able to prove if their proposed 20ms processing time is accurate. This is important since the authors correctly claim that "the shorter the delay, the more authentic the virtual reality is perceived." I don't see issues with the experiments presented, but new users will be warned in case the system is slower than the speeds required for their purposes.

My second suggestion is regarding the current structure. The main claim of this paper is to have a VR system that is affordable and reliable; not the experimental results used to benchmark the system. Thus, I would remove the experimental sections from the discussion (Defining that nature of taste-driven responses in adult flies; Exploring the ability of zebrafish [larvae] to orient with minimal graded visual inputs; Exploring the ability of the fruit fly….) and merged them in a minimal form to the experimental results.

Finally, although I truly appreciate and are convinced that the system is useful for particular experiments, it won't be for all. There are limitations on the FOV, camera resolution, speed, etc., that make it indispensable for many labs to engineering their systems. Thus, I would recommend adding this perspective to the discussion. I believe that this will be helpful for labs that thanks to the affordable design will start doing these kinds of VR experiments, but don't have the experience to judge all details.

Minor:

I would disagree with the first sentence: "Behavior emerges from the conversion of sensory input into motor output". Organisms are not automata, e.g.; there is a whole world of internal states that modulate behavior. Please rephrase.

I would also make a stronger point on why VR systems are important (line 41). You don't need a VR system to stimulate an animal repeatedly. VR systems allow combining the stimulus to the behavior, enabling a precise exploration of response dynamics to sensory stimuli dependent on particular behavioral conditions.

- The paper sometimes uses colloquial terms, e.g., "crack the neuronal code," etc. I would change that language.

- Hz and frames per second are used, I would stick to Hz.

- Reference, e.g., page 213: Schulze, Gomez-Marin (17) -> Schulze et al. (17). Also line 307.

- Get rid of popular in line 220 and 256. Model organisms already imply that they are used widely.

- Line 600 - 608, it is not clear that the (Ci-iii) refers to the supplemental figure.

- Line 195, "field of the view", remove the.

Rev. 2:

In this manuscript, Tadres & Louis present PiVR, a novel hardware designed to perform closed-loop light-based stimulations on small animals. The authors present data based on optogenetic activation of genetically targeted neurons in Drosophila larvae and imagos; they also present a proof of principle using visible light on zebrafish larvae.

PiVR is certainly an interesting device and I can see it becoming popular in the field of Drosophila neuroscience, especially among people interested in developing new paradigms of learning.

The device has some clear strengths and some limitations. The strengths are discussed appropriately and fairly. The limitations not so much and it may be useful to have a more rounded discussion of both so that readers can immediately recognise whether this tool is appropriate for their uses.

Amongst the strengths I would count:

1. The documentation is outstanding.

2. The machine is relatively inexpensive.

3. The basic usage of the machine seems to be easy to implement

Amongst the weaknesses:

1. Some of the proposed usages of PiVR are suboptimal. It is stressed several times in the manuscript that PiVR can be used to acquire videos "offline", for them to be tracked by a different software (such as tracker.ai). I doubt this setup can compete, in terms of resolution, speed and even cost with the simpler solution of having an industrial CCD connected to an existing computer.

2. While, in principle, the R in VR stands for any kind of Reality, in fact it is commonly associated to complex visual representations such as projections of objects, patterns, scenarios. PiVR is limited to providing different intensity of lights and therefore its main use-case is going to be optogenetics.

3. It is not clear to me why a reader should prefer PiVR over other already existing alternatives and it may be useful to stress out, perhaps in a table, pros and cons of PiVR vs the "competition". FlyPI , BONSAI, FreeMO-VR are all alternative products with functionalities that overlap the ones of PiVR - the manuscript would benefit from a more straight comparison with them.

I do not have specific comments on the manuscript, besides the fact that I found it perhaps too long and repetitive. I think the strength of this tool is that it does one thing and it does nicely but all this gets somehow lost in a manuscript so discursive and repetitive. The discussion, in particular, is not well focused.

The figures are well presented and clear but perhaps not very focused. figures 2 and 3 read more like a demonstration that "optogenetics works". It would have been more useful to focus on PiVR versatility and show several different use cases rather than only those two very basic ones. Somehow the focus of the figures is optogenetics, not PiVR.

I also had troubles understanding part of figure 4 (F-J) and almost all of figure 5. Perhaps the concepts of slow vs fast dynamic VR can be explained in greater detail with a cartoon and examples of why and how an experimenter may want to use slow or fast dynamic VR could be provided.

Rev. 3:

This paper describes an open, low-cost platform to perform closed-loop behavioral experiments. The intention is to make such experiments more accessible or scalable by removing barriers of cost and development. The utility of the system is thoroughly demonstrated through example experiments in larval and adult Drosophila and zebrafish, which are analyzed to reveal new biological insights.

The resources described in the paper are well designed and described and should be straightforward to apply to many current experimental questions. However, there are several existing platforms with overlapping aims and many of them are not cited or discussed in this paper. Some of these allow much more experimental flexibility than the generation of virtual environments through modulation of a one-dimensional signal that is used in the current paper. Therefore, it is not clear that the method fulfills the criteria required for a resource in PLOS Biology, in that it would enable experiments that were not possible using existing methods. In particular, it would be helpful to cite the following two resources and discuss them in comparison with the PIVR system:

1) Stytra

http://www.portugueslab.com/pages/resources.html

Stytra is a python-based package, which can be used for animal tracking and closed-loop presentation of stimuli. Implementations with low-cost hardware are possible. Štih V*, Petrucco L*, Kist AM, Portugues R (2019)

Stytra: An open-source, integrated system for stimulation, tracking and closed-loop behavioral experiments. PLOS Computational Biology, doi:10.1371/journal.pcbi.1006699

2) Bonsai.

http://www.kampff-lab.org/bonsai

Bonsai is a software framework that is widely used for designing closed-loop behavior experiments, and can be used in conjunction with low cost hardware.

Bonsai: An event-based framework for processing and controlling data streams. Lopes G, Bonacchi N, Frazão J, Neto JP, Atallah BV, Soares S, Moreira L, Matias S, Itskov PM, Correia PA, Medina RE, Calcaterra L, Dreosti E, Paton JJ, Kampff AR. Frontiers in Neuroinformatics. 2015; 9:7.

Some minor issues:

1) In the introductory discussion on virtual reality in neuroscience, it could be worth citing the productive use of virtual reality in work in zebrafish, in the context of motor adaptation, prey capture and social behavior.

2) The closed-loop experiments based on optogenetic stimulation of the gustatory system in flies have technical similarities with experiments in the following paper, which might be appropriate to cite:

Moreira, J.-M., Itskov, P. M., Goldschmidt, D., Baltazar, C., Steck, K., Tastekin, I., et al. (2019). optoPAD, a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, 8. http://doi.org/10.7554/eLife.43924

3) The terminology used in the zebrafish swimming experiments in not aligned to common usage in the literature, and could be confusing. Individual bursts of movement of the larvae are typically called 'bouts' , and the word 'scoots' is used to refer to a particular type of bout, where the tail oscillations are mostly confined to the caudal region, and which propels the larva forward at a slow speed, without major reorientation or head yaw (also often called 'Slow swims'). Routine turns, by contrast, start with a larger tail movement that reorients the larva, which can be followed by a propulsive phase. Therefore what are called 'scoots' in this paper may encompass both 'scoots' and 'turns' in the terminology of other studies.

4) In the discussion of phototaxis in zebrafish, there are some other papers that perhaps are relevant to cite, because they either describe virtual reality based assays for phototaxis, or discuss the choice of turns vs scoots and the direction of turns. Ahrens, M. B., Huang, K. H., Narayan, S., Mensh, B. D., & Engert, F. (2013). Frontiers in Neural Circuits, 7, 104.

Fernandes, A. M., Fero, K., Arrenberg, A. B., Bergeron, S. A., Driever, W., & Burgess, H. A. (2012). Current Biology, 22(21), 2042-2047. Horstick, E. J., Bayleyen, Y., Sinclair, J. L., & Burgess, H. A. (2017). BMC Biology, 1-16.

Decision Letter 2

Ines Alvarez-Garcia

20 Mar 2020

Dear Matthieu,

Thank you for submitting your revised Methods and Resources entitled "PiVR: an affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior" for publication in PLOS Biology. I have now discussed the revision with the team of editors and obtained advice from the original Academic Editor.

We're delighted to let you know that we're now editorially satisfied with your manuscript. However before we can formally accept your paper and consider it "in press", we also need to ensure that your article conforms to our guidelines. A member of our team will be in touch shortly with a set of requests. As we can't proceed until these requirements are met, your swift response will help prevent delays to publication. Please also make sure to address the data and other policy-related requests noted at the end of this email.

*Copyediting*

Upon acceptance of your article, your final files will be copyedited and typeset into the final PDF. While you will have an opportunity to review these files as proofs, PLOS will only permit corrections to spelling or significant scientific errors. Therefore, please take this final revision time to assess and make any remaining major changes to your manuscript.

NOTE: If Supporting Information files are included with your article, note that these are not copyedited and will be published as they are submitted. Please ensure that these files are legible and of high quality (at least 300 dpi) in an easily accessible file format. For this reason, please be aware that any references listed in an SI file will not be indexed. For more information, see our Supporting Information guidelines:

https://journals.plos.org/plosbiology/s/supporting-information

*Published Peer Review History*

Please note that you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. Please see here for more details:

https://blogs.plos.org/plos/2019/05/plos-journals-now-open-for-published-peer-review/

*Early Version*

Please note that an uncorrected proof of your manuscript will be published online ahead of the final version, unless you opted out when submitting your manuscript. If, for any reason, you do not want an earlier version of your manuscript published online, uncheck the box. Should you, your institution's press office or the journal office choose to press release your paper, you will automatically be opted out of early publication. We ask that you notify us as soon as possible if you or your institution is planning to press release the article.

*Protocols deposition*

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosbiology/s/submission-guidelines#loc-materials-and-methods

*Submitting Your Revision*

To submit your revision, please go to https://www.editorialmanager.com/pbiology/ and log in as an Author. Click the link labelled 'Submissions Needing Revision' to find your submission record. Your revised submission must include a cover letter, a Response to Reviewers file that provides a detailed response to the reviewers' comments (if applicable), and a track-changes file indicating any changes that you have made to the manuscript.

Please do not hesitate to contact me should you have any questions.

Best wishes,

Ines

--

Ines Alvarez-Garcia, PhD

Senior Editor

PLOS Biology

Carlyle House, Carlyle Road

Cambridge, CB4 3DN

+44 1223–442810

------------------------------------------------------------------------

DATA POLICY:

You may be aware of the PLOS Data Policy, which requires that all data be made available without restriction: http://journals.plos.org/plosbiology/s/data-availability. I can see that you have deposited your data in Driad (DOI: https://doi.org/10.25349/D9ZK50), but the link doesn't seem to be active and I can't check if the data we need before the manuscript enters production is available. Please either activate it or follow the instructions stated below.

Note that we do not require all raw data (for more information, please also see this editorial: http://dx.doi.org/10.1371/journal.pbio.1001797). Rather, we ask that all individual quantitative observations that underlie the data summarized in the figures and results of your paper be made available in one of the following forms:

1) Supplementary files (e.g., excel). Please ensure that all data files are uploaded as 'Supporting Information' and are invariably referred to (in the manuscript, figure legends, and the Description field when uploading your files) using the following format verbatim: S1 Data, S2 Data, etc. Multiple panels of a single or even several figures can be included as multiple sheets in one excel file that is saved using exactly the following convention: S1_Data.xlsx (using an underscore).

2) Deposition in a publicly available repository. Please also provide the accession code or a reviewer link so that we may view your data before publication.

Regardless of the method selected, please ensure that you provide the individual numerical values that underlie the summary data displayed in the following figure panels as they are essential for readers to assess your analysis and to reproduce it:

Fig. 3C, F; Fig. 4G, H, J; Fig. S1B, C, D; Fig. S3E, D, I, J; Fig. S4E, F; Fig. S7A, B, C and Fig. S8A, B

NOTE: the numerical data provided should include all replicates AND the way in which the plotted mean and errors were derived (it should not present only the mean/average values).

Please also ensure that figure legends in your manuscript include information on where the underlying data can be found, and ensure your supplemental data file/s has a legend.

Please ensure that your Data Statement in the submission system accurately describes WHERE YOUR DATA CAN BE FOUND.

Decision Letter 3

Ines Alvarez-Garcia

9 Jun 2020

Dear Dr Louis,

On behalf of my colleagues and the Academic Editor, Tom Baden, I am pleased to inform you that we will be delighted to publish your Methods and Resources in PLOS Biology.

The files will now enter our production system. You will receive a copyedited version of the manuscript, along with your figures for a final review. You will be given two business days to review and approve the copyedit. Then, within a week, you will receive a PDF proof of your typeset article. You will have two days to review the PDF and make any final corrections. If there is a chance that you'll be unavailable during the copy editing/proof review period, please provide us with contact details of one of the other authors whom you nominate to handle these stages on your behalf. This will ensure that any requested corrections reach the production department in time for publication.

Early Version

The version of your manuscript submitted at the copyedit stage will be posted online ahead of the final proof version, unless you have already opted out of the process. The date of the early version will be your article's publication date. The final article will be published to the same URL, and all versions of the paper will be accessible to readers.

PRESS

We frequently collaborate with press offices. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximise its impact. If the press office is planning to promote your findings, we would be grateful if they could coordinate with biologypress@plos.org. If you have not yet opted out of the early version process, we ask that you notify us immediately of any press plans so that we may do so on your behalf.

We also ask that you take this opportunity to read our Embargo Policy regarding the discussion, promotion and media coverage of work that is yet to be published by PLOS. As your manuscript is not yet published, it is bound by the conditions of our Embargo Policy. Please be aware that this policy is in place both to ensure that any press coverage of your article is fully substantiated and to provide a direct link between such coverage and the published work. For full details of our Embargo Policy, please visit http://www.plos.org/about/media-inquiries/embargo-policy/.

Thank you again for submitting your manuscript to PLOS Biology and for your support of Open Access publishing. Please do not hesitate to contact me if I can provide any assistance during the production process.

Kind regards,

Alice Musson

Publishing Editor,

PLOS Biology

on behalf of

Ines Alvarez-Garcia,

Senior Editor

PLOS Biology

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Fig. Timing performance of PiVR.

    (A) Illustration of the three parameters measured to estimate overall loop time and latency. (B) Dependence of image acquisition time on the infrared background illumination strength. (Ci) To measure image processing and VR computation time, the non-real-time python 3.5 time.time() function is used. At 50, 60, and 70 fps some frames (6/5,598, 25/7,198 and 16/8,398, respectively) take longer than the period of the frame rate, which leads to dropped frames. (Cii) To confirm these measurements, we also recorded timestamps of the images assigned by the real-time clock of the GPU. (D) To estimate the software-to-hardware latency during a full update cycle of the tracker, we measured the time between the GPIO pin being instructed to turn ON and the GPIO pin reporting being turned ON. (E) Cameras with a rolling shutter take images by reading out lines of pixels from top to bottom (E, left). This can be illustrated in a simple x/y plot (E, right). To estimate maximum latency between the animal position and the update of the LED intensity, we used an LED flash paradigm while PiVR was tracking a dummy object at 70 Hz. Our latency results depend on the ROI associated with the location of the dummy object. When PiVR tracks an ROI located at the top of the frame (Fi, left), it detects the LED only two frames later (Fii, left). By contrast, when PiVR tracks an ROI located at the bottom of the frame (Fi, right), it detects the LED in the next possible frame (Fii, right). If PiVR tracks an ROI in the center of the image (Fi, center), it either detects the LED during the next frame or two frames later. We conclude that the LED is being turned ON while the next image is being formed (1.34 milliseconds). For a full LED ON–LED OFF–LED ON sequence, we find that there are three frames between the LED being turned ON when PiVR tracks the top of the image (Fiii, left), whereas it takes two frames when PiVR tracks the bottom of the image (Fiii, right). Because the time between two light flashes contains the frame during which the LED is turned OFF, the image acquisition and processing time corresponds to one or two frames. This is summarized in the timeline of panel G. Taken together, this shows that the time necessary to detect the movement of an animal and to update the LED intensity takes a maximum of two frames plus the time necessary to take the picture, which amounts to a maximum of 30 milliseconds at a frame rate of 70 Hz. All data used to create these plots are available from https://doi.org/10.25349/D9ZK50. GPIO, general-purpose input/output; GPU, graphical processing unit; LED, light-emitting diode; PiVR, Raspberry Pi Virtual Reality; ROI, region of interest; VR, virtual reality.

    (TIF)

    S2 Fig. Modularity of PiVR (hardware).

    (A) PiVR is highly modular. The standard version (shown in Fig 1A) can easily be adapted to allow for side (or top) illumination using the same LED controller system. If high-light intensities are needed, a high-power LED controller can be used in combination with high-power LEDs. Panel B shows an example of a side illuminator. Side illumination increases contrast on the surface of the animal. (Bi) This side illuminator was used to collect videos with sufficient detail for idtracker.ai [25] to track 10 fruit fly larvae while retaining their identity. (C) The standard PiVR illuminator consists of at least two different 12-V LED strips: one for background illumination (850 nm) and another color (here, 625 nm) to stimulate the optogenetic tool of choice. (D) The high-power stimulation arena uses an LED controller that keeps current to high-power LEDs constant and can drive a maximum of 12 high-power LEDs. (Ci and Di) The intensity of the stimulation light was measured using a Photodiode (Thorlabs Inc. S130VC). Each pixel is 2 cm2. The black circles indicate petri dishes used as behavioral arenas. LED, light-emitting diode; PiVR, Raspberry Pi Virtual Reality.

    (TIF)

    S3 Fig. Flow diagram of the automatic animal detection and background reconstruction.

    (A) After placing the animal and pressing “start tracking” on the graphical user interface, the software will grab the first image. All images are immediately filtered using a Gaussian kernel with a sigma depending on the size of the animal (user defined) to reduce camera noise. (B) Then a second image is taken. The mean of the images taken so far is being calculated. (C) The current image (in this example, the second image of the experiment) is then subtracted from the mean image shown in panel B. (D) The histogram of the subtracted image shows most gray scale values to be 0. For the region where the animal has moved since the first frame, the pixel values are negative (magenta). For the region that the animal has left since the first frame, the pixel values are positive (green). (E) The threshold value is calculated based on the histogram: it is the mean of the image subtracted by 4 (optimal value defined by trial and error). (F) The threshold value is used to binarize the subtracted image shown in panel C. If there is no or more than one blob with a minimal area (defined by user in animal parameters file), the loop restarts at step (B). (G) If there is exactly one blob, an area around the blob (defined by user in animal parameters file) is defined as the current region of interest. (H) The region of interest is the area where movement has been detected. The algorithm will now restrict the search for the animal to this region. (I) The histogram of this small area of the first image (A) shows that the few pixel defining the animal are distinct from the background. (J) To find the optimal local threshold for binarizing the image, the threshold is adjusted if more than one blob is detected (top, green arrow). As soon as only one blob with the characteristics of the animal (defined by user in animal parameters file) has been detected, the local threshold value is set, and the shape of the identified animal in the first frame is saved (bottom). (K) Using the local threshold, each new image is binarized and then subtracted from the first image as shown in panel (J, bottom). If the identical blob that was detected in panel J (bottom) is found in any of the new subtracted binary images (cyan arrow), the animal is considered as having left its original position, and the algorithm continues. (L) The region occupied by the animal is then copied from the latest image and (M) pasted into the first image. (N) The resulting image does not contain the animal and will be used as the background image for the tracking algorithm (S4 Fig). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

    (TIF)

    S4 Fig. Flow diagram of the animal tracking algorithm.

    (A) At the start of the experiment, the ROI is defined during animal detection (S3G Fig). During the experiment, the current ROI is defined using the previous frame. The ROI of the current image (C) is then subtracted from the ROI of the background (B). The fact that the tracking algorithm only considers a subsample of the image is central to the temporal performances (short processing time) of PiVR. (D) In the resulting image, the animal clearly stands out relative to the background. (E) The histogram of the image indicates that whereas the background consists mostly of values around 0, the animal has pixel intensity values that are negative. (F) The threshold is defined as being three standard deviations away from the mean (G). This threshold is used to binarize the subtracted ROI. The largest blob with animal characteristics (defined by animal parameters) is defined to be the animal. (H) The image of the detected animal is saved, and the next ROI is designated (defined by animal parameters). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50. PiVR, Raspberry Pi Virtual Reality; ROI, region of interest.

    (TIF)

    S5 Fig. Head/tail classification using a Hungarian algorithm.

    (A) During tracking, head/tail classification starts with the binarized image (S4G Fig). (B) The binary image is used to calculate the morphological skeleton, which in turn is used to identify the two endpoints, one of which must be the head and the other the tail. (C) The Euclidian distance between the tail position in the previous frame and each of the endpoints is calculated. If the tail was not defined in the previous frame, the centroid position is used instead. (D) Whichever endpoint has less distance is defined as the tail (here v). The other endpoint is defined as the head.

    (TIF)

    S6 Fig. Capability of PiVR to track a wide variety of small animals.

    PiVR is able to detect, track, and assign head and tail positions to a variety of invertebrate species with different body plans: (A) kelp fly, (B) jumping spider, (C) firefly, and (D) pill bug. PiVR, Raspberry Pi Virtual Reality.

    (TIF)

    S7 Fig. Distance to real- and virtual-odor sources in larval chemotaxis to real- and virtual-odor gradients.

    (A) Distance to real-odor source (isoamyl acetate, n = 30) and the solvent (paraffin oil, n = 30), (B) between the Gaussian-shaped virtual-odor reality (n = 31) and the control (n = 26), and (C) the distance to the local maximum (rim of the volcano, n = 29) and the control (n = 26). Time point is 4 minutes into the experiment (Mann–Whitney U test, p < 0.001). All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

    (TIF)

    S8 Fig. Zebrafish larvae in virtual-light source.

    (A) Distance to virtual-light source of control (black) and experimental condition (red) at 4 minutes into the experiment. (B) Relationship between turn angle θ and distance to the virtual-light source (Mann–Whitney U test, different letters indicate p < 0.01, n = 11 and 13). All reported p-values are Bonferroni corrected. All data used to create these plots are available from https://doi.org/10.25349/D9ZK50.

    (TIF)

    S1 Movie. Illustration of homogenous illumination creating a virtual checkerboard reality.

    The behavior of a freely moving fly is shown in the petri dish (left), in the virtual checkerboard recorded by PiVR (middle). The corresponding time course of the homogenous illumination intensity is shown in the (right) panel. PiVR, Raspberry Pi Virtual Reality.

    (MP4)

    S2 Movie. Sample trajectory of a Drosophila larva expressing Orco only in the Or42a olfactory sensory neuron behaving in a quasistatic isoamyl acetate gradient.

    The odor gradient was reconstructed for visualization and an estimation of the experienced odor intensity as described before [33] (see also Methods section). Or42a, odorant receptor 42a.

    (MP4)

    S3 Movie. Illustrative trajectory of a Drosophila larva expressing the optogenetic tool CsChrimson in the Or42a olfactory sensory neuron behaving in a Gaussian-shaped virtual-odor reality.

    Or42a, odorant receptor 42a.

    (MP4)

    S4 Movie. Illustrative trajectory of a Drosophila larva expressing the optogenetic tool CsChrimson in the Or42a olfactory sensory neuron in a volcano-shaped virtual-odor reality.

    Or42a, odorant receptor 42a.

    (MP4)

    S5 Movie. Illustrative trajectory of an adult Drosophila expressing the optogenetic tool CsChrimson in the Gr66a bitter-sensing neurons in a checkerboard-shaped virtual gustatory reality.

    Gr66a, gustatory receptor 66a.

    (MP4)

    S6 Movie. Illustrative trajectory of a zebrafish larva in a virtual visual reality.

    (MP4)

    S7 Movie. Time-lapse video illustrating the production and assembly of a PiVR setup ending with the start of an experiment.

    For a detailed step-by-step protocol, please visit www.pivr.org. PiVR, Raspberry Pi Virtual Reality.

    (MP4)

    S8 Movie. Video sequences of an adult fly in a dish recorded with PiVR outfitted with three different lenses.

    Comparison of the image quality between the standard 3.6-mm, F1.8 lens that comes with the Raspberry Pi camera; a 6-mm, F1.8 lens (US$25, PT-0618MP); and a higher-magnification 16-mm, F1.8 lens (US$21, PT-1618MP). PiVR, Raspberry Pi Virtual Reality.

    (MP4)

    S1 Table. Bill of materials for the standard version of PiVR.

    PiVR, Raspberry Pi Virtual Reality.

    (CSV)

    S2 Table. Comparative analysis of prevalent tracking systems used for behavioral analysis.

    (1) Optogenetic activation in closed-loop experiments. FreemoVR and Stytra are designed to present complex 3D and 2D visual stimuli, respectively. (2) Maximal frequency of the closed-loop stimulus. (3) Maximal time between an action of the tracked animal and the update of the hardware presenting the virtual reality. (4) Most tracking algorithms monitor the position of the centroid. Some tools will automatically detect other features of the tracked animal. (5) Some tracking algorithms are designed to specifically identify animals with a stereotypic shape, size, and movement. Other tracking algorithms are more flexible. (6) Assessed based on published information. (7) Bonsai was used in the optoPAD system, which uses optogenetics [53]. The information presented in this comparative table relies on the following publications: FlyPi [58]; Ethoscope [23]; FreemoVR [11]; Bonsai [57]; PiVR, present manuscript. PiVR, Pi Raspberry Virtual Reality.

    (PDF)

    S1 HTML. Copy of the www.pivr.org website, which includes information on how to build a standard PiVR setup.

    PiVR, Raspberry Pi Virtual Reality.

    (ZIP)

    Attachment

    Submitted filename: 200306__PointResponse_v2.pdf

    Attachment

    Submitted filename: Point-by-point_answer_final.pdf

    Data Availability Statement

    All data files and scripts are available from the Dryad database (accession DOI: https://doi.org/10.25349/D9ZK50).


    Articles from PLoS Biology are provided here courtesy of PLOS

    RESOURCES