Summary
Locomotion requires a balance between mechanical stability and movement flexibility to achieve behavioral goals despite noisy neuromuscular systems, but rarely is it considered how this balance is orchestrated. We combined virtual reality tools with quantitative analysis of behavior to examine how Drosophila uses self-generated visual information (reafferent visual feedback) to control gaze during exploratory walking. We found that flies execute distinct motor programs coordinated across the body to maximize gaze stability. However, the presence of inherent variability in leg placement relative to the body jeopardizes fine control of gaze due to posture-stabilizing adjustments that lead to unintended changes in course direction. Surprisingly, whereas visual feedback is dispensable for head-body coordination, we found that self-generated visual signals tune postural reflexes to rapidly prevent turns rather than to promote compensatory rotations, a long-standing idea for visually guided course control. Together, these findings support a model in which visual feedback orchestrates the interplay between posture and gaze stability in a manner that is both goal dependent and motor-context specific.
Keywords: Drosophila, motor control, locomotion, visually guided walking, gaze stabilization, posture control, visuomotor processing, interlimb coordination, virtual reality
Graphical abstract
Highlights
-
•
Exploratory walking in Drosophila is configured to maximize gaze stability
-
•
Gaze stability is jeopardized by inevitable rotations induced by postural control
-
•
Visual feedback supports gaze stability by tuning down postural reflexes
-
•
Self-generated visual motion limits rotations without promoting compensatory turns
A balance between mechanical stability and movement flexibility is necessary to achieve behavioral goals despite noisy sensorimotor systems. Cruz et al. combine virtual reality with quantitative analysis of behavior to describe an unexplored role of visual feedback in orchestrating the interplay between posture and gaze stability.
Introduction
Every step we take is defined by an interplay between behavioral goals, the reasons why movement is executed in the first place, and mechanical stability mediated by postural reflexes, which keep our body balanced as we move through complex and unpredictable environments.1, 2, 3 This interplay requires the brain to estimate self-generated movements to determine whether the motor programs at work are executed as intended and to correct them otherwise.4 In addition, postural reflexes ought to be tuned flexibly within some permissive range if behavioral goals are to succeed.5 The brain estimates self-motion by integrating internal motor-related activity with mechanical and visual information.4,6 Yet how the brain uses this internal estimate to balance behavioral goals with postural stability requirements is poorly understood.
Visual animals move their eyes and head in a structured manner following a combination of fast gaze shifts (saccades) and fixations that maintain gaze stable.7, 8, 9, 10, 11, 12, 13 In vertebrates, gaze saccades and fixations are controlled by distinct motor programs.12,14, 15, 16, 17, 18 Saccades are initiated centrally by a common command to rotate the eyes and neck together, whereas eye fixations are controlled by compensatory eye rotations initiated by proprioceptive or vestibular signals in the cervical-ocular and vestibular-ocular reflexes.19, 20, 21, 22, 23, 24 Likewise, in walking insects, saccades are characterized by syndirectional yaw-rotations of head and body,25,26 whereas gaze stabilization during roll and pitch movements of the body is based on compensatory head rotations dependent on proprioceptive systems.27, 28, 29, 30, 31 These structured eye and/or head rotations are complemented with body movements that control course and further reduce motion blur.1,12,14,15,17 At the same time, noise within sensorimotor systems32 and uncertainties in the interaction between limbs and the environment2 can cause postural perturbations. Such perturbations lead to rapid (stepwise) readjustments in body posture33, 34, 35, 36 that may affect gaze stability goals. Thus, the trade-off between gaze stability for accurate visual and spatial perception8,12 and posture control during exploratory walking provides an opportunity to examine the role of self-motion estimation on tuning such trade-off.
While vision is the primary beneficiary of gaze stabilization, it is also an important contributor to its control, complementing non-visual stabilization mechanisms.30,37,38 In both vertebrates and invertebrates, experiments in which visual motion stimuli are presented uncoupled from the animal’s behavior induce directed rotations of the eyes, or the head and body known as the optokinetic responses (OKRs), or optomotor responses (OMRs), respectively.39, 40, 41, 42 Several models have proposed that OKRs and OMRs operate via a feedback control system in which visual error signals are minimized by compensatory turns, thereby reducing gaze and course deviations,41,43,44 although it remains unclear how OMRs coordinates movement across the body for gaze stability. Moreover, much less is known about how vision controls gaze stability in the absence of external visual perturbations, when visual input is self-generated (visual feedback).45, 46, 47, 48, 49 In insects, the sufficiency of OMR mechanisms to guarantee gaze stability under self-generated conditions has been challenged.48 OMRs depend on visual velocity signals that must be integrated to maintain the orientation of head and body fixed.48,50 Indeed, OMRs are modeled by a proportional-integral feedback control system.44,49,51 The integral aspect of the controller is prone to error accumulation and may not be accurate and fast enough for a smooth control of gaze with rapid postural adjustments (Figure 1A). Alternatively, menotaxis52 or other object-based responses53 may support gaze stabilization.
Here, we examine how Drosophila melanogaster uses visual feedback to control gaze stability during exploratory walking. We first established an experimental paradigm in which the fly displays regular gaze stabilization behavior and in which we control and manipulate visual feedback precisely (Figure 1). By combining virtual reality tools with quantitative analyses of movement, we found that flies execute distinct motor programs coordinated across the body to maximize gaze stability. Visual feedback is critical for gaze stabilization without affecting the accompanying head-body coordination. Instead, visual feedback prevents rotations induced by postural reflexes rather than promoting compensatory turns. Altogether, our findings suggest that visual feedback orchestrates the interplay between behavioral goals—gaze stability—and postural stability requirements in an exploring fly.
Results
An experimental paradigm to study the role of visual feedback in exploratory walking
To study the role of visual feedback in gaze control, we sought to establish a paradigm in which three conditions were met. First, the fly should display frequent walking paths with stable gaze. Second, movements across the body should be tracked in freely walking individuals. Third, visual feedback should be precisely manipulated. To this end, we developed FlyVRena,54 a virtual reality system for flies walking freely in a circular arena (Figures 1B and 1C). A heated wall prevents flies from following the perimeter of the arena (Figure 1D). To track movements of the head, body, and legs, we recorded high-magnification videos of the walking fly using a galvo-based high-resolution system coupled to positional information from a low-magnification tracking system55 (latency <16 ms). Visual stimuli were projected onto the arena’s floor in closed loop with the position of the fly (maximum latency of 40 ms). To measure the fly’s ability to detect FlyVRena’s stimuli, we compared behavior under natural versus reversed gain (RG) conditions (Figures S1A and S1B). In RG a leftward turn of the fly results in a leftward instead of a rightward rotation of the visual world, which induces continuous turning or circling behavior56 (Figure S1C). We define visual influence as the probability of circling under natural versus RG, a metric of the specific effect a virtual world has on the behavior of an individual fly (Figure S1D). Thus, FlyVRena allowed us to examine how the fly uses specific visual feedback to control gaze.
To examine the potential contribution of yaw visual feedback on gaze stabilization, we decouple the rotational and translational components of the fly’s visual feedback (Figure 1E; Video S1). Using the low-magnification system, the position of the visual stimuli was updated with the fly’s position. This configuration minimized translational visual feedback and facilitated keeping the angular size of objects on the fly’s retina constant during walking. Thus, in our experimental paradigm, flies explore a visually controlled virtual world where we can track movement across the body at high resolution and precisely manipulate self-generated visual information.
Exploratory walking paths reveal two distinct motor contexts
To elucidate the organization of body movement underlying the exploratory paths (Figure 2A; Video S1), we obtained an unbiased description of critical body kinematics linked to the motor programs at play57 (Figure S2A). The continuous character of locomotion poses a challenge for an unbiased segmentation of body kinematics. Therefore, we implemented an unsupervised segmentation of multivariate time series using a principled adaptive approach58 (Figure S2A; STAR Methods). Briefly, we model local dynamics in variable time windows with first-order linear models, using the forward and angular velocities and respective accelerations as inputs. In line with previous work,57 body kinematics were segmented into a set of windows ranging from 180 to 500 ms (Figure 2B). To group similar dynamics, we ran pairwise comparisons across all the segments to construct a dissimilarity matrix. Subsequently, we applied hierarchical clustering followed by a principled merging method via a logistic regression model (5-fold cross-validated binary classification, separability >99%; Figure 2A; STAR Methods). Applying this method on our dataset (84 flies, 441 min of recordings) led to 10 clusters representing body kinematics during continuous walking that were organized into three main branches (Figure 2C; for other behaviors, see Figure S2B). Two of these branches contained kinematics with prominent angular velocity profiles accompanied by forward velocity profiles displaying decelerations followed by accelerations (Figure 2D, cyan, magenta, dark violet, and dark green). In addition, these branches contained clusters with prominent angular bias and high forward velocity profiles (Figure 2D, black, blue, orange, and spring green). By contrast, the third branch contained clusters that on average lacked prominent angular velocity profiles and were accompanied by high and steady forward velocity profiles (Figure 2D, dodge blue and lime clusters). Thus, body kinematics can be organized into regular sets of dynamics across different timescales revealing characteristic properties of the fly’s exploratory walking.
To examine the temporal organization of clusters, we identified common transition patterns between them (Figure 2E). We performed a singular-value decomposition (SVD) on the one-step probability transition matrix between clusters and obtained a ranked description of the most relevant transitions (transition modes)59 (Figure 2E; STAR Methods). Because the first transition mode contains mainly steady-state transitions (Figure S2D; STAR Methods), we focused our analysis on the subsequent modes. The highest contribution to the second mode corresponds to transitions between angular peak-like rotations and reductions in the angular velocity (Figure 2E, right). The third and fourth mode’s highest contributors include transitions from the steady forward velocity clusters to either the same clusters or to initiations of peak-like rotations, respectively (Figure 2E, right). Altogether, these findings indicated the presence of two main distinctive body motor programs, characterized either by angular spike-like events resembling body saccades25,26 (angular peak plus relaxation) that are executed in about 56% of the time (14,978 segments), or by segments with steady high forward velocity (forward runs) executed in about 44% of the total time spent walking (11,744 segments). Thus, the organization of the body kinematics during exploration resembles a fixation-saccade strategy in which the fly separates turning from forward displacements.
The structure of exploratory walking is configured to maximize gaze stability
A fixation-saccade strategy orchestrates specific motor programs across the body to maintain gaze stable.11,12,16,18 We examined movement coordination between the head and body with FlyVRena’s high-magnification system (Figure 3A; Video S2). Because the computational load of the unsupervised analysis grows exponentially with data size, we leveraged the information acquired from this analysis and developed a faster, supervised identification of body saccades (Figures 3A, 3B, S3, S4A, and S4B; STAR Methods). During body saccades,26 the head and body moved initially in sync regardless of light conditions (8 ms resolution; Figure 3C). At the peak of the body saccade, the head initiated a counter-rotation (Figures 3D and S4C) that helped to stabilize gaze as the body completed its rotation. Unlike in vertebrates,17,18 the head’s contribution to gaze shifts was minor relative to that of the body (Figure 3E), perhaps due to the head’s limited movement range (<20°) and the body’s fast maneuverability. By contrast, and like the vertebrate cervical-ocular reflex,22 during forward runs gaze stability is increased by opposite rotations of the head and body, with the head following the body by 66 ms regardless of light conditions (Figures 3F and 3G). Together, these findings reveal a structural organization of exploration that is independent of visual conditions and that is configured to increase gaze stability while restricting the duration of gaze shifts.
Visual feedback rapidly controls course stability during forward runs
The limited contribution of the head relative to the body on gaze control (Figure 3E) suggests that gaze stability increases with course stability.47,49 To examine course stability, we calculated path straightness per forward run (Figures 3A and 4A; STAR Methods). Consistent with gaze stabilization, walking paths were straighter in light versus darkness or with weak visual feedback (Figures 4A–4D and S1; see also Robie et al.60). The increase in path straightness was associated with a decrease in deviations from a stable body orientation (body angular deviations) (Figure 4B). If the fly uses the projected stimuli for course control, then the stronger the visual influence of the stimuli, the straighter the forward runs should be. To examine this prediction, we designed visual worlds with variable influence on the behavior of individual flies (Figures 4C and S1). We found that although size or density of dots influenced behavior non-monotonically across the population, visual influence and walking performance were always strongly correlated, i.e., the larger the visual influence, the straighter the forward run (Figure 4E). This finding indicates that self-generated visual signals reduce the magnitude of body rotations and straighten forward runs when motor programs across the body are coordinated to maintain gaze stable.
To test how fast visual feedback affects course control, we calculated path straightness as a function of the duration of forward runs across virtual worlds (Figure 4F). Forward runs lasting only 300 ms were straighter under strong versus weak visual feedback. Moreover, short, body kinematics (∼180 ms) within clusters linked to forward runs displayed reduced body angular deviations under strong visual feedback (Figure 4G). Altogether, these observations suggest that visual feedback controls course stability at a timescale close to the step cycle of the walking fly.61
Visual feedback prevents pairwise interlimb correlations underlying postural adjustments
To understand how visual feedback controls path straightness at fast timescales, we first examined the relationship between interlimb coordination and course control. To characterize interlimb coordination, we used machine learning tools to track each leg’s tip position in two dimensions62,63 (Figure 5A; Video S3; STAR Methods). We labeled forward runs across visual conditions based on the cumulative body angular deviations normalized by the traveled distance (path deviation; Figure 5B). Regardless of the magnitude and direction of the path deviation, on average, flies walked with tripod gait (Figures 5C and S5A), characteristic of high-speed walking (Figures 2 and S2; see also Mendes et al.61 and Wosnitza et al.64). Thus, path deviations are not caused by changes in gait structure; instead, they may be caused by changes in leg kinematics.
To examine whether variations in single-leg kinematics affect body angular deviations, we extracted the location orthogonal (lateral) and parallel (longitudinal) to the body axis of the anterior extreme position (AEP) and posterior extreme position (PEP) of each leg relative to the body.61 The AEP and PEP can vary up to 0.3 body lengths (BLs), irrespective of the magnitude and direction of path deviations (Figure 5D). Interestingly, the lateral AEP of front legs changed systematically with path deviations (Figure 5D). This effect was clearer at a single-step level, where correlations between lateral leg placement and body angular deviations were more prominent for front than middle legs, and it was not observed in hind legs (Figure 5E). Front legs showed no correlation between the longitudinal AEP and body angular deviations, whereas the middle and hind legs showed variable correlations (Figure S5B). These findings indicate a specific relationship between each leg’s placement and body deviations, strongly suggesting that variability in limb kinematics causes self-paced path deviations during forward runs.
Notably, the lateral placement of a front leg in a single step (FLx) is not sufficient to explain the body angular deviations. In consecutive steps, if an initial displacement is not followed by a syndirectional FLx from the contralateral leg, body angular deviations don’t occur (Figure 5F). By contrast, if the initial displacement is followed by a correlated FLx, the magnitude of the correlation directly relates to the magnitude of the body angular deviation (Figure 5F). Changes in FLx over a step can displace the fly’s center of stability away from its center of mass (Figure 5G), thereby momentarily decreasing posture stability.65,66 To recover stability, posture control systems could either adjust lateral foot placement in the next step, like in humans,67 or dynamically engage viscoelastic properties of the musculoskeletal system, as observed in the cockroach.68 Our data are consistent with the first possibility. Initial FLx displacements were typically followed by contralateral FLx displacements in the same direction (Figures 5G and S5E). These front-leg pairwise correlations are strongly correlated with body angular deviations regardless of visual conditions (Figure S5D). However, the degree of the interlimb correlation was stronger in darkness and gradually reduced with stronger visual feedback (Figures S5E and S5F). Consistently, posture stability recovery was more robust in darkness than when visual feedback was available (Figures 5H and S5G). These findings show that posture control induces body angular deviations, thereby shifting gaze when the fly intends to keep it stable (Figure 3). Our data show that visual feedback controls gaze stability by tuning down rapid postural adjustments.
External visual perturbations induce body rotations with specific interlimb coordination
Next, we asked whether visual perturbations leading to body rotations induced similar interlimb coordination. We induced visual perturbations either by OMR visual stimuli (Figure S6A) or by RG (Figure S1A). During OMR, flies turn at high forward velocity or with saccades (data not shown) in the direction of the stimulus rotation (Figure S6A). In these biased forward runs, the body angular deviations were correlated with the front legs’ lateral AEP and anti-correlated with the hind legs’ lateral AEP (Figure S6B). Under RG, flies circle56 with near-perfect tripod gait (Figures S1 and S6C–S6E; Video S4). Like the OMR, the body angular deviations were correlated with the front legs’ lateral AEP and anti-correlated with the hind legs’ lateral AEP (Figure S6F). By contrast, the longitudinal AEP was unaffected under RG (Figure S6G). The consistent anti-correlation between the hind legs’ lateral AEP and the body angular deviations during open- and closed-loop visual perturbations was never observed in natural conditions (Figures 5 and S6B, static condition). Furthermore, we found that head-body coordination under RG differs from the anti-correlation found under natural conditions (Figure S6I). Altogether, our data suggest that, regardless of the gait structure (Figures S6D, S6E, and S6H), a different movement pattern across the body emerge under visual perturbations.
Motion-sensitive circuits are crucial for the rapid tuning of posture by visual feedback
To determine the specific visual circuits tuning down postural adjustments, we silenced the activity of T4/T5 cells, the first population of neurons with direction-selective visual-motion responses,69 via the selective expression of the inward-rectifier potassium channel Kir2.1 (Figure 6A). These experimental flies displayed characteristic exploratory paths (Video S4), suggesting that visual motion information is not required for the fixation-saccade structure of exploratory walking (Figures 6B and 6C, left).
Silencing the activity of T4/T5 cells did not affect the dynamics of saccades (Figures 6D and S7D). However, walking paths during forward runs under strong visual feedback were markedly less straight in experimental flies compared with controls, resembling walking in darkness. This effect was not explained by an impairment in head body coordination (Figures 6B and 6C, right; Figures S7D and S7E). Consistent with this observation, we found that front-leg pairwise correlations were more prominent in experimental than in control flies (Figure 6E). This finding strongly suggests that pathways postsynaptic to T4/T5 cells mediate the tuning of postural reflexes to maintain walking straight and gaze stable.
Discussion
Gaze stability is fundamental for visual and spatial perception.70 Here, we demonstrate that exploratory walking Drosophila attempt to keep gaze stable by moving forward in fixed directions while coordinating head and body rotations in antiphase (Figures 2 and 3). However, maintaining gaze stable during walking is not trivial because of internal and external sources of noise. We identified one component of this noise, the variability of leg placement, that momentarily decreases posture stability and triggers rapid limb adjustments that induce changes in course direction, thereby impairing gaze stabilization (Figure 5). We further show that the fly uses visual feedback to stabilize gaze at the expense of posture (Figures 4 and 5). Mechanistically, self-generated visual motion cues prevent rather than promote compensatory body rotations (Figures 5 and 6), a long-standing idea for visually guided course control71 (Figure 7A). Moreover, external visual disturbances induced a different leg movement pattern from the one observed under natural conditions. Taken together, these findings establish that self- and externally generated visual information engages different mechanisms to control goal-directed and visually guided locomotion. Overall, our work suggests a critical role for visual feedback in orchestrating the interplay between posture and gaze stability in a manner that is both goal dependent and motor-context specific.
Active, self-paced visual control of walking
In flies, course control is thought to depend on compensatory systems.42,43,49,71, 72, 73, 74 The proposed models rely on a combination of directional and non-directional components,43,75, 76, 77 which together act via compensatory turns until an internal error signal is minimized.43,44,49,72 In flight, the directional (optomotor) component depends on a feedback system with integrative components.43,44,49 Several lines of evidence show that these compensatory systems cannot fully explain our findings during walking. First, the fly’s course control critically depends on visual motion pathways, in contrast to position-based oriented behaviors.75,77 Furthermore, while flies follow discrete high-contrast edges (Figure S7C), they seem not to fixate on single dots (Figure S7A). Second, we found no evidence for the presence of actively generated small-amplitude turns, suggested to constitute an operant trial-and-error strategy for the optomotor system under closed-loop conditions.49,72 Third, visual feedback does not generate compensatory turns, but rather prevents body rotations. Fourth, while optomotor mechanisms could support gaze stability in flying insects, the gain of the OMR, in our case on average 0.56, is too low to guarantee straight walking.48,78
Reversing the gain of the virtual reality system or presenting an OMR stimulus induced additional components of limb placement absent under natural conditions, which likely contribute to compensatory turns. Altogether, we propose that parallel mechanisms exist for course control that depend on self- versus externally generated visual motion. While external perturbations, such a gust of wind or asymmetries in the locomotor apparatus induced by acute limb damage, rely on optomotor mechanisms for proper course control, the fast action of visual feedback reported here is critical in the context of self-paced variability during walking, where every step is prone to multiple sources of noise.3
Neural mechanisms involved in visual tuning of posture control
It takes ∼35 ms or half the stance period61,62 (Figure S5C) for visual motion signals to excite T4/T5 postsynaptic neurons (T. Fujiwara, personal communication). Thus, to control leg placement, visual feedback could act upon the swinging leg if descending signals to the ventral nerve cord (VNC) travel fast with minimal delay. T4/T5 cells connect to visual projection neurons,79, 80, 81 many of which receive extra-retinal82,83 information that further accelerates visual processing.84,85 In addition, visual projection neurons project to regions densely innervated by descending neurons.86 In insects, descending neurons carry task-specific signals to modulate reactive forces, interlimb coordination, and leg placement.87,88 At the same time, local circuits within the VNC perform stepwise adjustments to control posture in the presence of internally and externally generated noise.33, 34, 35, 36 Here, we show that posture reflexes respond to variability in leg placement via spatial interlimb correlations, which cause small changes in body orientation while recovering posture stability (Figure 5). Although the mechanisms underlying these spatial correlations remain unclear, we speculate that information about leg landing position in contralateral pairs contributes to postural adjustments, likely via the activity of interneurons integrating information across segments within the VNC88 and whose gain can be gated by visual feedback (Figure 7B). Alternatively, visual descending pathways may act on a step-by-step manner on leg sensory-motor systems89 (Figure 7C). Very little is known about the identity and mechanisms by which neurons within the VNC modulate leg placement or how visual descending circuits interact with such population of interneurons. The findings reported here constitute a possible framework to examine these important questions in the context of active vision and its role in goal-directed walking.
Anyone who tried to walk while blindfolded knows that vision is fundamental for course control. There is evidence supporting a role for vision in human leg placement,46 suggesting that a similar mechanism to the one we demonstrate in flies may underlie course control in humans.67 Investigating the role of visual feedback under naturalistic conditions in a system with a tractable number of neurons and a comprehensive array of anatomic, genetic, and physiological tools will produce a conceptual framework to elucidate the interplay between visual feedback and posture control during locomotion suitable for comparative studies across species and scenarios.
STAR★Methods
Key resources table
REAGENT or RESOURCE | SOURCE | IDENTIFIER |
---|---|---|
Experimental models: Organisms/strains | ||
Drosophila: Top Banana wild type | Michael Dickinson | N/A |
Drosophila: w118; Otd-nls:FLPo; + | Kenta Asahina | N/A |
Drosophila: w118; +; pJFRC56-10xUAS > myrtdTomato-SV40 > eGFPKir2.1 | Gerald Rubin | N/A |
Drosophila: w118; R59E08-p65ADZp;R42F06ZpGdbd | Michael Reiser | N/A |
Reagents | ||
Vectashield | Vector Laboratories | H-1200 |
Normal Goat Seru, | Life Technologies | PCN5000 |
PFA | Electron Microscopy Sciences | 15710 |
Antibodies | ||
Mouse monoclonal anti-nc82 | DSHB | RRID: AB_2314866 |
Rabbit polyclonal anti-GFP | Thermo Fisher Scientific | Cat#A6455; AB_221570 |
Alexa Fluor 633 conjugated goat anti-mouse | Thermo Fisher Scientific | Cat#A21050; RRID: AB_2535718 |
Alexa Fluor 488 conjugated goat anti-rabbit | Thermo Fisher Scientific | Cat#A11008; RRID: AB_143165 |
Software and algorithms | ||
FlyVRena | This paper | github, https://github.com/ChiappeLab |
Code for behavior analysis | This paper | github, https://github.com/ChiappeLab |
MATLAB | The MathWorks, Inc. | N/A |
Python version 3.7 | Python Software Foundation | N/A |
Microsoft Visual Studio | Microsoft Corporation | N/A |
Deposited data | ||
Raw and Pre-processed Data | This paper | Mendeley, https://doi.org/10.17632/tvw5k48b6p.2 |
Equipment | ||
Infrared Cameras | IDS | IDS UI-3240CP-NIR/ IDS UI-3360CP-NIR-GL |
Infrared LEDs | Osram | 850nm SFH 4235-Z |
LED current power supply | TENMA | 72-10480 |
LED projector | Texas Instrument | DLP Lightcrafter 4500 |
Telecentric lens | Telecentric Lens | 220mm WD CompactTL |
Galvo mirrors | Thorlabs | GVS012/M - 2D |
Resource availability
Lead contact
Further information and requests for resources and reagents should be directed to and will be fulfilled by the lead contact, Eugenia Chiappe (eugenia.chiappe@neuro.fchampaliamaud.org).
Materials availability
This study did not generate new unique reagents.
Experimental model and subject details
Fly Strains
Drosophila melanogaster were reared on a standard fly medium and kept on a 12hr light/12hr dark cycle. Experiments were performed with 2-to-4-day old flies with clipped wings from eclosion. Although at the beginning of this study we examined both females and males, under our experimental conditions we found that males display more robust and continuous walking behavior compared to females. Thus, in this study, we focused our analysis on male flies. We used the recently derived “Top Banana” WT line (TP, a gift from the Dickinson lab) because they engaged in exploration more readily than other WT lines (Figures 1, 2, and 3, comparisons across WT lines are not shown). Experiments were performed uniformly throughout the day since the variation of the time of the experiment could not explain relevant behavior variability (data not shown). An intersectional approach designed with Otd-nls:FLPo90 restricted the effector expression to the population of neurons of interest. The complete set of transgenic flies and their specific sources are listed in the Key resources table.
Method details
Immunostaining
Isolated brains were fixed for 30 min at room temperature in 4% paraformaldehyde in PBS, rinsed in PBT (PBS, 0.5% Triton X-100 and 10 mg/ml BSA), and blocked in PBT + 10% NGS for 15min. Brains were incubated in primary antibodies (1:25 mouse nc82 and 1:1000 rabbit antibody to GFP) at 4°C for three days. After several washes in PBT, brains were incubated with secondary antibodies (1:500 goat-anti rabbit: Alexa Fluor 488 and 1:500 goat-anti mouse: Alexa Fluor 633) for three days at 4°C. Brains were mounted in Vectashield, and confocal images were acquired with a Zeiss LSM710 scope with a 40× oil-immersion or 25× multi-immersion objective lens.
Virtual Reality setup for exploratory walking
Single flies walked freely in a 90mm circular arena with walls heated up to 42°C by an insulated nichrome wire (Pelican Wire P2128N60TFEWT). In order to prevent walking on the ceiling, the arena was covered with a glass plate pre-treated with Sigmacote (Sigma).91 The fly was video recorded using a near infrared camera (IDS UI-3240CP-NIR) at a frame rate of 60Hz and resolution 900x900 pixels. The camera has attached a 2x expander (Computar EX2C), a wide field lens (Computar 5mm 1:1:4 ½), and a filter against visible light (Thorlabs AC254-100-A-ML). The illumination was set beneath the arena by IR LEDs (850nm SFH 4235-Z), powered by a current power supply (TENMA 72-10480). The centroid position and orientation of the fly were determined via real time tracking. The tracking algorithm first performed a background subtraction using an image of an empty arena, followed by the application of a pixel intensity threshold. From the distribution of pixels within the contour of the thresholded image, the 2D centroid and main orientation were estimated. The final estimate of the position and orientation of the fly was given by a Kalman filter, which combined the recorded data with prior knowledge about the system to statistically minimize the difference between the real and estimated values.
Visual stimuli were projected onto a rear projection material (Da-Lite High Contrast DA-Tex), attached to the floor of the arena. We used a small LED projector (DLP Lightcrafter 4500, Texas Instruments) at a frame rate of 60Hz and a pixel size of 160px/mm. A custom-made software (FlyVRena, https://github.com/tlcruz/FlyVRena)54 generated the virtual worlds. FlyVrena implements the real time tracking of the position and orientation of the fly. FlyVrena uses computer game development libraries to render virtual objects and update their behavior accordingly (closed loop). The delay between the fly movement and the update of the world is about 40ms (as measured using a photodiode, Vishay BPW21R), which is generated mainly from the lag of the projection system. Virtual worlds were 2D environments with sets of textured square-based random dots (Figures 1, 2, 3, 4, 5, and 6 and S1–S7; Video S1) or textured square-based bars (“windmill environment”; Figure S7). The models of the textured squares were designed in Blender (https://www.blender.org), the textures were generated in MATLAB and Adobe Illustrator, and rendered using the FlyVRena software.
Following FlyMad,55 high-resolution images of the fly were recorded at 120fps and 1024x544 pixel resolution with a near infrared camera (IDS UI-3360CP-NIR-GL) and a telecentric lens (Edmund Optics 1x, 220mm WD CompactTL Telecentric Lens). The camera pointed to a pair of galvo mirrors (Thorlabs GVS012/M - 2D Large Beam (10mm) Diameter Galvo System) whose orientation was controlled by the FlyVRena software based on the instantaneous position of the fly to maintain the fly centered in the high-resolution frame (16ms lag; Video S2).
The size of our experimental arena (> 50BL) together with its aversive walls, circumvented the fly’s innate centerphobism92 and promoted longer excursions throughout the arena (Figure 1D). Under these conditions, we found that flies organize their behavior in two distinctive phases with characteristics that resemble what has been described as saccades in walking flies26 and fixation-saccade in patrolling and exploratory flying flies.93,94 Under our experimental conditions, the fly may be structuring its behavior similarly while searching for an exit or a cool spot within the aversive landscape generated by the hot walls. Therefore, we refer to the behavior as exploratory walking.
Data pre-processing for free walking assays
Post experiment, the change in the position of the fly was transformed to forward and side velocities, whereas the change of its orientation was transformed to angular velocity. Jump events of the fly were detected by a threshold in the fly acceleration (200mm/s2) and eliminated from subsequent analysis. Heading direction was inferred based on the persistence of the forward velocity, and if averaged forward speed between two jumps was negative, then orientation was rotated by 180° and velocities were re-calculated. All velocity signals within a window of 166ms centered at a jump were set to zero. The speeds were smoothed with a lowess algorithm with a window of 100ms. Activity was defined as translational speed greater than 0.5mm/s, or angular speed greater than 20°/s for at least 166ms. On the other hand, inactivity bouts smaller than 333ms were considered activity. Walking bouts were a subgroup of activity bouts in which a sequence of movement lasted for at least 333ms.
Unsupervised segmentation of movement dynamics
After pre-processing the data (84 flies, total number of walking segments = 64467, total time of spent walking = 441 minutes, percentage of time spent walking = 54 ± 17%), the description of the kinematics of the walking behavior consisted of time series of the angular and forward velocities, and their respective derivatives (accelerations). Under our experimental conditions, side velocities were minimal during forward runs, and highly coupled to angular velocity during turns. We used an adaptive methodology proposed in Costa et al.58 to segment these input time series into locally linear dynamical systems of varying length. Briefly, in an iterative manner, the algorithm searches for “dynamical breaks,” that is, points at which there would be a need to switch between one model’s dynamics and another to best fit the data among arbitrarily defined set of candidate windows , . A window is modeled by locally fitting a first order Autoregressive Process (AR) defined as follows
(1) |
where the set of parameters ,A,Σ) are estimated by least-square regression, with the coupling matrix A, the covariance matrix of the error, Σ, and the intercept, . We denote a window of length samples, and as a window of length samples, with and their respective models. A dynamical break is found if cannot model the dynamics in , in which case is chunked and the search for dynamical breaks starts again. Otherwise, a comparison with a new pair of candidate windows starts, i.e and The algorithm finishes when it chunks all the input time series. For our analysis, we considered a minimum candidate window of 10 frames at 60 Hz (∼167ms) which corresponds to about 2 steps under fast tripod walking and a lower boundary for consistent movement in freely walking flies.57 Thus, the choice of 10 frames as a starting candidate window gives us confidence to be able to capture most chunks of consistent dynamics in the dataset.
An analysis of the timescales of walking behavior (Figure 2B) revealed that the distribution of the duration of continuous locomotion-related dynamics decayed quickly after 350ms. Given that, our final analysis considered candidate windows from 167ms to 350ms (10-21 samples). Furthermore, we set the percentile of rejection under the null hypothesis that models to 98.5, and the number of surrogates to construct the null hypothesis distribution to 1000. Varying these parameters did not affect the overall distribution of segments (data not shown).
Clustering of movement dynamics
After segmenting the multivariate time series, we obtained 64467 segments of linear dynamics. A dissimilarity matrix is constructed by performing a pairwise comparison between segments. Given two segments, A and B and their corresponding models and, their distance (or dissimilarity) is computed by asking how likely the model of the concatenation of segments A and B, defined as , can model the individual segments. Equation 2 shows the measure of distance, where is a log-likelihood function. The output of this procedure is a symmetric, positive-defined squared matrix of 64467 rows (Figure S2A).
(2) |
Then, a hierarchical clustering using Ward linkage is applied to the dissimilarity matrix. To define the final clusters, we performed an iterative merging procedure. To start off, we used the lower level of the dendrogram obtained from the hierarchical clustering defining several “microclusters.” Next, we compared pairs of neighboring microclusters – clusters that share the same immediate common branch in the dendrogram, by first denoising the samples within each microcluster with Principal Component Analysis, and then selecting the first two components (explaining approx. 75% of variance) to reconstruct them in the time domain. Then, for each pair of microclusters, we compute a pairwise comparison between the samples to generate a dissimilarity matrix, using the log-likelihood distance described in the unsupervised segmentation. The labels for each row of the matrix are the two microclusters that are being compared. A Logistic Regression model was used to perform a binary classification (Figure S2A) such that two clusters would be merged if the binary classifier’s ability to separate them lies below a certain threshold of performance, measured by the Area Under the ROC Curve (AUC). To train and validate the binary classifier we randomly used 70% of the dataset, and the 30% remained for the test set. For the validation procedure, we used a cross-validation of five folds and a random grid search over the regularization C term in the range of in logarithmic scale. The best estimator of this search was used to classify the test set. The algorithm finishes when neighboring clusters cannot be further merged. For our dataset, we chose a threshold of 0.99 AUC.
Analysis of transitions between clusters
To examine how the behavior discretized in clusters unfolds in time, we compute a one-step probability transition matrix between continuous locomotion-related clusters. To account for the underlying structure of behavioral transitions, we performed a singular-value decomposition (SVD) on the one-step probability transition matrix. This factorization of the matrix allowed us to get a ranked description of the most important transitions between clusters. The SVD factorization of a one-step transition matrix decomposes it as a summation of transition modes. Therefore, the transition matrix with states can be expressed as follows:
(3) |
where the outer product of each left and right singular vectors , together with the associated singular value, generates a rank-1 matrix that here we refer as a transition mode. The larger the associated singular value, the slower the timescale of the system is reflected in the transition mode. The leading singular value contains information about long timescales –the steady-state of the system, where there is little predictability of the current state, and therefore, it reflects the averaged probability of transitioning into any other behavioral state.59,95 Because the larger clusters are the ones containing no angular bias (on averaged, “lime” and “dodge blue”), the first mode is characterized mainly by transitions from any clusters to these high forward velocity clusters (Figure S2E). Thus, we focused our analysis on the subsequent modes. The second transition mode accounts for local transitions between spike-like rotations followed by relaxations, in both directions. The third and fourth modes organized transitions from forward segments to either initiation of a rotation or continuation of high-forward velocity segments. This analysis shows the existence of two main different motor contexts in which the fly transitions on one-step basis composed by fast angular rotations resembling body saccades26 (spikes plus relaxations) and high-forward velocity segments.
Identification of spikes in the fly’s angular velocity
Spike-like dynamics described by the combination of angular velocity spikes and relaxation clusters corresponded to fast and stereotyped rotations of the fly (Figure 2). To quickly classify events as spike-like, we took advantage of this stereotypy and used a continuous wavelet transform strategy.96 For each walking bout, the continuous wavelet transform of the angular velocity was computed using Gaussian wavelets, and a signal with power at a range of 10-15Hz was extracted (frequency signal). Note that while the selection of the frequency band slightly affects the spike detection, it is not a determinant factor in subsequent analysis (Figure S3B). Next, the local maxima of the frequency signal, and of the absolute value of the angular velocity were calculated. If these local maxima coincide between the two sets of local maxima, these were labeled as putative spike turns (Figure S3A). Putative spike turns were filtered first to remove small local maxima (< 200°/s) that did not disrupt the forward velocity of the fly (variance of forward speed < 3mm/s), and then to roughly match the signal with a template obtained via PCA on a subset of pronounced peak aligned spike-like events (the first principal component explained ∼90% of the shape variance in a 500ms window and was used as the template; Figures 6D and S4A). Putative spike turns in which the square distance between the scaled shape and the template was smaller than a cutoff of 0.15 were labeled as angular spike events (Figure S3A). The free parameters of the classifier did not affect the results over a wide range of possible parameters (Figure S3B).
To identify forward walking segments (“forward runs”), angular spike events were removed from the walking bout. The remaining segments longer than 333ms, and with an average forward velocity greater than 6mm/s were defined as forward runs throughout this study.
Local curvature, path straightness and visual influence
Local curvature is calculated with a running window of 333ms centered on each point of the fly’s path. A 333ms window centered on each point of a forward segment was used to calculate the distance between the central data point and a line defined by the edges of the window (instantaneous deviation from an ideal straight path). The path straightness of a forward run was defined as the sum of traveled distances within the window divided by the sum of deviations. Path straightness per fly is the weighted average of all path straightness per forward run, with weights given by the total distance walked in each forward run (Figures 4 and 6).
Under natural gain conditions, when a fly rotates to the left, the world rotates to the right. In contrast, under the reversed gain (RG) condition, when the fly rotates to the left, the world rotates left too. The RG condition induces a persistent rotation of the fly, a behavior known as circling56 (Video S4). Visual influence was defined as the difference between the probabilities of circling in RG versus natural gain conditions (Figure S1). Under the dark condition we artificially divided the dataset into the same trial structure as in the random dot stimulus experiment and calculated visual influence in the same way.
Due to the delay between the estimate of the fly’s position and orientation and the update of the visual environment (∼40ms), when the fly stopped walking under the reversed gain condition, the visual environment continued rotating with the previous angular velocity of the fly for three consecutive frames. This brief “open-loop” segment was sufficient to induce a head optomotor response (Figure S7). Typically, after the stimulus stops moving, the head remains with an offset position driven by the stimulation until the fly initiates a behavior.
Head tracking
High-resolution images were filtered with a Gaussian filter (width = 16pixels), and background subtracted. Pixels that belonged to the fly were estimated with a combination of edge detector operations, and morphological transformations until a connected object with dimensions like the fly was detected. The centroid and orientation of the object were used to translate and rotate the frame to keep a vertical fly at the center of the frame. Next, a window around the head was cropped from the frame. The top of the thorax of the fly was masked out in the cropped frame, and the head was segmented using a combination of edge detector operations and morphological transformations. Head rotation was calculated via cross correlation with a template upright head.95 The head tracking error is the squared error between the template and the rotated head. The head angle trace was smoothed using a lowess algorithm with a window of 83ms. Frames in which the head tracking error was high (top 2.5% of the frames) were removed from the original time series.
As an alternative strategy, we used DeepLabCut to estimate the rotation of the head with respect to the body of the fly.63 We trained a neural network with a subset of images where the fly was in the upright position (see above) with four labels to track: Left Eye, Right Eye, Neck, and Thorax ending. After applying the tracking network to the full dataset, we transformed the positions of the labeled points into head angles. No significant differences were observed when comparing the two methods (Video S2).
Leg position extraction
High resolution video frames were background subtracted. Pixels that belonged to the fly were estimated with a combination of edge detector operations, and morphological transformations until a connected object with dimensions like the fly was detected. The centroid and orientation of the object were used to translate and rotate the frame to keep an upright vertical fly at the center of the frame. Next, a cropped window of 400x400px around the fly was stored in a new aligned video. We used DeepLabCut to estimate the location of the following features from the high resolution aligned videos: top of the head, left eye, right eye, neck, tip of the thorax, tip of the abdomen, tip of left front leg, tip of right front leg, tip of the left middle leg, tip of right middle led, tip of left hind leg and tip of right hind leg. These features were thresholded based on the accuracy of estimation parameters and aligned to the low-resolution preprocessed data.
Inter-leg phase was calculated by dividing the time delay in the cross covariance between leg position signals and the period as measured in the autocovariance of those signals. Swing-stance transitions for each leg were calculated by detecting the local maxima (swing to stance -PEP) and local minima (stance to swing-AEP) in the leg position aligned to the body orientation. Landing position is taken as the transition from swing to stance, and a step is defined by the swing of a single tripod.
Single step analysis
For each individual leg, we used the AEP and PEP of the leg position signal to identify swing, stance, landing and lift-off. We calculated the deviations in body angle (same as path deviations) as the integrated angular velocity between consecutive lift-offs. We correlate body angular deviations with the leg lateral and longitudinal displacement between lift-off and landing, the swing and stance time between those same consecutive lift-offs (Figures 5, S5, and S6). Leg placement correlation was calculated by multiplying leg lateral placement in consecutive steps (Figures 5, 6, and S6).
High quality steps
Only steps where we can identify stance and swing times larger than 16ms and smaller than 125ms, and swing amplitudes smaller than 0.75BL are considered for analysis to remove tracking errors that can affect the landing position identification. High quality steps include 68% of all detected steps.
Center of stability calculation
We calculated the landing positions of all the legs and considered for analysis only the steps where all the legs fulfilled the previous established criteria for step quality. We considered legs belonging to a triangle when they land within 16ms of each other. We consider the center of stability as the geometric center of the triangle defined by the landing positions of the legs. (Figure 5F). The reduction in posture stability was calculated by the distance in the orthogonal direction to the fly orientation between the centroid of the fly and the center of stability in a particular step (Figure 5F).
Quantification and statistical analysis
We performed a two-sided Wilcoxon signed-rank test for paired groups, two-sided Wilcoxon rank-sum test for comparisons between two independent groups and t test for the correlation analysis. In addition, for statistics in the cross-covariance analysis we performed a resampling, bootstrapping based method where new velocity signals were created by shuffling the time of different walking bouts, distorting the signal without affecting its general statistics. The new shuffled signal undergoes the same cross covariance analysis, and the procedure is repeated 1000 times to obtain the shuffled distributions in Figures 3E and 3F.
Acknowledgments
We thank Antonio Costa and Greg Stephens for assistance with the unsupervised algorithm for behavior analysis, Adrien Jouary for assistance with the SVD analysis, John Tuthill and Stephen Huston for earlier discussions on this work, and Michael Reiser and members of the lab for comments on the manuscript. We thank Kenta Asahina, Michael Dickinson, Michael Reiser, and Gerald Rubin for kindly sharing flies. This work was supported by the Champalimaud Foundation and the research infrastructure Congento, LISBOA-01-0145-FEDER-022170, co-financed by Fundação para a Ciência e Tecnologia (Portugal) and Lisboa2020, under the PORTUGAL 2020 Agreement (European Regional Development Fund). This work was also supported by Fundação para a Ciência e Tecnologia FCT PD/BD/105947/2014 (T.L.C.), by the Marie Curie Career Integration Grant PCIG13-GA-2013-618854 (M.E.C.), and by the European Research Council Starting Grant ERC-2017-STG-759782 537 (M.E.C.).
Author contributions
Conceptualization, T.C. and M.E.C.; methodology, T.L.C. and S.M.P.; investigation, T.L.C. and S.M.P.; software, T.L.C.; formal analysis, T.L.C. and S.M.P.; supervision, M.E.C.; writing, T.L.C. and M.E.C., with input from S.P.M.
Declaration of interests
M.E.C. is an advisory board member at Current Biology.
Inclusion and diversity
One or more of the authors of this paper self-identifies as a member of the LGBTQ+ community.
Published: September 8, 2021
Footnotes
Supplemental information can be found online at https://doi.org/10.1016/j.cub.2021.08.041.
Supplemental information
Data and code availability
All data reported in this paper can be found at Mendeley data: https://doi.org/10.17632/tvw.
All original codes used for behavior analysis are available at https://github.com/ChiappeLab.
Any additional information required to reanalyze the data reported in this work is available from the Lead Contact upon request, Eugenia Chiappe (eugenia.chiappe@neuro.fchampalimaud.org).
References
- 1.Matthis J.S., Yates J.L., Hayhoe M.M. Gaze and the Control of Foot Placement When Walking in Natural Terrain. Curr. Biol. 2018;28:1224–1233.e5. doi: 10.1016/j.cub.2018.03.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Dickinson M.H., Farley C.T., Full R.J., Koehl M.A., Kram R., Lehman S. How animals move: an integrative view. Science. 2000;288:100–106. doi: 10.1126/science.288.5463.100. [DOI] [PubMed] [Google Scholar]
- 3.Bernstein N. Pergamon; 1967. The Coordination and Regulation of Movements. [Google Scholar]
- 4.Haith A.M., Krakauer J.W. In: Gollhofer A., Taube W., Nielsen J.B., editors. Routledge; 2012. Theoretical models of motor control and motor learning; pp. 7–28. (Routledge Handbook of Motor Control and Motor Learning). [Google Scholar]
- 5.Scott S.H. Optimal feedback control and the neural basis of volitional motor control. Nat. Rev. Neurosci. 2004;5:532–546. doi: 10.1038/nrn1427. [DOI] [PubMed] [Google Scholar]
- 6.Crapse T.B., Sommer M.A. Corollary discharge across the animal kingdom. Nat. Rev. Neurosci. 2008;9:587–600. doi: 10.1038/nrn2457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Koenderink J.J., van Doorn A.J. Facts on optic flow. Biol. Cybern. 1987;56:247–254. doi: 10.1007/BF00365219. [DOI] [PubMed] [Google Scholar]
- 8.Lehrer M., Srinivasan M.V., Zhang S.W., Horridge G.A. Motion cues provide the bee’s visual world with a third dimension. Nature. 1988;332:356–357. [Google Scholar]
- 9.Yarbus A.L. Springer US; 1967. Eye Movements and Vision. [Google Scholar]
- 10.Stryker M., Blakemore C. Saccadic and disjunctive eye movements in cats. Vision Res. 1972;12:2005–2013. doi: 10.1016/0042-6989(72)90054-5. [DOI] [PubMed] [Google Scholar]
- 11.Collett T.S., Land M.F. Visual control of flight behaviour in the hoverfly Syritta pipiens L. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1975;99:1–66. [Google Scholar]
- 12.Land M.F. Motion and vision: why animals move their eyes. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1999;185:341–352. doi: 10.1007/s003590050393. [DOI] [PubMed] [Google Scholar]
- 13.Hardcastle B.J., Krapp H.G. Evolution of Biological Image Stabilization. Curr. Biol. 2016;26:R1010–R1021. doi: 10.1016/j.cub.2016.08.059. [DOI] [PubMed] [Google Scholar]
- 14.Necker R. Head-bobbing of walking birds. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 2007;193:1177–1183. doi: 10.1007/s00359-007-0281-3. [DOI] [PubMed] [Google Scholar]
- 15.Imai T., Moore S.T., Raphan T., Cohen B. Interaction of the body, head, and eyes during walking and turning. Exp. Brain Res. 2001;136:1–18. doi: 10.1007/s002210000533. [DOI] [PubMed] [Google Scholar]
- 16.Collewijn H. Eye- and head movements in freely moving rabbits. J. Physiol. 1977;266:471–498. doi: 10.1113/jphysiol.1977.sp011778. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Meyer A.F., O’Keefe J., Poort J. Two Distinct Types of Eye-Head Coupling in Freely Moving Mice. Curr. Biol. 2020;30:2116–2130.e6. doi: 10.1016/j.cub.2020.04.042. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Bizzi E., Kalil R.E., Tagliasco V. Eye-head coordination in monkeys: evidence for centrally patterned organization. Science. 1971;173:452–454. doi: 10.1126/science.173.3995.452. [DOI] [PubMed] [Google Scholar]
- 19.Waespe W., Henn V. Gaze stabilization in the primate. The interaction of the vestibulo-ocular reflex, optokinetic nystagmus, and smooth pursuit. Rev. Physiol. Biochem. Pharmacol. 1987;106:37–125. [PubMed] [Google Scholar]
- 20.Meiry J.L. In: Bach-y-Rita P., Collins C., Hyde D.J.E., editors. Academic Press; 1971. Vestibular and proprioceptive stabilization of eye movement; pp. 483–496. (The Control of Eye Movements). [Google Scholar]
- 21.Hikosaka O., Maeda M. Cervical effects on abducens motoneurons and their interaction with vestibulo-ocular reflex. Exp. Brain Res. 1973;18:512–530. doi: 10.1007/BF00234135. [DOI] [PubMed] [Google Scholar]
- 22.Goldberg J., Peterson B.W. Reflex and mechanical contributions to head stabilization in alert cats. J. Neurophysiol. 1986;56:857–875. doi: 10.1152/jn.1986.56.3.857. [DOI] [PubMed] [Google Scholar]
- 23.Angelaki D.E. In: Masland R.H., Albright T.D., editors. Academic Press; 2008. The VOR: A Model for Visual-Motor Plasticity; pp. 359–370. (The Senses: A comprehensive reference. Volume 2). [Google Scholar]
- 24.Cullen K.E. Vestibular processing during natural self-motion: implications for perception and action. Nat. Rev. Neurosci. 2019;20:346–363. doi: 10.1038/s41583-019-0153-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Blaj G., van Hateren J.H. Saccadic head and thorax movements in freely walking blowflies. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 2004;190:861–868. doi: 10.1007/s00359-004-0541-4. [DOI] [PubMed] [Google Scholar]
- 26.Geurten B.R.H., Jähde P., Corthals K., Göpfert M.C. Saccadic body turns in walking Drosophila. Front. Behav. Neurosci. 2014;8:365. doi: 10.3389/fnbeh.2014.00365. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Huston S.J., Krapp H.G. Nonlinear integration of visual and haltere inputs in fly neck motor neurons. J. Neurosci. 2009;29:13097–13105. doi: 10.1523/JNEUROSCI.2915-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Preuss T., Hengstenberg R. Structure and kinematics of the prosternal organs and their influence on head position in the blowfly, Calliphora erythrocephala. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1992;264:483–493. [Google Scholar]
- 29.Horn E., Lang H.G. Positional head reflexes and the role of the prosternal organ in the walking fly, Calliphora erythrocephala. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1978;126:137–146. [Google Scholar]
- 30.Hengstenberg R. Multisensory control in insect oculomotor systems. Rev. Oculomot. Res. 1993;5:285–298. [PubMed] [Google Scholar]
- 31.Kress D., Egelhaaf M. Head and body stabilization in blowflies walking on differently structured substrates. J. Exp. Biol. 2012;215:1523–1532. doi: 10.1242/jeb.066910. [DOI] [PubMed] [Google Scholar]
- 32.Faisal A.A., Selen L.P.J., Wolpert D.M. Noise in the nervous system. Nat. Rev. Neurosci. 2008;9:292–303. doi: 10.1038/nrn2258. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Laurent G. Sensory control of locomotion in insects. Curr. Opin. Neurobiol. 1991;1:601–604. doi: 10.1016/s0959-4388(05)80035-2. [DOI] [PubMed] [Google Scholar]
- 34.Pearson K.G. Generating the walking gait: role of sensory feedback. Prog. Brain Res. 2004;143:123–129. doi: 10.1016/S0079-6123(03)43012-4. [DOI] [PubMed] [Google Scholar]
- 35.Tuthill J.C., Wilson R.I. Mechanosensation and Adaptive Motor Control in Insects. Curr. Biol. 2016;26:R1022–R1038. doi: 10.1016/j.cub.2016.06.070. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Büschges A., Manira A.E. Sensory pathways and their modulation in the control of locomotion. Curr. Opin. Neurobiol. 1998;8:733–739. doi: 10.1016/s0959-4388(98)80115-3. [DOI] [PubMed] [Google Scholar]
- 37.Collewijn H., Van der Steen J. In: Glickstein M., Yeo C., Stein J., editors. Plenum Press; 1986. Visual Control of the Vestibulo-ocular Reflex in the Rabbit. A Multi-level Interaction; pp. 277–292. (Cerebellum and neural plasticity). [Google Scholar]
- 38.Schwyn D.A., Heras F.J.H., Bolliger G., Parsons M.M., Krapp H.G., Tanaka R.J. IFAC; 2011. Interplay between feedback and feedforward control in fly gaze stabilization. [Google Scholar]
- 39.Kretschmer F., Tariq M., Chatila W., Wu B., Badea T.C. Comparison of optomotor and optokinetic reflexes in mice. J. Neurophysiol. 2017;118:300–316. doi: 10.1152/jn.00055.2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Orger M.B., Smear M.C., Anstis S.M., Baier H. Perception of Fourier and non-Fourier motion by larval zebrafish. Nat. Neurosci. 2000;3:1128–1133. doi: 10.1038/80649. [DOI] [PubMed] [Google Scholar]
- 41.Robinson D.A. The use of control systems analysis in the neurophysiology of eye movements. Annu. Rev. Neurosci. 1981;4:463–503. doi: 10.1146/annurev.ne.04.030181.002335. [DOI] [PubMed] [Google Scholar]
- 42.Götz K.G., Wenking H. Visual control of locomotion in the walking fruitflyDrosophila. J. Comp. Physiol. 1973;85:235–266. [Google Scholar]
- 43.Poggio T., Reichardt W. A theory of the pattern induced flight orientation of the fly Musca domestica. Kybernetik. 1973;12:185–203. doi: 10.1007/BF00270572. [DOI] [PubMed] [Google Scholar]
- 44.Roth E., Reiser M.B., Dickinson M.H., Cowan N.J. IEEE; 2012. A task-level model for optomotor yaw regulation in Drosophila melanogaster: A frequency-domain system identification approach; pp. 3721–3726. (2012 IEEE 51st Proceedings of the IEEE Conference Decision and Control). [Google Scholar]
- 45.Götz K.G. The optomotor equilibrium of the Drosophila navigation system. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1975;99:187–210. [Google Scholar]
- 46.Patla A.E. Understanding the roles of vision in the control of human locomotion. Gait Posture. 1997;5:54–69. [Google Scholar]
- 47.Warren W.H., Jr., Kay B.A., Zosh W.D., Duchon A.P., Sahuc S. Optic flow is used to control human walking. Nat. Neurosci. 2001;4:213–216. doi: 10.1038/84054. [DOI] [PubMed] [Google Scholar]
- 48.Collett T., Nalbach H.O., Wagner H. Visual stabilization in arthropods. Rev. Oculomot. Res. 1993;5:239–263. [PubMed] [Google Scholar]
- 49.Wolf R., Heisenberg M. Visual control of straight flight in Drosophila melanogaster. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1990;167:269–283. doi: 10.1007/BF00188119. [DOI] [PubMed] [Google Scholar]
- 50.Skavenski A.A., Robinson D.A. Role of abducens neurons in vestibuloocular reflex. J. Neurophysiol. 1973;36:724–738. doi: 10.1152/jn.1973.36.4.724. [DOI] [PubMed] [Google Scholar]
- 51.Schnell B., Weir P.T., Roth E., Fairhall A.L., Dickinson M.H. Cellular mechanisms for integral feedback in visually guided behavior. Proc. Natl. Acad. Sci. USA. 2014;111:5700–5705. doi: 10.1073/pnas.1400698111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Green J., Vijayan V., Mussells Pires P., Adachi A., Maimon G. A neural heading estimate is compared with an internal goal to guide oriented navigation. Nat. Neurosci. 2019;22:1460–1468. doi: 10.1038/s41593-019-0444-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Linneweber G.A., Andriatsilavo M., Dutta S.B., Bengochea M., Hellbruegge L., Liu G., Ejsmont R.K., Straw A.D., Wernet M., Hiesinger P.R. A neurodevelopmental origin of behavioral individuality in the Drosophila visual system. Science. 2020;367:1112–1119. doi: 10.1126/science.aaw7182. [DOI] [PubMed] [Google Scholar]
- 54.Cruz, T. L. (2013). Development and test of a Virtual Reality system for tethered walking Drosophila. Diss. Master's Thesis (Técnico Lisboa).
- 55.Bath D.E., Stowers J.R., Hörmann D., Poehlmann A., Dickson B.J., Straw A.D. FlyMAD: rapid thermogenetic control of neuronal activity in freely walking Drosophila. Nat. Methods. 2014;11:756–762. doi: 10.1038/nmeth.2973. [DOI] [PubMed] [Google Scholar]
- 56.von Holst E., Mittelstaedt H. The principle of reafference: Interactions between the central nervous system and the peripheral organs. Naturwissenschaften. 1950;37:464–476. [Google Scholar]
- 57.Katsov A.Y., Freifeld L., Horowitz M., Kuehn S., Clandinin T.R. Dynamic structure of locomotor behavior in walking fruit flies. eLife. 2017;6:1–32. doi: 10.7554/eLife.26410. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Costa A.C., Ahamed T., Stephens G.J. Adaptive, locally linear models of complex dynamics. Proc. Natl. Acad. Sci. USA. 2019;116:1501–1510. doi: 10.1073/pnas.1813476116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Mearns D.S., Donovan J.C., Fernandes A.M., Semmelhack J.L., Baier H. Deconstructing Hunting Behavior Reveals a Tightly Coupled Stimulus-Response Loop. Curr. Biol. 2020;30:54–69.e9. doi: 10.1016/j.cub.2019.11.022. [DOI] [PubMed] [Google Scholar]
- 60.Robie A.A., Straw A.D., Dickinson M.H. Object preference by walking fruit flies, Drosophila melanogaster, is mediated by vision and graviperception. J. Exp. Biol. 2010;213:2494–2506. doi: 10.1242/jeb.041749. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Mendes C.S., Bartos I., Akay T., Márka S., Mann R.S. Quantification of gait parameters in freely walking wild type and sensory deprived Drosophila melanogaster. eLife. 2013;2:e00231. doi: 10.7554/eLife.00231. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.DeAngelis B.D., Zavatone-Veth J.A., Clark D.A. The manifold structure of limb coordination in walking Drosophila. eLife. 2019;8:1–34. doi: 10.7554/eLife.46409. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Mathis A., Mamidanna P., Cury K.M., Abe T., Murthy V.N., Mathis M.W., Bethge M. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 2018;21:1281–1289. doi: 10.1038/s41593-018-0209-y. [DOI] [PubMed] [Google Scholar]
- 64.Wosnitza A., Bockemühl T., Dübbert M., Scholz H., Büschges A. Inter-leg coordination in the control of walking speed in Drosophila. J. Exp. Biol. 2013;216:480–491. doi: 10.1242/jeb.078139. [DOI] [PubMed] [Google Scholar]
- 65.Szczecinski N.S., Bockemühl T., Chockley A.S., Büschges A. Static stability predicts the continuum of interleg coordination patterns in Drosophila. J. Exp. Biol. 2018;221:jeb189142. doi: 10.1242/jeb.189142. [DOI] [PubMed] [Google Scholar]
- 66.Ting L.H., Blickhan R., Full R.J. Dynamic and static stability in hexapedal runners. J. Exp. Biol. 1994;197:251–269. doi: 10.1242/jeb.197.1.251. [DOI] [PubMed] [Google Scholar]
- 67.Bauby C.E., Kuo A.D. Active control of lateral balance in human walking. J. Biomech. 2000;33:1433–1440. doi: 10.1016/s0021-9290(00)00101-9. [DOI] [PubMed] [Google Scholar]
- 68.Jindrich D.L., Full R.J. Dynamic stabilization of rapid hexapedal locomotion. J. Exp. Biol. 2002;205:2803–2823. doi: 10.1242/jeb.205.18.2803. [DOI] [PubMed] [Google Scholar]
- 69.Maisak M.S., Haag J., Ammer G., Serbe E., Meier M., Leonhardt A., Schilling T., Bahl A., Rubin G.M., Nern A. A directional tuning map of Drosophila elementary motion detectors. Nature. 2013;500:212–216. doi: 10.1038/nature12320. [DOI] [PubMed] [Google Scholar]
- 70.Henderson J.M. Gaze Control as Prediction. Trends Cogn. Sci. 2017;21:15–23. doi: 10.1016/j.tics.2016.11.003. [DOI] [PubMed] [Google Scholar]
- 71.Borst A. Fly visual course control: behaviour, algorithms and circuits. Nat. Rev. Neurosci. 2014;15:590–599. doi: 10.1038/nrn3799. [DOI] [PubMed] [Google Scholar]
- 72.Strauss R., Schuster S., Götz K.G. Processing of artificial visual feedback in the walking fruit fly Drosophila melanogaster. J. Exp. Biol. 1997;200:1281–1296. doi: 10.1242/jeb.200.9.1281. [DOI] [PubMed] [Google Scholar]
- 73.Tammero L.F., Frye M.A., Dickinson M.H. Spatial organization of visuomotor reflexes in Drosophila. J. Exp. Biol. 2004;207:113–122. doi: 10.1242/jeb.00724. [DOI] [PubMed] [Google Scholar]
- 74.Varju D. Stationary and dynamic responses during visual edge fixation by walking insects. Nature. 1975;255:330–332. doi: 10.1038/255330a0. [DOI] [PubMed] [Google Scholar]
- 75.Wolf R., Heisenberg M. Visual orientation in motion-blind flies is an operant behaviour. Nature. 1986;323:154–156. [Google Scholar]
- 76.Pick B. Visual Flicker Induces Orientation Behavior In The Fly Musca. Z. Naturfosch. Naturwiss. 1974;29:310–312. [Google Scholar]
- 77.Bahl A., Ammer G., Schilling T., Borst A. Object tracking in motion-blind flies. Nat. Neurosci. 2013;16:730–738. doi: 10.1038/nn.3386. [DOI] [PubMed] [Google Scholar]
- 78.Lönnendonker U., Scharstein H. Fixation and optomotor response of walking colorado beetles: interaction with spontaneous turning tendencies. Physiol. Entomol. 1991;16:65–76. [Google Scholar]
- 79.Borst A., Haag J. Neural networks in the cockpit of the fly. J. Comp. Physiol. A Neuroethol. Sensory, Neural, Behav. Physiol. 2002;188:419–437. doi: 10.1007/s00359-002-0316-8. [DOI] [PubMed] [Google Scholar]
- 80.Wu M., Nern A., Williamson W.R., Morimoto M.M., Reiser M.B., Card G.M., Rubin G.M. Visual projection neurons in the Drosophila lobula link feature detection to distinct behavioral programs. eLife. 2016;5:1–43. doi: 10.7554/eLife.21022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Panser K., Tirian L., Schulze F., Villalba S., Jefferis G.S.X.E., Bühler K., Straw A.D. Automatic Segmentation of Drosophila Neural Compartments Using GAL4 Expression Data Reveals Novel Visual Pathways. Curr. Biol. 2016;26:1943–1954. doi: 10.1016/j.cub.2016.05.052. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Fujiwara T., Cruz T., Bohnslav J., Chiappe M.E. A faithful internal representation of walking movements in the Drosophila visual system. Nat. Neurosci. 2017;20:72–81. doi: 10.1038/nn.4435. [DOI] [PubMed] [Google Scholar]
- 83.Kim A.J., Fenk L.M., Lyu C., Maimon G. Quantitative Predictions Orchestrate Visual Signaling in Drosophila. Cell. 2017;168:280–294. doi: 10.1016/j.cell.2016.12.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Seelig J.D., Chiappe M.E., Lott G.K., Dutta A., Osborne J.E., Reiser M.B., Jayaraman V. Two-photon calcium imaging from head-fixed Drosophila during optomotor walking behavior. Nat. Methods. 2010;7:535–540. doi: 10.1038/nmeth.1468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Chiappe M.E., Seelig J.D., Reiser M.B., Jayaraman V. Walking modulates speed sensitivity in Drosophila motion vision. Curr. Biol. 2010;20:1470–1475. doi: 10.1016/j.cub.2010.06.072. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Namiki S., Dickinson M.H., Wong A.M., Korff W., Card G.M. The functional organization of descending sensory-motor pathways in Drosophila. eLife. 2018;7:1–50. doi: 10.7554/eLife.34272. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Cruse H. What mechanisms coordinate leg movement in walking arthropods? Trends Neurosci. 1990;13:15–21. doi: 10.1016/0166-2236(90)90057-h. [DOI] [PubMed] [Google Scholar]
- 88.Dean J. Control of leg protraction in the stick insect: a targeted movement showing compensation for externally applied forces. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1984;155:771–781. [Google Scholar]
- 89.Berg E.M., Hooper S.L., Schmidt J., Büschges A. A leg-local neural mechanism mediates the decision to search in stick insects. Curr. Biol. 2015;25:2012–2017. doi: 10.1016/j.cub.2015.06.017. [DOI] [PubMed] [Google Scholar]
- 90.Asahina K., Watanabe K., Duistermars B.J., Hoopfer E., González C.R., Eyjólfsdóttir E.A., Perona P., Anderson D.J. Tachykinin-expressing neurons control male-specific aggressive arousal in Drosophila. Cell. 2014;156:221–235. doi: 10.1016/j.cell.2013.11.045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Ofstad T.A., Zuker C.S., Reiser M.B. Visual place learning in Drosophila melanogaster. Nature. 2011;474:204–207. doi: 10.1038/nature10131. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Götz K.G., Biesinger R. Centrophobism in Drosophila melanogaster. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 1985;156:329–337. [Google Scholar]
- 93.Tammero L.F., Dickinson M.H. The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. J. Exp. Biol. 2002;205:327–343. doi: 10.1242/jeb.205.3.327. [DOI] [PubMed] [Google Scholar]
- 94.Zeil J. The territorial flight of male houseflies (Fannia canicularis L.) Behav. Ecol. Sociobiol. 1986;19:213–219. [Google Scholar]
- 95.Berman G.J., Choi D.M., Bialek W., Shaevitz J.W. Mapping the stereotyped behaviour of freely moving fruit flies. J. R. Soc. Interface. 2014;11:1–9. doi: 10.1098/rsif.2014.0672. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Arthur B.J., Sunayama-Morita T., Coen P., Murthy M., Stern D.L. Multi-channel acoustic recording and automated analysis of Drosophila courtship songs. BMC Biol. 2013;11:11. doi: 10.1186/1741-7007-11-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
All data reported in this paper can be found at Mendeley data: https://doi.org/10.17632/tvw.
All original codes used for behavior analysis are available at https://github.com/ChiappeLab.
Any additional information required to reanalyze the data reported in this work is available from the Lead Contact upon request, Eugenia Chiappe (eugenia.chiappe@neuro.fchampalimaud.org).