Significance
Many animals, including humans, move their eyes, head, and/or body to stabilize visual gaze. Revealing how the nervous system relays visual information to control the eyes, head, and/or body together is critical to understanding gaze stabilization. Here, we studied how tethered, flying fruit flies combine head movements and wing steering efforts to stabilize gaze. By combining experimental and theoretical approaches we show that head movements shape visual information, which in turn increases the strength of wing steering responses. Moreover, head movements preceded and coordinated downstream wing steering responses, establishing a temporal order. Taken together, head movements and their influence on visual inputs must be considered to fully appreciate the potential of visual information processing for flight control.
Keywords: Drosophila, active sensing, control, synergy, sensorimotor integration
Abstract
Animals use active sensing to respond to sensory inputs and guide future motor decisions. In flight, flies generate a pattern of head and body movements to stabilize gaze. How the brain relays visual information to control head and body movements and how active head movements influence downstream motor control remains elusive. Using a control theoretic framework, we studied the optomotor gaze stabilization reflex in tethered flight and quantified how head movements stabilize visual motion and shape wing steering efforts in fruit flies (Drosophila). By shaping visual inputs, head movements increased the gain of wing steering responses and coordination between stimulus and wings, pointing to a tight coupling between head and wing movements. Head movements followed the visual stimulus in as little as 10 ms—a delay similar to the human vestibulo-ocular reflex—whereas wing steering responses lagged by more than 40 ms. This timing difference suggests a temporal order in the flow of visual information such that the head filters visual information eliciting downstream wing steering responses. Head fixation significantly decreased the mechanical power generated by the flight motor by reducing wingbeat frequency and overall thrust. By simulating an elementary motion detector array, we show that head movements shift the effective visual input dynamic range onto the sensitivity optimum of the motion vision pathway. Taken together, our results reveal a transformative influence of active vision on flight motor responses in flies. Our work provides a framework for understanding how to coordinate moving sensors on a moving body.
Sensing is often an active process in which sensors themselves move. For example, primate eyes perform saccades to center the visual field onto the fovea (1). Active sensing can be broadly defined as sensing that uses self-generated energy and therefore includes body, head, and/or eye movements in visual animals, whether of reflexive or volitional origin (2). Insects such as flies actively sample a scene with a pattern of head and body movements that together control gaze (3–5). Since flies cannot move their eyes independently of the head—with the notable exception of rhabdomere contractions (6, 7)—flies must rotate their whole head and/or rotate their body to shift gaze. In flying insects, compensatory head movements maintain visual equilibrium via wide-field image stabilization, enabling postural control and flight stability (8–10). In addition, head movements are involved in object tracking and search tasks (11–13). For instance, during the presentation of simultaneous figure and ground stimuli, fruit fly (Drosophila) head movements compensate for wide-field, ground motion whereas wing steering responses correlate with both wide- and small-field motion (14, 15)—“small-field” here referring to a small moving figure distinct from wide-field, ground motion. Additionally, on a static background Drosophila and Calliphora (blow flies) will track small-field motion via head movements (15, 16). More recent work in Drosophila showed that head and wing steering responses are correlated during the presentation of a wide-field stimulus (17–19). While we have a good understanding of stabilizing head movement reflexes due to concomitant body movements in both tethered and free flight (9), a functional understanding of how head motor control shapes wing motor control—and therefore body movement—is lacking.
Flying flies must somehow relay a single stream of visual information between two motor outputs (neck and wings) to stabilize their heading. What does the visual input shaped by head movements do for downstream motor control? Visual signals contributing to neck and wing steering responses must be integrated for effective gaze stabilization in the presence of self-generated feedback (9, 20). Elucidating the coordination between the head and wing steering responses is critical because it shapes the visual inputs that enter the brain. As these visuomotor processes are inescapably closed-loop, control theoretic approaches can provide a valuable computational framework (21, 22).
Here we use such a control theoretic framework to study the coordination between the head and wings during gaze stabilization in tethered Drosophila flight. We quantified how head movements are modulated by visual motion and are coordinated with wing steering responses. We abolished head movements by mechanically fixing the head and compared wing motor outputs under head-free and head-fixed conditions. We modeled the gaze stabilization reflex as a single-input–multiple-output (SIMO) system which incorporates movement of the visual scene as the input and head and wing movements as outputs. By presenting tethered flies with oscillating gratings of varying amplitudes and frequencies, we generated empirical frequency response functions (FRFs) that quantify how head and wing responses are modulated by visual inputs. Here individual FRFs make no implicit assumption of linearity and time invariance (LTI) but rather graphically represent describing functions approximating the full nonlinear system. By comparing head-free and head-fixed wing steering responses, we revealed that head movements increased wing gain and strongly coordinated wing steering responses. The wing steering responses consistently lagged the head response, thereby pointing to a head-to-wing temporal order in the visuomotor loop. Taken together, our results show that head movements in flies profoundly shape and coordinate wing steering responses in flight.
Results
Head Movements Shape the Visual Stimulus.
We began by quantifying the extent to which head movements compensate for visual motion during yaw gaze stabilization where both the wings and head are free to move but the body is fixed (Fig. 1A). We predicted that the head should compensate for a moving panorama (15, 18) but that it may only operate within a narrow range of frequencies (or mean angular velocities). We employed a range of visual stimuli to test this prediction (Materials and Methods). As expected, both the head and wings responded to the visual input across all experiments (Fig. 1 B, i, SI Appendix, Figs. S1, S3, and S4, and Movie S1).
Fig. 1.
Head movements shape visual inputs. (A) Virtual reality flight arena for visuomotor system identification experiments. The fly is rigidly tethered to a tungsten pin and placed in the center of the arena. Wing steering responses were measured by projecting a shadow of the wings over two infrared sensors. Head movements were acquired with a high-speed camera (200 frames per second) placed above the fly. The video frame shows the head vector (blue) and left and right WBAs (red). (B, i) Mean head responses (blue) to SOS stimulus (black). (B, ii) Mean wing steering responses (red) to the same SOS stimulus for head-free (Top) and head-fixed (Bottom) flies. Head-free: n = 10 flies; head-fixed: n = 9 flies; 10 s of the 20-s trial is shown. (C) Mean visual position error (red) during chirp presentation (black) due to head movements (blue). Error = stimulus position (S)–head angle (H). Amplitude of chirp = 15°. n = 13 flies. (D) Same as C but in frequency domain. (E) Compensation error magnitude for SS experiments vs. mean temporal frequency for four different amplitudes (7.5 to 18.75°). Dotted line: perfect compensation with gain = 1 (error = 0) and phase = 0°. Amplitude = 7.5°, n = 11 flies. Amplitude = 11.25°, n = 11 flies. Amplitude = 15°, n = 10 flies. Amplitude = 18.75°, n = 11 flies. Shaded region: ±1 SE.
One way to quantify the contribution of head movements in compensating for the visual stimulus is to plot the error signal E, which we define as the visual stimulus S minus the head movements H. An error of zero would indicate that the head fully compensates for the stimulus. Since the error was nonzero, head movements stabilized gaze partially and within a defined range of frequencies (Fig. 1 C–E). For periodic stimuli, we can define the quality of compensation by quantifying gain and phase. Gain represents the ratio of amplitudes between input and output in the frequency domain (Materials and Methods). For example, a gain of 0.5 for a stimulus oscillating ±15° means the fly head is oscillating at ±7.5°; a gain of 1 corresponds to perfect compensation. Phase represents the difference in timing between two signals. We can also quantify compensation performance by combining gain and phase into a single metric, complex gain (SI Appendix, Fig. S2 A–F), with a gain and phase of 1 and 0, respectively, indicating perfect compensation:
[1] |
where is the imaginary number. To measure how flies deviate from perfect compensation, we can compute a compensation error which is defined as
[2] |
where represents the complex gain of the fly head movements (23). Drosophila stabilized the stimulus with a compensation error of 0.4 at best, again suggesting that the head partially stabilizes visual motion (Fig. 1E). The head was most effective within a range of mean temporal frequency of ∼1 to 4 Hz, or at low to intermediate velocities. For constant-velocity visual motion, the temporal frequency c is expressed as c = v/λ, where v is the speed and λ the spatial wavelength of the pattern. For sinusoidal inputs, the speed of visual motion is not constant over time so we normalized based on the mean temporal frequency, expressed as c = 4Af/λ, where A is the amplitude of the sinusoid and f is the frequency.
To further quantify the contribution of head movements in stabilizing gaze, we calculated the FRF across stimulus frequencies (Fig. 2 A and B). The computed FRF for sum-of-sines (SOS) experiments showed a relatively flat head gain across frequencies (Fig. 2B). In addition, the head was strongly phase-locked to the stimulus, with the phase increasing with increasing frequencies (Fig. 2B and SI Appendix, Fig. S5A). At face value, the head FRF resembles the response of a high-pass filter (Fig. 2B). However, the increase in head gain with frequency may instead be indicative of amplitude sensitivity of the head, supporting the possibility of nonlinearities in the head response (Materials and Methods). To reveal potential nonlinearities of the head response, we presented chirp and single-sine (SS) stimuli spanning multiple oscillation amplitudes. The FRF gain computed from chirp and SS stimuli had the highest gain between 1 and 4 Hz and a sharp drop-off beyond this range, as expected due to visual bandwidth limits (SI Appendix, Figs. S2–S4) (24). The chirp and SS experiments thereby provide an estimate of the full bandwidth of the head response. The chirp and SS were sensitive to stimulus amplitude, suggesting nonlinearities in the head response (SI Appendix, Figs. S3B and S4B). These nonlinearities can be partly attributed to the angular excursion limits of the neck (∼±15°). Additionally, the fly’s visual system response is attenuated at high velocities—with a cutoff near 500°⋅s−1 for yaw steering responses (24)—thus the frequency roll-offs of the FRFs were dependent on the amplitude of the signal. The average time domain response of the head and wings corroborated the input–output relationship between the visual stimulus and head/wing motor outputs (SI Appendix, Figs. S1, S3A, and S4A). Taken together, our results suggest that Drosophila can reduce retinal slip speed via head movements in yaw by up to 60%.
Fig. 2.
Head movements increase wing gain and stimulus–wing coherence. (A) Control diagram for head-fixed (Top) and head-free (Bottom). In both cases, the body is fixed. Head-fixed flies operate in open-loop and experience the visual stimulus S. In contrast, head-free flies can shape the visual stimulus S by active head movements (H), generating an error signal E. In both cases, the wing steering responses are open-loop. M: mechanics when head is fixed. M′: mechanics when the head is free. Dotted line: replay error from closed-loop to open-loop system. (B) Head (Left) and wing (Right) FRFs for head-free and head-fixed flies measured from SOS experiments. The head and wings display similar tuning in gain, but the wings lag the head in phase. Head movements increase the open-loop gain of the wings (E → W). Shaded region: ±1 SE. Head-free: n = 10 flies. Head-fixed: n = 9 flies. (C) Stimulus–head (black), stimulus–wing (ΔWBA) coherence for head-free (green) and head-fixed (red) flies and head–wing coherence (blue) measured from SOS experiments. The head displays consistently higher coherence than the wings and head fixation decreases wing coherence across all frequencies. Error bars: ±1 SE. (D) Head–wing coherence for SOS (brown) and static (blue) stimuli. Visual motion increases coherence at the frequencies present in the SOS stimulus relative to the static stimulus. Shaded region: ±1 SE. For C and D: head-free: n = 10 flies, head-fixed: n = 9 flies, static stimulus: n = 9 flies.
Head Movements Increase Wing Gain and Coordinate Wing Steering Responses.
Head movements partially stabilize gaze, but do they influence downstream motor centers? To address this question, we measured the wing FRFs of head-free and head-fixed flies (Fig. 2B). When the head is fixed, head movements are abolished such that the visual stimulus-to-wing transform is simply the multiplication of individual systems (visual system V, neural controller C, and mechanics M) in the frequency domain (VCM; Materials and Methods) (Fig. 2A). We can refer to the head-fixed, rigid tether condition as “open-loop,” meaning that any movement of the tethered fly’s wings in response to a visual stimulus has no effect on the visual input (Fig. 2A). By contrast, the visual input experienced by a head-free fly is filtered or shaped by head movements into an error signal E (Fig. 2A). In this case head movements are “closed loop”—even though the body itself is rigidly tethered and therefore open-loop—and this system is mathematically distinct from the head-fixed case (Materials and Methods).
To determine whether head movements influence downstream wing steering responses, we compared the transforms between stimulus and change in wingbeat amplitude (ΔWBA) for head-free and head-fixed flies. We hypothesized to be different for head-free and head-fixed flies because head movements could be shaping (filtering) the visual input and thereby modulating . Interestingly, the gain changed modestly between conditions, suggesting that there is some influence of head movements on wing steering efforts. The neural controller C (brain) could somehow be modified between the head-free and head-fixed case, suggesting a nonlinearity introduced by head movements (Fig. 2B and SI Appendix, Fig. S5B). To identify this possible nonlinearity, we quantified how the head might shape the input . For the head-free condition, one might be tempted to compute directly, but doing so ignores the role of the head in shaping . Indeed, as head movements shape into (Figs. 1D and 2A), wing steering responses might more closely follow the error signal rather than the stimulus itself, particularly if the output lags the output in time (discussed below). To test this hypothesis, we computed directly, which mathematically is an open-loop transform as does not influence the visual input. doubled the gain when compared to , suggesting that active head movements increase the gain of downstream wing steering responses. In other words, computing the gain underestimates the actual wing gain in the sensorimotor loop. Because the head is shifting the visual stimulus into the lower stimulus velocity range across a range of frequencies (Fig. 1 D and E), the input in the FRF is therefore smaller. Because gain is defined as , a smaller value of results in an increased gain. With respect to phase, head movements caused wing motion to be marginally more in-phase with the stimulus, particularly at higher frequencies (or velocities). These results were consistent for both chirp and SS stimuli (SI Appendix, Fig. S5 B and C). The FRFs of wing steering responses to SS and chirp stimuli resembled a band-pass filter, consistent with previous findings (25), as chirp and SS FRFs were sensitive to stimulus velocity (SI Appendix, Figs. S3 and S4). Altogether, these results show that head movements have a significant influence on downstream flight motor centers.
To further characterize the influence of head movements on wing steering responses, we computed the coherence between stimulus and head or wing steering responses for both head-free and head-fixed flies. Coherence is a measure of power transfer between input and output signals and can be used to estimate correlation between signals in the frequency domain (see Materials and Methods for an explanation of coherence). Interestingly, coherence was much higher than coherence, suggesting that head movements are more linearly correlated with the stimulus than ΔWBA (Fig. 2C). Head movements significantly increased coherence compared to the head-fixed case. Head fixation approximately halved wing steering response coherence with the stimulus (Fig. 2C). coherence was marginally higher than coherence, suggesting that wing steering responses are more correlated with head movements than the stimulus itself. These results were consistent across chirp and SS stimuli (SI Appendix, Fig. S6 A and B). To further establish the relationship between head and wing steering responses, we computed head–wing coherence under a moving and static (stationary) background with the same spatial wavelength (30°). In the presence of a static background, flies move their head and steer their body spontaneously (12). Overall, a moving stimulus significantly increased head–wing coherence at frequencies present within the SOS stimulus, suggesting that visual motion facilitates more correlated head–wing motion (Fig. 2D). Additionally, the variability in head–wing coherence was much greater during the presence of a static background, pointing to strong head–wing coupling in the presence of wide-field motion. Therefore, a moving stimulus can significantly increase the coupling between the head and wing responses. This suggests that the role of the head and wing movements may be highly task-dependent.
To determine whether the influence of head fixation on wing movements measured in the FRFs could be due to a saturation nonlinearity as a result of a change in the stimulus range—that is, when the head is fixed the stimulus is perceived to move at higher amplitude compared to the head-free case—we replayed the error signal from a representative head-free experiment to head-fixed flies (Materials and Methods, Fig. 2A, and SI Appendix, Fig. S6 C and D). For an LTI system a replay experiment should not change the FRFs between input and output (26). The head-fixed replay FRFs did not differ significantly from head-fixed (Fig. 2B), providing some assurance that the effect of active head movements on wing steering responses is not due to a change in stimulus range. However, since our system is nonlinear—as suggested by its sensitivity to amplitude (SI Appendix, Figs. S3 and S4)—these findings must be interpreted with appropriate caution. Furthermore, for an LTI system FRFs for head-free and head-fixed (replay) experiments should be qualitatively similar. As the head-free gain is approximately twice the head-fixed (replay) gain (Fig. 2B), these results further point to a nonlinearity due to head movements, that is, strong head–wing coordination that is abolished when the head is fixed.
Wing Steering Responses Lag Head Movements at a Fixed Time Interval.
To further investigate the influence of the head on wing motor output, we quantified the FRF from head to wing steering responses (Fig. 3A). The nonzero gain of this FRF further supports the notion that the head and wings are strongly coupled. The flat gain curve indicates that the head and wings have similar tuning across frequencies (Fig. 3A). However, although both the head and wings showed a similar phase response, the wings consistently lagged the head (Fig. 3A and SI Appendix, Fig. S7A). For the visuomotor reflex, this timing is different from volitional, goal-directed flight maneuvers in other insects such as saccades in blowflies that showed that thorax rotation initiates before head rotation (27) or wasp roll maneuvers where head roll rotations occur at the same time as body movements (28). We confirmed this time lag by cross-correlating the head and wings signals for SOS experiments in the time domain (Fig. 3B). Cross-correlation yielded a time difference that is ∼30 ms when pooled across all frequencies, which is within the range of previously measured visuomotor delays in Drosophila (29). This result suggests that for the same visual stimulus, head and wing motor outputs have different delays. The lag between the head and the wing motor responses can likely be attributed to differences in sensorimotor delays and delays due to inertia and damping, which together make up the total delay (Fig. 3A). Differences in the transmission distance, the number of synapse crossings, and muscle activation as well as differences in inertia and damping of the head and wings would influence their respective total delay. Furthermore, the variability in phase pooled across frequencies was much greater for wing steering responses than head movements, suggesting that the head can compensate for the stimulus more reliably (Fig. 3C). Taken together, the timing difference between head movements and wing steering responses as well as the influence of head fixation on wing coherence suggests a temporal order. The temporal order is such that the stimulus S causes head movements H which in turn influence wing steering responses W.
Fig. 3.
Head movements lead wing steering responses and increase flight power output. (A) Mean head → wing FRF computed with head movements as the input and wing steering responses (ΔWBA) as the output for SOS experiments. (A, Top) The gain curve shows the relative tuning of the head and wings, which is similar across all frequencies. (A, Middle) The phase curve shows the difference in timing between head movements and wing steering responses, where the wings consistently lag the head. (A, Bottom) Time difference between stimulus and head or wings. Shaded region: ±1 SE. (B) Normalized cross-correlation between the head and wings for SOS experiments. Black line: mean across all trials and animals. Gray area: ±1 STD. (C) Histogram of phase SD for head movements and wing steering responses for SOS experiments. SD was pooled across all five frequencies. (D) Comparison of WBF for head-free (green) and head-fixed (red) flies for SOS experiments. (E, Left) Sum of all video frames in a head-free and head-fixed trial for different flies. (E, Right) Same as D but for thrust . For A–E, n = 10 head-free flies and n = 9 head-fixed flies. ***P < 0.001. For boxplots, the central line is the median, the bottom and top edges of the box are the 25th and 75th percentiles, and the whiskers extend to ±2.7 SD.
Head Fixation Decreases Flight Mechanical Power.
To assess the effect of head movements on flight mechanical power in Drosophila (30), we compared the wingbeat frequency (WBF) and the generated thrust by summing left and right WBAs (∑WBA) between head-free and head-fixed flies. Head-fixed flies showed a small but nonetheless significant decrease in these two metrics for both SOS and chirp stimuli (Fig. 3 D and E and SI Appendix, Fig. S7B). These results are consistent with the previously measured attenuation of the wing impulse response function (15) and the decrease of ∑WBA in head-fixed Drosophila (18) when presented with wide-field motion stimuli. WBF and ∑WBA can be used to approximate mechanical power output in tethered flight, as changes in WBA and WBF are linearly correlated with wing velocity (30). As the total aerodynamic force generated by a wing is proportional to the square of the velocity, by multiplying WBF and ∑WBA we can estimate the average angular velocity of a wing during a stroke cycle and thus estimate mechanical power as (∑WBA × WBF)3 (25, 30). This method provides a reasonable approximation to assess relative changes in quasi-steady mechanical power—but not absolute power—between head-free and head-fixed conditions. Using median values for WBF and ∑WBA yielded a 27% decrease in mechanical power output due to head fixation. This result suggests that the head has a strong influence on a proxy for tethered flight mechanical power.
Head Movements Map the Visual Input onto the Sensitivity Optimum of the Motion Vision Pathway.
To explore the influence of head movements on visual processing, we simulated the output of a Hassenstein–Reichardt elementary motion detector (EMD) under head-fixed and head-free conditions (31) (Fig. 4 A and B). Classically, models of the Drosophila yaw optomotor response have assumed that the photoreceptor array is fixed (31). However, the head can partially stabilize gaze (15, 18) (Fig. 1), effectively reducing the retinal slip of the stimulus within a certain range of speeds. We predicted that compensatory head yaw movements would shift the EMD output range, as shown for head roll movements in hawkmoths (10). To quantify this, we simulated the same EMD model as in the head-fixed case but included head movements, which generated an error signal (stimulus–head movements) feeding into the EMD model (Fig. 4B).
Fig. 4.
Head movements map the input dynamic range onto the sensitivity optimum of the motion vision pathway. (A) Model of the eye. The visual stimulus rotates about the eye. For the head-free simulation, the eye moves according to the experimental head response. Each photoreceptor is modeled by a Gaussian kernel defined by an acceptance angle (Inset), spaced by an interommatidial angle . The stimulus grating has the same spatial wavelength used in experiments (30°). (B) Simulation of the visual system for head-fixed (Top) and head-free (Bottom) conditions. For the head-free simulation, the error (stimulus−head angle) was fed as the input. (Inset) Hassenstein–Reichardt EMD model. S: spatial filter; PH: photoreceptor filter; LP: temporal first-order, low-pass filter; ×: multiplication; ∑: summation; : interommatidial angle. (C) Simulated EMD output for head-fixed (black) and head-free (blue) conditions. The head-free simulation was generated from the experimental head gain and phase from pure sinusoids experiments reported in SI Appendix, Fig. S3B (frequencies = 0.5, 1, 2, 3.5, 6.5, and 12 Hz) at an intermediate amplitude of 15°. (D) Phase of the head-fixed (black) and head-free (blue) simulations. Magenta: experimental head phase from SS stimuli (n = 10 flies; same as SI Appendix, Fig. S3B).
We first simulated the head-fixed system using relevant parameters for the visual system of Drosophila (Materials and Methods) and by feeding 15°-amplitude sinusoids—corresponding to our experimental data (SI Appendix, Fig. S3B)—into the EMD model (32). As the EMD model is nonlinear, we reconstructed the EMD output using sinusoid fits (10, 32) (SI Appendix, Fig. S8A). Using these methods, we recapitulated the classic EMD output for Drosophila with a peak near a temporal frequency of 1 Hz (Fig. 4C) (33). In comparison, simulation of the head-free system revealed that the EMD output as a function of stimulus temporal frequency shifted toward higher speeds and was broader than the head-fixed EMD output. These results are consistent with the notion that head movements slow down the visual input stimulus (Fig. 1D). To determine whether a nonlinearity due to an amplitude change of the sine inputs for head-fixed vs. head-free conditions would influence our conclusion, we compared the head-fixed and head-free gain (SI Appendix, Fig. S8B). This nonlinearity had little influence on our conclusion. Furthermore, our conclusion was not sensitive to the choice of first-order low-pass filter delay within the EMD model or to removing the temporal filter properties of the photoreceptors, although these parameters had some influence on gain and phase (SI Appendix, Fig. S8 C and D). While the differences in head-fixed and head-free EMD outputs resemble a change in internal parameter of the EMD, i.e., changing low-pass filter delay (SI Appendix, Fig. S8C), the internal parameters of the EMD are the same in both simulations; what has changed is the application of a head filter in the head-free simulation. Interestingly, head movements appear to map the dynamic range of the visual stimulus onto the sensitivity optimum of the motion vision pathway (e.g., horizontal system cells) of flies during active locomotion (34, 35).
Phase is another important indicator of how well flies can stabilize a visual stimulus. We computed the EMD output phase for the head-fixed simulation. In this condition, the EMD output consistently lagged the stimulus (Fig. 4D), as would be characteristic of a system containing a low-pass filter. The head-free EMD output lagged the input slightly more (Fig. 4D). Even though the experimental head filter has a positive phase relative to the stimulus at low frequencies (Fig. 2B), subtraction of head movements with the stimulus generated a greater phase lag due to sine wave interference. The simulated EMD output phases greatly lagged the experimentally measured head response phase (Fig. 4D). One possible explanation for the discrepancy between simulation and experiment is that the simulation does not consider the full closed-loop dynamics of the experimental head-free system. Indeed, the addition of a controller in closed-loop can shift the phase of a phase-lagging, open-loop system (36). Closing the loop would therefore seem to have a transformative effect on visuomotor processing, decreasing the overall phase of the head optomotor response.
Discussion
We discovered that head movements in flies shape and strongly coordinate wing steering responses (Fig. 5). Head movements shifted the dynamic range of the input stimulus to better match the operating range of the motion vision pathway. This shaping of the visual stimulus by head movements concomitantly increased the gain of the wing steering responses. Furthermore, head movements increased stimulus–wing coordination and led the wing steering response in time by more than 30 ms, thereby establishing a temporal order between head movements and wing steering responses (Fig. 5). Wing steering efforts were to some extent more correlated with head movements than the visual stimulus, suggesting that head movements are a better predictor of wing movements during gaze stabilization. Through EMD simulations, we showed that head movements reduce the effective input dynamic range, mapping visual inputs onto the sensitivity optimum of the motion vision pathway. Altogether, our results demonstrate a transformative influence of head movements on flight motor control in insect flight.
Fig. 5.
Summary of findings. Head movements shape and coordinate wing steering responses. The head loop generates an error signal which coordinates head and wing steering responses. Because of distinct delays between the stimulus, head, and wings there is a temporal order in the yaw gaze stabilization reflex in tethered flight. The head filter (magnitude vs. velocity, Inset) is modified from Fig. 1D. As an example, the input is an SOS stimulus as shown in Fig. 1B.
Active Visual Sensing Shapes Downstream Motor Control.
Flies can move their head about yaw, pitch, and roll axes while steering their body in flight. Seminal work by Hengstenberg (8) in blowflies demonstrated the critical importance of head roll movements in stabilizing external visual motion and perturbations of the whole body, pointing to rapid visuomotor (inner-loop) reflexes. Whereas head roll movements appear primarily reflexive to support rapid stabilization in a number of insects including locusts (37, 38), flies (22, 39), and dragonflies (40), head yaw movements in Drosophila can be controlled independently of stabilization reflexes—for example to track a visual feature—suggesting that there are multiple pathways involved in the control of head yaw motion (15). For instance, using compound figure and background stimuli, previous work demonstrated that head movements are necessary for stabilizing wide-field background motion and that wing steering responses can be uncoupled from head movements (15). Therefore, it would seem that to maintain a level gaze head roll and pitch are critical, whereas head yaw movements may be more related to general orientation and guidance. In Drosophila, motion of a visual panorama about yaw elicits a head optomotor response, which is influenced by both visual pattern temporal frequency and contrast (15, 18). Previous measurements of wing steering responses following head fixation showed a small, nonsignificant decrease in response magnitude over a range of visual contrasts (18).
Here, we show that closed-loop head yaw movements stabilize gaze at low to intermediate velocities, which in turn modifies the stimulus-to-wing gain and the correlation between stimulus and wing steering responses (Figs. 1 C and D and 2 B and C). Our results suggest that Drosophila can reduce retinal slip via head movements in yaw by up to 60% (Fig. 1E). Furthermore, head movements coordinated wing steering responses with the visual motion stimulus (Fig. 2C). Interestingly, visual motion entrained head movements and wing steering responses (Fig. 2D), indicative of a strong coupling between head movements and wing steering responses. Fixing the head decreased the correlation between the stimulus and wing steering responses (Fig. 2C) and decreased flight mechanical power (Fig. 3 D and E).
In studies employing virtual reality rigid tether paradigms, a common practice is to fix the head within a stationary frame of reference, thereby opening the loop between visual motion and head movements (Fig. 2A). For insect visual physiology, head fixation is a prerequisite to maintain stable conditions for electro- or optophysiological preparations. Head fixation, however, abolishes natural closed-loop head movements, which has important consequences for understanding the functional principles underlying gaze stabilization. For instance, in moths, head roll movements can reduce the effective range of input velocities nearly fourfold compared to the head-fixed case (10). Our results provide an opportunity for beginning to study comparative head control across flying insects, but our work goes beyond previous work by showing how head movements directly influence downstream wing steering responses.
Our results may provide an explanation for conflicting head-free and head-fixed results in the literature. For instance, a study recently showed that head-fixed Drosophila will take off rarely and have degraded optomotor responses in free flight (41). Our results showing that head-fixed flies have mechanical power reduced by 27% could explain the low take-off frequency of head-fixed flies (Fig. 3 D and E). Similarly, the degraded optomotor responses could be due to disruption of the coordination between head and wing movements (Fig. 2). Interestingly, the same study showed that tethered, head-fixed flies have optomotor responses similar to those of tethered, head-free flies (41), therefore not corroborating their free flight results. This discrepancy between free and tethered flight is somewhat puzzling as we discovered that head movements in tethered flight have a strong influence on wing steering responses and stimulus–wing coordination (Figs. 2 and 3). One possible explanation for this discrepancy is that the constant-velocity ramp stimulus used to study the optomotor response in tethered flight does not sufficiently recapitulate the complex visual dynamics experienced during free flight or the broadband, variable-velocity stimuli set presented here (e.g., SOS). Another possibility is that ramp stimuli cause the head to rapidly reach its anatomical (saturation) limit due to its limited angular range about yaw (∼±15°). Therefore, while tethered flight experiments can generate novel insights into head-mediated flight stabilization mechanisms, these insights may be influenced by the choice of visual stimuli.
Gaze stabilization is an inherently multisensory reflex, combining both mechanosensory and visual information (8, 9). While we have focused on vision, mechanosensory information can also modulate the head–wing system. For instance, haltere stimulation can induce roll movements of the head in less than 5 ms (8, 42). System identification experiments in blow flies showed that the halteres influence the gain and phase of the head response at the higher frequencies for body roll perturbations (9, 22). During such body roll perturbations, halteres trigger fast compensatory head roll movements, reducing retinal slip speed and thereby shifting the visual input into the operating range of the motion vision pathway (22). Intriguingly, the combination of halteres (feedforward) and visual (feedback) signals appears to serve the same function as the head yaw movements described here, that is, to shift visual inputs in the operating range of the motion vision pathway. At present, it is unclear how much the halteres influence head and wing yaw responses as halteres appear least sensitive to yaw movements of the body (24), although intriguingly haltere inputs can influence both head and wing visuomotor responses in rigidly tethered flight (43, 44).
Volitional and Reflexive Gaze Stabilization.
Whereas our study has focused on reflexive, inner-loop gaze stabilization due to externally generated visual stimuli, flies also volitionally orient in flight (goal-directed or outer-loop) (9). A limitation of tethered flight paradigms is that volitional maneuvers are often more difficult to study and are often exaggerated due to modification of feedback loops (21). Indeed, goal-directed behavior such as target tracking must alter inner-loop equilibrium reflexes to generate a coherent behavioral output (9). While we have identified a temporal order in head–wing reflexive gaze stabilization, it remains unclear whether head yaw movements precede body yaw rotations during nonsaccadic, volitional flight maneuvers. Interestingly, for head roll control in freely flying wasps, the delay between head and body movements appears to be near zero (28). One possible explanation for the near-zero delay is that wasps use a feedforward component in which the head compensates for volitional rotation of the body; however, free-flight video recordings at high frame rates will be necessary to establish whether this is a principle across flying insects. Feedforward mechanisms are likely critical for volitional flight maneuvers in insects, as they are for vertebrate gaze stabilization reflexes (1).
Head and Body Motor Coordination.
Studying flight with the body rigidly tethered enabled us to parameterize stimuli for the analysis of head movements and wing steering responses, but this paradigm also raises important questions about the influence of constraining body dynamics. Indeed, in free flight the head and body can move independently and together influence gaze. As the head can respond more than three times faster than wing steering responses (Fig. 3A), in free flight the head may dominate rapid gaze stabilization tasks, but the body could also generate slower body movements to redirect gaze, particularly as the head provides a limited range of motion about yaw. One interesting possibility is that the head is recruited for small-amplitude visual stabilization tasks but that larger-amplitude visual stimuli require coordinated head and body movements. Revealing the mechanisms that coordinate head and body movements will require experimental paradigms with more natural conditions. While ΔWBA can be used as a rough proxy for body movements as it scales with body torque (45), ΔWBA signals are exaggerated in tethered flight and therefore must be interpreted with appropriate caution (46). Furthermore, it is also possible that head yaw movement amplitude is exaggerated in tethered flight. Nonetheless, our results suggest that head movements should be coordinated with body movements in free flight. This prediction is consistent with work showing that externally imposed yaw movements of the head generate scaled body torque (47). In free flight both head and body movements could realign via proprioceptive feedback at the neck–thorax joint (e.g., via prosternal organs) (8).
Gaze stabilization in humans and nonhuman primate vision is enabled by rapid eye reflexes that may be analogous to the fly head movements described here. For instance, the eye optokinetic response (OKN) is functionally analogous to the fly head compensatory response, both serving to minimize retinal slip during wide-field motion (1). However, the human OKN has a longer delay (∼50 to 100 ms) and is tuned to much lower velocities (up to ∼60°⋅s−1) than the fly head (Fig. 3 A and B) (48). In humans, the slower OKN complements the faster vestibulo-ocular reflex (VOR), which counterrotates the eyes following head movement to keep gaze fixed. Here, fly head and wing movements together do not resemble the VOR in function as the head and wings generate a codirectional turning effort which could align the fly body with the head following head movements (Fig. 1B).
Neural Evidence for Head and Wing Coupling.
Here we reported a strong behavioral coordination between head and wing yaw movements, which is broadly consistent with reported neuroanatomical and physiological data in flies. The output of lobula plate tangential cells (LPTCs) is connected to the thoracic motor center via descending neurons (DNs) running through the ventral nerve cord (VNC) (49). The first and second thoracic motor centers (T1 and T2) include distinct neck and wing motor neuropils, respectively (49). LTPCs also project directly to neck motor neurons, bypassing the VNC, and share similar optic flow tuning properties (50, 51). Interestingly, many dorsal DNs respond to both small- and wide-field motion moving horizontally or vertically and terminate in areas that innervate neck and wing motor neurons (52). Three different DNs that innervate neck and wing neuropils are similarly tuned to different axes of wide-field optic flow (17). Genetic silencing of horizontal system (HS) and vertical system (VS) cells weakens head movements but appears to have only a small effect on wing movements, suggesting that other lobula plate neurons or parallel visual pathways may drive wing steering responses (19). Nonetheless, there is strong evidence that HS and VS cells are important to both head and wing motor control (17, 51, 53–55). As wing steering responses were more correlated with head movements than the stimulus itself (Fig. 2C), our results suggest that the same upstream neural signals may control both head and wing movements during the visual task studied here.
Phase Leading at Low Frequencies Suggests Velocity-Based Control.
We found that the head, and to a lesser extent the wings, led the stimulus at low frequencies, suggesting a predictive or velocity-based component in the response (Fig. 2). This result was corroborated for multiple stimuli in both time and frequency domain (SI Appendix, Figs. S3 and S4). Phase-leading behavior has been observed in previous flight experiments for wings in an object tracking task (26), wide-field stabilization in head-fixed (56) and head-free flies (25), and tethered hawkmoth visuomotor responses (57). For LTI control systems, a phase advanced response is generally indicative of a predictive element within the controller, such as the presence of derivative (D) or velocity-based control (e.g., high-pass filter). Velocity-based control is consistent with the notion that neurons thought to mediate the optomotor reflex in the lobula plate are themselves sensitive to velocity (58). Derivative or feedforward control elements could potentially explain how flies perform such rapid compensatory head movements (Fig. 2B).
Integrating Hybrid Head Dynamics.
The system identification tools used here work well with signals that are continuous. However, flight is composed of smooth and saccadic movements, of both the body and the head (18, 59), which together compose a hybrid system (59). Saccades manifest as rapid shifts in body and/or head orientation. Although visual influences on flight saccade dynamics and control have been described (59, 60), less is known about how head saccades influence smooth gaze stabilization. Previous work showed that the head tends to “reset” gaze, rapidly moving the head back to its starting point during the optomotor response, in a way analogous to optokinetic nystagmus in primate vision (5, 18).
Simulation of an EMD with Compensatory Head Movements Is Consistent with Behavioral Tuning during Active Locomotion.
Our results show that when head movements are considered, the EMD simulation output peak and range more than double, suggesting that compensatory head movements can significantly reduce the effective range of input velocities in Drosophila (Fig. 4C). The classic EMD model predicts response properties of LTPCs, which exhibit a temporal frequency peak near 1 Hz in quiescent flies (61). However, changes in locomotor state modulates the response of LPTCs (62, 63), such that during active locomotion the peak in LPTC tuning shifts to 3 to 7 Hz (34, 35). Intriguingly, the behavioral tuning during active locomotion overlaps with the tuning of the head-free simulation (Fig. 4C), suggesting that head movements map visual inputs onto the sensitivity optimum of the motion vision pathway. Fascinatingly, with head movements, flies could limit the dynamic range of the visual input within the velocity range with the highest signal-to-noise ratio of LPTC neurons (64). Altogether, head movements shaping visual input dynamics must be considered in models of visuomotor control in fly flight.
Materials and Methods
Control Theoretic Framework.
Here we model the fly neuromechanics using a control theoretic framework and quantify the flow of information from stimulus to behavior via FRFs. Individual FRFs represent describing functions of a nonlinear system (36). We focus on frequency domain representation of the different sensorimotor systems and omit the complex frequency s when reporting individual system variables. For all cases, the gaze stabilization response is modeled as a smooth pursuit system. Although previous work has shown the presence of head and body saccades in gaze stabilization tasks (59, 65), these hybrid dynamics do not play a large role in the confines of our framework as we limit the amplitude of our stimuli to ∼±15° in yaw, near the maximum head yaw angular position in Drosophila (15). In the rigid tether paradigm, the body control torque is proportional to the difference between left and right WBA (ΔWBA) (45). The physical unit of ΔWBA has little meaning because we do not know how ΔWBA contributes to gaze stabilization. Within this framework, we use two metrics to quantify input/output relationships: gain and phase, which define the FRF of a transform between two signals. Here we use “transform” rather than “transfer function” since our system is nonlinear (SI Appendix, Figs. S2–S4).
In the case of head fixation (open-loop), we can model the flow of information from visual stimulus (S) to ΔWBA (W) as , which is equal to since the head is fixed, where is the error going into the visual system (Fig. 2A). Whereas error is typically defined as velocity error (retinal slip) in the vision literature (units of degrees per second), for frequency domain analysis position and velocity error signals yield similar FRFs as the time-derivative, complex frequency terms cancel out for FRF calculation (although taking the time derivative of position introduces additional noise in the velocity-based FRFs). For the head-fixed case, both body and head dynamics are restricted, so the fly has no control over the error entering the visual system. The resulting open-loop transform is simply the multiplication of individual systems,
[3] |
where V is the visual system, C is the neural controller (“brain”), and M is the mechanics. In the case of head fixation, the mechanics would include the aerodynamics of flapping flight since both the head and body are fixed.
When the head is free to move (henceforth referred to as “closed-loop” since head movements influence or feed back onto visual motion sensing; Fig. 2A), the system now has two outputs: head movements H and wing steering responses W. The error E is measured by the visual system V then sent to the brain C, which determines how to split, or demultiplex, control effort between the neck and wings. Head movements H shape the visual stimulus S into E, generating feedback (closed-loop), whereas W does not feed back (open-loop). Since H feeds backs, we can no longer assume that , but we can derive a series of transforms for the closed-loop SIMO system. For , the corresponding gain is a dimensionless quantity that can be interpreted physically. The closed-loop gain represents the portion of the visual stimulus compensated by the head as
[4] |
where represents the head mechanics and wing aerodynamics which act as a filter in the sensorimotor loop. This transform shows mathematically that head movements have a transformative effect on the sensorimotor system.
For the wings, the open-loop transform between error and ΔWBA is
[5] |
where M′ represent the addition of head mechanics. The wing response is a control signal while the head response is a control output, and thus without the transform from wings to body state the wing response alone does not reveal the body’s contribution to compensation, only the relative tuning across frequencies. As a result, comparing absolute gain magnitude between and is not possible.
Frequency Response Function Calculations.
The input and output signals were transformed to the frequency domain using a discrete Fourier transform (MATLAB fft function). The magnitude and phase of the complex frequency domain signals were extracted using the MATLAB functions abs and angle, respectively. Gain was computed by taking the ratio of the output magnitude over the input magnitude and phase difference was computed by subtracting the input phase from the output phase. We computed an FRF for every trial, averaging by animal, and then averaging all of the individual animal averages to determine the grand mean response. In the case of phase, we computed circular mean and variance. Averaging gain and phase data separately (Fig. 2) or together in the complex plane (SI Appendix, Fig. S2) provided similar results and thereby did not change our conclusions, as also shown in system identification studies in moths (66). For frequency domain analysis of data associated with chirp experiments, the gain and phase were extracted from the complex numbers at each frequency, for each trial. The 20-s chirp experiments yielded a frequency resolution of 0.05 Hz. Because chirp experiments are inherently noisier than SS and SOS, we averaged the complex FFT in equally spaced 0.4-Hz bins. The gain and phase were calculated for each bin. In the case of unequal sampling rates between the camera (200 Hz) and wingbeat analyzer (5,000 Hz), the signal with the higher sampling rate was downsampled and synced to the time of the signal with the lower sampling rate.
Coherence.
The coherence of two time-varying signals and is defined as
[6] |
where and are the autospectral densities of and , respectively, and is the cross-spectral density between and . All terms are a function of frequency . Coherence estimates the linear correlation across frequencies between input and output where . indicates a high spectral power transfer, where indicates no apparent spectral power transfer. Coherence is influenced by noise, nonlinearities between input and output and unaccounted dynamics, for example multiple inputs influencing a single output. High coherence implies linearity, but it is not a sufficient condition to establish linearity in a system: Other properties such as scaling and superposition must also be examined (23).
Animal Preparation.
We prepared the animals for each experiment according to a protocol that has been previously described (67). Briefly, we cold-anesthetized 3- to 5-d-old females flies by cooling them on a Peltier stage maintained at ∼4 °C. We fixed flies to tungsten pins (A-M Systems) by applying a small drop of ultraviolet (UV)-activated glue (XUVG-1; Newall) to the thorax. The pin was placed on the thorax projecting forward at an angle of ∼90°. After a minimum of 30-min recovery, flies were used for experiments. For head-fixed flies, a small drop of UV-activated glue was placed between the thorax and head. Flies were examined under a microscope to ensure the head was properly restrained.
Flight Arena.
The rigid-tether arena has been described elsewhere (67). Briefly, the display consists of an array of 96 × 32 light-emitting diodes (LEDs, each subtending 3.75° on the eye) that wrap around the fly, subtending 360° horizontally and 112° vertically (Fig. 1A). Flies were placed inside the arena and illuminated from above with a single 940-nm LED (Fig. 1A). The wingbeat analyzer has been described elsewhere in detail (68). Briefly, the shadow of each wing was projected onto two separate photodiodes. The wingbeat analyzer conditioned photodiode signals proportional to the shadow size of each wing. The voltages were proportional to the left and right WBA, which can be used to estimate ΔWBA. We recorded videos of the fly from above at 200 frames per second with an infrared-sensitive camera (acA640-120gm; Basler).
Experimental Protocol.
After preparation, all flies were attached to a rigid rod and placed in the arena at an angle near the natural flight angle (∼45° in pitch). Flies were then put through a closed-loop bar fixation period of approximately 10 s to confirm that the wings were being accurately measured and that the flies were not visually impaired. Any flies that failed to fixate the bar were not used for experiments. All experimental trials were interspersed with closed-loop bar fixation bouts of 5 s to keep flies motivated throughout the experiment. During trials, video frames were collected using an external trigger and synced to the start of the visual motion. We acquired signals (wingbeat analyzer ΔWBA and camera trigger) at 5 kHz.
Data Analysis.
Our recorded images allowed for custom computer-vision software (MATLAB; MathWorks) to track the angle of the head relative to the body, as well as measure left and right WBA in degrees (17). In short, we manually defined the head rotation point on the neck and the initial angle of the head. Then, we automatically tracked features on the fly head (base of the antennae) and measured the change in angle of the feature relative to the rotation point for every frame. To measure the median left and right WBA of a trial, we applied a median filter in time for all frames and fit a sigmoid function to the intensity gradient of each wing. For system identification experiments involving the wings, we measured the left and right WBA using the wingbeat analyzer described above. The wingbeat analyzer enables measurements of the WBA at each dorsal flip, therefore providing a measurement during each wing stroke. Therefore, the wingbeat analyzer provided a higher temporal resolution than the video-based system. For some trials, we also recorded ΔWBA using a video-based system (Movie S1).
Visual Stimuli.
Previous work showed that when background and foreground object movement are present simultaneously, the head primarily compensates for background motion whereas the wings (ΔWBA) primarily track the object (15). Alternatively, when background or object movement are present individually, head movements will track either object or background motion. To generalize our framework for independent background movements, we chose to move a visual pattern consisting of a vertical grating of on–off bars with a spatial wavelength of 30°. The lit pixels had a luminance of 72 cd⋅m−2, whereas the off pixels had effectively zero luminance, corresponding to a Michelson contrast near 100% (67). The pattern was presented in three different forms: SS, SOS, and frequency ramps (chirps), all oscillating about the yaw axis within the head’s constrained range of motion. All three stimuli have qualities that provide insights into the tuning of the system but also come with certain limitations.
SS stimuli, which consist of fixed-frequency sinusoids, yield a gain and phase at a single frequency with a high signal-to-noise ratio. We constructed a SS FRF by extracting gain and phase over a range of frequencies. Chirp stimuli, which consist of a signal with time-varying frequency, provide an efficient method to measure the FRF over a wide frequency range but with a lower signal-to-noise ratio. The SOS stimuli, which consist of the sum of multiple fixed-frequency sinusoids, yield FRFs at distinct frequencies. The SOS individual sinusoids were normalized to have similar peak velocities, which allowed the FRF to be computed while remaining in the velocity bandwidth of the fly’s visual system (up to ∼300°⋅s−1) (24). It was not possible to achieve this with chirps or SS, because of limitations of both the fly mechanics (head range of motion limited to ±15°) and experimental apparatus (LED arena has a resolution of 3.75° per pixel). With these constraints, the maximum frequency we could achieve with chirps and SS stimuli while maintaining a velocity within the visual bandwidth was limited to ∼12 Hz.
SS stimuli consisted of six logarithmically spaced frequencies from 0.5 to 12 Hz. Chirp stimuli consisted of a logarithmic, upswept-frequency cosine spanning 0.1 to 12 Hz. Both SS and chirps were presented at four constant amplitudes, corresponding to two, three, four, and five pixels on our LED array, or 7.5, 11.25, 15, and 18.75°, respectively. SOS stimuli consisted of five linearly spaced frequencies from 1 to 9.6 Hz, with individual sinusoids bounded by a peak velocity between 35 and 100°⋅s−1 (SI Appendix, Fig. S5A). We chose these bounds because flies had very low stimulus–head and stimulus–ΔWBA coherence below 1 Hz (SI Appendix, Fig. S6 A and B) and the phase was extremely noisy below 1 Hz. Individual frequencies were selected to avoid interference from harmonics (23). The SOS was bounded by an angular velocity of 300°⋅s−1, which is near the peak yaw wing steering response of Drosophila (24).
Replay Experiment.
To replay visual inputs filtered by head movements to head-fixed flies, we subtracted head movements from the visual stimulus. This recreated the visual input (error) experienced by head-free flies. We computed the mean and SD of the time domain error response for all flies and all trials for ±15° SOS experiments (SI Appendix, Fig. S6 C and D). We chose the trial that was closest to the mean error for the duration of the experiment. This trial was replayed to naïve, head-fixed flies.
EMD Model and Simulation.
We implemented a Hassenstein–Reichardt EMD model to simulate the visual system (Fig. 4 A and B) (69, 70). We used an image I consisting of a square wave grating with a spatial wavelength of 30°, corresponding to our experimental stimulus (Fig. 4A). To simulate Drosophila ommatidia optics, we spatially filtered the image I using the Gaussian kernel
[7] |
where is the acceptance angle of the Drosophila photoreceptor and θ is the array of photoreceptor angular positions around the virtual “eye” defined from the visual midline (Fig. 4A) (71). We used an interommatidial angle , consistent with the average interommatidial angle along the azimuth at the level of the eye equator in Drosophila (72). The size of the photoreceptor array is defined by the angular span divided by interommatidial angle, in this case yielding 80 photoreceptors wrapped around the eye (Fig. 4A). We convolved the image I with L. For each photoreceptor, the spatially filtered image was then convolved with photoreceptor temporal dynamics (T). The dynamics of this filter have been measured in the form of its impulse response function given by
[8] |
where is the time to peak, is the width of the response for Drosophila, and is the discrete time (69, 73). The signal was then passed through a first-order low-pass filter (D) of the form
[9] |
where is the time constant for Drosophila (69). The output of two adjacent photoreceptors A and B was defined as
[10] |
where and are the outputs of the two photoreceptors and and are the outputs of the same photoreceptors delayed by the first-order filter. We computed the EMD response by summing across all simulated photoreceptors. We fed ±15°-amplitude sine inputs to the EMD model, identical to the input used in our experiments. Because we simulated sinusoid inputs entering the EMD, which is itself a nonlinear model, we reconstructed the output using best-fit approximations (10, 32) (SI Appendix, Fig. S8A). Specifically, after passing sinusoidal visual motion through the EMD model, the output was fit with a sinusoid with the same frequency as the input (SI Appendix, Fig. S8A). For each simulation, the gain and phase were extracted and a R2 value was computed to quantify goodness of fit. While canonical EMD models use image temporal frequency along the x axis to display tuning curves, sine inputs have time-varying temporal frequency. Thus, we plot the mean temporal frequency for each sine wave.
Supplementary Material
Acknowledgments
We thank two anonymous referees, Mark Frye, and Wael Salem for comments on the manuscript. This material is based on work supported by the Air Force Office of Scientific Research under Award FA9550-20-1-0084.
Footnotes
The authors declare no competing interest.
This article is a PNAS Direct Submission.
This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.1920846117/-/DCSupplemental.
Data Availability.
All code for data analysis is available on our laboratory GitHub repository (https://github.com/bmslpsu). All data are available in The Pennsylvania State University ScholarSphere public data repository (https://doi.org/10.26207/gvzr-5409).
References
- 1.Angelaki D. E., Cullen K. E., Vestibular system: The many facets of a multimodal sense. Annu. Rev. Neurosci. 31, 125–150 (2008). [DOI] [PubMed] [Google Scholar]
- 2.Nelson M. E., MacIver M. A., Sensory acquisition in active sensing systems. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 192, 573–586 (2006). [DOI] [PubMed] [Google Scholar]
- 3.Land M. F., Head movement of flies during visually guided flight. Nature 243, 299–300 (1973). [Google Scholar]
- 4.Land M. F., Motion and vision: Why animals move their eyes. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 185, 341–352 (1999). [DOI] [PubMed] [Google Scholar]
- 5.Land M. F., Nilsson D.-E., Animal Eyes (Oxford University Press, ed. 2, 2012). [Google Scholar]
- 6.Hengstenberg R., “Eye movements in the housefly musca domestica” in Information Processing in the Visual Systems of Anthropods, Wehner R., Ed. (Springer, Berlin, 1972), pp. 93–96. [Google Scholar]
- 7.Burtt E. T., Patterson J. A., Internal muscle in the eye of an insect. Nature 228, 183–184 (1970). [DOI] [PubMed] [Google Scholar]
- 8.Hengstenberg R., Multisensory control in insect oculomotor systems. Rev. Oculomot. Res. 5, 285–298 (1993). [PubMed] [Google Scholar]
- 9.Hardcastle B. J., Krapp H. G., Evolution of biological image stabilization. Curr. Biol. 26, R1010–R1021 (2016). [DOI] [PubMed] [Google Scholar]
- 10.Windsor S. P., Taylor G. K., Head movements quadruple the range of speeds encoded by the insect motion vision system in hawkmoths. Proc. Biol. Sci. 284, 20171622 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Reichardt W., Poggio T., Figure-ground discrimination by relative movement in the visual system of the fly: Part I: Experimental results. Biol. Cybern. 35, 81–100 (1979). [Google Scholar]
- 12.Geiger G., Poggio T., On head and body movements of flying flies. Biol. Cybern. 25, 177–180 (1977). [Google Scholar]
- 13.Taylor G. K., Krapp H. G., Sensory systems and flight stability: What do insects measure and why? Adv In Insect Phys 34, 231–316 (2007). [Google Scholar]
- 14.Fox J. L., Aptekar J. W., Zolotova N. M., Shoemaker P. A., Frye M. A., Figure-ground discrimination behavior in Drosophila. I. Spatial organization of wing-steering responses. J. Exp. Biol. 217, 558–569 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Fox J. L., Frye M. A., Figure-ground discrimination behavior in Drosophila. II. Visual influences on head movement behavior. J. Exp. Biol. 217, 570–579 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Hengstenberg R., Gaze control in the blowfly Calliphora: A multisensory, two-stage integration process. Semin. Neurosci. 3, 19–29 (1991). [Google Scholar]
- 17.Suver M. P., Huda A., Iwasaki N., Safarik S., Dickinson M. H., An array of descending visual interneurons encoding self-motion in Drosophila. J. Neurosci. 36, 11768–11780 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Duistermars B. J., Care R. A., Frye M. A., Binocular interactions underlying the classic optomotor responses of flying flies. Front. Behav. Neurosci. 6, 6 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Kim A. J., Fenk L. M., Lyu C., Maimon G., Quantitative predictions orchestrate visual signaling in Drosophila. Cell 168, 280–294.e12 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Kim A. J., Fitzgerald J. K., Maimon G., Cellular evidence for efference copy in Drosophila visuomotor processing. Nat. Neurosci. 18, 1247–1255 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Roth E., Sponberg S., Cowan N. J., A comparative approach to closed-loop computation. Curr. Opin. Neurobiol. 25, 54–62 (2014). [DOI] [PubMed] [Google Scholar]
- 22.Schwyn D. A., et al., Interplay between feedback and feedforward control in fly gaze stabilization. IFAC Proc. Vol. 44, 9674–9679 (2011). [Google Scholar]
- 23.Roth E., Zhuang K., Stamper S. A., Fortune E. S., Cowan N. J., Stimulus predictability mediates a switch in locomotor smooth pursuit performance for Eigenmannia virescens. J. Exp. Biol. 214, 1170–1180 (2011). [DOI] [PubMed] [Google Scholar]
- 24.Sherman A., Dickinson M. H., A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly Drosophila melanogaster. J. Exp. Biol. 206, 295–302 (2003). [DOI] [PubMed] [Google Scholar]
- 25.Duistermars B. J., Reiser M. B., Zhu Y., Frye M. A., Dynamic properties of large-field and small-field optomotor flight responses in Drosophila. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 193, 787–799 (2007). [DOI] [PubMed] [Google Scholar]
- 26.Roth E., Reiser M. B., Dickinson M. H., Cowan N. J., “A task-level model for optomotor yaw regulation in drosophila melanogaster: A frequency-domain system identification approach” in 51st IEEE Conference on Decision and Control (CDC) (IEEE, 2012), pp. 3721–3726. [Google Scholar]
- 27.Schilstra C., van Hateren J. H., Stabilizing gaze in flying blowflies. Nature 395, 654 (1998). [DOI] [PubMed] [Google Scholar]
- 28.Viollet S., Zeil J., Feed-forward and visual feedback control of head roll orientation in wasps (Polistes humilis, Vespidae, Hymenoptera). J. Exp. Biol. 216, 1280–1291 (2013). [DOI] [PubMed] [Google Scholar]
- 29.Elzinga M. J., Dickson W. B., Dickinson M. H., The influence of sensory delay on the yaw dynamics of a flapping insect. J. R. Soc. Interface 9, 1685–1696 (2012). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Lehmann F. O., Dickinson M. H., The changes in power requirements and muscle efficiency during elevated force production in the fruit fly Drosophila melanogaster. J. Exp. Biol. 200, 1133–1143 (1997). [DOI] [PubMed] [Google Scholar]
- 31.Hassenstein B., Reichardt W., Systemtheoretische Analyse der Zeit-, Reihenfolgen- und Vorzeichenauswertung bei der Bewegungsperzeption des Rüsselkäfers Chlorophanus. Z. Naturforsch. B 11, 513–524 (1956). [Google Scholar]
- 32.Egelhaaf M., Reichardt W., Dynamic response properties of movement detectors: Theoretical analysis and electrophysiological investigation in the visual system of the fly. Biol. Cybern. 56, 69–87 (1987). [Google Scholar]
- 33.Borst A., Haag J., Reiff D. F., Fly motion vision. Annu. Rev. Neurosci. 33, 49–70 (2010). [DOI] [PubMed] [Google Scholar]
- 34.Jung S. N., Borst A., Haag J., Flight activity alters velocity tuning of fly motion-sensitive neurons. J. Neurosci. 31, 9231–9237 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Chiappe M. E., Seelig J. D., Reiser M. B., Jayaraman V., Walking modulates speed sensitivity in Drosophila motion vision. Curr. Biol. 20, 1470–1475 (2010). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Aström K. J., Murray R. M., Feedback Systems: An Introduction for Scientists and Engineers (Princeton University Press, 2010). [Google Scholar]
- 37.Miall R. C., Visual control of steering in locust flight: The effects of head movement on responses to roll stimuli. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 166, 735–744 (1990). [Google Scholar]
- 38.Hensler K., Robert D., Compensatory head rolling during corrective flight steering in locusts. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 166, 685–693 (1990). [Google Scholar]
- 39.Hengstenberg R., Sandeman D. C., Hengstenberg B., Compensatory head roll in the blowfly Calliphora during flight. Proc. Biol. Sci. 227, 455–482 (1986). [Google Scholar]
- 40.Mittelstaedt H., Physiologie des Gleichgewichtsinnes bei fliegenden Libellen. Z. vergl. Physiol. 32, 422–463 (1950). [PubMed] [Google Scholar]
- 41.Stowers J. R., et al., Virtual reality for freely moving animals. Nat. Methods 14, 995–1002 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Sandeman D. C., Markl H., Head movements in flies (Calliphora) produced by deflexion of the halteres. J. Exp. Biol. 85, 43–60 (1980). [Google Scholar]
- 43.Mureli S., Fox J. L., Haltere mechanosensory influence on tethered flight behavior in Drosophila. J. Exp. Biol. 218, 2528–2537 (2015). [DOI] [PubMed] [Google Scholar]
- 44.Mureli S., Thanigaivelan I., Schaffer M. L., Fox J. L., Cross-modal influence of mechanosensory input on gaze responses to visual motion in Drosophila. J. Exp. Biol. 220, 2218–2227 (2017). [DOI] [PubMed] [Google Scholar]
- 45.Tammero L. F., Frye M. A., Dickinson M. H., Spatial organization of visuomotor reflexes in Drosophila. J. Exp. Biol. 207, 113–122 (2004). [DOI] [PubMed] [Google Scholar]
- 46.Dickinson M. H., Muijres F. T., The aerodynamics and control of free flight manoeuvres in Drosophila. Philos. Trans. R. Soc. B Biol. Sci. 371, 20150388 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Liske E., The influence of head position on the flight behaviour of the fly. Calliphora erythrocephala. J. Insect Physiol. 23, 375–379 (1977). [Google Scholar]
- 48.Goldberg J. M., et al., The Vestibular System (Oxford University Press, 2012). [Google Scholar]
- 49.Namiki S., Dickinson M. H., Wong A. M., Korff W., Card G. M., The functional organization of descending sensory-motor pathways in Drosophila. eLife 7, e34272 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Strausfeld N. J., Seyan H. S., Milde J. J., Seyan H. S., Strausfeld N. J., The neck motor system of the fly Calliphora erythrocephala. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 160, 205–224 (1987). [Google Scholar]
- 51.Huston S. J., Krapp H. G., Visuomotor transformation in the fly gaze stabilization system. PLoS Biol. 6, e173 (2008). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Gronenberg W., Strausfeld N. J., Descending neurons supplying the neck and flight motor of Diptera: Physiological and anatomical characteristics. J. Comp. Neurol. 302, 973–991 (1990). [DOI] [PubMed] [Google Scholar]
- 53.Haag J., Wertz A., Borst A., Integration of lobula plate output signals by DNOVS1, an identified premotor descending neuron. J. Neurosci. 27, 1992–2000 (2007). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Haikala V., Joesch M., Borst A., Mauss A. S., Optogenetic control of fly optomotor responses. J. Neurosci. 33, 13927–13934 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Wertz A., Haag J., Borst A., Integration of binocular optic flow in cervical neck motor neurons of the fly. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 198, 655–668 (2012). [DOI] [PubMed] [Google Scholar]
- 56.Heisenberg M., Wolf R., Reafferent control of optomotor yaw torque in Drosophila melanogaster. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 163, 373–388 (1988). [Google Scholar]
- 57.Windsor S. P., Bomphrey R. J., Taylor G. K., Vision-based flight control in the hawkmoth Hyles lineata. J. R. Soc. Interface 11, 20130921 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Eckert H., Functional properties of the H1-neurone in the third optic Ganglion of the Blowfly,Phaenicia. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 135, 29–39 (1980). [Google Scholar]
- 59.Mongeau J.-M., Frye M. A., Drosophila spatiotemporally integrates visual signals to control saccades. Curr. Biol. 27, 2901–2914.e2 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Mongeau J.-M., Cheng K. Y., Aptekar J., Frye M. A., Visuomotor strategies for object approach and aversion in Drosophila melanogaster. J. Exp. Biol. 222, jeb193730 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Schnell B., et al., Processing of horizontal optic flow in three visual interneurons of the Drosophila brain. J. Neurophysiol. 103, 1646–1657 (2010). [DOI] [PubMed] [Google Scholar]
- 62.Maimon G., Straw A. D., Dickinson M. H., Active flight increases the gain of visual motion processing in Drosophila. Nat. Neurosci. 13, 393–399 (2010). [DOI] [PubMed] [Google Scholar]
- 63.Longden K. D., Krapp H. G., State-dependent performance of optic-flow processing interneurons. J. Neurophysiol. 102, 3606–3618 (2009). [DOI] [PubMed] [Google Scholar]
- 64.Haag J., Borst A., Encoding of visual motion information and reliability in spiking and graded potential neurons. J. Neurosci. 17, 4809–4819 (1997). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Boeddeker N., Mertes M., Dittmar L., Egelhaaf M., Bumblebee homing: The fine structure of head turning movements. PLoS One 10, e0135020 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Dyhr J. P., Morgansen K. A., Daniel T. L., Cowan N. J., Flexible strategies for flight control: An active role for the abdomen. J. Exp. Biol. 216, 1523–1536 (2013). [DOI] [PubMed] [Google Scholar]
- 67.Reiser M. B., Dickinson M. H., A modular display system for insect behavioral neuroscience. J. Neurosci. Methods 167, 127–139 (2008). [DOI] [PubMed] [Google Scholar]
- 68.Dickinson M. H., Lehmann F. O., Götz K. G., The active control of wing rotation by Drosophila. J. Exp. Biol. 182, 173–189 (1993). [DOI] [PubMed] [Google Scholar]
- 69.Dickson W. B., Straw A. D., Dickinson M. H., Integrative model of Drosophila flight. AIAA J. 46, 2150–2164 (2008). [Google Scholar]
- 70.Tuthill J. C., Chiappe M. E., Reiser M. B., Neural correlates of illusory motion perception in Drosophila. Proc. Natl. Acad. Sci. U.S.A. 108, 9685–9690 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Buchner E., “Behavioural analysis of spatial vision in insects” in Photoreception and Vision in Invertebrates, Ali M., Ed. (Plenum Press, New York, 1984), pp. 561–621. [Google Scholar]
- 72.Gonzalez-Bellido P. T., Wardill T. J., Juusola M., Compound eyes and retinal information processing in miniature dipteran species match their specific ecological demands. Proc. Natl. Acad. Sci. U.S.A. 108, 4224–4229 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Juusola M., Hardie R. C., Light adaptation in Drosophila photoreceptors: I. Response dynamics and signaling efficiency at 25 degrees C. J. Gen. Physiol. 117, 3–25 (2001). [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
All code for data analysis is available on our laboratory GitHub repository (https://github.com/bmslpsu). All data are available in The Pennsylvania State University ScholarSphere public data repository (https://doi.org/10.26207/gvzr-5409).