Skip to main content
Journal of Insect Science logoLink to Journal of Insect Science
. 2021 Nov 26;21(6):14. doi: 10.1093/jisesa/ieab079

Three-Dimensional Tracking of Multiple Small Insects by a Single Camera

Ching-Hsin Chen 1,2, Ann-Shyn Chiang 2,3,4,5,6,7, Hung-Yin Tsai 1,2,
Editor: Yuko Ulrich
PMCID: PMC8633622  PMID: 34850033

Abstract

Many systems to monitor insect behavior have been developed recently. Yet most of these can only detect two-dimensional behavior for convenient analysis and exclude other activities, such as jumping or flying. Therefore, the development of a three-dimensional (3D) monitoring system is necessary to investigate the 3D behavior of insects. In such a system, multiple-camera setups are often used to accomplish this purpose. Here, a system with a single camera for tracking small insects in a 3D space is proposed, eliminating the synchronization problems that typically occur when multiple cameras are instead used. With this setup, two other images are obtained via mirrors fixed at other viewing angles. Using the proposed algorithms, the tracking accuracy of five individual drain flies, Clogmia albipunctata (Williston) (Diptera: Psychodidae), flitting about in a spherical arena (78 mm in diameter) is as high as 98.7%, whereas the accuracy of 10 individuals is 96.3%. With this proposed method, the 3D trajectory monitoring experiments of insects can be performed more efficiently.

Keywords: behavior monitoring, image processing, multiple insects tracking, single camera, three-dimensional analysis/tracking


What triggers a particular behavior? Researchers have a great interest in finding the causes of various behaviors, and experiments to this are often carried out manually. Many quantitative measurement tools have been devised to observe animal behavior, such as for the mouse (Giancardo et al. 2013), housefly (Nasir and Mat 2019), dragonfly (Tofilski 2004), and fruit fly (Martin 2004). Insects are often used in physiology and neuroscience investigations. For instance, Drosophila is a popular model organism because of its short life cycle and inexpensive breeding cost. Numerous systems can monitor and quantitatively analyze the movement of insects in two-dimensional (2D) space (Charabidze et al. 2008, Robie et al. 2010, Ofstad et al. 2011, Donelson et al. 2012, Zabala et al. 2012, Wu et al. 2014, Brooks et al. 2016, Hughson et al. 2018). There are also some studies that utilized a marker-based method to track the insects (Mersch et al. 2013, Crall et al. 2015, Blut et al. 2017, Gernat et al. 2018, Ulrich et al. 2018). However, compared to the amount of 2D insect tracking studies, far fewer tracking systems provide three-dimensional (3D) spatial information. The lack of development in 3D tracking tools is most likely attributed to the difficulty in constructing a robust trajectory in 3D space (Guo et al. 2018, Nasir and Mat 2019, Sun and Gaydecki 2021).

There are four pertinent issues to monitoring tiny animals in 3D space compared with the 2D one: 1) nonuniform lighting environment; 2) long view depth; 3) synchronization of multiple cameras; and 4) matching the same object from different viewing angles.

To date, in constructing a 3D tracking system, at least two cameras were employed (Straw et al. 2010, 2011; Ardekani et al. 2013; van Breugel and Dickinson 2014; Wilkinson et al. 2014; Sinhuber et al. 2019). Some studies even used six cameras to simultaneously monitor a 1.5 × 0.3 × 0.3 m wind tunnel to obtain precise movement information (Straw et al. 2010). To improve the accuracy of tracking small animals or insects in such a large arena, the usage of two cameras or more is inevitable. When two or more cameras are being used to observe 3D behavior, how the multiple cameras are synchronized becomes very critical consideration. Besides these difficulties, the cost of extra equipment, namely more cameras and lenses, may also hinder the development of the 3D tracking system.

To solve this problem, mirrors were often applied to replace additional cameras on the same baseline, with target positions computed according to some geometric constraints. For example, mirrors were used to replace two cameras on the same baseline in the case of bird tracking (de Margerie et al. 2015), and two mirrors were positioned behind the tube of flies to get three viewing angles on the same plane (Kohlhoff et al. 2011). Similarly, the present study integrates mirrors into the observation system, but a new design aspect is introduced: two mirrors are placed to create two other viewing angles on different planes. This setup provides the advantage of measuring the height of targets, without doing complex geometric calculations.

In addition to the problem of camera synchronization, it is harder to set up a homogeneously illuminated environment for a 3D than 2D monitoring system. An uneven lighting environment poses a challenge to identifying the true position of insects, especially for the tiny moving targets. For these reasons, a stereo tracking system is crucial yet seldom used for monitoring insects (Fry et al. 2004, Wu et al. 2011, van Breugel and Dickinson 2012). The system described and tested in this study provides a solution that resolves both problems. The stereo trajectory of insects in a transparent ball arena can be obtained using only one camera, with an image processing method integrated into this system. With our system, users can easily obtain the moving trajectory of multiple insects in 3D space, automatically, bypassing camera synchronization problems.

Materials and Methods

The schematic of the tracking system is shown in Fig. 1A. Set atop the system to monitor the arena (78 mm in diameter) is a CCD (charge-coupled device) image sensor (Vieworks, VH-4MC-M/C 20, Korea, 20 frames per second, 2,048 × 2,048 pixels) with a telecentric lens (Optoengineering, TC4M240F, Italy). Such lenses provide high-quality inspection results: in contrast to regular industrial lenses, they yield images with very low distortion rates and perspective errors. These unique characteristics of telecentric lenses enable the exact location of each insect to be identified. The frame rate, at 20 frames per second, is sufficient to capture the motion of drain flies, Clogmia albipunctata (Williston) (Diptera: Psychodidae). Successful motion sensing can be done with an image sensor at the appropriate frame rate, depending on the insect species.

Fig. 1.

Fig. 1.

Experimental setup of the 3D multiple-insect tracking system. (A) Overview of the system. A single camera is used to monitor insects in the bio-ball. (B) Detailed view of the inspection zone. The two mirrors convey the height information of the moving insects. Three LED lighting plates illuminate the bio-ball. (C) Demonstration of equivalent virtual cameras of this designed system.

Figure 1B shows the details of the inspection zone. The bio-ball, which serves as the arena, is placed in the middle. It is built from two hemispheres; hence, their edges will cause a ring-shaped joint to appear in the image. Surrounding the bio-ball are two mirrors and three white LED (light-emitting diode) lighting plates, the latter serving as the light source in this system. One LED lighting plate is placed below the bio-ball, while the other two are placed on the side, perpendicular to each other, thereby providing two additional angles of view.

Figure 1C shows the equivalent virtual cameras of this system. The views of two equivalent virtual cameras and the real camera are in different planes and mutually perpendicular. The design of this mutually perpendicular view entails a key advantage: the extracted coordinates of insects in any view can be directly inferred along any two major coordinates (i.e., XY, YZ, or XZ coordinates), without additional calculations. Thus, this mutual perpendicular-view design makes matching the coordinates of insects in different views more efficient. The optical lever phenomenon doubles the effect of mirror tilting errors. In our study, the tolerance of angle error is 0.5°, which can cause a 0.8- to 1.5-mm displacement.

The position and the angle of lens and mirrors were well-calibrated and fixed by a digital angle gauge, with a resolution of 0.01° and an accuracy of ±0.05°. Two mirrors are fixed at angles of 45 ± 0.05° with respect to the ground, and the angle of their stands is 90 ± 0.05°. These two angle errors are thus no greater than 0.1°, ensuring the equivalent virtual cameras are mutually perpendicular to each other; any distortion of the mirror image is verified by using Zhang’s camera calibration procedure (Zhang 2000).

Figure 2 is the flow chart of the offline image processing algorithms we used, showing the steps for tracking insects in each zone and frame. Figure 3A demonstrates the method by which to identify the location of a given insect by matching the coordinates in different zones based on its position in zone III. In Fig. 4A, distinct areas are defined by the order of the quadrant, an expression widely used in mathematics and physics. In our system, zone III is an aerial view of the bio-ball. We define the XY coordinates in zone III as the XY values of overall 3D coordinates. Zones II and IV are the images generated from the two mirrors (fixed at a 45° angle with respect to the ground). The height information, called the Z-value, can be determined by matching the corresponding information in zones II and IV, these, respectively, providing information for the XZ and YZ planes. In this way, we can correctly locate each insect in 3D space.

Fig. 2.

Fig. 2.

Position extraction flowchart.

Fig. 3.

Fig. 3.

(A) Flowchart for the height matching, and (B) schematic of the process to generate the background.

Fig. 4.

Fig. 4.

Coordinate definitions and schematic of the matching process. (A) System coordinates. The blue circle denotes the origin, the red arrow indicates the positive y-axis direction, and the yellow arrow indicates the positive x-axis direction. (B) X- and Y-value matching with the reference point in zone III. (C) Height matching.

Tracking Insects

Background Extraction

The background subtraction method was used to extract the region in which insects are present. Theoretically, the easiest way to obtain the background image is to capture an image before every experiment. However, when the arena is slightly moved or the luminance on the bio-ball has changed slightly, the differences however minor will create ‘noise’ that makes the image processing more difficult. We proposed a background extraction method that can generate a good quality background automatically from a video sequence with insects present in the bio-ball. As shown in Fig. 3B, frames in the video are selected at intervals of 500. The intensity of each pixel is compared with its corresponding pixels from other chosen frames; the maximum intensity value among different frames is then selected as the value of that background pixel. The background created by this method is robust even in a nonuniform lighting environment and reduces noise so that the insect positions can be reliably extracted.

ROI (Region of Interest) Detection

To increase the efficiency and accuracy of the image processing, the Hough transform was used to detect the ROI circle in the image. This circle defines the region of the bio-ball and thus restricts the processing zone. After reducing the processing area through ROI detection, the computational time is greatly shortened. Moreover, the application of ROIs here can also mitigate the interference caused by reflections of insects that occur when their bodies shift to the edge of the bio-ball, thereby improving the final tracking result.

Dynamic Thresholding

When insects move to the dark edge or the ring, a fixed threshold value cannot be applied to extract their position, because their grayscale value will be very similar to that of the background. Thus, in this study, two threshold values are used. In regular situations, the higher threshold value (10.24) is employed to locate the insects. When insects move into the dark edge or under the ring-shaped joint, the number of regions detected after noise reduction may be less than the exact number of insects in the bio-ball. In this case, a lower threshold value (7.68) is applied to extract the insect positions. Although a lower threshold value helps to extract objects having low contrast, it is not used here as the default because it may harbor more noise.

Noise Reduction

Morphological processing is applied to remove the noise. Tiny objects (regions of 180 pixels or less in this study) are regarded as noise and removed entirely. In addition, an ellipse is fitted to each region, and the ratio of its major axis length to the minor axis length is calculated. Those regions having a ratio >10 are defined as noise and removed.

Checking the Number of Regions and Detection of Touching Events

After excising the background noise, insect individuals can be identifiable by our system. Successful detection can be checked by comparing the number of detected regions with the number of expected individuals. If the number of detected regions is more or less than the number of expected individuals, extra procedures are applied as follows. Only when noise exists, will the number of detected regions exceed the number of expected individuals. Because noise does not appear continuously, the system will remove the region lacking continuity to address this condition. The calculation related to this is made in the identification step, in which the noise interference will be eliminated.

Only under two conditions is the number of detected regions less than the number of expected individuals: 1) when an individual lands on the junction of bio-ball’s two hemispheres, or 2) when two individuals are touching each other in the field of view. In regular processing, the higher threshold is set as the default threshold. But when the system finds the number of detected regions unequal to that of expected individuals, a lower threshold value is automatically applied to extract all the individuals again. Under this lower threshold, individuals under the shade of the bio-ball can be found. If after lowering the threshold the number of detected regions is still not equal to the number of expected individuals, the occurrence of touching event is deduced. In this case, the K-means method is applied to separate the touching foreground.

Centroid Calculation and Initialization of the K-Means Method

The centroid of each region is defined as the position of each insect. When two individuals are touching each other, applying the K-means method can distinguish each individual in the overlapping regions. The working principle of the K-means method is introduced in the Supp File (online only).

We applied the K-means method to all regions of foreground, rather than specifically to the region where the touching event occurs. With an adequate frame rate, the centroids of individuals in the previous frame, that is, the last frame when they were identified as separate individuals, can be used to identify whether a touching event has occurred. These centroids are defined as the initial clustering centroid of the K-means method.

Figure 5 demonstrates the process flow of the K-means method in this study. Red, blue, and green regions denote different identities. Figure 5A shows three centroids of insects before the touching event occurs (i.e., the N − 1th frame), these being the initial clustering centroids in the first iteration of the Nth frame. The distribution of initial clustering centroids is evidently crowded near the touching region (red and blue crosses); conversely, a single initial clustering centroid appears in the nontouching region (green cross). Therefore, the touching region will be separated by densely initial clustering centroids, while the nontouching region will remain a single cluster. Figure 5B displays the occurrence of the touching event (Nth frame) and the clustering result in each iteration. Finally, the clustering converges at the fifth iteration. Through this process, the identities of both individuals remain unchanged while touching each other.

Fig. 5.

Fig. 5.

Demonstration of the process flow of the K-means method when a touching event happened in this study. Red, blue, and green regions denote different identities. (A) Three centroids of insects before a touching event (N − 1th frame) are the initial clustering centroids in the first iteration on the Nth frame. (B) The touching event (Nth frame) and clustering result in each iteration.

Identification

There are many solutions to the problem of identification; for example, using the Kalman filter (Straw et al. 2011) or the particle filter (Pistori et al. 2010). The complex procedure used in those studies is not necessary and a much simpler tracking strategy is proposed.

Because the moving path of an insect is continuous and the period between two successive frames (0.1 s) is very short. Hence, we may assume the travel distance and the direction of an insect from frame t − 1 to frame t is almost the same as that from frame t − 2 to frame t − 1. Therefore, an estimated position was calculated by assuming that the distance and the direction traveled in the two previous frames would remain constant. The insect is identified by assigning to it the identity of the individual who was closest (least distant) to the estimated position. This procedure may provide additional help to eliminate the noise which randomly appear.

Matching the Coordinates in Different Zones

Both the definition of the X and Y coordinates as well as the origin of the image are presented in Fig. 4A. The coordinates of the points in zone III are considered the references for matching the points in zones II and IV. The purpose of matching these coordinates is to determine the height of each insect in the bio-ball.

Creation of the Pending Match Pool

To create the pending match pool, we collect all centroid data (X and Y coordinates) from zones II and IV and put them into a pool. At this stage, all points are ready to be matched with the reference points, albeit the corresponding coordinates in different zones may not be exactly the same. Hence, the threshold for corresponding conditions was set to 30 pixels.

X-Value Matching

The points in zone II whose X-direction distance to a reference point is less than 30 pixels are selected as candidate points. If only one candidate point ensues, this individual’s X coordinate has been successfully identified (Fig. 4B). This candidate is then removed from the pool, and the remaining reference points in the pool are matched.

Y-Value Matching

Similar to the X-value matching, we look for candidates with a Y-value distance to the reference point of less than 30 pixels. If just one candidate is chosen, the match between zone IV and the reference point is confirmed (Fig. 4B) and that candidate is also removed from the pool.

Height (Z-Value) Matching

After completing the X- and Y-value matching, every reference point corresponds to two matching points from the pool, unless the X- or Y-value matching has failed. If the reference points only correspond to one matching point, then the step of height matching is performed next.

For such one-point-match cases, it is possible to find the height. Using the obtained height information, a point with a height difference of less than 30 pixels can be found (Fig. 4C). If only one candidate is found, we can directly remove it from the pool. X-value, Y-value, and height matching were run iteratively until the matching result of the current iteration was the same as that of the previous matching.

Continuity Matching

The remaining points in the pool, which could not be matched in the previous steps, are matched to the reference points using trajectory continuity. The matching algorithm used for this is the same as that mentioned in the section Identification. After carrying out all the matching processes, the points in the three zones have been matched up successfully. In other words, each reference point corresponds to two matching points, one in zone II and one in zone IV, with no remaining points in the pool. At this stage, the 3D position of every insect in the bio-ball has been determined for 3D behavior monitoring.

Flying Movement Detection

Concerning the definition of the threshold for flying detection, some fundamental experiments have been done in this study. One hundred flying bouts and 100 walking bouts were manually labeled. It was found that the flying speed (average flying speed: 145 mm/s, the minimum flying speed: 128 mm/s) is much faster than the insects’ walking speed (average walking speed: 6 mm/s). Hence, we can discern the type of activity engaged by the moving speed. Because the minimum flying speed is 128 mm/s, this study defines the flying behavior of a drain fly to be that of moving at 120 mm/s or greater. Furthermore, an additional 1,000 bouts of behavior were examined using this threshold; the accuracy of the flying behavior detection was 100%.

Ethical Note

The drain flies were captured in the wild. After 10 individuals were subdued with 98% CO2, they were loaded into the bio-ball (78 mm in diameter, transparent acrylic), and freely exposed to the experimental arena for 10 min prior to the monitoring experiment. After the 10-min experimental period, all drain flies were released into the wild immediately. The total captive time of each drain fly was less than 30 min.

Results and Discussion

Many kinds of small and tiny insects are compatible for use with this tracking system, such as fruit flies, houseflies, bees, and mosquitoes. The highly active drain fly was selected for this study to demonstrate the proposed system’s performance. The movement of drain flies is characterized by flitting. In our experiment, a camera with the frame rate of 20 fps is more than adequate to reliably track the drain flies at their maximum speed of 160 mm/s when they jumped. Thus, drain flies are ideal experimental subjects to test the proposed system. We should also note that both genders of the drain flies were used in this study.

Uneven Lighting Environment

Figure 6A presents a frame taken by this newly developed system. Here, 10 drain flies had been placed in the bio-ball. Upon examination, several 3D tracking problems are evident. For example, the uneven lighting environment and the dark edge created by the bio-ball hinders the tracking of the insects.

Fig. 6.

Fig. 6.

Challenges to extracting the positions of insects. (A) An example image of insects in the bio-ball. Here, circles B–D show insects that are difficult to locate because the contrast between them and the background is very low. (B and C) Close-up view of insects under the ring of the bio-ball. (D) Close-up view of an insect that has moved to the dark edge of the bio-ball.

Maintaining uniform lighting is much harder to achieve in a 3D environment than in a 2D situation. As seen in Fig. 6A, an example of poor lighting uniformity, the luminance clearly differs between different zones. Moreover, the dark edge of the bio-ball and the ring where its two hemispheres join, exist in each view. It is impossible to prevent the intrusion of such artifacts in these views, leaving it a daunting task to distinguish the insect in these regions. Enlarged views of these regions are shown in Fig. 6B–D: a drain fly concealed by the ring appears in both Fig. 6B and C, while Fig. 6D reveals the dark edge of the bio-ball. Under these critical conditions, color features and a simple threshold value are insufficient to distinguish the insects from the background.

Even under a variable lighting environment, the tracking algorithm we developed can still evaluate and elucidate the positioning of insects. We use one frame from a monitoring video with 10 drain flies, as an example (Fig. 7A), to demonstrate the outcome of each process.

Fig. 7.

Fig. 7.

Background subtraction process and the result of ROI detection. (A) Ten drain flies placed in the bio-ball. (B) Background created from a video sequence. (C) Result of the background subtraction. (D) Red circles indicating the ROI generated by this system.

Background Extraction

Due to the huge pixel intensity difference between the insects and the background, the method of collecting maximum pixel value among frames as the background is a robust and efficient way to generate a clear background. Figure 7B shows the background image created from the insect-monitoring video by the method described in the Materials and Methods (section Background Extraction). The quality of the background subtraction result (Fig. 7C) is excellent, featuring a low level of noise.

ROI Detection

The red circles in Fig. 7D show the ROI, which was detected successfully by the Hough transform because of the accurate fit of the red circles to the edge of the bio-ball.

Dynamic Thresholding

Figure 8A is the binarized image after applying the higher threshold value, and drain fly no. 5 is covered by the ring of the bio-ball in the red-circled area. Hence, extraction of this drain fly was not successful when using a high threshold. To overcome this condition, our proposed system employs the low threshold value automatically and repeats the binarization process, because the foreground has fewer regions than insects. As evinced by Fig. 8B, the drain fly covered by the ring is now revealed after using the lower threshold. Different marks in Fig. 8C denote the centroids of each insect.

Fig. 8.

Fig. 8.

The outcome of each step in the position extraction method. The red circles show a drain fly no. 5 covered by the ring of the bio-ball. (A) Drain fly no. 5 cannot be extracted when using the higher threshold value. (B) Drain fly no. 5 is successfully detected after the system thresholds the image using the lower threshold value. (C) Different marks indicate the centroids of each drain fly individual. (D) Different colors denote the individual matching of the drain flies in the different zones.

Centroid Calculation and Tracking

After the centroids of the drain flies in these three zones have been verified, the points in these three zones are matched to obtain the height information, as illustrated in Fig. 8D. In the case of a two-individual touching event, this study applied the K-means method to cluster the overlapping regions. The accuracy of identity identification after the touching event took place is 99.4% (i.e., 994 correct identifications out of 1,000 touching events). The individuals’ identity is checked manually.

Computation Time

In this study, a desktop system having eight cores (Intel Core i7-9700F) running at 3.00 GHz with 16 GB of random-access memory was used as the evaluation platform. Its average computation time is 0.08 seconds per frame. Here, it took just 16 min to complete the analysis of 10 min of insect behavior monitoring. The code is freely available for download at: https://doi.org/10.5061/dryad.qrfj6q5gm.

Flying Movement Detection

The two supplementary videos generated from our same experimental trial show the detection of flying behavior captured by video and its corresponding 3D trajectory (Supp Video 1 and Video 2 [online only]). Flying movements are marked with red circles. Both videos demonstrate that this system can distinguish the flying behavior from walking activity of drain flies.

Method for a Tracking Accuracy Test

It is difficult to know the true 3D coordinates of an insect to conduct a tracking accuracy test because these coordinates are human-defined. The position/angle of the lens and mirrors were well-calibrated in this study; therefore, the geometric relationship between lens and mirrors is precise and steady. This means that, if the position of an insect can be matched fully across zones II, IV, and III, the 3D coordinate calculated by this system is deemed ‘correct’. Therefore, it is not necessary to know the true 3D coordinates for checking the system’s tracking accuracy.

Two tracking experiments were conducted with 5 and 10 drain flies, respectively, to demonstrate the feasibility of our proposed methodology. After these tracking experiments, the accuracy of the tracking system was manually examined, by reviewing whether each insect extracted was matched correctly in zones II, IV, and III—or not. Hence, accuracy is defined as: the number of correctly extracted and matched divided by the total number of insects tested.

The accuracy of 3D tracking for 5 drain flies was 98.7% (1,481 correct out of 1,500 tracking times) and that for 10 drain flies was 96.3% (2,889 correct out of 3,000 tracking times). These high tracking accuracy values demonstrate that our proposed system’s methodology can successfully extract the 3D trajectories of multiple insects when using its setup requiring a sole camera. Figure 9 presents a map of what the estimated 3D trajectories look like. Different colors and numbers are used to represent the different identities of insect individuals.

Fig. 9.

Fig. 9.

Map of 3D trajectories. Differing colors and numbers correspond to different identities of the monitored drain fly individuals.

This study proposed a methodology to track multiple insects by utilizing a single camera in an uneven lighting environment. The novel mirror arrangement, of placing mirrors at the angle of 45°, can address the synchronization problem successfully. By using two mirrors, the problem of synchronizing multiple cameras can be skipped, rendering a complex experimental design moot. The new mirror arrangement provides three viewing angles on mutually perpendicular planes. More specifically, via geometric relationships, the height of the insects can be easily obtained through image processing and calculation. Moreover, through using the proposed method, complex mapping procedures are no longer needed. The system can successfully track multiple small insects and determine their corresponding 3D positions. The position extraction method and the algorithm for matching individual insects among three zones are both effective and robust. The accuracy of 3D tracking for 5 and 10 drain flies is, respectively, 98.7% and 96.3%. Even in some critical situations, such as when one or more insects move to the dark edge or ring of the bio-ball, our experimental results show that both problems can be successfully solved.

In closing, we anticipate that 3D trajectory monitoring experiments in studies of insect behavior and neuroscience could be performed more efficiently with this proposed system and its methodology.

Supplementary Material

ieab079_suppl_Supplementary_Material
ieab079_suppl_Supplementary_Video_1
ieab079_suppl_Supplementary_Video_2

Acknowledgments

We thank Chi-Lien Yang for discussion and comments on this project. We thank the Vienna Drosophila Resource Center and Bloomington Stock Center for fly stocks. This work was supported by the Brain Research Center under the Higher Education Sprout Project, cofunded by the Ministry of Education and the Ministry of Science and Technology in Taiwan. This work was also supported by research projects (grant 105-2628-E-007-002-MY3, 106-2811-E-007-031, 107-2923-E-007-004, 110-2634-F-007-025) funded by the Ministry of Science and Technology in Taiwan.

Author Contributions

C.-H.C.: Methodology, software, investigation, data curation, visualization, writing—original draft preparation. A.-S.C.: Resources, writing—reviewing and editing. H.-Y.T.: Methodology, software, supervision, visualization, writing—original draft preparation, writing—reviewing and editing, project administration, funding acquisition.

References Cited

  1. Ardekani, R., Biyani A., Dalton J. E., Saltz J. B., Arbeitman M. N., Tower J., Nuzhdin S., and Tavaré S.. . 2013. Three-dimensional tracking and behaviour monitoring of multiple fruit flies. J. R. Soc. Interface. 10: 20120547. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Blut, C., Crespi A., Mersch D., Keller L., Zhao L., Kollmann M., Schellscheidt B., Fülber C., and Beye M.. . 2017. Automated computer-based detection of encounter behaviours in groups of honeybees. Sci. Rep. 7: 17663. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. van Breugel, F., and Dickinson M. H.. . 2012. The visual control of landing and obstacle avoidance in the fruit fly Drosophila melanogaster. J. Exp. Biol. 215: 1783–1798. [DOI] [PubMed] [Google Scholar]
  4. van Breugel, F., and Dickinson M. H.. . 2014. Plume-tracking behavior of flying Drosophila emerges from a set of distinct sensory-motor reflexes. Curr. Biol. 24: 274–286. [DOI] [PubMed] [Google Scholar]
  5. Brooks, D. S., Vishal K., Kawakami J., Bouyain S., and Geisbrecht E. R.. . 2016. Optimization of wrMTrck to monitor Drosophila larval locomotor activity. J. Insect Physiol. 93-94: 11–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Charabidze, D., Bourel B., Leblanc H., Hedouin V., and Gosset D.. . 2008. Effect of body length and temperature on the crawling speed of Protophormia terraenovae larvae (Robineau-Desvoidy) (Diptera: Calliphoridae). J. Insect Physiol. 54: 529–533. [DOI] [PubMed] [Google Scholar]
  7. Crall, J. D., Gravish N., Mountcastle A. M., and Combes S. A.. . 2015. BEEtag: a low-cost, image-based tracking system for the study of animal behavior and locomotion. PLoS One. 10: e0136487. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Donelson, N., Kim E. Z., Slawson J. B., Vecsey C. G., Huber R., and Griffith L. C.. . 2012. High-resolution positional tracking for long-term analysis of Drosophila sleep and locomotion using the “tracker” program. PLoS One. 7: e0037250. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Fry, S. N., Müller P., Baumann H. J., Straw A. D., Bichsel M., and Robert D.. . 2004. Context-dependent stimulus presentation to freely moving animals in 3D. J. Neurosci. Methods. 135: 149–157. [DOI] [PubMed] [Google Scholar]
  10. Gernat, T., Rao V. D., Middendorf M., Dankowicz H., Goldenfeld N., and Robinson G. E.. . 2018. Automated monitoring of behavior reveals bursty interaction patterns and rapid spreading dynamics in honeybee social networks. Proc. Natl. Acad. Sci. USA. 115: 1433–1438. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Giancardo, L., Sona D., Huang H., Sannino S., Managò F., Scheggia D., Papaleo F., and Murino V.. . 2013. Automatic visual tracking and social behaviour analysis with multiple mice. PLoS One. 8: e0074557. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Guo, Y. Y., He D. J., and Liu C.. . 2018. Target tracking and 3D trajectory acquisition of cabbage butterfly (P. rapae) based on the KCF-BS algorithm. Sci. Rep. 8: 9622. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Hughson, B. N., Anreiter I., Jackson Chornenki N. L., Murphy K. R., Ja W. W., Huber R., and Sokolowski M. B.. . 2018. The adult foraging assay (AFA) detects strain and food-deprivation effects in feeding-related traits of Drosophila melanogaster. J. Insect Physiol. 106: 20–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Kohlhoff, K. J., Jahn T. R., Lomas D. A., Dobson C. M., Crowther D. C., and Vendruscolo M.. . 2011. The iFly tracking system for an automated locomotor and behavioural analysis of Drosophila melanogaster. Integr. Biol. (Camb). 3: 755–760. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. de Margerie, E., Simonneau M., Caudal J. P., Houdelier C., and Lumineau S.. . 2015. 3D tracking of animals in the field using rotational stereo videography. J. Exp. Biol. 218: 2496–2504. [DOI] [PubMed] [Google Scholar]
  16. Martin, J. R. 2004. A portrait of locomotor behaviour in Drosophila determined by a video-tracking paradigm. Behav. Processes. 67: 207–219. [DOI] [PubMed] [Google Scholar]
  17. Mersch, D. P., Crespi A., and Keller L.. . 2013. Tracking individuals shows spatial fidelity is a key regulator of ant social organization. Science. 340: 1090–1093. [DOI] [PubMed] [Google Scholar]
  18. Nasir, N., and Mat S.. . 2019. An automated visual tracking measurement for quantifying wing and body motion of free-flying houseflies. Measurement. 143: 267–275. [Google Scholar]
  19. Ofstad, T. A., Zuker C. S., and Reiser M. B.. . 2011. Visual place learning in Drosophila melanogaster. Nature. 474: 204–207. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Pistori, H., V. V.Viana Aguiar Odakura,Oliveira Monteiro J. B., Gonçalves W. N., Roel A. R., de Andrade Silva J., and Machado B. B.. . 2010. Mice and larvae tracking using a particle filter with an auto-adjustable observation model. Pattern Recognit. Lett. 31: 337–346. [Google Scholar]
  21. Robie, A. A., Straw A. D., and Dickinson M. H.. . 2010. Object preference by walking fruit flies, Drosophila melanogaster, is mediated by vision and graviperception. J. Exp. Biol. 213: 2494–2506. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Sinhuber, M., van derVaart K., Ni R., Puckett J. G., Kelley D. H., and Ouellette N. T.. . 2019. Three-dimensional time-resolved trajectories from laboratory insect swarms. Sci. Data. 6: e190036. [Google Scholar]
  23. Straw, A. D., Lee S., and Dickinson M. H.. . 2010. Visual control of altitude in flying Drosophila. Curr. Biol. 20: 1550–1556. [DOI] [PubMed] [Google Scholar]
  24. Straw, A. D., Branson K., Neumann T. R., and Dickinson M. H.. . 2011. Multi-camera real-time three-dimensional tracking of multiple flying animals. J. R. Soc. Interface. 8: 395–409. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Sun, C., and Gaydecki P.. . 2021. A visual tracking system for honey bee (Hymenoptera: Apidae) 3D flight trajectory reconstruction and analysis. J. Insect Sci. 21: 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Tofilski, A. 2004. DrawWing, a program for numerical description of insect wings. J. Insect Sci. 4: 105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Ulrich, Y., Saragosti J., Tokita C. K., Tarnita C. E., and Kronauer D. J. C.. . 2018. Fitness benefits and emergent division of labour at the onset of group living. Nature. 560: 635–638. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Wilkinson, D. A., Lebon C., Wood T., Rosser G., and Gouagna L. C.. . 2014. Straightforward multi-object video tracking for quantification of mosquito flight activity. J. Insect Physiol. 71: 114–121. [DOI] [PubMed] [Google Scholar]
  29. Wu, H. S., Zhao Q., Zou D., and Chen Y. Q.. . 2011. Automated 3D trajectory measuring of large numbers of moving particles. Opt. Express. 19: 7646–7663. [DOI] [PubMed] [Google Scholar]
  30. Wu, M. C., Chu L. A., Hsiao P. Y., Lin Y. Y., Chi C. C., Liu T. H., Fu C. C., and Chiang A. S.. . 2014. Optogenetic control of selective neural activity in multiple freely moving Drosophila adults. Proc. Natl. Acad. Sci. USA. 111: 5367–5372. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Zabala, F., Polidoro P., Robie A., Branson K., Perona P., and Dickinson M. H.. . 2012. A simple strategy for detecting moving objects during locomotion revealed by animal-robot interactions. Curr. Biol. 22: 1344–1350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Zhang, Z. 2000. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22: 1330–1334. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ieab079_suppl_Supplementary_Material
ieab079_suppl_Supplementary_Video_1
ieab079_suppl_Supplementary_Video_2

Articles from Journal of Insect Science are provided here courtesy of University of Wisconsin Libraries

RESOURCES