Abstract
Artificial intelligence (AI) and machine learning (ML) present revolutionary opportunities to enhance our understanding of animal behaviour and conservation strategies. Using elephants, a crucial species in Africa and Asia’s protected areas, as our focal point, we delve into the role of AI and ML in their conservation. Given the increasing amounts of data gathered from a variety of sensors like cameras, microphones, geophones, drones and satellites, the challenge lies in managing and interpreting this vast data. New AI and ML techniques offer solutions to streamline this process, helping us extract vital information that might otherwise be overlooked. This paper focuses on the different AI-driven monitoring methods and their potential for improving elephant conservation. Collaborative efforts between AI experts and ecological researchers are essential in leveraging these innovative technologies for enhanced wildlife conservation, setting a precedent for numerous other species.
Keywords: wildlife monitoring, elephant conservation, machine learning, artificial intelligence, elephant olfactory, elephant management
1. Introduction
Elephants hold significant intrinsic, biological, ecological and human cultural value. They possess unique genetic and physiological features [1], exhibit high levels of individual cognitive and emotional intelligence [2–4], maintain complex social behaviours and structures [5–9], serve as keystone species in their respective ecosystems [10–12], are highly valued by their local communities [13] and serve as flagship species for animal conservation [14].
Unfortunately, the three extant species of elephants, African savannah elephants (Loxodonta africana), African forest elephants (L. cyclotis) and Asian forest elephants (Elephas maximus), face multiple continued threats to their survival including habitat loss [15,16], human–elephant conflict/coexistence (HEC) [17] and poaching [18,19]. This paper outlines how conservationists can leverage advances in the fields of artificial intelligence (AI) and machine learning (ML) to develop new solutions and capabilities for studying, monitoring and managing elephants and their habitats to ensure their survival.
Elephant conservation challenges have grown more acute, varying across different elephant populations. It is essential to note that while all elephant populations face a multitude of problems, certain challenges have become predominant for specific populations. Forest elephants, especially from the Congo basin, face severe threats from poaching due to the allure of their hard ivory [18,19]. Savannah elephants are grappling with habitat fragmentation caused by human expansion, disrupting their traditional migration routes [20–23] and exacerbating HECs particularly near farmland borders [17,20,24]. Meanwhile, Asian elephants, heavily impacted by habitat fragmentation, suffer from limited genetic diversity due to limited mobility and are frequently compelled to relocate from shrinking habitats, often resulting in heightened conflicts [15,16]. As conservationists continue to tackle these complex issues, it is imperative that they are armed with state of the art tools and monitoring techniques. This includes models and methods for enhancing dataset collection and annotation, particularly in countries where elephant conservation receives limited funding.
Conservation efforts have traditionally relied on a variety of techniques for monitoring elephant populations. These include direct methods, such as aerial and ground surveys [25,26], as well as indirect methods like analysing dung samples [27] and using camera traps [28]. However, these methods often have limitations. Aerial and ground surveys are labour-intensive, costly and affected by weather conditions and visibility [29–32]. Dung counts, on the other hand, may provide limited information on population demographics and are prone to errors and biases associated with the variation in dung decay rates in different parts of the elephant range. With advances in monitoring technology, wildlife ecology can acquire more informative datasets for conservation efforts with potentially lower cost. As the ease of data collection improves, this consequently leads to higher volumes of data to be collected and analysed.
The fields of AI and ML, which deal with high-level semantic reasoning and data-driven pattern recognition, are well suited to leverage this modern volume of data and to develop new solutions and capabilities for conservation science. Comprehensive reviews on AI/ML’s application in wildlife ecology and conservation can be found in the works of Farley et al. [33], Christin [34] and Tuia et al. [35]. A recurring emphasis of these articles is the need for cross-disciplinary collaboration between conservationists and technologists. Here, we build on these calls by honing in on species-specific conservation priorities, by surveying existing applications of AI techniques and sketching out problem spaces where this class of approaches may be well suited. This paper is geared towards elephant researchers and conservationists, but it might also be useful for AI/ML practitioners seeking entry points into this field to gain a tangible sense of existing technological hurdles to key elephant conservation goals.
In the following sections, we introduce AI/ML applications by measurement modality, i.e. imaging, acoustics, seismic and molecular. In doing so, we highlight the potential of AI/ML in addressing specific monitoring needs for elephants and discuss current and upcoming AI/ML techniques suitable for conservationists. The final section dives into the broader implications of AI in the field of elephant monitoring, focusing on strategies to reduce costs and to cope with massive new datasets.
2. Image and video monitoring
Imaging and video monitoring have been key tools in elephant conservation for a long time. Up until recently, this technology needed a person to analyse the recorded data to annotate elephant sightings, count elephants and record elephant behaviour. However, these tasks can be greatly facilitated by recent advances in AI monitoring.
2.1. Elephant detection
AI-assisted camera traps such as those partnered with the SMARTParks monitoring system [36], WildEye [37], EarthRanger [38] and Mbaza AI [39] have been increasingly used in parks and protected wildlands for elephant monitoring. These systems employ AI algorithms to detect and identify different species captured by camera traps, thereby helping human researchers sift through and process vast amounts of video data. As a result, they enable identification of elephants entering the camera trap’s field of view. Though the technical details of SMART and WildEye systems remain undisclosed, recent publications on species classification using camera trap images report classification accuracies of around 90% [40–44]. Mbaza AI, in particular, classifies 25 species with a 96% accuracy [39]. While the performance of these systems is impressive, they have been reported to struggle when transferring to new environments [44] and the limited visible range of camera traps constrains the coverage of this as a census technique.
AI techniques have been integrated with aerial surveys for enhanced elephant monitoring. Cameras mounted on survey aircraft capture images, and AI algorithms are used to count and track elephants in the recorded data [31,45,46]. Often these tools are used to detect the sparse appearance of elephants within a dataset of hundreds of thousands of images. AI detection techniques eliminate the need for human presence on the aircraft, the manual inspection of the images after capture and reduces the chances of miscounting due to human error and fatigue.
Recently, AI techniques have been used to count and monitor habitat use of elephants from high-resolution satellite images [47]. This approach shows great promise for wildlife monitoring, particularly in areas where elephants are not obstructed under canopies and sparsely dispersed across large home ranges such as desert environments. However, several technical challenges limit the widespread adoption of this method. These challenges include the high cost of acquiring satellite images, the lack of high-resolution satellite coverage and the inability to detect obstructed elephants under canopy. Although the increase of high-resolution satellites and improvement of AI techniques for elephant detection may make this approach more feasible in the future, the authors suggest that oblique camera count is currently more practical for detecting elephants, as it provides higher resolution images and allows for detection of elephants under tree canopies.
Finally, in the context of HECs, elephant detection techniques using camera traps have been employed to detect elephants approaching farmlands, serving as a warning to farmers about impending crop raids [48]. When these sensors activate, local farmers are informed and in some cases the elephants can be driven away.
2.2. Individual identification
Deep learning has achieved promising initial results on individual elephant identification; however, higher accuracy is needed before these methods are practical in the field. Korschens et al. [49] trained a convolutional neural network (CNN) to recognize individual elephants from a group of 276 with a 74% accuracy, using only a few training images per individual. Other studies have trained various neural network architectures to examine images or the contours of elephant ears for individual recognition, with the best results yielding an 88% accuracy [50,51]. Recent work has combined an ensemble of current elephant identification techniques into a tool called ElephantBook [52]. With this ensemble, the system achieved 93% accuracy in correctly identifying individuals within the top 15 matches.
Improving the accuracy and robustness of individual identification from images could potentially improve elephant monitoring and can be used to track individual elephants across extensive time-frames, and helps to recognize repeat offenders of HECs [53]. ML methods to identify individuals have been demonstrated in small animals using whole-body morphology, e.g. in fish [54], or in body patterns, e.g. in cheetahs, tigers and giraffes [55,56]. However, the sheer size of elephants makes it difficult to capture their whole body, particularly in forested settings where visual obstruction due to dense vegetation requires cameras to be placed close to the anticipated sites of anticipated elephant crossing. Additionally, elephants lack consistent distinguishing patterning variation; instead, expert reviewers typically use morphological features based on ears (such as folds, lope shapes and tears or holes), tusk or tush characteristics (such as angle and symmetry), tail length and brush type and back slope to identify adult elephants [53,57,58]. Focused development of methods to characterize these morphological features in concert would greatly improve automatic identification of individuals. Techniques from AI-based human facial recognition techniques [59] can also be adapted to body morphology recognition. These techniques have been used with success in bears [60,61], primates [62] and giant pandas [63] for individual face recognition.
2.3. Automated behaviour analysis
An emerging area with important conservation impacts is the study of individual elephant behaviour and its application in detecting perceived threats by elephants, understanding individual and group decision-making processes, and mitigating HECs. Direct field observations [64,65] and long-term ethological studies [66,67] continue to provide the foundation for understanding the rich and complex behaviours exhibited by elephants. These studies have built up invaluable collective knowledge of the form, function and contexts of elephant behaviour, and have accumulated into living multimedia resources such as the Elephant Ethogram book [68]. Maturation of machine vision methods, coupled with advances in remote and sustained observation of elephant behaviour, offer a way to extend this knowledge and quantify individual behaviour at greater scales through automated behaviour analysis (ABA).
2.3.1. Behaviour observation and representation
Automated learning of visual-based behaviours analyses body posture representations over time. As such, this approach is dependent on numerous other technically challenging tasks, namely video acquisition that balances spatial resolution with field of view, followed by robust postural estimation. ABA methods assume that the significant portion of variability in its input data is due to behavioural variability and not due to imaging or environmental variability; thus, it is equally as important to understand the capabilities and existing challenges of these preceding technical components.
Video-camera traps and camera-equipped unmanned aerial vehicles (UAVs) are offering researchers and conservationists more sustained and previously inaccessible views of elephant behaviour at high spatio-temporal resolution. Video-camera traps, which remain vital sensors for forest-dwelling elephants and cause minimal behavioural disruption, have been used successfully to classify behaviour by manual human review [69]. These systems are in the same technology class as camera traps and thus share similar methodological and logistical considerations, including camera placement, sensor sensitivity and resolution and environmental resilience [70]. The review by Dell et al. [71] on the challenges of image-based tracking in the field and their call to developers remain relevant even in light of the rapid progress in machine vision that has occurred in subsequent years.
One significant advance has been that of automatic pose estimation. Postural representations capture the relevant features of an animal’s pose, such as appendage configuration using keypoint-based representations or soft-tissue and body condition using dense point cloud or surface mesh representations. We refer readers to Mathis & Mathis [72] and Jiang et al. [73] for a review of animal pose estimation. A persistent challenge is the standardization of pose and robustness to visual occlusion, irrespective of an animal’s distance or orientation to the camera. This is an inherent challenge in observing with a single camera and representing pose in two dimensions, and can lead to corrupted downstream behavioural inferences that would otherwise be resolved with three-dimensional (3D) pose representations [74]. Multiple cameras may be used to resolve this inherent depth ambiguity, but this introduces additional data synchronization challenges and increases system cost. Research in ML methods for single-view (or monocular) 3D pose estimation provides a promising alternative solution [75]. The most feasible of these approaches is the use of shape and skeletal priors, whose utility have been demonstrated by existing methods in 3D animal pose estimation [76–78].
UAVs provide unprecedented field-of-views of animal groups and mobility in following them over difficult terrain in open landscapes such as the African savannah [79,80]. We note that overhead observation, as will be discussed, is limited in its ability to capture elephant behaviour occurring under trees, and even thermal imaging cameras do not have enough contrast or resolution during the daytime due to ambient temperatures. UAVs collecting oblique video footage overcome the fixed nature of ground-based video-camera traps and can be flexibly positioned to control the level of observation detail versus field of view (including at an angle to observe elephants under trees), but they otherwise face many of the same pose estimation challenges described above. A much simpler imaging condition is that of UAVs collecting overhead video footage, where steady altitude, relatively large distance to subjects and complete view of gross body position (e.g. orientation and heading) circumvent the need for fuller, 3D descriptions of pose. The trade-off is naturally in the detail of behavioural observation, but if the focus is on how cumulative behaviour and interactions influence group-level behaviour, then these observations are typically sufficient and already richer than previous sparse and point observations provided by telemetry-based systems [81]. A key development has been in the robust and high-resolution georeferencing of free-ranging animals and landscape reconstruction from these aerial videos using a combination of data collection protocols, data fusion and machine vision [80]. This is particularly relevant for studying elephant behaviour over large areas such as migratory routes or when using multiple drones to capture diffuse group behaviour. We refer readers to Corcoran et al. [82] and Schad & Fischer [79] for more in-depth reviews on the use of drones for studying animal behaviour, and Kroger et al. [80] for an accessible methodology guide spanning data collection, georeferencing and individual tracking and pose estimation. In spite of noise reduction technology, however, one persisting drawback of UAV-based monitoring is the flight noise, which impacts elephant behaviour. The recognized utility of this observational approach has spurred active research by ecologists and conservationists in developing protocols to minimizing UAV disturbance and improving habituation of elephants and other wildlife to UAVs [47,83,84]. Future behavioural insights and other conservation insights as a result of aerial-based monitoring may catalyse further hardware development of quieter UAVs for conservation purposes.
2.3.2. Behaviour analysis
ML-based behavioural analysis methods have successfully expedited the automated annotation of video data in behavioural assays [85–87] and social interactions [88–90], used to discover the structure and dynamics of expressed behaviour [91–93] and reveal previously undescribed movement phenotypes in neurodevelopmental models [91,94]. Collective behaviour analysis has also been applied to many animal species to understand how decisions are made in a group [95]. This is an active area of research in the fields of neuroscience and behavioural biology, and we refer readers to Anderson & Perona [96], Valletta et al. [97], Datta et al. [98] and Couzin & Heins [99] for more pedagogical introductions, comprehensive overviews and nuanced reviews of the field.
We consider the field of automatic behavioural analysis methods for elephant monitoring in terms of supervised approaches, sometimes referred to automatic behaviour detection or action recognition, and unsupervised approaches, sometimes referred to as deep behavioural phenotyping. When the detection or study of well-defined behaviours is required, researchers can curate a labelled subset of video frames during which the behaviours of interest are exhibited, and use supervised approaches [85–87,100]. Such studies have demonstrated human-level accuracy in detecting whole-body behaviours in mice such as floating, rearing, nose-to-nose in paired individuals, and more localized behaviours such as head dipping, grooming and scratching [85,100]. Applying these methods in order to automatically detect heightened attentiveness behaviours in elephants, such as fence touching, ear flapping and swaying, and quantifying the number of occurrences and duration would be useful in evaluating and sustaining the effectiveness of boundary deterrents such as beehive fences [69,101]. Automatic behavioural detection methods may also be useful in evaluating proxies of elephant psychological state, such as relaxed versus perceived threat, at key locations such as water sources or wildlife crossings. When a more extensive set of behaviours, patterns or open-ended ethological questions are of interest, unsupervised deep phenotyping [90–92,102–104] provides a means to identify behavioural sequences from the statistics of the individual and group postural data alone. The consistency of discovered sequences with ethologically relevant behaviour has been experimentally validated via optogenetic stimulation [74] and pharmaceutical induction [105], and demonstrated high enough sensitivity to identify previously undescribed pausing and head-bobbing behavioural phenotypes in genetically modified mice [91]. Deep phenotyping could be used to study the differences in the type and structure of social behaviour between orphaned elephants versus elephants from intact families [106–108], or translocated versus resident elephants [109,110], in order to improve re-introduction and social integration efforts.
We note that, when studying behaviour in natural settings, preference should be given to ABA methods that operate on postural representations, which were previously introduced, rather than single-stage methods that operate directly on pixels. Single-stage methods, such as [85,91,111], directly infer behaviour from changes in pixel values and provide a streamlined ML pipeline. These methods work well in controlled imaging environments with simple, consistent backgrounds such as laboratory settings, where the changes in pixel values are almost entirely attributable to a behaving animal. However, these methods will suffer when changes in pixel values are due to non-behavioural variables, such as imaging conditions, changing backgrounds and visual appearance. These challenges are faced even in laboratory settings [74,78]. On the other hand, if videos are cropped to tightly bound the relevant subjects in order to reduce image-based variability, a few studies have demonstrated high performance using single-stage behavioural inference in more controlled but still complex settings, such as in an enriched, multi-level home cage of group-housed rhesus macaques [112] and in a natural forest clearing of a resident wild chimpanzee home range using handheld video recorders [113]. These demonstrations suggest that single-stage behavioural inference may still find application when fine-tuned for specific camera views and imaging conditions.
To the best of our knowledge, automated vision-based behavioural analysis has not yet been conducted in free-ranging elephants, and only minimally in other wild free-ranging mammals [113] (e.g. in wild chimpanzees, detecting nut cracking and passing food to mouth [114], or in wild felines, standing, ambling and galloping gait detections). A handful of papers have applied automated pose estimation to ground- and aerial-based videos ([115]: wild apes, [80]: ungulates), but have not explicitly conducted behavioural analyses. This may be reflective of the challenge of translating automated behavioural analyses from the laboratory into the wild, as previously mentioned. After constraining the variability of ABA inputs, be it pose or pixels, to just behavioural variability, challenges and limitations of existing methods still remain. Automated behaviour detection faces the standard challenges associated with supervised learning approaches, such as data labelling time, annotator inconsistency and sampling and temporal bias. The ethological relevance of the behavioural sequence classes discovered by deep phenotyping require manual human validation and semantic assignment in practice. These also produce more sequence classes than can be distinguishable by eye, and these are typically agglomerated by using a model variant that explicitly accounts for additional sources of behavioural variability, such as speed [116], or post hoc via manual curation [94].
Additional challenges in modelling individual and group elephant behaviour arise simply because of its complexity and species-specificity, and the development of behavioural methods to capture these nuances has not yet been motivated. For example, elephant interactions can be indirect and take place over the timescale of tens of minutes which laboratory-motivated methods, that are focused on sub-second behavioural precision, are generally not designed to perform inference at these timescales nor have the representational capacity to study these interactions. Moreover, elephants have complex, non-visual modes of signalling, to be explored in future sections, that are still just being understood and which may complicate the inference of indirect interactions within a group. The unprecedented group-level behaviour observations with individual behavioural resolution require translation of theories from movement ecology [117,118] in order to better understand the group decision-making processes underlying socially hierarchical elephants.
The application of these techniques to behaviour patterns relevant to conservation is gradually being realized with advances in machine vision and laboratory-based ABA. For example, quantifying the ‘personality’ of individual elephants to assess their likelihood of engaging in crop raiding [53,119,120] would allow the efficient allocation of GPS collars and other resources to monitor high-risk individuals. Remote evaluation of elephant behaviour could draw from studies demonstrated for humans, such as through gait tracking [121,122], and action patterns, such as driving patterns [123], to inform wildlife managers about elephant health, wellness, intrinsic states and evolved behavioural strategies. Dedicated work by AI/ML practitioners will be required to robustly translate the latest AI and ML methods to continuous, variable observations of elephants in their natural settings, scientific and field experts are needed to guide the behavioural questions and metrics of highest conservation priority, and dedicated investments from interdisciplinary researchers will be necessary to develop novel methods for studying elephant-specific, evolved and emergent behavioural strategies.
3. Acoustic monitoring
Auditory communication plays a pivotal role in the social lives and survival of elephants. Using a diverse repertoire of calls, including infrasonic rumbles, trumpets and roars, elephants convey essential information related to group cohesion, reproductive status and alarm signals [67,124]. This audio communication contributes significantly to the complex social structure of elephants, facilitating coordination among group members, resource acquisition and the transmission of knowledge and social learning [124,125]. Consequently, the intricate acoustic communication system of elephants is a fundamental aspect of their ecology and behaviour.
Audio-based monitoring has emerged as a promising tool in elephant conservation efforts. By enabling tools such as elephant presence detection and localization, or individual recognition by vocal fingerprint, AI can offer valuable insights into elephant behaviour, communication and movement patterns.
3.1. Passive acoustic monitoring
Bio-acoustic data gathered from a strategically placed grid of microphones can be employed to detect the presence of elephants over a significantly wider range compared to camera traps. This approach, termed passive acoustic monitoring (PAM), possesses distinct advantages over imaging techniques. Whereas imaging is constrained by directionality and is ineffective in capturing occluded subjects, PAM can identify acoustic signals encompassing a radius surrounding the microphone, though obstructions such as dense foliage. This renders PAM particularly advantageous in forested terrains where visibility is compromised. Moreover, the low frequency of infrasonic elephant rumbles can be detected over 3 km from the source [126–128], facilitating a large potential radius of measurement. However, this radius is strongly influenced by factors such as foliage density, source amplitude and ambient noise levels.
PAM presents technical challenges, as working with real-world bio-acoustic data is complex due to background noise interfering with event detection and species classification. Moreover, separating individual calls from numerous simultaneous calls, often referred to as the ‘cocktail party problem’ in human audio analysis, is a crucial aspect of determining the number of elephants present at any given moment and it remains a formidable task. Another limitation of PAM is that elephants can only be detected when they are producing sound, which may result in some elephants going undetected.
Recent research from the Elephant Listening Project at Cornell University [129–132] has explored the application of AI to PAM for elephant call detection in the Congo Basin. In one study, the detector had reasonable performance (0.8 recall) in identifying when an elephant rumble occurred in a continuous wild recording [129], while another study achieved very similar performance (82% true positive score) on identifying when elephant calls occured in audio recordings [132]. Furthermore, these techniques have been implemented on a portable embedded system, enabling on-board AI audio processing and the subsequent transmission of results [133], thus creating a low-power monitoring system for potential long-term measurements in the field. Other notable works on elephant detection [134] used a hidden Markov model to detect elephants from continuous infrasonic rumble recordings with 97.6% accuracy in the presence of noise, and Venter’s work on elephant rumble detection [135], using a Voice Activity Decode, which achieved 90.5% detection accuracy.
For future work on elephant presence detection via PAM, data preprocessing methods can be applied to enhance species classification performance. More robust denoising techniques, for instance, help eliminate non-critical events such as weather, making it easier to detect events of interest. Such techniques have demonstrated improved performance in the classification of whale clicks [136] and songs [137]. Additionally, when an elephant call is detected, multiple calls are sometimes present simultaneously. To potentially use PAM for census purposes [129], it is essential to determine the number of elephants present during an elephant call event. Deep learning source separation techniques can filter the data to provide separate audio streams for calls from different sources, showing promise across various bio-acoustic datasets [138]. Unsupervised source separation techniques have also demonstrated success in classification improvement [139], and unsupervised techniques have also been used to improve event detection [140]. The benefits of unsupervised learning are discussed more in §6.2.
Not only can this technology be used to detect elephants, but it can also aid elephant conservation by detecting gunshots and chainsaws, which are indicative of potential threats to wildlife and their habitats. There exists some work in this field [141], one using CNNs [142,143]; however, the current accuracies of these algorithms make it impractical to rely on these methods alone. One notable exception is the work done by Wrege et al. [129] which has quite impressive performance on gunshot detection (0.94 recall).
3.2. Individual identification
There is strong evidence that elephants can identify other individuals from their calls [5,6], and researchers have also had some success in individual classification from audio data as well. Clemins et al. [144] demonstrated 83% accuracy in identifying individuals by employing a hidden Markov model with feature extraction. Interestingly, more advanced AI techniques have successfully identified individual lions from their roar recordings, achieving impressive accuracy of up to 98% [145,146]. These advanced techniques could easily be adapted to elephant data, likely enhancing the accuracy of individual identification from audio. Because this work depends on training on a known population with audio data labelled with the individual caller, it is best suited for monitoring in areas where the home population has sufficient data.
3.3. Full-spectrum ensemble audio monitoring
AI applications in PAM tend to concentrate on specific call types or frequencies, such as rumbles or trumpets. To improve detection of elephants near monitoring stations, it is beneficial to integrate a broad set of elephant vocalizations and frequencies into a single classification system. However, given the challenges in estimating call production rates, this method is better suited for detecting elephant presence rather than accurate population censusing.
4. Seismic monitoring
Seismic signals, the transmission of low-frequency vibrations through the ground, constitutes a potential component of elephant communication with significant implications for conservation efforts. As previously discussed, elephants are known to produce low-frequency audio vocalizations, known as infrasound, that can travel long distances [126], easily up to 3 km [127]. Concurrently, the emission of these infrasonic calls generates seismic waves that propagate through the ground [147]. Elephants have been shown to be able to detect these seismic vibrations through bone conduction and specialized mechano-receptors in their feet [148], potentially allowing them to communicate with one another over vast expanses of their habitat, conveying information about potential threats, herd movements, resource utilization and reproductive status [149,150]. Moreover, elephants can differentiate the same seismic call type from different individual elephants, owing to their ability to discriminate frequency changes within a narrow bandwidth in the low-frequency spectrum [149].
Seismic monitoring presents several compelling opportunities in the field of conservation. First, it provides an innovative way to detect elephant presence, which could be highly advantageous for potential future applications such as population censusing, monitoring, corridor planning and early warning systems to prevent HEC. Second, a deeper understanding of the social behaviour surrounding these seismic signals can be gained with long-distance monitoring techniques that are minimally disruptive to the animal’s environment. Finally, considering elephants’ ability to discern caller identity from the same call, it is plausible that additional information on elephants can be extracted with the analysis of seismic data. This may potentially include details such as the size of the caller, or the caller’s identity.
4.1. Elephant detection
While initial studies have primarily focused on analysing elephant behaviour and responses to these seismic signals through geophone measurement [149], seismic monitoring has been proposed [150–152] and implemented [153] as a method for elephant population monitoring, encompassing both censusing and tracking. Non-ML techniques have been employed to detect elephant presence from seismic data for censusing purposes, achieving 85% accuracy [154] and continuous wavelet transforms reached 90% accuracy in detecting forest elephants [155]. Within the realm of ML, elephant calls have been classified from seismic measurements using support vector machines (SVMs) with 73% accuracy [153], neural networks with 87% accuracy [156] and CNNs attaining 80–90% accuracy up to 100 m away [157].
Using AI techniques for species classification and localization within seismic data holds promise for enhancing the accuracy of elephant censusing. As highlighted in §3.3, elephant infrasonic calls are accompanied by corresponding seismic calls. Each of these transmission modalities has its own mode of transport. Depending on environmental factors, either the audio or seismic signal may propagate more effectively [158]. By integrating techniques for elephant detection across both audio and seismic modalities, it is possible to reduce noise and more accurately distinguish elephant sounds from background interference. This is particularly beneficial when the signal-to-noise level is low in both measurements, a scenario frequently encountered at extensive measurement radii [147,149]. Such bi-modal detection can not only extend the effective detection range beyond that achievable by either audio or seismic monitoring alone but also elevate the signal-to-noise ratio (SNR) without compromising the measurement radius. This advantage becomes especially pronounced in forested or densely vegetated regions where acoustic waves face substantial attenuation [147].
Localization of elephants is another essential task. Recent work has compared seismic and acoustic localization [150], revealing improved localization using seismic data. Moreover, the multimodal combination of seismic and acoustic measurements can provide location information with a limited array, as seismic and acoustic waves travel at different speeds, allowing distance to be determined by the delay between the two signals if the soil composition is known [159].
In the context of behaviour monitoring, there is considerable potential for analysing call types and long-distance elephant communication [126], even though much remains uncharted. O’Connell Rodwell’s research group has significantly contributed to identifying the objectives of certain calls, such as when elephants are leaving a resource or are in oestrus [149]. Other researchers have effectively automated the classification of distinct behaviours through seismic measurements, including walking, running and rumbles [160]. The application of AI in behaviour monitoring within this domain continues to offer substantial opportunities for further exploration and advancements.
4.2. Future of seismic and acoustic monitoring
Lastly, the potential exists for acquiring additional information from seismic or low-frequency sound data through frequency analysis. Studies in whale bio-acoustics suggest that individual characteristics, such as body size, weight, age and sex, can be inferred through the analysis of whale songs, as the caller’s unique size and vocal cord properties generate distinctive sounds associated with various phenotypes and traits [161]. Given that elephants can discern between familiar and unfamiliar callers based on their seismic signals, and that the frequency characteristics of the seismic wave an individual can transmit are influenced by their body size [149], it is plausible that an individual’s traits can be deduced from their seismic call. These traits may encompass body size, individual identity, sex and age. Further research into the application of AI in this area will prove immensely beneficial for monitoring efforts.
5. Olfactory monitoring
Olfaction, or the sense of smell, plays a critical role in the lives of elephants as well as in conservation efforts. Elephants have an extraordinary sense of smell, which they rely on for various purposes, such as foraging, locating water sources [162], identifying herd members [163], communicating with other elephants [164] and navigation [165]. The elephant’s olfactory system is highly developed, featuring a large number of olfactory receptor genes and a sophisticated vomeronasal organ [166,167]. These adaptations enable elephants to detect and interpret a wide range of chemical signals in their environment. By studying these chemical signals, researchers can better understand elephant behaviour, social structure and reproductive strategies. As our understanding of elephant olfaction and as measurement technology improves, new opportunities for monitoring chemical signals and how they affect elephant behaviour will arise.
This section explores how AI analysis of olfactory measurement during elephant behaviour studies can be used for understanding elephant social interactions and decision making. The implications of this work are large and can lead to very useful insights such as mitigating HECs by identifying cheap and effective odour deterrents. However, the authors urge caution in applying these findings until a thorough understanding of olfactory sensing in elephants is obtained. This will ensure the appropriate use of odour deterrents and prevent any unintended consequences on the behaviour and welfare of elephants.
Traditionally, olfactory measurements are difficult to make in the field due to hardware constraints. Current technology makes it difficult to measure the presence of a known particulate in the air. The literature surveyed suggests that most studies on elephant olfaction either expose elephants to a known chemical, assessing their reactions, or present the same scenario to elephants both with and without the chemical [162,163,165]. This means that in order to conduct this work, the researcher must first identify a particular chemical of interest, and then test it, which is a very costly method of research. Incorporating AI into molecular sensing technology can provide researchers with a more extensive dataset for understanding elephant behaviour.
5.1. AI enhanced chemosensors
An electronic nose (e-Nose) or chemosensor is a device designed to mimic the biological olfactory system, capable of detecting and identifying various chemical compounds and odours in the surrounding environment. Though there exists technology to mimic the actual biological sensors in noses [167], discerning meaning from these sensors is quite difficult. Recent work has applied AI in interpreting measurements from these chemosensors to infer chemicals present in the air with reasonable results [168–170], although most work focuses on inanimate sensors based on metal oxides or polymers and therefore are limited in what they are able to detect. Future work in applying ML techniques to data from biological chemosensors, sensors where the sensing mechanism is based on the same biological materials as mammalian noses [167], shows a lot of promise. Google Inc has also released a new startup researching AI for olfaction sensing [171]. The technology from the research done in this field will lead to more portable systems, such as those seen in [168], with higher specificity to air particulate identification.
This increased precision in detecting airborne particulates will enable researchers to delve deeper into the role of olfactory cues in influencing elephant decisions, and help to understand elephant responses. One potential application of such high precision measurements is discerning the influence of olfactory cues on elephants’ migratory patterns. While elephants have demonstrated the ability to detect water through scent [162] , other species are known to use chemical cues for navigation [172]. Improved olfactory sensing using AI can provide a tool to understanding if elephants also rely on similar cues for their migratory decisions.
5.2. Olfactory deterrents for human–elephant conflict mitigation
Once certain chemicals are identified, these can be used in elephant conservation. Elephants have an aversion to certain smells, which can be used as olfactory deterrents to minimize HECs. Some deterrents, such as bees and chilli peppers, have been used to create barriers around crops to deter elephants from raiding [173–175]. These methods can help reduce crop damage and protect both humans and elephants from potentially dangerous encounters. However, some elephants may habituate to such deterrents over time, reducing their effectiveness. Moreover, the success of these deterrents can be influenced by factors such as the local environment, weather conditions and the availability of alternative food sources for elephants. Research into understanding the elephant olfactory universe can lead to far more subtle ways to reduce HECs.
6. Discussion
In this paper, we have reviewed various AI and ML methods that can benefit elephant monitoring and conservation. The key challenge for future work in this field is fostering a consistent and long-lasting collaboration between the fields of conservation and AI. The pace at which each field advances is quite different: AI is rapidly evolving, with new advancements emerging frequently; while elephant studies require prolonged periods to establish trust with subjects, observe them and adapt experiments to suit the population and environmental constraints. Collaborative efforts should not be limited to merely gathering data and applying AI tools. Instead, they should involve cyclical processes of data exploration, transformation, analysis, interpretation and communication. From a conservation perspective, this requires clear goal-setting, statements of the research objectives and keeping up to date on technological advancements relevant to the work. For AI professionals, early involvement in discussions on data collection is crucial, as well as gaining an understanding of the practical challenges and timelines associated with field work.
From the discussions in this review, it is evident that AI holds substantial potential for applications in elephant research and monitoring. These applications span from data collection and curation to deriving meaningful representations of data and generating novel scientific hypotheses. However, it is important to clarify the specific contexts where AI excels and its inherent limitations. Typically, AI and ML methods thrive in scenarios with extensive datasets and well-defined research questions. Their core strength resides in pattern recognition. For instance, supervised learning targets predefined patterns that researchers aim to detect, while unsupervised learning endeavours to understand the inherent structure, pattern or characteristics of a dataset. Additionally, AI is particularly efficient in scenarios requiring consistent and repetitive computation, especially when working with standardized or normalized datasets. By contrast, AI may not be the ideal solution for more ambiguous or heterogeneous data situations.
6.1. Transfer learning
In numerous video and imaging classification tasks for elephant recognition, the developed models often face difficulties in transferring to new environments due to varying environmental background features and the presence of unfamiliar species [176,177]. Adapting these networks for new environments could be expensive, as it necessitates collecting and annotating new data. Several approaches can be employed to enhance a model’s transfer-ability without requiring the direct measurement of new datasets in the field. First, AI model developers may use simulated data to broaden the training dataset, which has demonstrated promising results [178]. Another strategy involves creating more general models that exhibit robustness across multiple environments, followed by fine-tuning for specific tasks of interest [176]. A notable example of this is Megadetector, a tool used to detect animals in camera trap images with an impressive ability to transfer to new environments easily [179]. Finally, self-supervised or unsupervised methods can be employed for training models, enabling dataset expansion without the need for labels.
6.2. Self-supervised and unsupervised methods
A significant portion of AI work in the field relies on the collection of vast amounts of data. Furthermore, the majority of techniques reviewed in this paper use supervised training, requiring extensive data annotation for effective model training. The combined requirements of data collection and labour-intensive annotation lead to considerable development costs for these tools. In a resource-constrained field, such as conservation, these costs pose substantial obstacles to the ongoing development and application of AI solutions.
Self-supervised and unsupervised learning methods are AI techniques that do not rely on labelled data, allowing them to automatically discover patterns and structure within datasets without needing humans to mark or annotate the data first. This area of research is currently experiencing significant activity. Notable examples include self-supervision in images to learn valuable information from the images and videos automatically [180,181], and using unsupervised methods to improve and automate bio-acoustics analysis of PAM datasets without labelling [139,140]. Moreover, innovative techniques have emerged for animal behaviour monitoring without the need for labels using unsupervised techniques [91,103]. As these methods continue to advance, they will not only reduce the costs associated with data collection but also facilitate novel insights into animal behaviour that may elude human consideration.
7. Conclusion
AI techniques have emerged as powerful tools for enhancing elephant monitoring and conservation efforts. AI-based techniques in imaging, video, audio, seismic and olfactory modalities have shown promising advancements, offering more sophisticated and efficient alternatives to traditional monitoring methods. These monitoring techniques have the potential to improve the accuracy and efficiency of elephant behaviour studies, automate data labelling and analysis, and detect nuanced behaviours or patterns that may enhance capabilities of human observers. Furthermore, the ongoing development of novel data capture hardware systems, such as drone monitoring and olfactory measurement, generates a vast array of data that can be harnessed to create powerful AI analysis tools.
Advancing these AI models requires consistent collaboration between AI specialists and conservationists and some methods, tools and analyses will need to be uniquely tailored for elephant research. These tools will arise primarily from needs within that field, for example, identifying individual elephants. For this work input from elephant experts is essential, and relying solely on adjusting general models will not be adequate for these specialized needs.
Additionally, challenges remain in transferring AI models to new environments, considering the high cost of data collection and annotation and the limitations of supervised learning techniques. Future research should focus on strategies such as using simulated data, fine-tuning pretrained models and employing self-supervised or unsupervised techniques to address these data shortcomings and promote wider adoption of AI in elephant conservation efforts.
Finally, while each monitoring modality has been addressed individually in this overview, integrating various modalities in future work can provide a comprehensive understanding of the stimuli that elephants encounter and their perception of their environment, and can aid in understanding how elephants make decisions.
We conclude that AI offers significant advancements in animal monitoring, with elephants being particularly apt for the application of these techniques. As AI applications in this domain evolve and challenges are addressed, our understanding and protection of elephants will improve. This not only supports the longevity of elephants and their habitats but also paves the way for technologies that could aid broader conservation initiatives.
Acknowledgements
We would like to thank Matt James for his extensive knowledge and guidance in the field of elephant conservation. His direction was invaluable in connecting us with the right experts, which significantly informed our work and understanding. We also wish to acknowledge the use of AI language services which contributed to the editing and improvement of the readability of this paper. As befits the subject, an AI service was used for its editing, while the content of the paper was independently generated by the only too human authors.
Data accessibility
This article has no additional data.
Declaration of AI use
Yes, we have used AI-assisted technologies in creating this article.
Authors' contributions
L.B.: conceptualization, data curation, formal analysis, investigation, methodology, project administration, writing—original draft, writing—review and editing; L.Z.: conceptualization, formal analysis, investigation, writing—review and editing; F.V.: conceptualization, investigation, writing—review and editing; I.D.-H: conceptualization, writing—review and editing; A.J.T.: conceptualization, supervision, writing—review and editing.
All authors gave final approval for publication and agreed to be held accountable for the work performed therein.
Conflict of interest declaration
We declare we have no competing interests.
Funding
This work was internally funded by Colossal Laboratories & Biosciences.
References
- 1.Dagenais P, Hensman S, Haechler V, Milinkovitch MC. 2021. Elephants evolved strategies reducing the biomechanical complexity of their trunk. Curr. Biol. 31, 4727-4737. ( 10.1016/j.cub.2021.08.029) [DOI] [PubMed] [Google Scholar]
- 2.Chevalier-Skolnikoff S, Liska J. 1993. Tool use by wild and captive elephants. Anim. Behav. 46, 209-219. ( 10.1006/anbe.1993.1183) [DOI] [Google Scholar]
- 3.Plotnik JM, De Waal FB, Reiss D. 2006. Self-recognition in an Asian elephant. Proc. Natl Acad. Sci. USA 103, 17 053-17 057. ( 10.1073/pnas.0608062103) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Goldenberg SZ, Wittemyer G. 2020. Elephant behavior toward the dead: a review and insights from field observations. Primates 61, 119-128. ( 10.1007/s10329-019-00766-5) [DOI] [PubMed] [Google Scholar]
- 5.McComb K, Reby D, Baker L, Moss C, Sayialel S. 2003. Long-distance communication of acoustic cues to social identity in African elephants. Anim. Behav. 65, 317-329. ( 10.1006/anbe.2003.2047) [DOI] [Google Scholar]
- 6.McComb K, Moss C, Sayialel S, Baker L. 2000. Unusually extensive networks of vocal recognition in African elephants. Anim. Behav. 59, 1103-1109. ( 10.1006/anbe.2000.1406) [DOI] [PubMed] [Google Scholar]
- 7.Wittemyer G, Douglas-Hamilton I, Getz WM. 2005. The socioecology of elephants: analysis of the processes creating multitiered social structures. Anim. Behav. 69, 1357-1371. ( 10.1016/j.anbehav.2004.08.018) [DOI] [Google Scholar]
- 8.de Silva S, Ranjeewa AD, Kryazhimskiy S. 2011. The dynamics of social networks among female Asian elephants. BMC Ecol. 11, 17. ( 10.1186/1472-6785-11-17) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Lee PC, Moss CJ. 2014. African elephant play, competence and social complexity. Anim. Behav. Cogn. 1, 144-156. ( 10.12966/abc.05.05.2014) [DOI] [Google Scholar]
- 10.Santiapillai C. 2004. The living elephants: evolutionary ecology, behaviour, and conservation.
- 11.Haynes G. 2012. Elephants (and extinct relatives) as earth-movers and ecosystem engineers. Geomorphology 157–158, 99-107. ( 10.1016/j.geomorph.2011.04.045) [DOI] [Google Scholar]
- 12.Guldemond RAR, Purdon A, van Aarde RJ. 2017. A systematic review of elephant impact across Africa. PLoS ONE 12, e0178935. ( 10.1371/journal.pone.0178935) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Kahindi O. 2001. Cultural perceptions of elephants by the Samburu people in northern Kenya. Unpublished master dissertation, University of Strathclyde, UK.
- 14.Bowen-Jones E, Entwistle A. 2002. Identifying appropriate flagship species: the importance of culture and local contexts. Oryx 36, 189-195. ( 10.1017/S0030605302000261) [DOI] [Google Scholar]
- 15.Santiapillai C. 1997. The Asian elephant conservation: a global strategy. Gajah 18, 21-39. [Google Scholar]
- 16.Saaban S, Othman NB, Yasak MNB, Burhanuddin M, Zafir A, Campos-Arceiz A. 2011. Current status of Asian elephants in Peninsular Malaysia. Gajah 35, 67-75. [Google Scholar]
- 17.Perera B. 2009. The human-elephant conflict: a review of current status and mitigation methods. Gajah 30, 41-52. [Google Scholar]
- 18.Blake S, et al. 2007. Forest elephant crisis in the Congo Basin. PLoS Biol. 5, e111. ( 10.1371/journal.pbio.0050111) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Wittemyer G, Northrup JM, Blanc J, Douglas-Hamilton I, Omondi P, Burnham KP. 2014. Illegal killing for ivory drives global decline in African elephants. Proc. Natl Acad. Sci. USA 111, 13 117-13 121. ( 10.1073/pnas.1403984111) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Gross EM, Pereira JG, Shaba T, Bilério S, Kumchedwa B, Lienenlüke S. 2022. Exploring routes to coexistence: developing and testing a human–elephant conflict-management framework for African elephant-range countries. Diversity 14, 525. ( 10.3390/d14070525) [DOI] [Google Scholar]
- 21.Di Minin E, Slotow R, Fink C, Bauer H, Packer C. 2021. A pan-African spatial assessment of human conflicts with lions and elephants. Nat. Commun. 12, 2978. ( 10.1038/s41467-021-23283-w) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Hoare R. 2000. African elephants and humans in conflict: the outlook for co-existence. Oryx 34, 34-38. ( 10.1046/j.1365-3008.2000.00092.x) [DOI] [Google Scholar]
- 23.Sach F, Dierenfeld ES, Langley-Evans SC, Watts MJ, Yon L. 2019. African savanna elephants (Loxodonta africana) as an example of a herbivore making movement choices based on nutritional needs. PeerJ 7, e6260. ( 10.7717/peerj.6260) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Wenborn M, et al. 2022. Analysis of records from community game guards of human-elephant conflict in Orupupa Conservancy, northwest Namibia. Namib. J. Environ. 6, A-100. [Google Scholar]
- 25.Chase M, Schlossberg S, Sutcliffe R, Seonyatseng E. 2018. Dry season aerial survey of elephants and wildlife in northern Botswana, July–October 2018. Gabarone, Botswana: Department of Wildlife and National Parks. [Google Scholar]
- 26.Dunham K, Mackie C, Nyaguse G, Zhuwau C. 2015. Aerial survey of elephants and other large herbivores in the Sebungwe (Zimbabwe): 2014. Seattle, WA: Great Elephant Census. [Google Scholar]
- 27.Barnes R, Beardsley K, Michelmore F, Barnes K, Alers M, Blom A. 1997. Estimating forest elephant numbers with dung counts and a geographic information system. J. Wildl. Manag. 61, 1384-1393. ( 10.2307/3802142) [DOI] [Google Scholar]
- 28.Smit J, Pozo RA, Cusack JJ, Nowak K, Jones T. 2019. Using camera traps to study the age–sex structure and behaviour of crop-using elephants Loxodonta africana in Udzungwa Mountains National Park, Tanzania. Oryx 53, 368-376. ( 10.1017/S0030605317000345) [DOI] [Google Scholar]
- 29.Schlossberg S, Chase MJ, Griffin CR. 2016. Testing the accuracy of aerial surveys for large mammals: an experiment with African savanna elephants (Loxodonta africana). PLoS ONE 11, e0164904. ( 10.1371/journal.pone.0164904) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Koneff MD, Royle JA, Otto MC, Wortham JS, Bidwell JK. 2008. A double-observer method to estimate detection rate during aerial waterfowl surveys. J. Wildl. Manag. 72, 1641-1649. ( 10.2193/2008-036) [DOI] [Google Scholar]
- 31.Lamprey R, Ochanda D, Brett R, Tumwesigye C, Douglas-Hamilton I. 2020. Cameras replace human observers in multi-species aerial counts in Murchison Falls, Uganda. Remote Sens. Ecol. Conserv. 6, 529-545. ( 10.1002/rse2.154) [DOI] [Google Scholar]
- 32.Lamprey R, Pope F, Ngene S, Norton-Griffiths M, Frederick H, Okita-Ouma B, Douglas-Hamilton I. 2020. Comparing an automated high-definition oblique camera system to rear-seat-observers in a wildlife survey in Tsavo, Kenya: taking multi-species aerial counts to the next level. Biol. Conserv. 241, 108243. ( 10.1016/j.biocon.2019.108243) [DOI] [Google Scholar]
- 33.Farley SS, Dawson A, Goring SJ, Williams JW. 2018. Situating ecology as a big-data science: current advances, challenges, and solutions. BioScience 68, 563-576. ( 10.1093/biosci/biy068) [DOI] [Google Scholar]
- 34.Christin S. 2019. Applications for deep learning in ecology. Methods Ecol. Evol. 10, 1632-1644. ( 10.1111/2041-210X.13256) [DOI] [Google Scholar]
- 35.Tuia D, et al. 2022. Perspectives in machine learning for wildlife conservation. Nat. Commun. 13, 792. ( 10.1038/s41467-022-27980-y) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.SMART Partnership. SMART Parks. n.d. https://smartconservationtools.org/.
- 37.Elephant survey system. 2023. URL https://wildeyeconservation.org/elephant-survey-system/.
- 38.Vulcan Inc. EarthRanger. n.d. https://www.earthranger.com/.
- 39.Alim AN. 2021. Stop the illegal wildlife trade: how artificial intelligence has become the latest conservation tool. See https://www.independent.co.uk/stop-the-illegal-wildlife-trade/ai-technology-wildlife-conservation-gabon-b1783812.html.
- 40.Tabak MA, et al. 2019. Machine learning to classify animal species in camera trap images: applications in ecology. Methods Ecol. Evol. 10, 585-590. ( 10.1111/2041-210X.13120) [DOI] [Google Scholar]
- 41.Vecvanags A, Aktas K, Pavlovs I, Avots E, Filipovs J, Brauns A, Done G, Jakovels D, Anbarjafari G. 2022. Ungulate detection and species classification from camera trap images using RetinaNet and faster R-CNN. Entropy 24, 353. ( 10.3390/e24030353) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Willi M, Pitman RT, Cardoso AW, Locke C, Swanson A, Boyer A, Veldthuis M, Fortson L. 2019. Identifying animal species in camera trap images using deep learning and citizen science. Methods Ecol. Evol. 10, 80-91. ( 10.1111/2041-210X.13099) [DOI] [Google Scholar]
- 43.Villa AG, Salazar A, Vargas F. 2017. Towards automatic wild animal monitoring: identification of animal species in camera-trap images using very deep convolutional neural networks. Ecol. Inform. 41, 24-32. ( 10.1016/j.ecoinf.2017.07.004) [DOI] [Google Scholar]
- 44.Schneider S, Greenberg S, Taylor GW, Kremer SC. 2020. Three critical factors affecting automated image species recognition performance for camera traps. Ecol. Evol. 10, 3503-3517. ( 10.1002/ece3.6147) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Gonzalez LF, Montes GA, Puig E, Johnson S, Mengersen K, Gaston KJ. 2016. Unmanned aerial vehicles (UAVs) and artificial intelligence revolutionizing wildlife monitoring and conservation. Sensors 16, 97. ( 10.3390/s16010097) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Artificial intelligence and elephant conservation. 2020. url: https://www.4elephants.org/blog/article/artificial-intelligenceand-elephant-conservation.
- 47.Duporge I, Spiegel MP, Thomson ER, Chapman T, Lamberth C, Pond C, Macdonald DW, Wang T, Klinck H. 2021. Determination of optimal flight altitude to minimise acoustic drone disturbance to wildlife using species audiograms. Methods Ecol. Evol. 12, 2196-2207. ( 10.1111/2041-210X.13691) [DOI] [Google Scholar]
- 48.Premarathna KSP, Rathnayaka RMKT. 2020. CNN based image detection system for elephant directions to reduce human-elephant conflict. In 13th International Research Conference General Sir John Kotelawala Defence University Computing Sessions.
- 49.Körschens M, Barz B, Denzler J. 2018. Towards automatic identification of elephants in the wild. (http://arxiv.org/abs/1812.04418).
- 50.De Silva M, Kumarasinghe P, De Zoysa K, Keppitiyagama C. 2022. Reidentifying asian elephants from ear images using a cascade of convolutional neural networks and explaining with GradCAM. SN Comput. Sci. 3, 192. ( 10.1007/s42979-022-01057-5) [DOI] [Google Scholar]
- 51.Weideman H, et al. 2020. Extracting identifying contours for African elephants and humpback whales using a learned appearance model. In Proc. of the IEEE/CVF Winter Conf. on Applications of Computer Vision, Snowmass Village, CO, pp. 1276–1285.
- 52.Kulits P, Wall J, Bedetti A, Henley M, Beery S. 2021. ElephantBook: a semi-automated human-in-the-loop system for elephant re-identification. In ACM SIGCAS Conf. on Computing and Sustainable Societies, Virtual Event Australia, pp. 88–98.
- 53.Srinivasaiah N, Kumar V, Vaidyanathan S, Sukumar R, Sinha A. 2019. All-male groups in Asian elephants: a novel, adaptive social strategy in increasingly anthropogenic landscapes of southern India. Sci. Rep. 9, 8678. ( 10.1038/s41598-019-45130-1) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Ditria EM, Lopez-Marcano S, Sievers M, Jinks EL, Brown CJ, Connolly RM. 2020. Automating the analysis of fish abundance using object detection: optimizing animal ecology with deep learning. Front. Mar. Sci. 7, 429. ( 10.3389/fmars.2020.00429) [DOI] [Google Scholar]
- 55.Cheema GS, Anand S. 2017. Automatic detection and recognition of individuals in patterned species. In Machine Learning and Knowledge Discovery in Databases: European Conf., ECML PKDD 2017, Skopje, Macedonia, September 18–22, 2017, Proceedings, Part III 10, pp. 27–38. Skopje, Macedonia: Springer.
- 56.Shi C, Liu D, Cui Y, Xie J, Roberts NJ, Jiang G. 2020. Amur tiger stripes: individual identification based on deep convolutional neural network. Integr. Zool. 15, 461-470. ( 10.1111/1749-4877.12453) [DOI] [PubMed] [Google Scholar]
- 57.Foley CA, Faust LJ. 2010. Rapid population growth in an elephant Loxodonta africana population recovering from poaching in Tarangire National Park, Tanzania. Oryx 44, 205-212. ( 10.1017/S0030605309990706) [DOI] [Google Scholar]
- 58.Montero-De L, Jacobson SL, Chodorow M, Yindee M, Plotnik JM. 2023. Day and night camera trap videos are effective for identifying individual wild Asian elephants. PeerJ 11, e15130. ( 10.7717/peerj.15130) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Taskiran M, Kahraman N, Erdem CE. 2020. Face recognition: past, present and future (a review). Digit. Signal Process. 106, 102809. ( 10.1016/j.dsp.2020.102809) [DOI] [Google Scholar]
- 60.Clapham M, Miller E, Nguyen M, Darimont CT. 2020. Automated facial recognition for wildlife that lack unique markings: a deep learning approach for brown bears. Ecol. Evol. 10, 12883-12892. ( 10.1002/ece3.6840) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Clapham M, Miller E, Nguyen M, Van Horn RC. 2022. Multispecies facial detection for individual identification of wildlife: a case study across ursids. Mamm. Biol. 102, 921-933. ( 10.1007/s42991-021-00168-5) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Shukla A, Cheema GS, Anand S, Qureshi Q, Jhala Y. 2019. Primate face identification in the wild. In PRICAI 2019: Trends in Artificial Intelligence: 16th Pacific Rim Int. Conf. on Artificial Intelligence, Cuvu, Yanuca Island, Fiji, August 26–30, 2019, Proceedings, Part III 16, pp. 387–401. Springer.
- 63.Hou J, et al. 2020. Identification of animal individuals using deep learning: a case study of giant panda. Biol. Conserv. 242, 108414. ( 10.1016/j.biocon.2020.108414) [DOI] [Google Scholar]
- 64.Kahl MP, Armstrong BD. 2002. Visual displays of wild African elephants during musth. Mammalia 66, 159-172. [Google Scholar]
- 65.Douglas-Hamilton I, Bhalla S, Wittemyer G, Vollrath F. 2006. Behavioural reactions of elephants towards a dying and deceased matriarch. Appl. Anim. Behav. Sci. 100, 87-102. ( 10.1016/j.applanim.2006.04.014) [DOI] [Google Scholar]
- 66.Turkalo AK, Wrege PH, Wittemyer G. 2013. Long-term monitoring of Dzanga Bai forest elephants: forest clearing use patterns. PLoS ONE 8, e85154. ( 10.1371/journal.pone.0085154) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Moss CJ, Croze H, Lee PC. 2011. The Amboseli elephants: a long-term perspective on a long-lived mammal. University of Chicago Press. [Google Scholar]
- 68.Poole J, Granli P. 2021. The elephant ethogram: a library of African elephant behaviour. Pachyderm 62, 105-111. [Google Scholar]
- 69.van de Water A, et al. 2020. Beehive fences as a sustainable local solution to human-elephant conflict in Thailand. Conserv. Sci. Pract. 2, e260. ( 10.1111/csp2.260) [DOI] [Google Scholar]
- 70.O’Connell AF, Nichols JD, Karanth KU eds. 2011. Camera traps in animal ecology. Tokyo, Japan: Springer Japan. [Google Scholar]
- 71.Dell AI, et al. 2014. Automated image-based tracking and its application in ecology. Trends Ecol. Evol. 29, 417-428. ( 10.1016/j.tree.2014.05.004) [DOI] [PubMed] [Google Scholar]
- 72.Mathis MW, Mathis A. 2020. Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 60, 1-11. ( 10.1016/j.conb.2019.10.008) [DOI] [PubMed] [Google Scholar]
- 73.Jiang L, Lee C, Teotia D, Ostadabbas S. 2022. Animal pose estimation: a closer look at the state-of-the-art, existing gaps and opportunities. Comput. Vis. Image Underst. 222, 103483. ( 10.1016/j.cviu.2022.103483) [DOI] [Google Scholar]
- 74.Günel S, Rhodin H, Morales D, Campagnolo J, Ramdya P, Fua P. 2019. DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. Elife 8, e48571. ( 10.7554/eLife.48571) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Liu W, Bao Q, Sun Y, Mei T. 2022. Recent advances of monocular 2d and 3d human pose estimation: a deep learning perspective. ACM Comput. Surv. 55, 1-41. ( 10.1145/3524497) [DOI] [Google Scholar]
- 76.Zuffi S, Kanazawa A, Jacobs DW, Black MJ. 2017. 3D menagerie: modeling the 3D shape and pose of animals. In Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, Honolulu, HI, pp. 6365–6373.
- 77.Zhang L, Dunn T, Marshall J, Olveczky B, Linderman S. 2021. Animal pose estimation from video data with a hierarchical von Mises-Fisher-Gaussian model. In Proceedings of The 24th International Conference on Artificial Intelligence and Statistics (eds A Banerjee, K Fukumizu), vol. 130 Proceedings of Machine Learning Research, 13–15 Apr 2021, pp. 2800–2808. Virtual Conference: PMLR.
- 78.Hu B, et al. 2023. 3D mouse pose from single-view video and a new dataset. Sci. Rep. 13, 13554. ( 10.1038/s41598-023-40738-w) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Schad L, Fischer J. 2023. Opportunities and risks in the use of drones for studying animal behaviour. Methods Ecol. Evol. 14, 1864-1872. ( 10.1111/2041-210X.13922) [DOI] [Google Scholar]
- 80.Koger B, Deshpande A, Kerby JT, Graving JM, Costelloe BR, Couzin ID. 2023. Quantifying the movement, behaviour and environmental context of group-living animals using drones and computer vision. J. Anim. Ecol. 92, 1357-1371. ( 10.1111/1365-2656.13904) [DOI] [PubMed] [Google Scholar]
- 81.Costa-Pereira R, Moll RJ, Jesmer BR, Jetz W. 2022. Animal tracking moves community ecology: opportunities and challenges. J. Anim. Ecol. 91, 1334-1344. ( 10.1111/1365-2656.13698) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Corcoran E, Winsen M, Sudholz A, Hamilton G. 2021. Automated detection of wildlife using drones: synthesis, opportunities and constraints. Methods Ecol. Evol. 12, 1103-1114. ( 10.1111/2041-210X.13581) [DOI] [Google Scholar]
- 83.Hartmann WL, Fishlock V, Leslie A. 2021. First guidelines and suggested best protocol for surveying African elephants (Loxodonta africana) using a drone. Koedoe 63, 1-9. ( 10.4102/koedoe.v63i1.1687) [DOI] [Google Scholar]
- 84.vanVuuren M, Silverberg LM, Manning J, Pacifici K, Dorgeloh W, Campbell J. 2023. Ungulate responses and habituation to unmanned aerial vehicles in Africa’s savanna. PLoS ONE 18, e0288975. ( 10.1371/journal.pone.0288975) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Bohnslav JP, et al. 2021. DeepEthogram, a machine learning pipeline for supervised behavior classification from raw pixels. Elife 10, e63377. ( 10.7554/eLife.63377) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Kabra M, Robie AA, Rivera-Alba M, Branson S, Branson K. 2013. JAABA: interactive machine learning for automatic annotation of animal behavior. Nat. Methods 10, 64-67. ( 10.1038/nmeth.2281) [DOI] [PubMed] [Google Scholar]
- 87.Gabriel CJ, et al. 2022. BehaviorDEPOT is a simple, flexible tool for automated behavioral detection based on markerless pose tracking. Elife 11, e74314. ( 10.7554/eLife.74314) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Nilsson SR, et al. 2020. Simple Behavioral Analysis (SimBA)—an open source toolkit for computer classification of complex social behaviors in experimental animals. BioRxiv. ( 10.1101/2020.04.19.049452) [DOI]
- 89.Segalin C, Williams J, Karigo T, Hui M, Zelikowsky M, Sun JJ, Perona P, Anderson DJ, Kennedy A. 2021. The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice. Elife 10, e63720. ( 10.7554/eLife.63720) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Klibaite U, Berman GJ, Cande J, Stern DL, Shaevitz JW. 2017. An unsupervised method for quantifying the behavior of paired animals. Phys. Biol. 14, 015006. ( 10.1088/1478-3975/aa5c50) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Wiltschko AB, Johnson MJ, Iurilli G, Peterson RE, Katon JM, Pashkovski SL, Abraira VE, Adams RP, Datta SR. 2015. Mapping sub-second structure in mouse behavior. Neuron 88, 1121-1135. ( 10.1016/j.neuron.2015.11.031) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Berman GJ, Choi DM, Bialek W, Shaevitz JW. 2014. Mapping the stereotyped behaviour of freely moving fruit flies. J. R. Soc. Interface 11, 20140672. ( 10.1098/rsif.2014.0672) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Werkhoven Z, Bravin A, Skutt-Kakaria K, Reimers P, Pallares LF, Ayroles J, De Bivort BL. 2021. The structure of behavioral variation within a genotype. Elife 10, e64988. ( 10.7554/eLife.64988) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Klibaite U, Kislin M, Verpeut JL, Bergeler S, Sun X, Shaevitz JW, Wang SSH. 2022. Deep phenotyping reveals movement phenotypes in mouse neurodevelopmental models. Mol. Autism 13, 12. ( 10.1186/s13229-022-00492-8) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Couzin ID, Krause J, Franks NR, Levin SA. 2005. Effective leadership and decision-making in animal groups on the move. Nature 433, 513-516. ( 10.1038/nature03236) [DOI] [PubMed] [Google Scholar]
- 96.Anderson DJ, Perona P. 2014. Toward a science of computational ethology. Neuron 84, 18-31. ( 10.1016/j.neuron.2014.09.005) [DOI] [PubMed] [Google Scholar]
- 97.Valletta JJ, Torney C, Kings M, Thornton A, Madden J. 2017. Applications of machine learning in animal behaviour studies. Anim. Behav. 124, 203-220. ( 10.1016/j.anbehav.2016.12.005) [DOI] [Google Scholar]
- 98.Datta SR, Anderson DJ, Branson K, Perona P, Leifer A. 2019. Computational neuroethology: a call to action. Neuron 104, 11-24. ( 10.1016/j.neuron.2019.09.038) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Couzin ID, Heins C. 2022. Emerging technologies for behavioral research in changing environments. Trends Ecol. Evol. 38, 346-354. ( 10.1016/j.tree.2022.11.008) [DOI] [PubMed] [Google Scholar]
- 100.Sturman O, et al. 2020. Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions. Neuropsychopharmacology 45, 1942-1952. ( 10.1038/s41386-020-0776-y) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.King LE, Lala F, Nzumu H, Mwambingu E, Douglas-Hamilton I. 2017. Beehive fences as a multidimensional conflict-mitigation tool for farmers coexisting with elephants. Conserv. Biol. 31, 743-752. ( 10.1111/cobi.12898) [DOI] [PubMed] [Google Scholar]
- 102.Hsu AI, Yttri EA. 2021. B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors. Nat. Commun. 12, 5188. ( 10.1038/s41467-021-25420-x) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Luxem K, Mocellin P, Fuhrmann F, Kürsch J, Miller SR, Palop JJ, Remy S, Bauer P. 2022. Identifying behavioral structure from deep variational embeddings of animal motion. Commun. Biol. 5, 1267. ( 10.1038/s42003-022-04080-7) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Weinreb C, et al. 2023. Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics. bioRxiv. ( 10.1101/2023.03.16.532307) [DOI]
- 105.Wiltschko AB, et al. 2020. Revealing the structure of pharmacobehavioral space through motion sequencing. Nat. Neurosci. 23, 1433-1443. ( 10.1038/s41593-020-00706-3) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Goldenberg SZ, Wittemyer G. 2017. Orphaned female elephant social bonds reflect lack of access to mature adults. Sci. Rep. 7, 14408. ( 10.1038/s41598-017-14712-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Stokes H, Perera V, Jayasena N, Silva-Fletcher A. 2017. Nocturnal behavior of orphaned Asian elephant (Elephas maximus) calves in Sri Lanka. Zoo Biol. 36, 261-272. ( 10.1002/zoo.21360) [DOI] [PubMed] [Google Scholar]
- 108.Garaï ME, Boult VL, Zitzer HR. 2023. Identifying the effects of social disruption through translocation on African elephants (Loxodonta africana), with specifics on the social and ecological impacts of orphaning. Animals 13, 483. ( 10.3390/ani13030483) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Pinter-Wollman N, Isbell LA, Hart LA. 2009. Assessing translocation outcome: comparing behavioral and physiological aspects of translocated and resident African elephants (Loxodonta africana). Biol. Conserv. 142, 1116-1124. ( 10.1016/j.biocon.2009.01.027) [DOI] [Google Scholar]
- 110.Hörner F, Oerke AK, Müller DW, Westerhüs U, Azogu-Sepe I, Hruby J, Preisfeld G. 2021. Monitoring behaviour in African elephants during introduction into a new group: differences between related and unrelated animals. Animals 11, 2990. ( 10.3390/ani11102990) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 111.Batty E, Whiteway M, Saxena S, Biderman D, Abe T, Musall S, Gillis W, Markowitz J, Churchland A, Cunningham JP, Datta SR, Linderman S, Paninski L. 2019. BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. In Advances in neural information processing systems (eds H Wallach, H Larochelle, A Beygelzimer, F d’Alché-Buc, E Fox, R Garnett), vol. 32. Vancouver, Canada: Curran Associates, Inc.
- 112.Marks M, Qiuhan J, Sturman O, von Ziegler L, Kollmorgen S, von der Behrens W, Mante V, Bohacek J, Yanik MF. 2020. SIPEC: the deep-learning Swiss knife for behavioral data analysis. BioRxiv. ( 10.1101/2020.10.26.355115) [DOI] [PMC free article] [PubMed]
- 113.Bain M, et al. 2021. Automated audiovisual behavior recognition in wild primates. Sci. Adv. 7, eabi4883. ( 10.1126/sciadv.abi4883) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Feng L, Zhao Y, Sun Y, Zhao W, Tang J. 2021. Action recognition using a spatial-temporal network for wild felines. Animals 11, 485. ( 10.3390/ani11020485) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Wiltshire C, Lewis-Cheetham J, Komedová V, Matsuzawa T, Graham KE, Hobaiter C. 2023. DeepWild: application of the pose estimation tool DeepLabCut for behaviour tracking in wild chimpanzees and bonobos. J. Anim. Ecol. 92, 1560-1574. ( 10.1111/1365-2656.13932) [DOI] [PubMed] [Google Scholar]
- 116.Costacurta JC, Duncker L, Sheffer B, Gillis W, Weinreb C, Markowitz JE, Datta SR, Williams AH, Linderman S. 2022. Distinguishing discrete and continuous behavioral variability using warped autoregressive HMMs. In Advances in neural information processing systems (eds AH Oh, A Agarwal, D Belgrave, K Cho). New Orleans, LA, USA: Curran Associates.
- 117.Strandburg-Peshkin A, Farine DR, Couzin ID, Crofoot MC. 2015. Shared decision-making drives collective movement in wild baboons. Science 348, 1358-1361. ( 10.1126/science.aaa5099) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 118.Ozogány K, Vicsek T. 2015. Modeling the emergence of modular leadership hierarchy during the collective motion of herds made of harems. J. Stat. Phys. 158, 628-646. ( 10.1007/s10955-014-1131-7) [DOI] [Google Scholar]
- 119.Mumby HS, Plotnik JM. 2018. Taking the elephants’ perspective: remembering elephant behavior, cognition and ecology in human-elephant conflict mitigation. Front. Ecol. Evol. 6, 122. ( 10.3389/fevo.2018.00122) [DOI] [Google Scholar]
- 120.Hoare R. 1999. Determinants of human–elephant conflict in a land-use mosaic. J. Appl. Ecol. 36, 689-700. ( 10.1046/j.1365-2664.1999.00437.x) [DOI] [Google Scholar]
- 121.Han J, Bhanu B. 2005. Individual recognition using gait energy image. IEEE Trans. Pattern Anal. Mach. Intell. 28, 316-322. ( 10.1109/TPAMI.2006.38) [DOI] [PubMed] [Google Scholar]
- 122.Teepe T, Khan A, Gilg J, Herzog F, Hörmann S, Rigoll G. 2021. Gaitgraph: graph convolutional network for skeleton-based gait recognition. In 2021 IEEE Int. Conf. on Image Processing (ICIP), pp. 2314–2318. Anchorage, Alaska, USA: IEEE.
- 123.Enev M, Takakuwa A, Koscher K, Kohno T. 2016. Automobile driver fingerprinting. Proc. Priv. Enhancing Technol. 2016, 34-50. ( 10.1515/popets-2015-0029) [DOI] [Google Scholar]
- 124.Poole JH, Tyack PL, Stoeger-Horwath AS, Watwood S. 2005. Elephants are capable of vocal learning. Nature 434, 455-456. ( 10.1038/434455a) [DOI] [PubMed] [Google Scholar]
- 125.Byrne RW, Bates LA, Moss CJ. 2009. Elephant cognition in primate perspective. Comp. Cogn. Behav. Rev. 4, 65-79. ( 10.3819/ccbr.2009.40009) [DOI] [Google Scholar]
- 126.Mortimer B, Rees WL, Koelemeijer P, Nissen-Meyer T. 2018. Classifying elephant behaviour through seismic vibrations. Curr. Biol. 28, R547-R548. ( 10.1016/j.cub.2018.03.062) [DOI] [PubMed] [Google Scholar]
- 127.Thompson ME, Schwager SJ, Payne KB, Turkalo AK. 2010. Acoustic estimation of wildlife abundance: methodology for vocal mammals in forested habitats. Afr. J. Ecol. 48, 654-661. ( 10.1111/j.1365-2028.2009.01161.x) [DOI] [Google Scholar]
- 128.Hedwig D, DeBellis M, Wrege PH. 2018. Not so far: attenuation of low-frequency vocalizations in a rainforest environment suggests limited acoustic mediation of social interaction in African forest elephants. Behav. Ecol. Sociobiol. 72, 1-11. ( 10.1007/s00265-018-2451-4) [DOI] [Google Scholar]
- 129.Wrege PH, Rowland ED, Keen S, Shiu Y. 2017. Acoustic monitoring for conservation in tropical forests: examples from forest elephants. Methods Ecol. Evol. 8, 1292-1301. ( 10.1111/2041-210X.12730) [DOI] [Google Scholar]
- 130.Bjorck J, Rappazzo BH, Chen D, Bernstein R, Wrege PH, Gomes CP. 2019. Automatic detection and compression for passive acoustic monitoring of the african forest elephant. In Proc. of the AAAI Conf. on Artificial Intelligence, vol. 33, pp. 476–484.
- 131.Sethi SS, Jones NS, Fulcher BD, Picinali L, Clink DJ, Klinck H, Orme CDL, Wrege PH, Ewers RM. 2020. Characterizing soundscapes across diverse ecosystems using a universal acoustic feature set. Proc. Natl Acad. Sci. USA 117, 17 049-17 055. ( 10.1073/pnas.2004702117) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.Keen SC, Shiu Y, Wrege PH, Rowland ED. 2017. Automated detection of low-frequency rumbles of forest elephants: a critical tool for their conservation. J. Acoust. Soc. Am. 141, 2715-2726. ( 10.1121/1.4979476) [DOI] [PubMed] [Google Scholar]
- 133.Schwartz D, Selman JMG, Wrege P, Paepcke A. 2021. Deployment of embedded edge-AI for wildlife monitoring in remote regions. In 2021 20th IEEE Int. Conf. on Machine Learning and Applications (ICMLA), pp. 1035–1042. IEEE.
- 134.Wijayakulasooriya JV. 2011. Automatic recognition of elephant infrasound calls using formant analysis and hidden markov model. In 2011 6th Int. Conf. on Industrial and Information Systems, pp. 244–248. IEEE.
- 135.Venter PJ 2009. Recording and automatic detection of African elephant (Loxodonta africana) infrasonic rumbles. PhD thesis, University of Pretoria, South Africa.
- 136.Bermant PC, Bronstein MM, Wood RJ, Gero S, Gruber DF. 2019. Deep machine learning techniques for the detection and classification of sperm whale bioacoustics. Sci. Rep. 9, 12588. ( 10.1038/s41598-019-48909-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 137.Allen AN, Harvey M, Harrell L, Jansen A, Merkens KP, Wall CC, Cattiau J, Oleson EM. 2021. A convolutional neural network for automated detection of humpback whale song in a diverse, long-term passive acoustic dataset. Front. Mar. Sci. 8, 607321. ( 10.3389/fmars.2021.607321) [DOI] [Google Scholar]
- 138.Bermant PC. 2021. BioCPPNet: automatic bioacoustic source separation with deep neural networks. Sci. Rep. 11, 23502. ( 10.1038/s41598-021-02790-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 139.Wisdom S, Tzinis E, Erdogan H, Weiss R, Wilson K, Hershey J. 2020. Unsupervised sound separation using mixture invariant training. Adv. Neural Inform. Process. Syst. 33, 3846-3857. ( 10.1152/physiol.00008.2007) [DOI] [Google Scholar]
- 140.Bermant PC, Brickson L, Titus AJ. 2022. Bioacoustic event detection with self-supervised contrastive learning. bioRxiv. ( 10.1101/2022.10.12.511740) [DOI]
- 141.Van der Merwe J, Jordaan J. 2013. Comparison between general cross correlation and a template-matching scheme in the application of acoustic gunshot detection. In 2013 Africon, pp. 1–5. IEEE.
- 142.Katsis LK, Hill AP, Pina-Covarrubias E, Prince P, Rogers A, Doncaster CP, Snaddon JL. 2022. Automated detection of gunshots in tropical forests using convolutional neural networks. Ecol. Indic. 141, 109128. ( 10.1016/j.ecolind.2022.109128) [DOI] [Google Scholar]
- 143.Hrabina M, Sigmund M. 2015. Acoustical detection of gunshots. In 2015 25th Int. Conf. Radioelektronika (RADIOELEKTRONIKA), pp. 150–153. IEEE.
- 144.Clemins PJ, Johnson MT, Leong KM, Savage A. 2005. Automatic classification and speaker identification of African elephant (Loxodonta africana) vocalizations. J. Acoust. Soc. Am. 117, 956-963. ( 10.1121/1.1847850) [DOI] [PubMed] [Google Scholar]
- 145.Wijers M, Trethowan P, Du Preez B, Chamaillé-Jammes S, Loveridge AJ, Macdonald DW, Markham A. 2021. Vocal discrimination of African lions and its potential for collar-free tracking. Bioacoustics 30, 575-593. ( 10.1080/09524622.2020.1829050) [DOI] [Google Scholar]
- 146.Trapanotto M, Nanni L, Brahnam S, Guo X. 2022. Convolutional neural networks for the identification of African lions from individual vocalizations. J. Imag. 8, 96. ( 10.3390/jimaging8040096) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 147.Günther RH, O’Connell-Rodwell CE, Klemperer SL. 2004. Seismic waves from elephant vocalizations: a possible communication mode? Geophys. Res. Lett. 31, L11602. [Google Scholar]
- 148.Bouley D, Alarcon C, Hildebrandt T, O’Connell-Rodwell CE. 2007. The distribution, density and three-dimensional histomorphology of Pacinian corpuscles in the foot of the Asian elephant (Elephas maximus) and their potential role in seismic communication. J. Anat. 211, 428-435. ( 10.1111/j.1469-7580.2007.00792.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 149.O’Connell-Rodwell CE. 2007. Keeping an ‘ar’ to the ground: seismic communication in elephants. Physiology 22, 287-294. [DOI] [PubMed] [Google Scholar]
- 150.Reinwald M, Moseley B, Szenicer A, Nissen-Meyer T, Oduor S, Vollrath F, Markham A, Mortimer B. 2021. Seismic localization of elephant rumbles as a monitoring approach. J. R. Soc. Interface 18, 20210264. ( 10.1098/rsif.2021.0264) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 151.Wood JD, O’Connell-Rodwell CE, Klemperer SL. 2005. Using seismic sensors to detect elephants and other large mammals: a potential census technique. J. Appl. Ecol. 42, 587-594. ( 10.1111/j.1365-2664.2005.01044.x) [DOI] [Google Scholar]
- 152.Anni DJS, Sangaiah AK. 2015. An early warning system to prevent human elephant conflict and tracking of elephant using seismic sensors. In Emerging ICT for Bridging the Future-Proceedings of the 49th Annual Convention of the Computer Society of India (CSI) Volume 1, pp. 595–602. Hyderabad, India: Springer.
- 153.Parihar D, Ghosh R, Akula A, Kumar S, Sardana H. 2022. Variational mode decomposition of seismic signals for detection of moving elephants. IEEE Trans. Instrum. Meas. 71, 1-8. (doi:10.1109/TIM.2022.3178465) [Google Scholar]
- 154.Wood JD, O’Connell-Rodwell CE, Klemperer SL. 2005. Seismic Census Technique for African Elephants. In AGU Fall Meeting Abstracts, vol. 2005, pp. B51D–0247. San Francisco, CA: NASA.
- 155.Parihar D, Ghosh R, Akula A, Kumar S, Sardana HK. 2021. Seismic signal analysis for the characterisation of elephant movements in a forest environment. Ecol. Inform. 64, 101329. ( 10.1016/j.ecoinf.2021.101329) [DOI] [Google Scholar]
- 156.Fernando P, Perera K, Dissanayake P, Jayakody J, Wijekoon JL, Wijesundara M. 2020. Gaja-Mithuru: smart elephant monitoring and tracking system. In 2020 11th IEEE Annual Information Technology, Electronics and Mobile Communication Conf. (IEMCON), virtual conference, pp. 0461–0467. IEEE.
- 157.Szenicer A, Reinwald M, Moseley B, Nissen-Meyer T, Mutinda Muteti Z, Oduor S, McDermott-Roberts A, Baydin AG, Mortimer B. 2022. Seismic savanna: machine learning for classifying wildlife and behaviours using ground-based vibration field recordings. Rem. Sens. Ecol. Conserv. 8, 236-250. ( 10.1002/rse2.242) [DOI] [Google Scholar]
- 158.Mortimer B. 2017. Biotremology: do physical constraints limit the propagation of vibrational information? Anim. Behav. 130, 165-174. ( 10.1016/j.anbehav.2017.06.015) [DOI] [Google Scholar]
- 159.O’Connell-Rodwell C, Arnason B, Hart L. 2000. Seismic properties of Asian elephant (Elephas maximus) vocalizations and locomotion. J. Acoust. Soc. Am. 108, 3066-3072. ( 10.1121/1.1323460) [DOI] [PubMed] [Google Scholar]
- 160.Nissen-Meyer T, Mortimer B, Rees W, Koelemeijer P. 2018. Classifying elephant behavior with seismic detection and modeling. In AGU Fall Meeting Abstracts, vol. 2018, pp. S41B–04. [DOI] [PubMed]
- 161.Taylor AM, Reby D. 2010. The contribution of source–filter theory to mammal vocal communication research. J. Zool. 280, 221-236. ( 10.1111/j.1469-7998.2009.00661.x) [DOI] [Google Scholar]
- 162.Wood M, Chamaillé-Jammes S, Hammerbacher A, Shrader AM. 2022. African elephants can detect water from natural and artificial sources via olfactory cues. Anim. Cogn. 25, 53-61. ( 10.1007/s10071-021-01531-2) [DOI] [PubMed] [Google Scholar]
- 163.Bates LA, Sayialel KN, Njiraini NW, Poole JH, Moss CJ, Byrne RW. 2008. African elephants have expectations about the locations of out-of-sight family members. Biol. Lett. 4, 34-36. ( 10.1098/rsbl.2007.0529) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 164.Rasmussen L, Schulte B. 1998. Chemical signals in the reproduction of Asian (Elephas maximus) and African (Loxodonta africana) elephants. Anim. Reprod. Sci. 53, 19-34. ( 10.1016/S0378-4320(98)00124-9) [DOI] [PubMed] [Google Scholar]
- 165.Allen CR, Brent LJ, Motsentwa T, Croft DP. 2021. Field evidence supporting monitoring of chemical information on pathways by male African elephants. Anim. Behav. 176, 193-206. ( 10.1016/j.anbehav.2021.04.004) [DOI] [Google Scholar]
- 166.Rasmussen L, Munger BL. 1996. The sensorineural specializations of the trunk tip (finger) of the Asian elephant, Elephas maximus. Anat. Rec. 246, 127-134. () [DOI] [PubMed] [Google Scholar]
- 167.Cave JW, Kenneth J, Mitropoulos AN. 2019. Progress in the development of olfactory-based bioelectronic chemosensors. Biosens. Bioelectron. 123, 211-222. ( 10.1016/j.bios.2018.08.063) [DOI] [PubMed] [Google Scholar]
- 168.Meléndez F, Arroyo P, Gómez-Suárez J, Palomeque-Mangut S, Suárez JI, Lozano J. 2022. Portable electronic nose based on digital and analog chemical sensors for 2, 4, 6-trichloroanisole discrimination. Sensors 22, 3453. ( 10.3390/s22093453) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 169.Gardner J, Hines E, Wilkinson M. 1990. Application of artificial neural networks to an electronic olfactory system. Meas. Sci. Technol. 1, 446. ( 10.1088/0957-0233/1/5/012) [DOI] [Google Scholar]
- 170.Fu J, Li G, Qin Y, Freeman WJ. 2007. A pattern recognition method for electronic noses based on an olfactory neural network. Sens. Actuators B 125, 489-497. ( 10.1016/j.snb.2007.02.058) [DOI] [Google Scholar]
- 171.Blog GC. 2023. How Osmo is bringing AI to aromas. See https://cloud.google.com/blog/products/ai-machine-learning/how-osmo-isbringing-ai-to-aromas.
- 172.Papi F. 1989. Pigeons use olfactory cues to navigate. Ethol. Ecol. Evol. 1, 219-231. ( 10.1080/08927014.1989.9525511) [DOI] [Google Scholar]
- 173.Davies TE, Wilson S, Hazarika N, Chakrabarty J, Das D, Hodgson DJ, Zimmermann A. 2011. Effectiveness of intervention methods against crop-raiding elephants. Conserv. Lett. 4, 346-354. ( 10.1111/j.1755-263X.2011.00182.x) [DOI] [Google Scholar]
- 174.King LE, Douglas-Hamilton I, Vollrath F. 2007. African elephants run from the sound of disturbed bees. Curr. Biol. 17, R832-R833. ( 10.1016/j.cub.2007.07.038) [DOI] [PubMed] [Google Scholar]
- 175.King LE, Lawrence A, Douglas-Hamilton I, Vollrath F. 2009. Beehive fence deters crop-raiding elephants. Afr. J. Ecol. 47, 131-137. ( 10.1111/j.1365-2028.2009.01114.x) [DOI] [Google Scholar]
- 176.Beery S, Morris D, Yang S. 2019. Efficient pipeline for camera trap image review. (http://arxiv.org/abs/1907.06772)
- 177.Beery S, Van Horn G, Perona P. 2018. Recognition in terra incognita. In Proc. of the European Conf. on Computer Vision (ECCV), pp. 456–473.
- 178.Beery S, Liu Y, Morris D, Piavis J, Kapoor A, Joshi N, Meister M, Perona P. 2020. Synthetic examples improve generalization for rare classes. In Proc. of the IEEE/CVF Winter Conf. on Applications of Computer Vision, pp. 863–873.
- 179.Beery S, Morris D, Yang S. 2019. Efficient pipeline for camera trap image review. See http://github.com/ecologize/CameraTraps.
- 180.Caron M, Touvron H, Misra I, Jégou H, Mairal J, Bojanowski P, Joulin A. 2021. Emerging properties in self-supervised vision transformers. In Proc. of the IEEE/CVF Int. Conf. Computer Vision, pp. 9650–9660.
- 181.Grill JB, et al. 2020. Bootstrap your own latent-a new approach to self-supervised learning. Adv. Neural Inf. Process. Syst. 33, 21 271-21 284. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
This article has no additional data.