Abstract
Main
In recent years, substantial advances in social neuroscience have been realized, including the generation of numerous rodent models of autism spectrum disorder. Still, it can be argued that those methods currently being used to analyze animal social behavior create a bottleneck that significantly slows down progress in this field. Indeed, the bulk of research still relies on a small number of simple behavioral paradigms, the results of which are assessed without considering behavioral dynamics. Moreover, only few variables are examined in each paradigm, thus overlooking a significant portion of the complexity that characterizes social interaction between two conspecifics, subsequently hindering our understanding of the neural mechanisms governing different aspects of social behavior. We further demonstrate these constraints by discussing the most commonly used paradigm for assessing rodent social behavior, the three-chamber test. We also point to the fact that although emotions greatly influence human social behavior, we lack reliable means for assessing the emotional state of animals during social tasks. As such, we also discuss current evidence supporting the existence of pro-social emotions and emotional cognition in animal models. We further suggest that adequate social behavior analysis requires a novel multimodal approach that employs automated and simultaneous measurements of multiple behavioral and physiological variables at high temporal resolution in socially interacting animals. We accordingly describe several computerized systems and computational tools for acquiring and analyzing such measurements. Finally, we address several behavioral and physiological variables that can be used to assess socio-emotional states in animal models and thus elucidate intricacies of social behavior so as to attain deeper insight into the brain mechanisms that mediate such behaviors.
Conclusions
In summary, we suggest that combining automated multimodal measurements with machine-learning algorithms will help define socio-emotional states and determine their dynamics during various types of social tasks, thus enabling a more thorough understanding of the complexity of social behavior.
Keywords: Animal models, Autism spectrum disorder, Behavioral phenotyping, Emotional states, Social behavior, Social vocalizations, Three-chamber test
Introduction
Social behavior is a broad term that can be defined as any communication or interaction between two conspecifics of a given species [1]. Flexible and dynamic social behavior is necessary to adapt to new environments so as to ensure survival and reproductive success [2, 3]. Other than its survival-related benefits, various forms of social interactions in both humans [4, 5] and rodents [6–10] involve the activity of the mesolimbic system mediated by dopaminergic signaling (the reward system), indicating a rewarding aspect.
Social behavior encompasses many forms and can be aggressive, mutualistic, cooperative, altruistic, or parental in nature [11–13]. Social behavior entails the active and ongoing detection of cues by multiple sensory modalities and the ongoing process of reshaping the individual’s behavioral response according to the behavior of other conspecifics which comprise the perceived social environment [1, 2, 11]. More than any other aspect of cognition and behavior, social interactions of any type or nature can never be dissociated from the accompanying emotional context influencing an individual's affective state, as well as the others with whom that individual interacts [14–16]. Such characteristics make scientific explorations into the neurobiological mechanisms underlying social behavior highly challenging and have thus delayed progress in this field for many decades, relative to other fields of neuroscience [17, 18].
The study of social behavior is relevant not only for revealing the cognitive and neural processes governing its normal expression, but also for understanding how those mechanisms may malfunction to produce atypical social behavior. Abnormalities in social cue identification, impaired social skills and difficulties in maintaining social relationships are distinctive features of several psychiatric (e.g., schizophrenia), neurodevelopmental (NDD; e.g., Autism Spectrum Disorder), and neurodegenerative disorders (e.g., dementia) [19–21]. Still, the exact neural substrates and biological mechanisms underpinning abnormal social behavior in pathological conditions, which may serve as possible targets for pharmacological intervention, remain elusive [11, 22].
Given that research tools and manipulations amenable to human subjects are fairly limited, the study of the neural and molecular mechanisms underlying social deficits in various neurodevelopmental, psychiatric, and neurodegenerative diseases relies heavily on experimentation performed on animals and thus requires the generation of appropriate animal models of these pathological conditions [23, 24]. Accordingly, given their high degree of genetic similarity to humans, ease of maintenance, and rich social lives, rodents—especially rats and mice—are widely used in the research of neurodevelopmental disorders [25, 26]. The social behavior of mice and rats differs greatly, with rats showing a broader and more complex repertoire of social behaviors [25–28], and being less aggressive [29], and more rewarded by social interactions [30, 31]. Still, the larger proportion of ongoing research that considers hypotheses regarding the etiology and underlying mechanisms of NDDs is being conducted on mice, given the larger genetic toolbox available for mice, enabling the generation of mouse models with genetic alterations mimicking those found in humans [26].
Yet, despite the remarkable progress rodent models have allowed us to realize in the context of NDDs and their high construct validity, our understanding of rodent social behavior remains too limited to make any direct comparisons with that of humans. In the case of Autism Spectrum Disorder (ASD), for example, impaired altruism or lack of Theory of Mind (i.e., inferring the feelings and intentions of others) were long thought to be uniquely human traits and hence, difficult to parallel in rodents [32]. However, accumulating evidence now shows that both rats and mice have more sophisticated emotional cognition than once believed [33–35]. Moreover, while some symptoms of ASD, specifically those related to speech and linguistic communication (e.g. lack of prosody or inability to understand sarcasm) [36], cannot be mimicked by the rodent brain, given how brain substrates and mechanisms parallel to those mediating human language skills are not found in these animals [32, 36], rats and mice, nonetheless, do seem to utilize auditory cues for communication. Such cues comprise unique structures of vocalization (mainly ultrasonic) emitted in (but not exclusively) certain social contexts, which also seem to be abnormally altered in models of autism [37–39]. However, we still do not fully comprehend the exact behavioral significance of these vocalizations, nor what are the communicational deficits that their altered profiles in ASD rodent models might signify [40, 41]. It is important to note that the scope of knowledge regarding rodent social behavior is limited by the availability of tools for accessing such traits, as well as our interpretations. Therefore, behavior we deem as “less complex” may not necessarily be so, thus making the drawing of conclusions of direct links between genetic alterations and behavioral impairments even more challenging.
Nevertheless, substantial advances in the field of social neuroscience have been made in recent years with the development of cutting-edge methods for labeling, recording, and manipulating the activity of neural circuits, which have helped to reveal an increasing number of brain mechanisms and neural circuits involved in social behavior [1, 2, 14, 42–49]. In parallel, an ever-growing body of human genomic and transcriptomic studies have increased the repertoire of genes linked to social behavior in general, and specifically to disorder-associated social deficits [50, 51]. This has allowed the generation of an enormous number of genetically-modified animal lines, which may serve as animal models of pathologies associated with mutations in these genes [23, 52]. Still, the analysis of social behavior in animal models seems to be a bottleneck that significantly slows progress on this vibrant research frontier. Despite the technological achievement attained in methods that allow for testing more elaborate and inclusive paradigms, the majority of research in social neuroscience still relies on a small number of very simple behavioral set-ups [22, 24]. Moreover, these paradigms are usually employed with only one or two monitored variables, thus masking the complex dynamics of social behavior and sacrificing the ecological relevance of the collected data [29, 53–55]. While such reductionist approaches are attractive due to the high level of control over experimental conditions they provide, at the same time demanding low labor intensity, narrowing the scope of examined behaviors and over-simplification of the behavioral paradigm nonetheless represent a significant risk to translational validity. Furthermore, such strategies limit the possibility of generalizing findings made with animal models to the human brain and behavior, as well as the ability to reliably assess the potential of drugs designed to cure relevant symptoms in humans [24, 29, 32]. Increasing the number of examined variables and the complexity of behavioral assessment used for phenotypic profiling of animal models requires developing new methodologies, using detailed analysis of the dynamics of social behavior, and collecting multimodal data sets [29, 53]. This approach will help to increase the translational value of animal research into social behavior without compromising the reproducibility and accuracy of the obtained findings.
To elucidate the need for a more thorough and inclusive examination of social behavior, the three-chamber test which is considered the gold standard for assessing social behavior in both mouse and rat models, will be analyzed below as an example.
A closer look at the three-chamber test
What is the three-chamber test?
The three-chamber test is one of the most commonly used methods for evaluating social behavior in mouse models of ASD. It is used for assessing an animal’s preference for a social environment over a non-social environment (termed social preference or sociability) and its preference for a novel over a familiar conspecific (termed social novelty preference) [56]. In this task, the subject mouse is first placed in the medial chamber of a three-chambered apparatus for habituation. A novel same-sex conspecific placed under a wire cup serves as a “social stimulus” in one lateral chamber, while an empty wire cup located in the other lateral chamber serves as a “non-social stimulus.” Upon habituation, the walls separating the chambers are raised, allowing the subject to move freely between chambers. “Sociability” in the context of the three-chamber test is defined as the propensity of the subject to spend more time in the “social” chamber containing the conspecific, as compared to the other chamber containing the empty cup. To assess social novelty preference, a second test is carried out immediately following the first, with one chamber containing the same conspecific from the previous test, now serving as a “familiar stimulus”, and the other chamber containing a novel conspecific serving as an “unfamiliar stimulus.” “Social novelty preference”, in this context, is defined as the propensity to spend more time in the chamber containing the novel conspecific than in that chamber with the familiar conspecific [32, 57, 58]. The placement of social stimuli within wired cups prevents these mice from freely moving in the arena and restricts their direct physical contact with the subject. This, in turn, attenuates the expression of aggressive and sexual behavior while still allowing the subject to explore and detect sensory cues (namely, smell, sight, sound, and touch). Under these conditions, the subject mouse is solely in control of actively seeking and investigating the social stimulus [53, 57].
The three-chamber test thus provides an elegant but simple design with high experimental control and offers easy objective scoring, as compared to a social interaction test involving two freely moving animals [53, 59]. Indeed, the three-chamber test represented a breakthrough in the field of social neuroscience and now serves as a fundamental instrument that offers a wide range of applications. These include phenotyping social deficits in transgenic and environmental mouse models of ASD [60–66], investigation of social deficits during development [8, 67, 68], comparing distinct strains and genotypes [56, 58, 59, 69, 70], as well as testing the effects of pharmacological treatments and other manipulations on social behavior [38, 63, 71–74].
Limitations of the three-chamber test
Despite its tempting simplicity and high degree of experimental control, the three-chamber test suffers from multiple limitations. These are discussed below.
Limited number of monitored variables
The assessment of social behavior by the three-chamber test is usually restricted to monitoring one or two variables, usually the time spent in each chamber and/or the time spent in proximity to distinct wire cups. However, evaluating social behavior using only one or two variables reduces its complexity to a single dimension. Moreover, estimating sociability by “time spent in chamber” may not be a very reliable reflection of social propensity, given that time spent in a given chamber does not necessitate an active and direct investigation of the stimulus in that chamber. Furthermore, social interactions are reflected by multiple behavioral variables that are in constant interplay and dynamically change over time. In the context of the three-chamber test, such variables include the number and length of individual bouts of interaction with the stimuli, the rate of transition between stimuli, the progression of stimulus investigation over the duration of the test, and the periods spent by the subject in non-social activities, such as grooming and resting. Modeling social behavior in the social preference test with several of these variables revealed two distinct phases of subject mouse behavior during the test, namely, an “exploratory phase” in which the subject’s investigation is merely driven by curiosity and exploration, and the “interaction phase”, when the subject begins to show an increased tendency for interaction with the stimuli [75]. Therefore, future assessments of rodent social behavior should incorporate more nuanced variables, such as accurate detection of body posture, combined with an analysis of behavioral dynamics during the paradigm being employed. Such an approach is not only crucial for enhancing the translational validity of behavioral testing, it is also important for differentiating between distinct aspects of social behavior. Such aspects may be mediated by distinct molecular and neuronal mechanisms that cannot be identified solely by relying on the “time spent in chamber” variable [53].
Non-standardized protocols
Despite the straightforward approach of the three-chamber test, there is still a lack of consensus regarding a standardized protocol for its use. Variations in protocols include differences in the habituation period, currently ranging from 5 min [57, 58, 60, 69, 74, 76] to 10 min [32, 38, 59, 77–82], and even 20 min [63, 83, 84]. Protocols also differ in terms of what constitutes a “non-social” stimulus. While some protocols introduce a novel object placed under a wired cup in the non-social compartment [38, 68, 77, 78, 85], others place a wire cup with nothing underneath [32, 57, 59, 60, 63, 69, 76, 80–84]. Another variation includes the portion of the arena to which the subject animal has access during the habituation phase; some protocols limit the habituation space to the central chamber of the three-chamber apparatus [32, 57, 58, 60, 74, 80], while others allow the animal to explore the entire arena [38, 69, 76–78, 81, 82]. Given that the main variable used to estimate sociability is time spent in each chamber during the testing phase, pre-test habituation to the central chamber alone introduces confounding variables irrelevant to social tendency, such as spatial preference, anxiety, and novelty-seeking that may drive the animal to spend more time in one chamber over the other. CD1 outbred mice, for example, failed to exhibit social preference when they were habituated to the central chamber only. Rather, they showed intact social preference when exposed to all three chambers during the habituation phase [79].
The variety of testing methods in use has also contributed to discrepancies in the phenotyping of several ASD mouse models, with varying conclusions regarding levels of sociability, including Shank2-KO [81, 82],Cntnap2-KO [63, 77, 83, 86], and 16p11.2+/− mice [87, 88]. In one attempt to examine the effect of this procedural variability on detected deficits in social behavior, Rein et al. [56] tested two versions of the three-chamber test. One version included a pre-test phase in which subjects were introduced to two identical objects within the cups so as to familiarize them with objects being presented inside the cups. This method was designed to minimize variability caused by novelty-driven interactions with the cup and to prevent “muddying” of detected preferences for the social versus the non-social stimulus. The other tested version compared the subject’s interaction with a social stimulus placed in one chamber versus an empty cup serving as the “non-social” stimulus in the other chamber. The two tested versions yielded different sensitivities in detecting social preference deficits in multiple ASD mouse lines, including Shank3+/ΔC, Cul3f/−, and 16p11.2+/−, thus demonstrating that protocols involving an inanimated object as a non-social stimulus rather than an empty cup are more sensitive to social preference deficits in ASD models [56]. Taken together, such findings highlight the need for standardized protocols that allow for consistent phenotyping of ASD models, and for the use of more reliable measurements to estimate social behaviors that can persistently and accurately capture changes in social behavior while remaining insensitive to variations in the protocol used.
The need for multiple tests
Even if social deficits are detected using the three-chamber tests (or any other social task for that matter), one test alone cannot provide the sole basis for reaching the conclusion that a certain mouse model is socially impaired or not. The 16p11.2+/− line used to model ASD, as an instance, displayed intact social preference in multiple studies [83, 84, 88], yet exhibited deficits in other behavioral tasks, including sex preference and emission of mating calls [39, 88], social recognition memory and habituation [83, 84], and the degree of direct social interaction [87]. Another example is the Iqsec2 A350V line, which showed intact social preference and social novelty preference in the three-chamber test, yet also displayed deficits in sex and emotional state preference [73].
The need to employ multiple behavioral assays for the phenotyping and exploration of NDD animal models was recently highlighted in a review by Silverman et al. [27]. ASD, for instance, is a heterogeneous disorder with high percentages of comorbidity and multiple associated symptoms occurring in subsets of autistic individuals, such as seizures, anxiety, mental retardation, hyper- or hypo-reactivity to sensory stimulation, and motor abnormalities [36, 80]. Therefore, associated symptoms are a major source of potential artifacts that can confound the interpretation of the phenotype revealed in a mouse model. For example, a mouse model for ASD might exhibit an abnormal social approach and social recognition due to innate high anxiety levels, impaired sensory perception, or motor defects and not necessarily due to altered social motivation. Thus, phenotyping of mouse models for NDDs should not be limited to “single tasks” that examine core features of the disorder, such as lack of sociability or repetitive behavior, but rather should also include a battery of tests that address associated symptoms like anxiety, sensory functioning, and motor fitness, which may offer alternative explanations for behavioral abnormalities [27].
Confounding variables
In addition to potentially being influenced by spatial-related processes, as mentioned earlier, the social behavior of a tested subject may be affected by other confounding variables that can bias or mask the results of any social task. Such factors include social rank and aggressiveness [22, 53, 89], strain [58, 59, 69], experimental settings, such as lighting conditions and the use of a novel testing arena [52, 59, 89], housing conditions, including the size, genotype, and gender composition of the litter [56, 90], and even individual differences in temperament and personality [91, 92].
It is important to note that the limitations discussed above are not necessarily exclusive to the three-chamber test and can be considered as relevant to a wide variety of gold standard tests in other fields. For example, two of the most common tests for assessing anxiety levels and anxiety-related behaviors in both mice and rats are the Elevated Plus Maze (EPM) and the Open Field (OF) tests, both of which rely on the innate conflict of rodents between the drive to explore a new space and the fear of open spaces [93]. The EPM consists of a plus-shaped maze with two open arms and two closed arms inter-connected by a central platform elevated above the ground. Anxiety is measured by calculating the percentage of time subject animals spend in the open versus the closed arms, with more time spent in the closed arms indicating higher anxiety levels. In the OF test, the subject animal is placed in a box-shaped arena with walls and the animal's trajectory is tracked during the session. Anxiety is then inferred by calculating the percentage of time spent along the walls of the arena versus the center, with more time spent along the walls indicating higher levels of anxiety [25, 93]. While these tests have high ecological value, with an economic and straightforward design, and show sensitivity to anxiogenic and anxiolytic pharmacological treatments [94], they still suffer from multiple caveats similar to those of the three-chamber test. For example, both tests, like the three-chamber tests, have no single, agreed-upon standardized protocol adopted across laboratories [94, 95]. Performances in these tests were also shown to be influenced by multiple confounding variables, including species, strain, gender, age, housing conditions, prior handling and exposure to stress, illumination levels, and prior test experience [93–95]. Furthermore, the behavior of tested subjects in the EPM was shown to differ on a minute-to-minute basis [94], indicating the redundancy of estimating anxiety by calculating one or two variables, like the number of entries or total time spent in open/closed arm. Lastly, it was found that reliance on solely one test, like the EPM or OF test, is not reliable for determining anxiety levels in rats, given that performance in one test was not correlated with performances in other tests for anxiety. This again stresses the need for profiling anxiety by applying multiple tests [96].
Affective states
Finally, currently used behavioral paradigms for phenotyping social deficits in ASD models often overlook a key component of any form of social behavior, namely, the emotional state of the subjects involved. Accumulating recent evidence shows that rodents possess higher levels of emotional cognition than once believed, and that their emotional state can affect their behavior by generating cognitive biases and inducing differential effects in response to the same stimuli [97, 98]. Despite their robust influence on behavior, assessing emotional states in animal models still relies on a fairly limited number of tools, thus creating a gap that prevents any reliable comparisons of deficits in social behavior exhibited by human patients and social deficits characterized in animal models. Validated measures of an affective state can help in developing improved models and treatment options for human emotional disorders and may provide additional information regarding the neural mechanisms behind such complex forms of behavior [97]. The following chapter will discuss this requirement and offer possible solutions.
Socio-emotional states in rodents
Although there is no consensual definition of emotions, these can be viewed as central states triggered by intrinsic or extrinsic stimuli processed in particular neural circuits and which drive behavioral, cognitive, somatic, and physiological responses [99]. Such a definition of emotions does not necessitate the existence of subjective consciousness as a requirement for experiencing emotions. Therefore, basic and more primitive forms of emotional states can be found in animals [99–101]. This comes as no surprise, since research in rodents has already established the existence of negative emotional states, like fear and stress [102], that elicit distinct behavioral and physiological changes spanning multiple modalities [100, 101]. Emotional states in rodents can be recognized as complex and flexible reactions to environmental events that can persist for some time, influencing other aspects of cognition, and affecting subsequent behavioral decisions [99, 101]. Rats’ emotional states, for example, were found to influence their decision-making behavior, as seen in the Ambiguous-Cue Interpretation Test, in which animals are first trained to associate one cue with a rewarding outcome and a second cue with avoiding punishment or a less rewarding outcome. Animals experiencing a negative affective state were found to exhibit a “pessimist” judgment of an ambiguous cue and respond to it as if it predicted the negative/less rewarding outcome, while animals experiencing a positive affective state displayed an “optimist” cognitive bias that led them to deem the ambiguous cue as being predictive of a rewarding outcome [103, 104].
Moreover, both mice and rats communicate emotional content using multiple modalities, as shown in Fig. 1 for mice. These include postures like freezing, vocalizations of varying frequencies [41, 105, 106], scent marking and release of pheromonal cues for communicating social or sexual status, as well as affective states [107–109], and distinct facial expressions in response to emotionally salient events [110]. Thus, the emotional state experienced by a mouse or rat can influence its social behavior toward a fellow conspecific. Evidence supporting the expression of complex forms of socio-emotional behavior, like pro-social and empathic behaviors, in rodents has recently begun to accumulate [91]. Prairie voles were shown to engage in higher levels of allogrooming of their partner when re-united after a 24 min separation if their partner received an electric shock during the period of separation [34]. Exposure of rats to a stressed and fear-conditioned cage-mate increased allogrooming of that cage-mate, facilitated the acquisition of avoidance behavior in the training phase of a fear conditioning paradigm, and increased fear memory retention [16, 19]. Rats were also shown to modify their behavior and refrain from pressing a food-delivering lever that also delivers a foot shock to a cage-mate [111, 112]. Moreover, rats introduced to a trapped cage-mate quickly and consistently freed their cage-mate, and when given a choice between pressing a lever to obtain chocolate or pressing a lever to release a trapped cage-mate, the rats preferred to free the trapped cage-mate [113]. This willingness to cooperate with other conspecifics in rats was influenced by the value of previous benefits received by those conspecifics- female Norway rats were more willing to provide cereal flakes to a partner who previously provided them with a piece of banana than a partner who had earlier provided them with a carrot [114]. However, rats still exhibited pro-social behavior and chose to provide food rewards to a cage-mate even without any direct self-benefit resulting from their choice [115].
In mice, evidence for such complex forms of pro-social behavior is more scarce. Still, available findings indicate that mice possess the ability to recognize and respond to the distress of other conspecifics. For instance, exposure to a cage-mate in pain increased pain behaviors in observer mice also experiencing pain and induced hyper-mechanical and -thermal sensitivity to nociceptive stimulation [35, 116]. Moreover, mice exposed to a shocked cage-mate displayed increased social approach and allogrooming toward the stressed mouse, indicative of an emotional response [117].
While the existence of high-level social cognition that can mediate abilities, such as empathy and Theory of Mind, in rodents is still under debate, it is well-accepted that rodents do show emotional contagion, as evident in tasks such as social transfer of fear, pain, and food preference [14, 35, 118, 119]. In emotional contagion, the subject’s attention to the “state” of another automatically activates the same state in the observer, thus increasing the probability of behavior driven by that emotion and allowing for rapid adaptation to environmental challenges [15, 29]. An essential component of emotional contagion is the ability to detect, recognize, and react to the emotional state or arousal of other conspecifics. To assess affective state discrimination ability in rodents, a behavioral paradigm was recently developed by Scheggia et al. [120], in which emotional state recognition (also termed ‘affective state discrimination’) was estimated by comparing the time an “observer” subject spent investigating a “demonstrator” in a neutral state versus one under an arousing affective state (positive or negative). C57BL/6J mice of both sexes preferred the emotionally aroused conspecific experiencing either a positive or a negative affective state over a neutral conspecific. This ability depends on oxytocin signaling in the paraventricular nucleus (PVN)-Central Amygdala (CeA) pathway, and on inhibition mediated by somatostatin-expressing (SOM) interneurons in the pre-frontal cortex (PFC) [20, 120]. These behavioral observations imply that demonstrators transmit cues about their affective state, which are then detected by observers using different sensory modalities. These sensory cues may eventually converge on a common neural circuit dedicated to processing emotional cues, encompassing areas like the PVN, CeA, insular cortex, and PFC [118]. Paradigms such as these may be used to characterize deficits in affective state discrimination in animal models of human pathological conditions. This type of examination seems to be especially relevant to ASD, a condition in which disruption of emotional cognition and Theory of Mind-related processes is a core feature [119, 121]. Such abilities can be independently hindered regardless of the general social propensity of the tested subjects. Iqsec2 A350V-mutated mice, for example, exhibit deficits in specific social interactions that include emotionally-arousing stimuli [73]. Therefore, the application of behavioral tests designed for assessing emotion-cognition-related processes might yield greater insight into behavioral deficits related to ASD.
The proper assessment of emotional states in animals will require multi-dimensional methods aimed at defining complex signatures that reflect a subject’s emotional state and able to capture how this state affects social behavior. One such approach is discussed below.
A multidimensional approach for phenotyping social behavior
Socio-emotional states involve various neuronal, hormonal, physiological, and behavioral processes that interplay to enhance the survival and success of an individual in any social context [65, 99, 101, 122]. Fearful situations in humans, for example, elicit a wide range of physiological (e.g. increase in heart rate, blood pressure, respiration, and sweating), hormonal (e.g. HPA-axis activation, secretion of cortisol, and increased adrenaline levels), and behavioral changes (e.g. facial expressions, body posture, and freezing or fleeing) [100, 123, 124]. Therefore, to identify socio-emotional states in an ethologically valid manner and distinguish between them, one can rely on the complex signatures of such states across different modalities. This approach requires capturing and correlating as many aspects of a subject's behavior and physiology as possible. Such multimodal analysis provides a detailed and wide perspective that is expected to be much more informative than are commonly used measures of “time spent in chamber”. Moreover, subtle differences in specific behaviors or physiological variables may reflect unique emotional states of a subject or its responses to the affective states of other conspecifics. Such findings may also help to delineate differences in the neural circuitry responsible for varying behavioral responses during distinct types of social interactions [98, 118].
In the following section, we will detail several “fronts”, including some behavioral and physiological variables of different modalities, that have been shown to be involved in social behavior. The section will also describe the latest methodological advances for measuring and analyzing these variables and discuss how such variables may serve as good candidates for providing meaningful information on complex aspects of social interaction, like the socio-emotional state of the subject animal.
Systems for automated behavioral analysis
Social interactions usually involve multiple individuals displaying a dynamic and high-dimensional repertoire of behaviors influenced by their own motivational and emotional states, as well as those of their partners [3, 125, 126]. As such, accurate and thorough quantification of social behavior is needed to understand the exact neural basis mediating its intricacies [127, 128]. While human analysis and manual annotation have their benefits, like the ability to differentiate between closely similar behaviors that can otherwise be prone to faulty classification by automated analysis methods, manual analysis of social behavior still has its downfalls. Besides being time-consuming and tedious, human analysis is limited by the observer’s ability to visually perceive and follow complex sequential behavior, making it prone to human error, bias, and variable inconsistencies [2, 125]. Accordingly, numerous attempts to develop objective computerized tracking systems to analyze animal behavior, a task that has proven to be highly complex, have been made. Table 1 lists some presently available tracking, pose-estimation, and classification computerized tools for rodent behavioral analysis.
Table 1.
Name | Function | Number of subjects | General description and relevant features | Measured variables | Citation |
---|---|---|---|---|---|
UMA tracker | Tracking | Multiple subjects (without identity preservation) |
An image-based tracking algorithm Allows the application of multiple image processing algorithms so as to choose the most suitable An option for manual correction of tracking and trajectory swapping errors Requires arenas with high contrast |
Trajectory, interaction times in regions of interest (ROI) | [132] |
Rodent arena tracker (RAT) | Tracking | Individual subjects |
Machine-vision tracking device that is inexpensive, has low battery demand and does not require a tethered computer Requires a high-contrast arena Real-time online image processing Can be synchronized with other devices for pellet dispensing\optogenetic stimulation, etc. |
Trajectory and speed | [133] |
Mousemove | Tracking | Individual subjects |
A software for centroid-based tracking; thus does not offer orientation-dependent information Requires high-contrast circular arenas Restricted to video resolution of 320 × 240 [151] Batch processing option [129] |
Trajectory, traveled distance, speed, turning and curvature in the entire arena or within a ROI | [134] |
Mouse tracking | Tracking | Individual subjects |
Neural network-based tracker for long periods (days) in multiple, complex, and dynamic environments The option to train a new network suitable for the user’s need with minimal training data needed (minimum of 2500 annotated images) Indifferent to coat color or animal size |
Traveled distance, speed | [126] |
Automated rodent tracker (ART) | Tracking | Individual subjects |
A rule-based system for tracking a rodent’s nose and body points with minimal user interference needed Detection of orientation and head-direction of subjects Requires high-contrast arenas Option for batch processing of multiple videos |
The frequency of certain behaviors (exploration, moving forward, turning, interacting with a ROI), locomotion variables (speed, distance), and body size estimation | [135] |
Janelia automatic animal behavior annotator (JAABA) | Behavior Classification | Single or multiple subjects |
Machine learning algorithm for neural networks-based behavioral classification using animal trajectory Users annotate a small set of video frame to create classifiers for detecting behaviors of interest in screen-scale data sets Can operate on the output of multiple tracking systems (e.g., Ctrax, MoTr) |
[127] | |
MoTr | Tracking | Multiple subjects |
A software for long-duration tracking (days) in home cage environment, with identity preservation Identity is assigned by coloring subjects with distinct bleach patterns that are detected and learned by the tracking software Detects the position and orientation of the animal based on previous frames by applying an Expectation–Maximization algorithm Suitable for quantification of social behaviors with long-scale dynamics (dominance, aggression, courtship) |
Preferred location, preferred associates, following rate and duration, speed | [142] |
DeepLabCut | Pose estimation |
Based on transfer learning of deep neural networks with minimal training data needed [193] for classifier creation Can be used for detecting the pose, orientation, and posture change of body parts of multiple free-interacting mice Option for retraining the network for fine-tuning to a specific need/task by providing it with labeled data on annotated body parts locations |
Trajectories, traveled distance, and location of annotated points | [149] | |
MiceProfiler | Tracking | Two subjects |
The software requires no specific tagging of subject animals by implementing a physics engine capable of maintaining identity even after occlusions and with hidden body parts Can detect the orientation of the mouse’s head (oral-oral, oral-genital, side-by-side interactions) The system is limited by its need for supervision and correction by an expert [143] |
The frequency, duration, type (follow, escape, investigation), and temporal evolution of pre-determined behavioral events The identity of the animal initiating an action (follower/leader), and the response of the other conspecific |
[125] |
A 3D-video-based computerized analysis of social and sexual interactions in rats | Tracking | Two subjects |
The system is used to detect behavioral events that include vertical changes in posture (rearing, mounting) by using four depth cameras positioned at different viewpoints to extract a 3D image of two freely-moving rats. The merged extracted image is fitted into a “skeleton” model by physics-based algorithm to estimate the location of four body parts (head, neck, trunk, hip) for identifying spatio-temporal patterns of these parts May need manual corrections for identity swaps and dislocated body-parts |
Frequency, latency, and duration of dynamic behavioral events like rearing, head-head\hip contact, approach, follow and mount | [144] |
RFID-assisted socialscan | Tracking | Multiple subjects |
A system for long-term tracking (days) tracking in ethologically relevant environments, and with identity preservation Identity preservation is obtained through radio frequency identification – each subject is implanted with a RFID chip that transmits a unique radiofrequency detected by RFID antennas placed underneath the arena that is then synchronized with the video frames for identity assignment An option for adjusting\adding new parameters for detection by the user Animal identity is preserved even when out of frame, thus enabling the attachment of other components to the arena (nests and enrichments) |
Detection of specific social events (identified by built-in rules), like approach, contact, follow, leave, and locomotor activity within ROIs | [141] |
Autotyping | Tracking | Individual subjects |
A toolbox for locating and measuring time spent in ROIs in multiple behavioral tasks, including open field, fear conditioning, elevated zero maze, Y\T-maze, spatial\novel object recognition, and three-chamber task Requires high-contrast arenas |
Interaction time\time spent in a given location, number of exploratory bouts, approach angle during bouts of interaction, distance traveled | [137] |
ToxTrac and ToxId | Tracking | Multiple subjects |
An open-source software for image-based tracking, with high processing speed (less than 25 frames per second), integrated distortion correction and camera calibration, and identity preservation of multiple subjects Can be used in multiple arenas ToxId algorithm can be implemented in ToxTrac and enables identity preservation of multiple “untagged” animals by linking trajectory segments using their intensity and Hu-moments with no training or complex configuration steps or access to past and future frames needed |
Average speed, acceleration, distance traveled, time spent near\in a ROI | [140, 151] |
Mouse action recognition system (MARS) | Pose estimation and behavior classification | Two subjects |
Automated pipeline and software tools (deep learning based) for supervised training and evaluating of novel pose estimation, behavior classification, and joint visualization of neural and behavioral data Subjects need to have different coat colors (one black, one white) Include three pre-trained supervised classifiers trained to detect attack, mounting, and close investigation events An option for training MARS pose and behavior models for creating user-specific classifiers from manually annotated videos (minimum of 1500 annotated frames needed for training) Suitable for head-mounted animals with implantations Includes an open-source interface Behavior Ensemble and Neural Trajectory Observatory (BENTO) for synchronous display, navigation, and analysis of behavior annotations, audio recordings, and recorded neural activity |
Frequency and time spent in specific behavioral events (attack, mounting, close investigation) Can detect orientation-sensitive behaviors like face\anogenital- directed sniffing) |
[128] |
TrackRodent | Tracking | Up to two subjects (without identity preservation) |
Suitable for rats and mice Requires high-contrast arenas Suitable for implanted animals Options for multiple tracking algorithms based on species (mouse\rat), coat color, head-\body-based tracking, and head implantation |
Total time of investigation of ROI, bouts of investigation and their distribution, transitions between defined regions, investigation along time, and intervals between investigation bouts | [75] |
Idtracker.ai | Tracking | Multiple subjects |
An algorithm and software for extracting the trajectories of freely-moving and unmarked animals in high-contrast arenas The software is based on two convolutional networks, one for detecting events of animals touching or colliding, and one for assigning identification to the detected animal using classification analysis The software can be used for detecting multiple subjects of various species in different environments but requires large training data to adapt to new animals and experimental settings Idtracker’s ability to maintain the identity of multiple mice in a long recording given their deformable geometric shape is yet to be established [143] High computational demands [131] |
Trajectories of detected animals | [148] |
Simple behavioral analysis-SimBA | Behavior classification | Two subjects |
An open-source software that uses pose-estimation to create supervised machine learning predictive classifiers of rodent social behavior Requires different coat coloring Uses labelling interfaces and pose-estimation data from DeepLabCut and DeepPoseKit to annotate body parts on subject animals |
Durations and frequencies of classified behaviors | [150] |
Video-RFID tracking system | Tracking | Multiple subjects |
A system for automated location tracking within a semi-naturalistic setting, and for long periods of time (from minutes to several weeks) Identity preservation is obtained through radio frequency identification—each subject is implanted with a RFID chip that transmits a unique radiofrequency detected by RFID antennas that is then synchronized with the video frames for identity assignment The system integrates MiceProfiler algorithms for improving identity detection and defining mouse-body orientation for more sensitive behavioral characterization |
Locomotion (travelled distance and time spent in running, walking, hiding, sleeping, and staying still), and the number of social events (include avoiding, being-avoided, chasing, being-chased) that can be later used for quantifying social dominance | [147] |
Live mouse tracker | Tracking and behavior classification | Multiple subjects |
A real-time tracking software combining computer vision, machine learning, and RFID identification methods for tracking and classifying behavior in a home-like environment for up to several days The tracking is possible with any coat color, wired animals, and enriched environments Identity of subject mice is appointed using RFID and machine learning algorithms with the ability for the user to monitor the quality of the tracking live during the experiment the setup includes a depth camera that enables the extraction of animals’ orientation (head–tail) and the detection of various head parts, like ears and nose The option to synchronize behavioral tracking with USV recording through the LMT USV Toolbox thus enabling the investigation of spontaneously emitted USVs in home-like environments. The system does not record audio continuously, but only when a certain power threshold is crossed, and then uses machine-learning-based classifier to filter out recorded files of noise. Extracted USVs are then correlated with behavioral events detected by LMT. Cannot appoint the identity of the emitter |
Based on changes in shape geometry, the system is able to detect and classify up to 35 different behavioral events related to individual behavior, social dyadic events, dynamic events (escape and follow), and subgroup configuration events | [143] |
One core requirement of any tracking system is the ability to locate an animal's position and separate it from its surroundings [129]. To locate an animal within a given frame, some tracking systems employ computer vision algorithms for background subtraction followed by segmentation, techniques that usually require simplified and fixed arenas with a high level of contrast between the background and the target for adequate separation [2, 129–131]. Such demands require behavioral testing to be constrained to specific simplified arenas, which may compromise the translational validity of the testing environment, increase the anxiety levels of the tested animals [89], and limit the strain repertoire of possible subjects, depending on their coat coloring [132–137]. In addition, some tracking systems track the position of an animal by locating its center of mass (centroid) and reducing the tracked animal to a single point, thereby providing information limited to the subject’s location, with no information regarding the orientation and/or directionality of behavior [131, 133, 134, 138]. Neglecting directionality in behavioral analysis overlooks a rich source of valuable information in the context of social behavior, given that some social behaviors require the identification of the animal’s orientation. In rats, for example, anogenital sniffing is considered an affiliative action of social investigation, while face-to-face investigation might increase the probability of attacking a subordinate rat [139]. Another example comes from the work of Hong et al. [2] in which pose estimation of freely interacting mice revealed a significant reduction in the time spent in short (< 4 cm) head-body distances by BTBR subjects investigating a BALB\c stimulus, as compared to C57BL/6J subjects.
Furthermore, while some tracking systems are adequate for tracking one animal [133–138], free social interactions involving two animals at the least introduce the challenge of tracking multiple subjects and maintaining their identities over the course of analysis, including periods of close physical proximity (huddling) or following occlusions [129, 140]. Possible solutions to this issue include various forms of “artificial marking” of tested animals, whether by coloring an animal with distinct dye patterns or implanting the animals with RFID chips that emit radio frequency signals unique to each subject [141–143]. Another innovative solution for the identity preservation issue is the use of multiple depth cameras covering multiple viewpoints for 3D depth filming of social interactions [144].
Another aspect of social behavior that presents a challenge for computerized behavior analysis is group dynamics. While most social tests focus on dyadic interactions between two mice [36, 145], the behavior of animals in a group cannot be predicted by models based solely on the behavior of the individual or the behavior of pairs, indicating that social behavior is determined by relatively complex interactions that include more than one other animal [146]. For capturing group behavioral dynamics, some tracking systems have the ability to track animal activity across days in a semi-natural habitat while maintaining identities (see Fig. 2 for several examples), offering a relevant tool for investigating social behavior with long-scale progressions, such as dominance, sexual courting, and the identification of persistent personality traits [126, 142, 143, 146, 147].
Advances in machine learning approaches and neural network training have led to the development of tracking algorithms capable of detecting multiple untagged and freely moving animals while maintaining their identities. One example is idtracker.ai [148], an algorithm and software that implements two convolutional networks, one for detecting collision events between subjects and one for assigning identities to the detected animals using classification analysis with no need for any artificial “tagging.” Deep learning approaches also enabled the development of pose-estimation systems for detecting and tracking changes in the postures and positions of user-defined body parts, allowing a closer look at fine motor changes involved in the performance of certain behaviors [131, 149]. A leading example for a deep-learning based pose-estimator is DeepLabCut, which employs transfer learning of neural networks to simultaneously estimate the body positions of multiple animals. DeepLabCut is a deep convolutional network pre-trained for object recognition on images from the ImageNet dataset. Due to transfer learning, the network needs minimal training data of manually labeled and annotated frames to fine-tune weights within the network to detect and classify events relevant to the specific needs of the user [149].
Machine learning-based approaches also contributed to programs like JAABA [127] and SimBA [150], used to classify behavior through supervised learning into user-specified categories by training neural networks with manually annotated data. Such programs can be beneficial in the context of social behavior and for quantification of distinct behavioral events like grooming, attacking, and mounting that might otherwise by missed by conventional top-view 2D position-based tracking.
Thus, developing software with features including animal tracking, pose estimation, and machine-learning-based options for supervised behavioral classification will provide high-resolution insight into the nuances of social behavior and facilitate the study of their underlying neural circuits and genes. It is noteworthy that most available tracking systems are better suited for tracking target animals in simple postures. Therefore, the identification and interpretation of a subject’s affective state and the meaning of each displayed change in behavior or movement may be hard to achieve relying on vision-based tracking alone.
Vocalizations
Vocalizations emitted by rodents serve as a communicational tool that varies in frequency range according to the emotional context [90, 105, 152, 153]. Pups elicit 30–60 kHz calls with varying acoustic features when separated from the dam, resulting in approach and retrieval behaviors and reducing attack and rough handling by the dam [105]. Since pup calls emission is modulated by maternal cues and affected by anxiolytic/anxiogenic drugs, ultrasonic vocalization (USV) analysis in pups can be considered a suitable model for studying the development of emotionality in rodents [90, 105].
USVs were also shown to convey emotional content in adult rats [105] (Fig. 3A, B). These animals emit USVs at a 22 kHz frequency in negative emotional contexts like exposure to predators, threats, pain, and during withdrawal from drugs, such as benzodiazepines and psycho-stimulants. In contrast, 50 kHz USVs are emitted in more affiliative contexts, including play solicitation, sexual interactions, social exploration, and drug-induced reward states [10, 29, 91, 105, 152, 154]. The emission of 22 kHz alarm calls was found to elicit freezing behavior in ‘listeners’ who had previous experience with the aversive stimulus used to elicit the calls, accompanied by increased activity of brain regions regulating fear and anxiety, including the amygdala, periaqueductal gray (PAG), and hypothalamus [10, 155]. The emission of 50 kHz USVs, on the other hand, was found to encourage social approach and cooperative behavior and to establish and maintain social contact. These events were accompanied by decreased activation of the amygdala and increased activation of brain regions implicated in reward, like the nucleus accumbens, mediated by increased dopaminergic signaling in the ventral tegmental area (VTA) [10, 152, 153]. Both the 22 and 50 kHz USVs were affected by social experience, with prolonged social isolation decreasing the emission of the 22 kHz call and increasing the emission of 50 kHz calls during play interaction and in anticipation of tickling, indicating an increase in social motivation [10, 152].
Mice were also found to emit a variety of vocalizations spanning a wide range of frequencies that can be divided into non-USVs, comprisng low-frequency harmonic calls (LFHs) and mid-frequency vocalizations (MFVs), and USVs. LFHs, or “squeaks”, are composed of harmonic complexes with a power audible to humans and frequencies below 5 kHz. These vocalizations are emitted mainly in aversive contexts, like pain, agitation, and fighting [106]. MFVs are a non-USV category identified by Grimsley et al. [41], encompassing vocalizations with a frequency range between 9 and 15 kHz. They are emitted under different types of restraint, depicting a negative emotional state of the emitter and eliciting stress and anxiety in the listener [106] (see Fig. 1). However, mice predominantly emit USVs in social contexts with frequencies above 20 kHz. Mice USVs present many structurally and temporally complex acoustic features that can vary across developmental stage [10, 90, 105, 156], genetic strains [157, 158], gender [159–162], and social context [163–165] (Fig. 3C). Mice USVs can be classified into different syllables, defined as units of sound composed of one or more tones and separated by silent pauses [166–168]. Syllables can be categorized based on their acoustic variables, i.e., bandwidth, duration, amplitude, and shape [158]. Still, there is no universally accepted classification for mice USV syllables [158, 167–171]. To capture and analyze USVs in rodents at different levels of complexity, multiple tools have been developed for USV detection and classification, some of which are summarized in Table 2. It is important to note that while some attempts to understand the role of the different ultrasonic vocalizations in mediating behavior [169, 172, 173] and the context in which they are produced [164, 170, 174] have been made, the functional extent and exact meanings of the varying variables of mouse USVs in social interactions remain unknown [171].
Table 2.
Name | General description and relevant features | References |
---|---|---|
WAV-file Automated Analysis of Vocalizations Environment Specific (WAAVES) |
An automated USV assessment program utilizing MATLAB’s Signal and Image Processing Toolboxes and customizes filters to separate USV calls from noise and assign each USV into one of two categories: 50–55 kHz and 22–28 kHz USVs Appropriate for rat call analysis Different test environments (e.g. operant chamber, open field, home cage, etc.) require customized separation criteria |
[195] |
Automatic mouse ultrasound detector (AMUD) |
An algorithm for the automatic detection and extraction of USV syllables which runs on STx acoustic software The de-noising steps are amplitude-sensitive Provides information on the detected element frequency, amplitude, and time variables For detecting USV that are not shorter than 10 ms |
[196] |
Vocal inventory clustering engine (VoICE) |
A classification software that utilizes acoustic similarity relationships between vocal events to generate high dimensional similarity matrices, which are then subjected to hierarchical clustering based on mean frequency and each note's slope, duration, and curvature Based on pre-defined rules, the syllables are clustered into a limited number [9–12] of named categories Includes syntactical similarity quantification to detect changes in syllable patterns across conditions Independent method is needed to detect and ‘‘clip’’ each syllable into a separate wav file |
[190] |
Mouse song analyzer (MSA) |
A custom MATLAB program based [163] modified from code written by [167] and further developed by [164] for automated, rule-based categorization of syllable shapes Multi-note syllables are classified based on the number and direction of frequency jumps (or pitch jumps) but not based on the duration, slope, or curvature of each note The detected syllables are categorized into a limited number [4–15] of named categories based on pre-defined rules Other measured variables include syllable duration, inter-syllable interval, standard deviation of pitch distribution, pitch mean frequency, frequency modulation, and spectral purity [164] Offers syntax composition and probability analysis to determine the probability of transitioning between different syllable types within a given context, an analysis that enables the identification of repeated syllable patterns (e.g., songs) |
[163, 164, 167] |
Mouse Ultrasonic Profile ExTraction (MUPET) |
An open access MATLAB tool for data-driven analysis of USVs by measuring, learning, and comparing syllable types MUPET uses an automated and unsupervised algorithmic approach for the detection and clustering of syllable types summarized in the following features: Syllable detection by isolating and measuring spectro-temporal syllable variables, followed by analyzing overall vocalization features (syllable number, rate and duration, spectral density, and fundamental frequency) The application of unsupervised machine learning based on k-means clustering to build “syllable repertoire” from the dataset which includes up to several hundreds of the most represented syllable types based on spectral shape similarities within that dataset Similarity measurement between syllable types of two different repertoires using rank order comparisons in a manner that is frequency-independent Centroid-based (k-medoids) cluster analysis of syllable types from different syllable repertoires of different datasets to measure the frequency of use of different syllable types across conditions or strains and identify shared and unique shapes Provides automated time-stamps of syllable events for synchronized analysis with behavior The option for the user to control features regarding noise reduction, minimum and maximum syllable duration, minimum total and peak syllable energy, and the minimum inter-syllable interval needed to separate rapidly successive notes into distinct syllables Cannot detect USVs below 30 kHz [197] |
[166] |
DeepSqueak |
A USV detection and analysis software suite based on regional convolutional neural network architecture to detect and categorize USV calls syllables Packaged with four default detection networks: one general-purpose network, one for mouse USVs, one for short rat USVs and one for long 22 kHz rat USVs Detecting USVs is done by a region proposal network, which segments the filtered sonogram image into proposed areas of interest with possible USVs, which are then passed to the classification network to determine whether the image contains a call or background noise. The detected USVs are then saved to a detection file along with call variables and classification confidence scores An option for creating and training custom de-noising secondary networks (by manual annotation of noise vs. call) for identifying noises that might be specific to certain experiments\setups For syllable clustering, the user can determine which USV features are most important for clustering and adjust three weighted input features that are contour-based: shape, frequency, and duration (thus clustering is amplitude invariant). The number of clusters can be determined by the user using supervised neural networks, or by unsupervised data-based clustering by using k-means on perceptually relevant dimensions of the extracted contour to place calls into a predefined number of clusters |
[197] |
USVSEG |
A program for detecting USV segments (syllables) in continuous sound data containing background noise from several rodent species Output contains segmented sound files, image files, and spectral peak feature data that can be used for clustering, classification, or behavioral assessment using other toolkits |
[198] |
VocalMat |
A software that uses image-processing and differential geometry approaches to detect USVs in spectrograms, thus eliminating the need for user-defined parameters or costume training of the neural network VocalMat uses computational vision and machine learning by training a convolutional neural network to classify detected USVs into distinct 11 USV categories or noise |
[199] |
While mice USVs do not clearly depict the emotional state of the emitter [168] as in rats, they still serve a communicative function that modulates interactions in social contexts among both males and females [158]. USVs of male mice were mainly investigated in the context of reproduction [175]. When exposed to a female, male mice emit USV songs composed of different types of syllables repeated in regular, temporal, and non-random sequences [167] (Fig. 3D, E). Although USV emission in mice is innate [176], it is highly influenced by social experience. For example, the acoustic features and syllable variables of male USVs to females are affected by the receptivity of the female (specifically, its estrous state) [169], the state of the female (i.e., vivid, anesthetized, or urine only) [164], female presence [169], and prior sexual [177] and social experience [161, 165, 178]. Male courtship USVs were found to be mediated by a distinct neural population in the PAG connected to downstream premotor vocal-respiratory neurons in the nucleus retroambiguus to control the temporal and spectral features of the emitted USVs [179]. In a playback study, female mice were also shown to favor male songs over pup vocalizations and noise [180]. Together, these findings demonstrate that the USVs of adult male mice facilitate the attraction of females and promote reproduction. However, adult male mice were found to emit USVs in other social contexts with same-sex stimuli (Fig. 3 F–H), namely, in response to a male intruder [175], and during interactions with a male stimulus following social isolation [165, 170, 174]. Males also emit low-frequency USVs (≤ 60 kHz) when held in a constrainer with a nearby male conspecific [174], indicating that USVs in adult male mice serve a broader social function than merely courtship calls.
As for female mice, earlier studies showed that interactions between devocalized males and intact females abolished detected USVs, while interactions between intact males and devocalized females had little effect on the number of detected USVs, thus leading to the conclusion that USVs in male–female interactions are mainly emitted by the male [181]. However, later research found that female mice do vocalize during interactions with males, although to a lesser extent of 15–18% of the total USVs recorded [159, 160, 182, 183] and with USVs of different acoustic features than those of males [159, 162, 183]. Females also emit USVs in female-female interactions [184, 185] and in response to an awake or anesthetized female intruder in the resident-intruder test [165, 186] (Fig. 3I–K). USVs of females in same-sex interactions are influenced by their motivational state and sexual receptivity, age, familiarity [187], and prior social isolation [161, 165]. Therefore, USVs in females appear to serve many roles, including territorial calls [186], indexing familiarity [187], and facilitating approach [165, 185].
It should be pointed out, however, that some experimental designs include isolating subjects prior to the experiment so as to induce emission of a greater number of USVs [159, 167, 184, 187], which may compromise the generality of the results by introducing the confounding effects of isolation on USV emission and social behavior in both males [162, 170, 174] and females [165]. To overcome such limitations, de Chaumont et al. [37] recorded same-sex pairs of mice over three days in a home-like environment to examine differences in spontaneously emitted USVs without the contribution of prior isolation or constrained interaction in limited recording sessions. This method uncovered changes in USVs that were dependent on age, sex, genotype, and social context, signifying a possible role for USVs as an indicator of higher arousal states in social interactions.
The research of ultrasonic vocalizations is currently hindered by technical challenges that include determining the identity of the vocalizer during interactions with two or more animals. Both males and females can emit USVs of similar features [182, 186], and mice do not show clear visual cues of their vocal behavior [183]. Attempts to overcome this challenge included surgical interventions to devocalize one of the interacting animals, specifically by unilateral incision of the inferior laryngeal nerve [181, 188], anesthetizing the stimulus [170, 172, 186], or exposing the subject to urine or bedding collected from the stimulus instead of a conspecific [167, 177]. To attain vocalizer identity without any outside intervention, Zala and colleagues [160] recorded USVs from subjects interacting with a stimulus through a plexiglass divider wall, with the compartment of the stimulus being covered in a plexiglass lid to ensure that only USVs from the subject’s compartment were recorded. While promising, this method entails the placement of a separator between the subject and the stimulus, thus limiting the extent of social interaction. In contrast, Neunuebel et al. [182] used a four-channel ultrasonic microphone array-based system combined with a sound source localization method for localizing the source of the recorded USVs in groups of freely-behaving mice. By using four microphones, multiple estimates for a given sound signal were extracted and then averaged to pinpoint the location of the source. Combined with video tracking of mice location, a probability index for each mouse was then calculated to assign the source of the sound. While this system allows for analysis of USVs in freely behaving animals, it is not without limitations, as the median error between the location of the actual mouse and the estimated sound source is 3.87 cm, with an identity localization percentage of 78.03% of the total detected USVs. Heckman et al. [183] recorded USVs of two nose-to-nose-interacting mice placed on two separate platforms using two microphones arranged at either side of the arena for an accurate estimation of the vocalizing mouse based on temporal differences in sound time arrival (Fig. 4). A set-up with a similar premise was used in Rao et al. [189] to investigate how the interplay between facial touch and USVs modulates the activity of the auditory cortex. The set-up included a gap between the platforms of the subject and stimulus rats, allowing for only close face-to-face interactions. USVs were recorded with four ultrasonic microphones, such that the identity of the caller was assigned by intensity measurements to yield a success rate of 80% of the detected USVs. Notably, both set-ups described in Heckman et al. [183] and Rao et al. [189] reduced the social interactions under investigation to only one dimension, thus limiting the extent of physical contact between the examined subjects, which might in turn have limited the repertoire of USVs emitted. On a related note, our lab is currently developing mini-microphones directly implanted into a subject’s head for accurate recognition of the emitter’s identity and a more sensitive detection of a broader range of USVs than is usually detected by a microphones placed above the arena.
Given their communicative function in social interactions, ultrasonic vocalizations seem to be worthy candidates for a translational endophenotype in studying socio-emotional interactions and modeling neurodevelopmental disorders with deficits in communication, such as ASD. Indeed, abnormal emission of USVs among pups and adult subjects has been reported in multiple genetic models of ASD, including 16p11.2 [39, 84], Cntnap2 [38, 83, 190], Iqsec2 [73], Shank2 mice [191], and Shank3 mice [36, 37, 55, 192, 193] and rats [28], as well as BTBR mice [36, 175, 177, 194], emphasizing the translational and face validity of USVs as a model for socio-affective communication. Therefore, the analysis of USVs in terms of frequency, sequence, number, and acoustic structure may provide greater insight into the state of the tested animal and how these are altered under various manipulations and abnormal conditions, as well as offer a potential platform for evaluating the efficacy of therapeutic assays in the context of ASD.
Urinary scent-marking
Rodents use urinary scent marks (among other means) for communicating with conspecifics in many social contexts, including individual recognition, assertion of dominance, and reproductive status assessment [36, 49, 109]. Urinary scents convey information regarding the sender's age, sex, strain, social status, health fitness, and individual identity [108]. Communication through urinary scents is mediated mainly by two classes of protein, specifically the major histocompatibility complex (MHC) and major urinary proteins (MUPs) [108, 200]. Scent marking through urine changes over the course of development, with increased urination by C57BL/6J mice being seen after exposure to a novel CD1 stimulus or a female stimulus appearing only after sexual maturation at the age of 2–3 months [200]. In male-to-male interactions, urinary marking is used for territorial establishment, and is influenced by dominance and suppressed by social defeat [177]. It also influences inter-male aggression and the display of attack behavior, an observation mediated by neural projections from the vomeronasal organ (VNO) via the bed nucleus of the stria terminalis (BNST) to dopaminergic neurons in the ventral pre-mammillary nucleus of the hypothalamus (PMv) [201]. Territorial urinary marking in the presence of urine from a novel conspecific was found to be influenced by social experience and enhanced by prior social isolation, an effect mediated by the activity of the lateral hypothalamus [202]. Urinary marking also conveys social memory and habituation to a given conspecific, since repeated exposure to the same conspecific is correlated with a reduction in urinary scent-marking in C57BL/6J male mice and is recovered upon the introduction of a novel stimulus [108]. Urinary scent detection also influences the arousal state and behavior of the receiver. For example, non-lactating female mice exposed to the major urinary protein darcin present in male urine emitted a greater number of USVs and showed an increase in scent-marking behavior for communicating reproductive status, a behavioral effect mediated by the medial amygdala [109]. Interestingly, deficits in urinary marking by male subjects in response to urine from a female in estrus were found among BTBR [177] and Cntnap2−\− but not among 16p11.2df\+ mice [83], indicating that urinary scent-marking is disrupted in some models of ASD but not in others.
Some of the methods presently used for tracking urination in mice rely on post hoc analysis of urine spots collected by placing absorbent paper underneath the subjects. Urine spots can then be detected and analyzed by fluorescence imaging, given that urine presents red-shifted fluorescent emission when excited with UV light [202–204], or by using Ninhydrin spray for urine fixation [108, 200, 205]. Such methods can only provide information on the cumulative output of voiding behavior, like void numbers, volume, and spatial distribution. However, these methods are incapable of detecting the exact time of each void and differentiating between two overlapping voids, and are poorly suited for combined analysis of other time-sensitive methods, like brain activity [206, 207]. In contrast, thermal imaging offers a promising solution to such limitations, due to the ability to detect voiding events on the basis of the distinctive thermal signature of urine, namely how freshly deposited urine is close to body temperature and then cools down below ambient substrate temperature. Thermal imaging thus offers a valuable and highly informative tool for investigating the spatial and temporal dynamics of micturition behavior in social contexts that can be combined with other in vivo methods for uncovering possible interplay with other sensory cues or unmasking of the neural processing mechanisms underlying such behavior. For example, using thermal imaging, Miller et al. [206] were able to investigate the spatiotemporal dynamics of micturition in male mice and uncover changes in scent-marking signaling behavior in response to different social contexts and the outcome of prior social competition that could not have been detected by other post hoc methods for urine spot analysis.
Taken together, these findings indicate that scent-marking represents an active emission of a signal that serves a social function between conspecifics with the ability to convey and induce changes in the socio-affective behavior in various contexts, and thus merits further investigation.
Sniffing
Sniffing is an active respiratory behavior essential for acquiring and sampling odors typically exhibited at a higher frequency than average respiration rates and commonly displayed during motivated behaviors, such as social behaviors [139, 154]. Highly aggressive rats were shown to display decreased sniffing during exploration of a novel context accompanied by increased anxiety-related behaviors, indicating that sniffing can be used as a physiological marker for measuring the arousal state of an animal [208]. In addition, sniffs and other orofacial behaviors, like whisking and changes in head position, show oscillatory patterning at theta frequency (4–12 Hz), a frequency that reflects arousal and is relevant to information exchange between brain areas [154, 209]. Abnormalities in sniffing were also correlated with reduced social [38] and sexual [39] drive in mice in which the Cntnap2 gene was knocked-down in the PFC and in 16p11.2+/− mice, respectively. Practically, sniffing can be monitored by connecting a cannula implanted into the nasal cavity of the animal to a pressure sensor for monitoring airflow [154, 209], allowing the detection of sniffing patterns at the millisecond time resolution and providing the ability to integrate other methods of analysis to the set-up. Therefore, research of neural correlates in social behavior will benefit from examining sniffing patterns and their changes during specific events in social interactions, as well by examining their relationships with other communication modalities.
Facial expressions
In humans, facial expressions offer a generous source of information for conveying the subjective emotional experience and for inferring the emotional experience of others [100]. In rodents, however, the importance of facial expressions as a modality for inter-species communication might be less pronounced than it is in humans, given that rodents are olfactory creatures with weakly developed facial musculature and relatively poor changes in their facial expressions [54]. Still, recent evidence suggests that facial expressions in rodents convey information regarding the individual's emotional state. Mice, for example, display distinct changes in their facial expressions, including bulges in the nose and cheeks, and changes in the positions of their ears and whiskers in response to noxious stimuli that were utilized to establish a mouse grimace scale for assessing pain response in mice [210]. Mice also show tightened eyes and flattened ears in response to an intruder in an resident-intruder test but not in response to a cat odor [211]. Distinct facial expressions were also detected in positive contexts among rats who showed significant ear color and ear angle changes during tickling [212]. Recent and highly compelling evidence for the display of distinct facial expressions in mice was provided by Dolensek et al. [110], who generated a non-supervised algorithm to cluster and classify facial expressions. In this manner, distinct facial expressions mediated by “face-responsive” neurons in the insular cortex were detected and correlated with different emotional events, including disgust, pleasure, malaise, active, and passive fear.
While the communicative value of facial expressions and their importance in directing the behavior of the observer in rodents is still unclear, the analysis of facial expressions can still provide unique insight into the affective state experienced by an animal and thus can be used to assess affective responses to certain stimuli/treatments and how these are altered under pathological conditions.
The behavior of the stimulus
Social interactions in nature are rarely unilateral and often entail instantaneous changes in behavior and mutual feedback between multiple participants. Still, behavioral tasks used for estimating social behavior usually restrict the physical expanse of the social interaction and focus on the behavior of an individual subject, neglecting the dyadic nature of social interactions and the contribution of the stimulus in driving the behavior of the subject. Vocalizations emitted by the stimulus can influence the behavior of the subject and vice versa [152, 153]. Rats, for instance, showed preference and induced approach to 50 kHz calls in playback studies [152], demonstrating the ability of the emitter to influence the behavior of the receiver. Interestingly, this pro-social effect of 50 kHz calls was found to be absent in male but not female Shank3−\− rats, indicating reduced social motivation [28]. Rats also display cognitive bias in an Ambiguous-Cue Interpretation Test induced upon hearing USVs of certain frequencies. Rats exposed to 50 kHz USVs showed optimistic bias in the judgment of an ambiguous tone, while rats exposed to 22 kHz calls showed a pessimistic bias, indicating that USVs are capable of influencing the emotional state of the listener [213]. Mice were also shown to exhibit elevated corticosterone levels and anxiety-related behaviors when listening to mid-frequency calls in playback [106], further confirming that vocalizations emitted by the stimulus can alter the affective state of the subject. Also, mice exposed to multiple stimuli held in enclosures that allow varying levels of sensory cues to be detected by the subject showed an increased probability of investigating stimuli held in the enclosure permitting the highest level of social cues complexity [3]. In addition, mice exposed to an anesthetized intruder in a resident intruder test emitted USVs with different acoustic structures in terms of duration and number of frequency jumps than those emitted when the subjects were introduced to an awake intruder, demonstrating that USVs emitted by a subject are influenced by the state of the stimulus and the arousal level induced by the stimulus [186]. These results indicate that the integration of multiple sensory cues emitted by a stimulus is important for a more salient representation of the stimulus and for driving the social behavior of a subject [3]. Another cue that can influence the behavior of a subject is a stimulus’s movement. Stimuli differing in familiarity to a subject (i.e., a familiar cage-mate versus a novel mouse) were shown to exhibit a different number of large movements as measured by piezoelectric sensors, which in turn had a differential effect on the social investigation of the stimulus by the subject [70]. Therefore, while it is important to capture changes in the behavior of a subject in response to a given social stimulus, examination of a stimulus’s behavior and affective state can provide further information on the exact nature of the influencing variables driving the observed changes in the behavior/neural activity of the subject. Overall, the contribution of a stimulus’s behavior and affective state should also be included when analyzing any social interaction.
Interaction between multimodal cues
Whereas isolating and focusing on one variable is important for detailed understanding of its role and influence, it is also important to keep in mind that focusing on one pixel does not convey the whole picture. Indeed, modalities of social communication rarely work in isolation and are often simultaneously synchronized with other modalities. One such example is sniffing. Wesson [139] showed that changes in sniffing behavior communicate social hierarchy and influence the latency to be attacked by a dominant subject. However, later work by Sirotin et al. [154] showed that active sniffing and ultrasonic vocalizations during social interactions bidirectionally influence one another, with USVs being strictly emitted during periods of active sniffing, especially in the exhalation phase, thereby altering the sniffing phase which modulates segmentation of ultrasound production into individual calls. Such findings suggest that alterations in sniffing can be caused by or coupled with the emission of USVs. Later, Alves et al. [209] showed that changes in sniffing are correlated with other orofacial behaviors, like head movements in the x- and y-axis in a manner influenced by the walking speed of the animal. Therefore, a better and more inclusive understanding of a subject’s emotional state requires not only strict analysis of a given variable during social behavior but also examining its interplay and correlation with other variables (Fig. 5).
Conclusions
Establishing the existence of social deficits in mouse models for any given disorder requires reliance on more than one behavioral paradigm or variable, given that such an approach is binary, context-dependent, and susceptible to pollution by various confounding variables, which could compromise the conclusiveness of any findings. Therefore, phenotyping behavioral deficits in animal models should implement the systematic use of a battery of behavioral tasks that address different aspects and contexts of social behavior. In addition, incorporating various methods for detailed and automatic analysis of multiple physiological and behavioral variables during tasks and combining these variables with brain recordings and machine-learning algorithms will allow for determining and characterizing socio-emotional states during social interactions of animal subjects who encounter various types of social stimuli. Such an integrative approach for analyzing social behavior in rodents not only will accelerate investigation of the brain mechanisms involved, but will also enable a genuine comparison of deficits between human patients and animal models of pathological conditions.
Acknowledgements
Not applicable.
Author contributions
RJ collected the data, prepared the figures and tables and wrote the initial draft of the manuscript. SN contributed to the conceptualization of the manuscript, and to the design of the figures. SW participated in the conceptualization and design coordination of the manuscript and helped write the final draft. All authors read and approved the final manuscript.
Funding
This study was supported by ISF-NSFC joint research program (Grant 3459/20), the Israel Science Foundation (Grant 1361/17), the Ministry of Science, Technology and Space of Israel (Grant 3-12068) and the United States-Israel Binational Science Foundation (Grant 2019186), all awarded to SW.
Availability of data and materials
Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.
Declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Competing interests
One of the authors (SW) is an associate editor in Molecular Autism journal.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Chen P, Hong W. Neural circuit mechanisms of social behavior. Neuron. 2018;98(1):16–30. doi: 10.1016/j.neuron.2018.02.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Hong W, Kennedy A, Burgos-artizzu XP, Zelikowsky M, Navonne SG, Perona P. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc Natl Acad Sci U S A. 2015;112(38):E5351–E5360. doi: 10.1073/pnas.1515982112. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Contestabile A, Casarotto G, Girard B, Tzanoulinou S, Bellone C. Deconstructing the contribution of sensory cues in social approach. Eur J Neurosci. 2021;53(9):3199–3211. doi: 10.1111/ejn.15179. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Bhanji JP, Delgado MR. The social brain and reward: social information processing in the human striatum. Wiley Interdiscipl Rev Cogn Sci. 2014;5:61–73. doi: 10.1002/wcs.1266. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Krach S, Paulus FM, Bodden M, Kircher T. The rewarding nature of social interactions. Front Behav Neurosci. 2010 doi: 10.3389/fnbeh.2010.00022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Tan T, Wang W, Liu T, Zhong P, Conrow-Graham M, Tian X, et al. Neural circuits and activity dynamics underlying sex-specific effects of chronic social isolation stress. Cell Rep. 2021;34(12):108874. doi: 10.1016/j.celrep.2021.108874. [DOI] [PubMed] [Google Scholar]
- 7.Gunaydin LA, Grosenick L, Finkelstein JC, Kauvar IV, Fenno LE, Adhikari A, et al. Natural neural projection dynamics underlying social behavior. Cell. 2014;157(7):1535–1551. doi: 10.1016/j.cell.2014.05.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Opendak M, Raineki C, Perry RE, Serrano PA, Wilson DA, Sullivan RM. Bidirectional control of infant rat social behavior via dopaminergic innervation of the basolateral amygdala ll ll Article Bidirectional control of infant rat social behavior via dopaminergic innervation of the basolateral amygdala. Neuron. 2021;109(24):4018–4035.e7. doi: 10.1016/j.neuron.2021.09.041. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Arakawa H. Dynamic regulation of oxytocin neuronal circuits in the sequential processes of prosocial behavior in rodent models. Curr Res Neurobiol. 2021;2:100011. doi: 10.1016/j.crneur.2021.100011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Wöhr M, Schwarting RKW. Affective communication in rodents: ultrasonic vocalizations as a tool for research on emotion and motivation. Cell Tissue Res. 2013;354:81–97. doi: 10.1007/s00441-013-1607-9. [DOI] [PubMed] [Google Scholar]
- 11.Ko J. Neuroanatomical substrates of rodent social behavior: the medial prefrontal cortex and its projection patterns. Front Neural Circuits. 2017;11:1–16. doi: 10.3389/fncir.2017.00041. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Liska A, Bertero A, Gomolka R, Sabbioni M, Galbusera A, Barsotti N, et al. Homozygous loss of autism-risk gene cntnap2 results in reduced local and long-range prefrontal functional connectivity. Cereb Cortex. 2018;28(4):1141–1153. doi: 10.1093/cercor/bhx022. [DOI] [PubMed] [Google Scholar]
- 13.Raam T, Hong W. Organization of neural circuits underlying social behavior: a consideration of the medial amygdala. Curr Opin Neurobiol. 2021;68:124–136. doi: 10.1016/j.conb.2021.02.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Keum S, Shin H. Review neural basis of observational fear learning: a potential model of affective empathy. Neuron. 2019;104(1):78–86. doi: 10.1016/j.neuron.2019.09.013. [DOI] [PubMed] [Google Scholar]
- 15.Kim SW, Kim M, Shin HS. Affective empathy and prosocial behavior in rodents. Curr Opin Neurobiol. 2021;68:181–189. doi: 10.1016/j.conb.2021.05.002. [DOI] [PubMed] [Google Scholar]
- 16.Panksepp JB, Lahvis GP. Rodent empathy and affective neuroscience. Neurosci Biobehav Rev. 2011;35(9):1864–1875. doi: 10.1016/j.neubiorev.2011.05.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Adolphs R. Review conceptual challenges and directions for social neuroscience. Neuron. 2010;65(6):752–767. doi: 10.1016/j.neuron.2010.03.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Cacioppo JT, Decety J. Challenges and opportunities in social neuroscience. Ann N Y Acad Sci. 2012;1224(1):162–173. doi: 10.1111/j.1749-6632.2010.05858.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Ferretti V, Papaleo F. Understanding others: emotion recognition in humans and other animals. Genes Brain Behav. 2019;18(1):1–12. doi: 10.1111/gbb.12544. [DOI] [PubMed] [Google Scholar]
- 20.Ferretti V, Maltese F, Contarini G, Nigro M, Bonavia A, Huang H, et al. Oxytocin signaling in the central amygdala modulates emotion discrimination in mice. Curr Biol. 2019;29(12):1938–1953.e6. doi: 10.1016/j.cub.2019.04.070. [DOI] [PubMed] [Google Scholar]
- 21.Bicks LK, Koike H, Akbarian S, Morishita H. Prefrontal cortex and social cognition in mouse and man. Front Psychol. 2015;6:1–15. doi: 10.3389/fpsyg.2015.01805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Zilkha N, Sofer Y, Beny Y, Kimchi T. ScienceDirect From classic ethology to modern neuroethology: overcoming the three biases in social behavior research. Curr Opin Neurobiol. 2016;38:96–108. doi: 10.1016/j.conb.2016.04.014. [DOI] [PubMed] [Google Scholar]
- 23.Nestler EJ, Hyman SE. Animal models of neuropsychiatric disorders. Nat Neurosci. 2010;13(10):1161–1169. doi: 10.1038/nn.2647. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Kas MJ, Glennon JC, Buitelaar J, Ey E, Biemans B, Crawley J, et al. Assessing behavioural and cognitive domains of autism spectrum disorders in rodents: current status and future perspectives. Psychopharmacology. 2014;231(6):1125–1146. doi: 10.1007/s00213-013-3268-5. [DOI] [PubMed] [Google Scholar]
- 25.Ellenbroek B, Youn J. Rodent models in neuroscience research: Is it a rat race? DMM Dis Model Mech. 2016;9(10):1079–1087. doi: 10.1242/dmm.026120. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Silverman JL, Ellegood J. Behavioral and neuroanatomical approaches in models of neurodevelopmental disorders: opportunities for translation. Curr Opin Neurol. 2018;31(2):126–133. doi: 10.1097/WCO.0000000000000537. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Silverman JL, Thurm A, Ethridge SB, Soller MM, Petkova SP, Abel T, et al. Reconsidering animal models used to study autism spectrum disorder: current state and optimizing future. Genes Brain Behav. 2022;21(5):1–13. doi: 10.1111/gbb.12803. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Berg EL, Copping NA, Rivera JK, Pride MC, Careaga M, Bauman MD, et al. Developmental social communication deficits in the Shank3 rat model of phelan-mcdermid syndrome and autism spectrum disorder. Autism Res. 2018;11(4):587–601. doi: 10.1002/aur.1925. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Kondrakiewicz K, Kostecki M, Szadzińska W, Knapska E. Ecological validity of social interaction tests in rats and mice. Genes Brain Behav. 2019;18(1):1–14. doi: 10.1111/gbb.12525. [DOI] [PubMed] [Google Scholar]
- 30.Kummer KK, Hofhansel L, Barwitz CM, Schardl A, Prast JM, Salti A, et al. Differences in social interaction- vs. cocaine reward in mouse vs. rat. Front Behav Neurosci. 2014;8:1–7. doi: 10.3389/fnbeh.2014.00363. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Netser S, Meyer A, Magalnik H, Zylbertal A, de la Zerda SH, Briller M, et al. Distinct dynamics of social motivation drive differential social behavior in laboratory rat and mouse strains. Nat Commun. 2020;11(1):5908. doi: 10.1038/s41467-020-19569-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Crawley JN, Hill C, Carolina N. Designing mouse behavioral tasks relevant to autistic -like behaviors. Ment Retard Dev Disabil Res Rev. 2004;10:248–258. doi: 10.1002/mrdd.20039. [DOI] [PubMed] [Google Scholar]
- 33.Langford DJ, Crager SE, Shehzad Z, Smith SB, Sotocinal SG, Levenstadt JS, et al. Social modulation of pain as evidence for empathy in mice. Science (80-) 2006;312(5782):1967–1970. doi: 10.1126/science.1128322. [DOI] [PubMed] [Google Scholar]
- 34.Burkett JP, Andari E, Johnson ZV, Curry DC, De Waal FBM, Young LJ. Oxytocin-dependent consolation behavior in rodents. Science (80-) 2016;351(6271):375–378. doi: 10.1126/science.aac4785. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Meyza KZ, Bartal IBA, Monfils MH, Panksepp JB, Knapska E. The roots of empathy: through the lens of rodent models. Neurosci Biobehav Rev. 2017;76:216–234. doi: 10.1016/j.neubiorev.2016.10.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Silverman JL, Yang M, Lord C, Crawley JN. Behavioural phenotyping assays for mouse models of autism. Nat Rev Neurosci. 2010;11(7):490–502. doi: 10.1038/nrn2851. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.de Chaumont F, Lemière N, Coqueran S, Bourgeron T, Ey E. LMT USV toolbox, a novel methodological approach to place mouse ultrasonic vocalizations in their behavioral contexts—a study in female and male C57BL/6J Mice and in Shank3 mutant females. Front Behav Neurosci. 2021;15:1–18. doi: 10.3389/fnbeh.2021.735920. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Sacai H, Sakoori K, Konno K, Nagahama K, Suzuki H, Watanabe T, et al. Autism spectrum disorder-like behavior caused by reduced excitatory synaptic transmission in pyramidal neurons of mouse prefrontal cortex. Nat Commun. 2020;11(1):1–15. doi: 10.1038/s41467-020-18861-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Yang M, Mahrt EJ, Lewis F, Foley G, Portmann T, Dolmetsch RE, et al. 16P11.2 Deletion syndrome mice display sensory and ultrasonic vocalization deficits during social interactions. Autism Res. 2015;8(5):507–521. doi: 10.1002/aur.1465. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Kazdoba TM, Leach PT, Yang M, Silverman JL, Solomon M, Crawley JN. Translational mouse models of autism: advancing toward pharmacological therapeutics. Curr Top Behav Neurosci. 2016;28:1–52. doi: 10.1007/7854_2015_5003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Grimsley JMS, Sheth S, Vallabh N, Grimsley CA, Bhattal J, Latsko M, et al. Contextual modulation of vocal behavior in mouse: newly identified 12 kHz “Mid-frequency” vocalization emitted during restraint. Front Behav Neurosci. 2016;10:1–14. doi: 10.3389/fnbeh.2016.00038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Wittmann MK, Lockwood PL, Rushworth MFS. Neural mechanisms of social cognition in primates. Annu Rev Neurosci. 2018;41:99–118. doi: 10.1146/annurev-neuro-080317-061450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Carcea I, Froemke RC. Biological mechanisms for observational learning. Curr Opin Neurobiol. 2019;54:178–185. doi: 10.1016/j.conb.2018.11.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Fernández M, Mollinedo-Gajate I, Peñagarikano O. Neural circuits for social cognition: implications for autism. Neuroscience. 2018;370:148–162. doi: 10.1016/j.neuroscience.2017.07.013. [DOI] [PubMed] [Google Scholar]
- 45.Kohl J, Autry AE, Dulac C. The neurobiology of parenting: a neural circuit perspective. BioEssays. 2018;39(1):1–11. doi: 10.1002/bies.201600159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Li Y, Dulac C. Neural coding of sex-specific social information in the mouse brain. Curr Opin Neurobiol. 2018;53:120–130. doi: 10.1016/j.conb.2018.07.005. [DOI] [PubMed] [Google Scholar]
- 47.Lischinsky JE, Lin D. Neural mechanisms of aggression across species. Nat Neurosci. 2020 doi: 10.1038/s41593-020-00715-2. [DOI] [PubMed] [Google Scholar]
- 48.Matthews GA, Tye KM. Neural mechanisms of social homeostasis. Ann N Y Acad Sci. 2020;1457(1):5–25. doi: 10.1111/nyas.14016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Zhou T, Sandi C, Hu H. Advances in understanding neural mechanisms of social dominance. Curr Opin Neurobiol. 2018;49:99–107. doi: 10.1016/j.conb.2018.01.006. [DOI] [PubMed] [Google Scholar]
- 50.Fakhro KA. Genomics of autism. Adv Neurobiol. 2020;24:83–96. doi: 10.1007/978-3-030-30402-7_3. [DOI] [PubMed] [Google Scholar]
- 51.De Rubeis S, Buxbaum JD. Recent advances in the genetics of autism spectrum disorder. Curr Neurol Neurosci Rep. 2015;15(6):1–9. doi: 10.1007/s11910-015-0553-1. [DOI] [PubMed] [Google Scholar]
- 52.Kazdoba TM, Leach PT, Crawley JN. Behavioral phenotypes of genetic mouse models of autism. Genes Brain Behav. 2016;15(1):7–26. doi: 10.1111/gbb.12256. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Peleh T, Ike KGO, Wams EJ, Lebois EP, Hengerer B. The reverse translation of a quantitative neuropsychiatric framework into preclinical studies: focus on social interaction and behavior. Neurosci Biobehav Rev. 2019;97:96–111. doi: 10.1016/j.neubiorev.2018.07.018. [DOI] [PubMed] [Google Scholar]
- 54.Salyha Y. Animal models of autism spectrum disorders and behavioral techniques of their examination. Neurophysiology. 2018 [Google Scholar]
- 55.Leach T, Crawley JN. Behavioral phenotypes of genetic mouse models of autism. Genes Brain Behav. 2016;15:7–26. doi: 10.1111/gbb.12256. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Rein B, Ma K, Yan Z. A standardized social preference protocol for measuring social deficits in mouse models of autism. Nat Protoc. 2020 doi: 10.1038/s41596-020-0382-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Kaidanovich-beilin O, Lipina T, Vukobradovic I, Roder J, Woodgett JR. Assessment of social interaction behaviors. J Vis Exp. 2011 doi: 10.3791/2473. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Moy SS, Nadler JJ, Perez A. Sociability and preference for social novelty in five inbred strains: an approach to assess autistic-like behavior in mice. Genes Brain Behav. 2004;3:287–302. doi: 10.1111/j.1601-1848.2004.00076.x. [DOI] [PubMed] [Google Scholar]
- 59.Pearson BL, Defensor EB, Blanchard DC, Blanchard RJ. C57BL/6J mice fail to exhibit preference for social novelty in the three-chamber apparatus. Behav Brain Res. 2010;213(2):189–194. doi: 10.1016/j.bbr.2010.04.054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Guo B, Chen J, Chen Q, Ren K, Feng D, Mao H, et al. Anterior cingulate cortex dysfunction underlies social deficits in Shank3 mutant mice. Nat Neurosci. 2019;22(8):1223–1234. doi: 10.1038/s41593-019-0445-9. [DOI] [PubMed] [Google Scholar]
- 61.Sharon G, Cruz NJ, Kang DW, Gandal MJ, Wang B, Kim YM, et al. Human Gut microbiota from autism spectrum disorder promote behavioral symptoms in mice. Cell. 2019;177(6):1600–1618.e17. doi: 10.1016/j.cell.2019.05.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Phenotypes IT, Orefice LL, Mosko JR, Morency DT, Lehtinen MK, Feng G, et al. Targeting peripheral somatosensory neurons to article targeting peripheral somatosensory neurons to improve tactile-related phenotypes in ASD models. Cell. 2019;178(4):867–886.e24. doi: 10.1016/j.cell.2019.07.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Peñagarikano O, Lázaro MT, Lu XH, Gordon A, Dong H, Lam HA, et al. Exogenous and evoked oxytocin restores social behavior in the Cntnap2 mouse model of autism. Sci Transl Med. 2015 doi: 10.1126/scitranslmed.3010257. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Jackson MR, Loring KE, Homan CC, Thai MHN, Määttänen L, Arvio M, et al. Heterozygous loss of function of IQSEC2/Iqsec2 leads to increased activated Arf6 and severe neurocognitive seizure phenotype in females. Life Sci Alliance. 2019;2(4):1–18. doi: 10.26508/lsa.201900386. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Anthony TE, Dee N, Bernard A, Lerchner W, Heintz N, Anderson DJ. Control of stress-induced persistent anxiety by an extra-amygdala septohypothalamic circuit. Cell. 2014;156(3):522–536. doi: 10.1016/j.cell.2013.12.040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Rogers EJ, Jada R, Schragenheim-Rozales K, Sah M, Cortes M, Florence M, et al. An IQSEC2 mutation associated with intellectual disability and autism results in decreased surface. AMPA Recept. 2019;12:1–18. doi: 10.3389/fnmol.2019.00043. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Gur TL, Vadodkar A, Rajasekera T, Allen J, Niraula A, Godbout J, et al. Prenatal stress disrupts social behavior, cortical neurobiology and commensal microbes in adult male off spring. Behav Brain Res. 2019;359:886–894. doi: 10.1016/j.bbr.2018.06.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Grundwald NJ, Ben DP, Brunton PJ, Brunton PJ. Sex-dependent effects of prenatal stress on social memory in rats: a role for differential expression of central vasopressin-1a receptors neuroendocrinology. J Neuroendocrinol. 2016 doi: 10.1111/jne.12343. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Sankoorikal GMV, Kaercher KA, Boon CJ, Lee JK, Brodkin ES. A mouse model system for genetic analysis of sociability: C57BL/6J versus BALB/cJ inbred mouse strains. Biol Psychiatry. 2006;59(5):415–423. doi: 10.1016/j.biopsych.2005.07.026. [DOI] [PubMed] [Google Scholar]
- 70.Netser S, Meyer A, Magalnik H, Zylbertal A, Haskal S, Zerda D, et al. Distinct dynamics of social motivation drive differential social behavior in laboratory rat and mouse strains. Nat Commun. 2020;11(1):5908. doi: 10.1038/s41467-020-19569-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Matsumoto M, Yoshida M, Jayathilake BW, Inutsuka A, Nishimori K, Takayanagi Y, et al. Indispensable role of the oxytocin receptor for allogrooming toward socially distressed cage mates in female mice. J Neuroendocrinol. 2021 doi: 10.1111/jne.12980. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Haskal de la Zerda S, Netser S, Magalnik H, Wagner S. Impaired sex preference, but not social and social novelty preferences, following systemic blockade of oxytocin receptors in adult male mice. Psychoneuroendocrinology. 2020;116:104676. doi: 10.1016/j.psyneuen.2020.104676. [DOI] [PubMed] [Google Scholar]
- 73.Jabarin R, Levy N, Abergel Y, Berman JH, Zag A, Netser S, et al. Pharmacological modulation of AMPA receptors rescues specific impairments in social behavior associated with the A350V Iqsec2 mutation. Transl Psychiatry. 2021;11(1):1–11. doi: 10.1038/s41398-021-01347-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Kim J, Park K, Kang RJ, Gonzales ELT, Kim DG, Oh HA, et al. Pharmacological modulation of AMPA receptor rescues social impairments in animal models of autism. Neuropsychopharmacology. 2018 doi: 10.1038/s41386-018-0098-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Netser S, Haskal S, Magalnik H, Wagner S. A novel system for tracking social preference dynamics in mice reveals sex- and strain-specific characteristics. Mol Autism. 2017;8(1):1–14. doi: 10.1186/s13229-017-0169-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Fairless AH, Shah RY, Guthrie AJ, Li H, Brodkin ES. Deconstructing sociability, an autism-relevant phenotype, in mouse models. Anat Rec. 2011;294(10):1713–1725. doi: 10.1002/ar.21318. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Levy DR, Tamir T, Kaufman M, Parabucki A, Weissbrod A, Schneidman E, et al. Dynamics of social representation in the mouse prefrontal cortex. Nat Neurosci. 2019;22(12):2013–2022. doi: 10.1038/s41593-019-0531-z. [DOI] [PubMed] [Google Scholar]
- 78.Sharon G, Cruz NJ, Kang D, Geschwind DH, Krajmalnik-brown R, Mazmanian SK, et al. Human gut microbiota from autism spectrum disorder promote behavioral symptoms in mice article human gut microbiota from autism spectrum disorder promote behavioral symptoms in mice. Cell. 2019;177(6):1600–1618.e17. doi: 10.1016/j.cell.2019.05.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Hsieh LS, Wen JH, Miyares L, Lombroso PJ, Bordey A. Outbred CD1 mice are as suitable as inbred C57BL/6J mice in performing social tasks. Neurosci Lett. 2017;637:142–147. doi: 10.1016/j.neulet.2016.11.035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Yang M, Silverman JL, Crawley JN. Automated three-chambered social approach task for mice. Curr Protoc Neurosci. 2011;56:8–26. doi: 10.1002/0471142301.ns0826s56. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Won H, Lee HR, Gee HY, Mah W, Kim JI, Lee J, et al. Autistic-like social behaviour in Shank2-mutant mice improved by restoring NMDA receptor function. Nature. 2012;486(7402):261–265. doi: 10.1038/nature11208. [DOI] [PubMed] [Google Scholar]
- 82.Schmeisser MJ, Ey E, Wegener S, Bockmann J, Stempel AV, Kuebler A, et al. Autistic-like behaviours and hyperactivity in mice lacking ProSAP1/Shank2. Nature. 2012;486(7402):256–260. doi: 10.1038/nature11015. [DOI] [PubMed] [Google Scholar]
- 83.Brunner D, Kabitzke P, He D, Cox K, Thiede L, Hanania T, et al. Comprehensive analysis of the 16p11.2 deletion and null cntnap2 mouse models of autism spectrum disorder. PLoS ONE. 2015;10(8):1–39. doi: 10.1371/journal.pone.0134572. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Portmann T, Yang M, Mao R, Panagiotakos G, Ellegood J, Dolen G, et al. Behavioral abnormalities and circuit defects in the basal ganglia of a mouse model of 16p11.2 deletion syndrome. Cell Rep. 2014;7(4):1077–1092. doi: 10.1016/j.celrep.2014.03.036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Lee DK, Li SW, Bounni F, Friedman G, Jamali M, Strahs L, et al. Reduced sociability and social agency encoding in adult Shank3-mutant mice are restored through gene re-expression in real time. Nat Neurosci. 2021;24(9):1243–1255. doi: 10.1038/s41593-021-00888-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Selimbeyoglu A, Kim CK, Inoue M, Lee SY, Hong ASO, Kauvar I, et al. Modulation of prefrontal cortex excitation/inhibition balance rescues social behavior in CNTNAP2-deficient mice. Sci Transl Med. 2017;9(401):eaah6733. doi: 10.1126/scitranslmed.aah6733. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Wang W, Rein B, Zhang F, Tan T, Zhong P, Qin L, et al. Chemogenetic activation of prefrontal cortex rescues synaptic and behavioral deficits in a mouse model of 16p11.2 deletion syndrome. J Neurosci. 2018;38(26):5939–5948. doi: 10.1523/JNEUROSCI.0149-18.2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Stoppel LJ, Kazdoba TM, Schaffler MD, Preza AR, Heynen A, Crawley JN, et al. R-baclofen reverses cognitive deficits and improves social interactions in two lines of 16p11.2 deletion mice. Neuropsychopharmacology. 2018;43(3):513–524. doi: 10.1038/npp.2017.236. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Kim DG, Gonzales EL, Kim S, Kim Y, Adil KJ, Jeon SJ, et al. Social interaction test in home cage as a novel and ethological measure of social behavior in mice. Exp Neurobiol. 2019;28(2):247–260. doi: 10.5607/en.2019.28.2.247. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Ricceri L, Moles A, Crawley J. Behavioral phenotyping of mouse models of neurodevelopmental disorders: relevant social behavior patterns across the life span. Behav Brain Res. 2007;176:40–52. doi: 10.1016/j.bbr.2006.08.024. [DOI] [PubMed] [Google Scholar]
- 91.Mogil JS. Mice are people too: increasing evidence for cognitive, emotional and social capabilities in laboratory rodents. Can Psychol. 2019;60(1):14–20. [Google Scholar]
- 92.Forkosh O, Karamihalev S, Roeh S, Alon U, Anpilov S, Touma C, et al. Identity domains capture individual differences from across the behavioral repertoire. Nat Neurosci. 2019;22(12):2023–2028. doi: 10.1038/s41593-019-0516-y. [DOI] [PubMed] [Google Scholar]
- 93.Ramos A. Animal models of anxiety: do I need multiple tests? Trends Pharmacol Sci. 2008;29(10):493–498. doi: 10.1016/j.tips.2008.07.005. [DOI] [PubMed] [Google Scholar]
- 94.Carobrez AP, Bertoglio LJ. Ethological and temporal analyses of anxiety-like behavior: the elevated plus-maze model 20 years on. Neurosci Biobehav Rev. 2005;29(8):1193–1205. doi: 10.1016/j.neubiorev.2005.04.017. [DOI] [PubMed] [Google Scholar]
- 95.Hogg S. A review of the validity and variability of the elevated plus-maze as an animal model of anxiety. Pharmacol Biochem Behav. 1996;54(1):21–30. doi: 10.1016/0091-3057(95)02126-4. [DOI] [PubMed] [Google Scholar]
- 96.Sudakov SK, Nazarova GA, Alekseeva EV, Bashkatova VG. Estimation of the level of anxiety in rats: differences in results of open-field test, elevated plus-maze test, and Vogel’s conflict test. Bull Exp Biol Med. 2013;155(3):295–297. doi: 10.1007/s10517-013-2136-y. [DOI] [PubMed] [Google Scholar]
- 97.Id ACT, Id SK, Roberts C, Finnegan EM, Paul S, Planas-sitj I, et al. Measuring affect-related cognitive bias: do mice in opposite affective states react differently to negative and positive stimuli? PLoS ONE. 2019;14(12):e0226438. doi: 10.1371/journal.pone.0226438. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Krakenberg V, Siestrup S, Palme R, Kaiser S, Sachser N, Richter SH. Effects of different social experiences on emotional state in mice. Sci Rep. 2020;10:1–12. doi: 10.1038/s41598-020-71994-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Anderson DJ, Adolphs R. A framework for studying emotions across species. Cell. 2014;157(1):187–200. doi: 10.1016/j.cell.2014.03.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Zych AD, Gogolla N. Expressions of emotions across species. Curr Opin Neurobiol. 2021;68:57–66. doi: 10.1016/j.conb.2021.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Adolphs R. How should neuroscience study emotions? By distinguishing emotion states, concepts, and experiences. Soc Cogn Affect Neurosci. 2017;12(1):24–31. doi: 10.1093/scan/nsw153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 102.de Vere AJ, Kuczaj SA. Where are we in the study of animal emotions? Wiley Interdiscip Rev Cogn Sci. 2016;7(5):354–362. doi: 10.1002/wcs.1399. [DOI] [PubMed] [Google Scholar]
- 103.Mendl M, Burman OHP, Parker RMA, Paul ES. Cognitive bias as an indicator of animal emotion and welfare: emerging evidence and underlying mechanisms. Appl Anim Behav Sci. 2009;118(3–4):161–181. [Google Scholar]
- 104.Nguyen HAT, Guo C, Homberg JR. Cognitive bias under adverse and rewarding conditions: a systematic review of rodent studies. Front Behav Neurosci. 2020;14:1–12. doi: 10.3389/fnbeh.2020.00014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Simola N, Granon S. Ultrasonic vocalizations as a tool in studying emotional states in rodent models of social behavior and brain disease. Neuropharmacology. 2019 doi: 10.1016/j.neuropharm.2018.11.008. [DOI] [PubMed] [Google Scholar]
- 106.Niemczura AC, Grimsley JM, Kim C, Alkhawaga A, Poth A, Carvalho A, et al. Physiological and behavioral responses to vocalization playback in mice. Front Behav Neurosci. 2020;14:1–12. doi: 10.3389/fnbeh.2020.00155. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Sterley TL, Baimoukhametova D, Füzesi T, Zurek AA, Daviu N, Rasiah NP, et al. Social transmission and buffering of synaptic changes after stress. Nat Neurosci. 2018;21(3):393–403. doi: 10.1038/s41593-017-0044-6. [DOI] [PubMed] [Google Scholar]
- 108.Arakawa H, Blanchard DC, Arakawa K, Dunlap C, Blanchard RJ. Scent marking behavior as an odorant communication in mice. Neurosci Biobehav Rev. 2008;32:1236–1248. doi: 10.1016/j.neubiorev.2008.05.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Demir E, Li K, Bobrowski-Khoury N, Sanders JI, Beynon RJ, Hurst JL, et al. The pheromone darcin drives a circuit for innate and reinforced behaviours. Nature. 2020;578:137–141. doi: 10.1038/s41586-020-1967-8. [DOI] [PubMed] [Google Scholar]
- 110.Dolensek N, Gehrlach DA, Klein AS, Gogolla N. Facial expressions of emotion states and their neuronal correlates in mice. Science (80-) 2020;368(6486):89–94. doi: 10.1126/science.aaz9468. [DOI] [PubMed] [Google Scholar]
- 111.Scheggia D, Papaleo F. Social neuroscience: rats can be considerate to others. Curr Biol. 2020;30(6):R274–R276. doi: 10.1016/j.cub.2020.01.093. [DOI] [PubMed] [Google Scholar]
- 112.Hernandez-Lallement J, Attah AT, Soyman E, Pinhal CM, Gazzola V, Keysers C. Harm to others acts as a negative reinforcer in rats. Curr Biol. 2020;30(6):949–961.e7. doi: 10.1016/j.cub.2020.01.017. [DOI] [PubMed] [Google Scholar]
- 113.Bartal IB-A, Decety J, Mason P. Empathy and pro-social behavior in rats. Science (80-) 2011;334(6061):1427–1430. doi: 10.1126/science.1210789. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Dolivo V, Taborsky M. Norway rats reciprocate help according to the quality of help they received. Biol Lett. 2015 doi: 10.1098/rsbl.2014.0959. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Rennie SM, Costa DF, Moita MA, Rennie SM, Costa DF, Moita MA, et al. Prosocial choice in rats depends on food-seeking behavior displayed by recipients. Curr Biol. 2015;25:1736–1745. doi: 10.1016/j.cub.2015.05.018. [DOI] [PubMed] [Google Scholar]
- 116.Smith ML, Asada N, Malenka RC. Anterior cingulate inputs to nucleus accumbens control the social transfer of pain and analgesia. Science (80-) 2021;371(6525):153–159. doi: 10.1126/science.abe3040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Sivaselvachandran S, Acland EL, Abdallah S, Martin LJ. Behavioral and mechanistic insight into rodent empathy. Neurosci Biobehav Rev. 2018;91:130–137. doi: 10.1016/j.neubiorev.2016.06.007. [DOI] [PubMed] [Google Scholar]
- 118.Sterley TL, Bains JS. Social communication of affective states. Curr Opin Neurobiol. 2021;68:44–51. doi: 10.1016/j.conb.2020.12.007. [DOI] [PubMed] [Google Scholar]
- 119.Kemp J, Després O, Sellal F, Dufour A. Theory of mind in normal ageing and neurodegenerative pathologies. Ageing Res Rev. 2012;11(2):199–219. doi: 10.1016/j.arr.2011.12.001. [DOI] [PubMed] [Google Scholar]
- 120.Scheggia D, Managò F, Maltese F, Bruni S, Nigro M, Dautan D, et al. Somatostatin interneurons in the prefrontal cortex control affective state discrimination in mice. Nat Neurosci. 2020;23(1):47–60. doi: 10.1038/s41593-019-0551-8. [DOI] [PubMed] [Google Scholar]
- 121.Shackman AJ, Fox AS, Seminowicz DA, Program CS. The cognitive-emotional brain: opportunities [corrected] and challenges for understanding neuropsychiatric disorders. Behav Brain Sci. 2015;38:e86. doi: 10.1017/S0140525X14001010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Damasio A, Carvalho GB. The nature of feelings: evolutionary and neurobiological origins. Nat Rev Neurosci. 2013;14:143–152. doi: 10.1038/nrn3403. [DOI] [PubMed] [Google Scholar]
- 123.De Gelder B. Towards the neurobiology of emotional body language. Nat Rev Neurosci. 2006;7(3):242–249. doi: 10.1038/nrn1872. [DOI] [PubMed] [Google Scholar]
- 124.Steimer T. The biology of fear- and anxiety-related behaviors. Dialogues Clin Neurosci. 2002;4(3):231–249. doi: 10.31887/DCNS.2002.4.3/tsteimer. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 125.De Chaumont F, Coura RDS, Serreau P, Cressant A, Chabout J, Granon S, et al. Computerized video analysis of social interactions in mice. Nat Methods. 2012;9(4):410–417. doi: 10.1038/nmeth.1924. [DOI] [PubMed] [Google Scholar]
- 126.Geuther BQ, Deats SP, Fox KJ, Murray SA, Braun RE, White JK, et al. Robust mouse tracking in complex environments using neural networks. Commun Biol. 2019;2(1):1–11. doi: 10.1038/s42003-019-0362-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 127.Kabra M, Robie AA, Rivera-Alba M, Branson S, Branson K. JAABA: interactive machine learning for automatic annotation of animal behavior. Nat Methods. 2013;10(1):64–67. doi: 10.1038/nmeth.2281. [DOI] [PubMed] [Google Scholar]
- 128.Segalin C, Williams J, Karigo T, Hui M, Zelikowsky M, Sun JJ, et al. The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice. Elife. 2021;10:1–35. doi: 10.7554/eLife.63720. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 129.Panadeiro V, Rodriguez A, Henry J, Wlodkowic D, Andersson M. Applications: current features and limitations. Lab Anim (NY) 2021 doi: 10.1038/s41684-021-00811-1. [DOI] [PubMed] [Google Scholar]
- 130.Farah R, Langlois JMP, Bilodeau GA. Catching a rat by its edglets. IEEE Trans Image Process. 2013;22(2):668–678. doi: 10.1109/TIP.2012.2221726. [DOI] [PubMed] [Google Scholar]
- 131.Pereira TD, Shaevitz JW, Murthy M. Quantifying behavior to understand the brain. Nat Neurosci. 2020;23(12):1537–1549. doi: 10.1038/s41593-020-00734-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.Yamanaka O, Takeuchi R. UMATracker: an intuitive image-based tracking platform. J Exp Biol. 2018;221(16):1–5. doi: 10.1242/jeb.182469. [DOI] [PubMed] [Google Scholar]
- 133.Krynitsky J, Legaria AA, Pai JJ, Garmendia-Cedillos M, Salem G, Pohida T, et al. Rodent arena tracker (rat): a machine vision rodent tracking camera and closed loop control system. eNeuro. 2020;7(3):1–9. doi: 10.1523/ENEURO.0485-19.2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 134.Samson AL, Ju L, Kim HA, Zhang SR, Lee JAA, Sturgeon SA, et al. MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Sci Rep. 2015;5:1–11. doi: 10.1038/srep16171. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 135.Hewitt BM, Yap MH, Hodson-Tole EF, Kennerley AJ, Sharp PS, Grant RA. A novel automated rodent tracker (ART), demonstrated in a mouse model of amyotrophic lateral sclerosis. J Neurosci Methods. 2018;300:147–156. doi: 10.1016/j.jneumeth.2017.04.006. [DOI] [PubMed] [Google Scholar]
- 136.Crispim Junior CF, Pederiva CN, Bose RC, Garcia VA, Lino-de-Oliveira C, Marino-Neto J. ETHOWATCHER: validation of a tool for behavioral and video-tracking analysis in laboratory animals. Comput Biol Med. 2012;42(2):257–264. doi: 10.1016/j.compbiomed.2011.12.002. [DOI] [PubMed] [Google Scholar]
- 137.Patel TP, Gullotti DM, Hernandez P, O’Brien WT, Capehart BP, Morrison B, et al. An open-source toolbox for automated phenotyping of mice in behavioral tasks. Front Behav Neurosci. 2014;8:1–16. doi: 10.3389/fnbeh.2014.00349. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 138.Buccino AP, Lepperød ME, Dragly SA, Hafliger P, Fyhn M, Hafting T. Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai. J Neural Eng. 2018 doi: 10.1088/1741-2552/aacf45. [DOI] [PubMed] [Google Scholar]
- 139.Wesson DW. Sniffing behavior communicates social hierarchy. Curr Biol. 2013;23:575–580. doi: 10.1016/j.cub.2013.02.012. [DOI] [PubMed] [Google Scholar]
- 140.Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson M. ToxId: an efficient algorithm to solve occlusions when tracking multiple animals. Sci Rep. 2017;7(1):1–8. doi: 10.1038/s41598-017-15104-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 141.Peleh T, Bai X, Kas MJH, Hengerer B. RFID-supported video tracking for automated analysis of social behaviour in groups of mice. J Neurosci Methods. 2019;325:108323. doi: 10.1016/j.jneumeth.2019.108323. [DOI] [PubMed] [Google Scholar]
- 142.Ohayon S, Avni O, Taylor AL, Perona P, Roian Egnor SE. Automated multi-day tracking of marked mice for the analysis of social behaviour. J Neurosci Methods. 2013;219(1):10–19. doi: 10.1016/j.jneumeth.2013.05.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 143.Chaumont F de, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, et al. Live mouse tracker: real-time behavioral analysis of groups of mice. bioRxiv. 2018;345132. 10.1101/345132.
- 144.Matsumoto J, Urakawa S, Takamura Y, Malcher-Lopes R, Hori E, Tomaz C, et al. A 3D-video-based computerized analysis of social and sexual interactions in rats. PLoS ONE. 2013;8(10):e78460. doi: 10.1371/journal.pone.0078460. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 145.Hrabovska SV, Salyha YT. Animal models of autism spectrum disorders and behavioral techniques of their examination. Neurophysiology. 2016;48(5):380–388. [Google Scholar]
- 146.Shemesh Y, Sztainberg Y, Forkosh O, Shlapobersky T, Chen A, Schneidman E. High-order social interactions in groups of mice. Elife. 2013;2013(2):1–19. doi: 10.7554/eLife.00759. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 147.Weissbrod A, Shapiro A, Vasserman G, Edry L, Dayan M, Yitzhaky A, et al. Automated long-term tracking and social behavioural phenotyping of animal colonies within a semi-natural environment. Nat Commun. 2013 doi: 10.1038/ncomms3018. [DOI] [PubMed] [Google Scholar]
- 148.Romero-Ferrero F, Bergomi MG, Hinz RC, Heras FJH, de Polavieja GG. Idtracker.Ai: tracking all individuals in small or large collectives of unmarked animals. Nat Methods. 2019;16(2):179–182. doi: 10.1038/s41592-018-0295-5. [DOI] [PubMed] [Google Scholar]
- 149.Mathis A, Mamidanna P, Cury KM, Abe T, Murthy VN, Mathis MW, et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci. 2018;21(9):1281–1289. doi: 10.1038/s41593-018-0209-y. [DOI] [PubMed] [Google Scholar]
- 150.Nilsson SRO, Goodwin NL, Choong JJ, Hwang S, Wright HR, Norville ZC, et al. Simple Behavioral Analysis (SimBA)—an open source toolkit for computer classification of complex social behaviors in experimental animals. bioRxiv. 2020. 10.1101/2020.04.19.049452.
- 151.Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson PL, Andersson M. ToxTrac: a fast and robust software for tracking organisms. Methods Ecol Evol. 2018;9(3):460–464. [Google Scholar]
- 152.Wöhr M, Engelhardt KA, Seffer D, Sungur AÖ, Schwarting RKW. Acoustic communication in rats: effects of social experiences on ultrasonic vocalizations as socio-affective signals. Curr Top Behav Neurosci. 2017;30:67–89. doi: 10.1007/7854_2015_410. [DOI] [PubMed] [Google Scholar]
- 153.Brudzynski SM. Ethotransmission: communication of emotional states through ultrasonic vocalization in rats. Curr Opin Neurobiol. 2013;23:310–317. doi: 10.1016/j.conb.2013.01.014. [DOI] [PubMed] [Google Scholar]
- 154.Sirotin YB, Costa ME, Laplagne DA. Rodent ultrasonic vocalizations are bound to active sniffing behavior. Front Behav Neurosci. 2014;8:1–12. doi: 10.3389/fnbeh.2014.00399. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 155.Hartmann K, Brecht M. A functionally and anatomically bipartite vocal pattern generator in the rat brain stem. iScience. 2020;23(12):101804. doi: 10.1016/j.isci.2020.101804. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 156.Panksepp JB, Jochman KA, Kim JU, Koy JK, Wilson ED, Chen Q, et al. Affiliative behavior, ultrasonic communication and social reward are influenced by genetic variation in adolescent mice. PLoS ONE. 2007;2(4):e351. doi: 10.1371/journal.pone.0000351. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 157.Lahvis GP, Alleva E, Scattoni ML. Translating mouse vocalizations: prosody and frequency modulation. Genes Brain Behav. 2011;10(1):4–16. doi: 10.1111/j.1601-183X.2010.00603.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 158.Heckman J, McGuinness B, Celikel T, Englitz B. Determinants of the mouse ultrasonic vocal structure and repertoire. Neurosci Biobehav Rev. 2016;65:313–325. doi: 10.1016/j.neubiorev.2016.03.029. [DOI] [PubMed] [Google Scholar]
- 159.Warren MR, Spurrier MS, Roth ED, Neunuebel JP. Sex differences in vocal communication of freely interacting adult mice depend upon behavioral context. PLoS ONE. 2018;13(9):1–22. doi: 10.1371/journal.pone.0204527. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 160.Zala SM, Reitschmidt D, Noll A, Balazs P, Penn DJ. Sex-dependent modulation of ultrasonic vocalizations in house mice (Mus musculus musculus) PLoS ONE. 2017;12(12):4–7. doi: 10.1371/journal.pone.0188647. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 161.Burke K, Screven LA, Dent ML. CBA/CaJ mouse ultrasonic vocalizations depend on prior social experience. PLoS ONE. 2018;13(6):1–17. doi: 10.1371/journal.pone.0197774. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 162.Premoli M, Petroni V, Bulthuis R, Bonini SA, Pietropaolo S, Jarvis ED. Ultrasonic vocalizations in adult C57BL/6J Mice: the role of sex differences and repeated testing. Front Behav Neurosci. 2022;16:1–17. doi: 10.3389/fnbeh.2022.883353. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 163.Arriaga G, Zhou EP, Jarvis ED. Of mice, birds, and men: the mouse ultrasonic song system has some features similar to humans and song-learning birds. PLoS ONE. 2012;7(10):e46610. doi: 10.1371/journal.pone.0046610. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 164.Chabout J, Sarkar A, Dunson DB, Jarvis ED. Male mice song syntax depends on social contexts and influences female preferences. Behav Neurosci. 2015;9:76. doi: 10.3389/fnbeh.2015.00076. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 165.Zhao X, Ziobro P, Pranic NM, Chu S, Rabinovich S, Chan W, et al. Sex- And context-dependent effects of acute isolation on vocal and non-vocal social behaviors in mice. PLoS ONE. 2021;16:1–17. doi: 10.1371/journal.pone.0255640. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 166.Van Segbroeck M, Knoll AT, Levitt P, Narayanan S. MUPET—mouse ultrasonic profile extraction: a signal processing tool for rapid and unsupervised analysis of ultrasonic vocalizations. Neuron. 2017;94(3):465–485.e5. doi: 10.1016/j.neuron.2017.04.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 167.Holy TE, Guo Z. Ultrasonic songs of male mice. PLoS Biol. 2005;3(12):e386. doi: 10.1371/journal.pbio.0030386. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 168.Portfors CV. Types and functions of ultrasonic vocalizations in laboratory rats and mice. J Am Assoc Lab Anim Sci. 2007;46(1):28–34. [PubMed] [Google Scholar]
- 169.Hanson JL, Hurley LM. Female presence and estrous state influence mouse ultrasonic courtship vocalizations. PLoS ONE. 2012 doi: 10.1371/journal.pone.0040782. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 170.Chabout J, Serreau P, Ey E, Bellier L, Aubin T, Bourgeron T, et al. Adult male mice emit context-specific ultrasonic vocalizations that are modulated by prior isolation or group rearing environment. PLoS ONE. 2012;7(1):1–9. doi: 10.1371/journal.pone.0029401. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 171.Premoli M, Memo M, Bonini S. Ultrasonic vocalizations in mice: relevance for ethologic and neurodevelopmental disorders studies. Neural Regen Res. 2021;16(6):1158–1167. doi: 10.4103/1673-5374.300340. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 172.Ivanenko A, Watkins P, van Gerven MAJ, Hammerschmidt K, Englitz B. Classifying sex and strain from mouse ultrasonic vocalizations using deep learning. PLoS Comput Biol. 2020;16(6):1–27. doi: 10.1371/journal.pcbi.1007918. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 173.Sangiamo DT, Warren MR, Neunuebel JP. Ultrasonic signals associated with different types of social behavior of mice. Nat Neurosci. 2020;23(3):411–422. doi: 10.1038/s41593-020-0584-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 174.Lefebvre E, Granon S, Chauveau F. Social context increases ultrasonic vocalizations during restraint in adult mice. Anim Cogn. 2020;23(2):351–359. doi: 10.1007/s10071-019-01338-2. [DOI] [PubMed] [Google Scholar]
- 175.Scattoni ML, Ricceri L, Crawley JN. Unusual repertoire of vocalizations in adult BTBR T+tf/J mice during three types of social encounters. Genes Brain Behav. 2011;10(1):44–56. doi: 10.1111/j.1601-183X.2010.00623.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 176.Hammerschmidt K, Reisinger E, Westekemper K, Ehrenreich L, Strenzke N, Fischer J. Mice do not require auditory input for the normal development of their ultrasonic vocalizations. BMC Neurosci. 2012;13(1):40. doi: 10.1186/1471-2202-13-40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 177.Roullet FI, Wöhr M, Crawley JN. Female urine-induced male mice ultrasonic vocalizations, but not scent-marking, is modulated by social experience. Behav Brain Res. 2011;216(1):19–28. doi: 10.1016/j.bbr.2010.06.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 178.Keesom SM, Finton CJ, Sell GL, Hurley LM. Early-life social isolation influences mouse ultrasonic vocalizations during male-male social encounters. PLoS ONE. 2017;12(1):e0169705. doi: 10.1371/journal.pone.0169705. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 179.Tschida K, Michael V, Takatoh J, Han BX, Zhao S, Sakurai K, et al. A specialized neural circuit gates social vocalizations in the mouse. Neuron. 2019;103(3):459–472.e4. doi: 10.1016/j.neuron.2019.05.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 180.Hammerschmidt K, Radyushkin K, Ehrenreich H, Fischer J. Female mice respond to male ultrasonic “songs” with approach behaviour. Biol Lett. 2009;5(5):589–592. doi: 10.1098/rsbl.2009.0317. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 181.White NR, Prasad M, Barfield RJ, Nyby JG. 40- and 70-kHz vocalizations of mice (Mus musculus) during copulation. Physiol Behav. 1998;63(4):467–473. doi: 10.1016/s0031-9384(97)00484-8. [DOI] [PubMed] [Google Scholar]
- 182.Neunuebel JP, Taylor AL, Arthur BJ, Roian Egnor SE. Female mice ultrasonically interact with males during courtship displays. Elife. 2015;4:1–24. doi: 10.7554/eLife.06203. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 183.Heckman JJ, Proville R, Heckman GJ, Azarfar A, Celikel T, Englitz B. High-precision spatial localization of mouse vocalizations during social interaction. Sci Rep. 2017;7(1):1–16. doi: 10.1038/s41598-017-02954-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 184.Warren MR, Clein RS, Spurrier MS, Roth ED, Neunuebel JP. Ultrashort-range, high-frequency communication by female mice shapes social interactions. Sci Rep. 2020;10(1):1–14. doi: 10.1038/s41598-020-59418-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 185.Scattoni ML, Crawley J, Ricceri L. Ultrasonic vocalizations: a tool for behavioural phenotyping of mouse models of neurodevelopmental disorders. Neurosci Biobehav Rev. 2009;33(4):508–515. doi: 10.1016/j.neubiorev.2008.08.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 186.Hammerschmidt K, Radyushkin K, Ehrenreich H, Fischer J. The structure and usage of female and male mouse ultrasonic vocalizations reveal only minor differences. PLoS ONE. 2012;7(7):1–7. doi: 10.1371/journal.pone.0041133. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 187.Moles A, Costantini F, Garbugino L, Zanettini C, D’Amato FR. Ultrasonic vocalizations emitted during dyadic interactions in female mice: a possible index of sociability? Behav Brain Res. 2007;182(2):223–230. doi: 10.1016/j.bbr.2007.01.020. [DOI] [PubMed] [Google Scholar]
- 188.Sugimoto H, Okabe S, Kato M, Koshida N, Shiroishi T, Mogi K, et al. A role for strain differences in waveforms of ultrasonic vocalizations during male-female interaction. PLoS ONE. 2011;6(7):e22093. doi: 10.1371/journal.pone.0022093. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 189.Rao RP, Mielke F, Bobrov E, Brecht M. Vocalization-whisking coordination and multisensory integration of social signals in rat auditory cortex. Elife. 2014;3:1–20. doi: 10.7554/eLife.03185. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 190.Burkett ZD, Day NF, Peñagarikano O, Geschwind DH, White SA. VoICE: a semi-automated pipeline for standardizing vocal analysis across models. Nat Publ Gr. 2014 doi: 10.1038/srep10237. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 191.Ey E, Torquet N, de Chaumont F, Lévi-Strauss J, Ferhat AT, Le Sourd AM, et al. Shank2 mutant mice display hyperactivity insensitive to methylphenidate and reduced flexibility in social motivation, but normal social recognition. Front Mol Neurosci. 2018;11:1–9. doi: 10.3389/fnmol.2018.00365. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 192.Pagani M, Bertero A, Liska A, Galbusera A, Sabbioni M, Barsotti N, et al. Deletion of autism risk gene shank3 disrupts prefrontal connectivity. J Neurosci. 2019;39(27):5299–5310. doi: 10.1523/JNEUROSCI.2529-18.2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 193.Wang X, McCoy PA, Rodriguiz RM, Pan Y, Je HS, Roberts AC, et al. Synaptic dysfunction and abnormal behaviors in mice lacking major isoforms of Shank3. Hum Mol Genet. 2011;20(15):3093–3108. doi: 10.1093/hmg/ddr212. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 194.Scattoni ML, Martire A, Cartocci G, Ferrante A, Ricceri L. Reduced social interaction, behavioural flexibility and BDNF signalling in the BTBR T+tf/J strain, a mouse model of autism. Behav Brain Res. 2013;251:35–40. doi: 10.1016/j.bbr.2012.12.028. [DOI] [PubMed] [Google Scholar]
- 195.Reno JM, Marker B, Cormack LK, Schallert T, Duvauchelle CL. Automating ultrasonic vocalization analyses: the WAAVES program. J Neurosci Methods. 2013;219:155–161. doi: 10.1016/j.jneumeth.2013.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 196.Zala SM, Reitschmidt D, Noll A, Balazs P, Penn DJ. Automatic mouse ultrasound detector (AMUD): a new tool for processing rodent vocalizations. PLoS ONE. 2017;12(7):3–9. doi: 10.1371/journal.pone.0181200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 197.Coffey KR, Marx RG, Neumaier JF. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology. 2019;44(5):859–868. doi: 10.1038/s41386-018-0303-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 198.Tachibana RO, Kanno K, Okabe S, Kobayasi KI, Okanoya K. USVSEG: a robust method for segmentation of ultrasonic vocalizations in rodents. PLoS ONE. 2020 doi: 10.1371/journal.pone.0228907. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 199.Fonseca AHO, Santana GM, Bosque Ortiz GM, Bampi S, Dietrich MO. Analysis of ultrasonic vocalizations from mice using computer vision and machine learning. Elife. 2021;10:1–22. doi: 10.7554/eLife.59161. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 200.Arakawa H, Arakawa K, Blanchard DC, Blanchard RJ. Scent marking behavior in male C57BL/6J mice: sexual and developmental determination. Behav Brain Res. 2007;182(1):73–79. doi: 10.1016/j.bbr.2007.05.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 201.Chen AX, Yan JJ, Zhang W, Wang L, Yu ZX, Ding XJ, et al. Specific hypothalamic neurons required for sensing conspecific male cues relevant to inter-male aggression. Neuron. 2020;108(4):763–774.e6. doi: 10.1016/j.neuron.2020.08.025. [DOI] [PubMed] [Google Scholar]
- 202.Hyun M, Taranda J, Radeljic G, Miner L, Wang W, Ochandarena N, et al. Social isolation uncovers a circuit underlying context-dependent territory-covering micturition. Proc Natl Acad Sci U S A. 2021 doi: 10.1073/pnas.2018078118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 203.Lumley LA, Sipos ML, Charles RC, Charles RF, Meyerhoff JL. Social stress effects on territorial marking and ultrasonic vocalizations in mice. Physiol Behav. 1999;67(5):769–775. doi: 10.1016/s0031-9384(99)00131-6. [DOI] [PubMed] [Google Scholar]
- 204.Hou XH, Hyun M, Taranda J, Huang KW, Todd E, Feng D, et al. Central control circuit for context-dependent micturition. Cell. 2016;167(1):73–86.e12. doi: 10.1016/j.cell.2016.08.073. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 205.Kaur AW, Ackels T, Kuo TH, Cichy A, Dey S, Hays C, et al. Murine pheromone proteins constitute a context-dependent combinatorial code governing multiple social behaviors. Cell. 2014;157(3):676–688. doi: 10.1016/j.cell.2014.02.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 206.Miller CH, Hillock MF, Yang J, Carlson-Clarke B, Haxhillari K, Lee AY, et al. Dynamic changes to signal allocation rules in response to variable social environments in house mice. SSRN Electron J. 2022 doi: 10.1038/s42003-023-04672-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 207.Verstegen AM, Tish MM, Szczepanik LP, Zeidel ML, Geerling JC. Micturition video thermography in awake, behaving mice. J Neurosci Methods. 2020;331:108449. doi: 10.1016/j.jneumeth.2019.108449. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 208.Carnevali L, Nalivaiko E, Sgoifo A. Respiratory patterns reflect different levels of aggressiveness and emotionality in wild-type Groningen rats. Respir Physiol Neurobiol. 2014;204:28–35. doi: 10.1016/j.resp.2014.07.003. [DOI] [PubMed] [Google Scholar]
- 209.Alves JA, Boerner BC, Laplagne DA. Flexible coupling of respiration and vocalizations with locomotion and head movements in the freely behaving rat. Neural Plast. 2016 doi: 10.1155/2016/4065073. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 210.Langford DJ, Bailey AL, Chanda ML, Clarke SE, Drummond TE, Echols S, et al. Coding of facial expressions of pain in the laboratory mouse. Nat Methods. 2010;7(6):447–449. doi: 10.1038/nmeth.1455. [DOI] [PubMed] [Google Scholar]
- 211.Defensor EB, Corley MJ, Blanchard RJ, Blanchard DC. Facial expressions of mice in aggressive and fearful contexts. Physiol Behav. 2012;107:680–685. doi: 10.1016/j.physbeh.2012.03.024. [DOI] [PubMed] [Google Scholar]
- 212.Finlayson K, Lampe JF, Hintze S, Würbel H, Melotti L. Facial indicators of positive emotions in rats. PLoS ONE. 2016;11(11):1–24. doi: 10.1371/journal.pone.0166446. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 213.Saito Y, Yuki S, Seki Y, Kagawa H, Okanoya K. Cognitive bias in rats evoked by ultrasonic vocalizations suggests emotional contagion. Behav Processes. 2016;132:5–11. doi: 10.1016/j.beproc.2016.08.005. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.