Abstract
Despite the recent increase in second-person neuroscience research, it is still hard to understand which neurocognitive mechanisms underlie real-time social behaviours. Here, we propose that social signalling can help us understand social interactions both at the single- and two-brain level in terms of social signal exchanges between senders and receivers. First, we show how subtle manipulations of being watched provide an important tool to dissect meaningful social signals. We then focus on how social signalling can help us build testable hypotheses for second-person neuroscience with the example of imitation and gaze behaviour. Finally, we suggest that linking neural activity to specific social signals will be key to fully understand the neurocognitive systems engaged during face-to-face interactions.
Keywords: Second-person neuroscience, Social interaction, Social signal, Audience effects
Introduction
Interest in the neuroscience of social interactions has grown rapidly in the past decade. Influential opinion papers have called for a new “second-person neuroscience” and for the study of face-to-face dynamics (De Jaegher et al., 2010; Risko et al., 2016; Schilbach et al., 2013). Building on these, researchers have begun to develop paradigms where two or more people interact (Konvalinka et al., 2010; Sebanz et al., 2006) and where brain activity is captured using hyperscanning (Babiloni & Astolfi, 2014; Montague et al., 2002). However, it is still not easy to pin down specific cognitive models of the processes engaged when people take part in dynamic, real-time interactions. That is, what kind of neurocognitive models can we use to make sense of dynamic social interactions?
Here, we propose that a social signalling framework can help us understand social interactions both at the single- and two-brain level in terms of signal exchanges between senders and receivers. Social signalling takes an incremental approach to this problem, asking what factors change between situations where one participant performs a task alone and the same situation where the participant is interacting with another person as they perform a task together. In particular, we suggest that the simple manipulation of “being watched” or not by another person provides a core test of social signalling and may be able to give us a robust and general theoretical framework in which to advance “second-person neuroscience.”
First, we briefly review evidence that “being watched” matters to participant’s behaviour and their brain activity patterns. We then outline the social signalling framework to understanding these changes, and we detail how this can be applied to understand two cases of social behaviour—imitation and eye gaze. Finally, we review emerging evidence on the neural mechanisms of social signalling and set out future directions.
Being watched as a basic test of social interactions
There is a long tradition of research into the differences in our behaviour when we are alone, versus when we are in the presence of others. A series of studies from Zajonc (1965) showed that cockroaches, rats, monkeys, and humans showed changes in behaviour when in the presence of a conspecific, a phenomenon described as social facilitation. It has been proposed that the presence of conspecifics (regardless of whether they are watching) increases arousal and facilitates dominant behaviours, in both cognitive and motor tasks (Geen, 1985; Strauss, 2002; Zajonc & Sales, 1966).
In humans, the effect of being watched by another person goes beyond mere social facilitation and has been described as an audience effect (Hamilton & Lind, 2016). Being watched is one of the most basic and simplest social interactions, first studied by Triplett more than 100 years ago (Triplett, 1898), when he showed that children wind in a fishing reel faster when with another child than when alone. Since then, several studies have shown how an audience can induce the belief in being watched and cause changes in behaviour and in underlying brain activity. The audience effect is most clearly induced by the physical presence of another person who is actively watching the participant but can also be induced by the feeling of being watched (e.g., via camera), and these different triggering conditions are reviewed below. For instance, participants tend to gaze less at the face of a live confederate when compared with the same confederate in a prerecorded video clip (Cañigueral & Hamilton, 2019a; Gobel et al., 2015; Laidlaw et al., 2011), and they smile more in the presence of a live friend or confederate (Fridlund, 1991; Hietanen et al., 2018). During economic games and social dilemmas, the belief in being watched leads to an increase in prosocial behaviour (Cañigueral & Hamilton, 2019a; Izuma et al., 2009; Izuma et al., 2011) and a decrease in risk-taking (Kumano et al., 2021) as well as more brain activity in regions linked to mentalizing and social reward processing (Izuma et al., 2009, 2010).
Several different cognitive mechanisms have been proposed to account for audience effects (Fig. 1). The response to being watched may draw on perceptual mentalizing (i.e., the attribution of perceptual states to other people; Teufel et al., 2010) to determine what the other person can see, and theory of mind (Tennie et al., 2010) to determine what they think. The presence of “watching eyes” could engage self-referential processing, which increases the sense of self-involvement in the interaction (Conty et al., 2016; Hazem et al., 2017). Furthermore, Bond (1982) proposed a self-presentation model of audience effects, whereby participants change their behaviour to present themselves positively to the audience. This also fits with the recent idea that being watched engages reputation management mechanisms—that is, changes in behaviour that aim to promote positive judgements in the presence of others (Cage, 2015; Izuma et al., 2009, 2010). However, there is still uncertainty about how to best interpret these findings and integrate them with other aspects of social neuroscience. Crucially, common to all these models is the idea that participants can send information about themselves to the watcher, that is, they can communicate.
From being watched to social signalling
To make sense of these very basic types of communication, we believe that it is helpful to draw on the extensive studies on signalling in animal behaviour (Stegmann, 2013). In this tradition, a signal is defined as a stimulus which sends information from one individual to another, and which is performed in order to benefit both the sender and the receiver. In contrast, a cue is a stimulus which only benefits the receiver. For example, the carbon dioxide which I breathe out is a cue to a mosquito that wants a meal but does not benefit me. In contrast, the bright colour of a butterfly’s wings is a signal to birds to avoid eating the butterfly, which benefits the butterfly in helping it to avoid predation. The concept of signalling described here is very broad, applying to all types of stimuli from wing colour (defined by evolution) to particular behaviours (which only occur in particular contexts and might be learnt). Here, we take this general idea and apply it to the much more specific case of human nonverbal behaviour. In particular, we suggest that it is helpful to use the idea of signalling to define which human actions are used as signals and what those signals mean.
The social signalling framework proposes that if a human action is a social signal, it must meet two basic criteria. First, the sender must produce the action in order to influence the receiver. Second, the action must have a beneficial impact on the receiver. Importantly, to test the first criteria, we can vary the presence of an audience who can receive the signal. If an action is performed when the sender can be seen and is suppressed when the sender cannot be seen, we have evidence that the action is being used as a signal. To test the second criteria, we must evaluate how the receiver’s behaviour or mental state changes when they perceive the action, that is, do receivers change their attitude to the sender when they receive a signal? Note that receiving a signal should benefit the receiver in the sense that they have gained information about the social world, even if that information is negatively valenced (e.g., learning that another person is hostile). However, we acknowledge that there are circumstances in which receiving additional social information via a signal may have negative consequences for the receiver (e.g., if sender is lying and uses the signal to manipulate the receiver); such circumstances are beyond the scope of this paper.
Overall, the social signalling framework takes an incremental approach to understand social interactions in terms of signal exchanges between senders and receivers. To compare cognitive processes between a solo task (e.g., one participant responding to a computer) and a dynamic social interaction (e.g., two participants in conversation) is very complex. By studying the specific processes which change between a solo task and a solo task with an audience, we hope it will be possible to incrementally specify the different cognitive processes involved in different types of social behaviour.
Social signalling builds on the basic premise of second-person neuroscience, that engaging in social interactions involves additional neurocognitive processes and social dynamics compared to not being in an interaction (Redcay & Schilbach, 2019; Schilbach et al., 2013), and sets out a concrete framework to establish testable hypotheses in the context of two-person interactions. In particular, social signalling suggests that we need to study and understand the interactive behaviour of both performers (senders) and observers (receivers). By using the simple manipulation of being watched or not being watched (i.e., varying the presence of an audience/receiver), we can test which behaviours are used as signals and define what information content the signals might carry (Bacharach & Gambetta, 2001; Skyrms, 2010). In contrast, by varying the presence of a social signal, we can test which effects, if any, this signal has on the receiver.
What counts as an audience?
To build a comprehensive account of social signalling, it is important to consider which manipulations count as being watched or not, and thus which engage the additional cognitive processes involved in audience effects. The extreme cases are most clear cut—when a participant is engaged in a face-to-face conversation with another living person, it is clear that they are being watched, while a participant who views a cartoon of a pair of eyes on a computer alone in a room (with no cameras) might see a face-like image but is not being watched. We illustrate these examples in Fig. 2, where we divide the space of possible interactions in terms of the visible stimulus features (e.g., an image of eyes) on the x-axis, and top-down contextual knowledge on the y-axis. The face-to-face conversation includes both visual cues to another person and the knowledge they are real and are watching (Fig. 2a) so an audience effect should be active. Instead, the person viewing cartoon eyes has minimal visual cues together with the knowledge they are not being watched (Fig. 2e), so the audience effect is not active.
However, there are many other possible contexts which are much more ambiguous, and where we might not know if an audience effect is present or not; these are illustrated on the yellow diagonal (Fig. 2b–d). These include cases where a participant sees a camera and is told “you are being watched” but sees no visual cues (Fig. 2b; e.g., Somerville et al., 2013). In other cases, a participant might see an image or video clip of a person who directly addresses them (Fig. 2d), which include rich visual cues but participants still know that this image or video clip cannot actually see them (e.g., Baltazar et al., 2014; Wang & Hamilton, 2012). Finally, a number of studies use a combination of instructions and visual stimuli to induce the belief that participants are or are not engaged in a social interaction, using fake video calls (Cañigueral & Hamilton, 2019a), fake mirrors (Hietanen et al., 2019), or virtual characters (Wilms et al., 2010; Fig. 2c).
In each of these cases, either the perceptual features of the stimulus or the instructions given by the experimenter may lead participants to feel as if they are being watched. In interpreting such studies, we make the assumption that the feeling of “I am watched by a person” is a categorical percept (Hari & Puce, 2010)—that is, in each case the participant either does feel watched or does not, with no feeling of being half watched. However, it is not easy to make a blanket rule for which stimuli in the “ambiguous zone” will be treated as “true watchers” by participants and which will not—subtle effects of context can have a large impact. Equally, it is possible that some participants interpret a particular context as “being watched” while other participants in the same study do not. Future studies that aim to define more clearly when an ambiguous stimulus is treated as an audience will be useful in understanding basic mechanisms involved in detecting humans and triggering audience effect processes.
In the following, we illustrate how a framework of social signalling can help us build testable hypotheses for second-person neuroscience by separately manipulating both the sender and the receiver in a two-person interaction. We particularly focus on the examples of imitation and gaze behaviour, although similar studies have examined other social behaviours including facial expressions (Crivelli & Fridlund, 2018), eye blinks (Hömke et al., 2017, 2018), and hand gestures (Holler et al., 2018; Mol et al., 2011).
Imitation as an example of social signalling
Imitation is a simple social behaviour in which the actions of one person match the actions of the other (Heyes, 2011). It is relatively easy to recognise (Thorndike, 1898; Whiten & Ham, 1992) but there are many theories of why people imitate (see Farmer et al., 2018, for a review). These include imitation to learn new skills (Flynn & Smith, 2012); imitation to improve our understanding of another person via simulation (Gallese & Goldman, 1998; Pickering & Garrod, 2013); imitation to affiliate with others (Over & Carpenter, 2013; Uzgiris, 1981); and imitation as a side effect of associative learning (Heyes, 2017). Here, we focus on the claim that imitation is used as a social signal in order to build affiliation with others, sometimes described as the social glue hypothesis (Dijksterhuis, 2005; Lakin et al., 2003; Wang & Hamilton, 2012). Note that other social signals need not be linked to affiliation, such as those behaviours aimed at signalling dominance or status (Burgoon & Dunbar, 2006).
The claim that imitation acts as a social glue assumes that two people engage in an imitation sequence as shown in Fig. 3a. Here, the woman performs a hand gesture and the man then imitates her action: this is represented in the figure below as a pseudo-conversation, where red bubbles are the woman’s action/cognition and blue bubbles are the man’s action/cognition. When the man imitates the woman, his action is sending a signal to her. When she senses (probably implicitly) that she is being imitated, she receives the signal and may adjust her evaluation to like him more. If this interpretation of the imitation sequence under the social signalling framework is true, we can set out two testable hypotheses. First, imitation should be produced when other people can see it (Fig. 3b), because there is no need to send a signal if the signal cannot be received. Second, being imitated should change the internal state or behaviour of the receiver, as the receiver has gained new information about the sender (Fig. 3c).
In a study of dyadic interaction, we found support for the first prediction of the social signalling framework of imitation (Krishnan-Barman & Hamilton, 2019). Pairs of naïve participants in an augmented reality space were assigned roles of leader and follower in a cooperative game. On each trial, the leader learnt a block-moving sequence from the computer and demonstrated it to the follower, who was instructed to move the blocks in the same order; dyads received a score based on fast and accurate performance. Unbeknownst to the Follower, the Leader was given a secret instruction to move blocks using specific trajectories, including unusually high trajectories in some trials. In one half of the trials the Leader watched the Follower make their subsequent movements, while in the other half of the trials the Leader had their eyes closed during the Follower’s turn. We found that overall Followers tended to imitate the trajectories demonstrated by the Leaders, and critically, they imitated Leaders with greater fidelity when they knew they were being watched by the Leader (Fig. 3a).
The finding that being watched increases imitation is also seen in other contexts. A previous study (Bavelas et al., 1986) showed that observers winced more when watching an experimenter who was maintaining eye contact with them sustain a minor injury when compared with an experimenter who was looking elsewhere when experiencing a minor injury. Both toddlers (Vivanti & Dissanayake, 2014) and 4-month-old infants (de Klerk et al., 2018) show a greater propensity to imitate models in a video demonstration when cued with a direct rather than averted gaze. In a rapid reaction time study with adults, direct gaze enhances mimicry (Wang et al., 2011) and this effect is only seen if the gaze is present during the response period (Wang & Hamilton, 2014). However, in studies using direct gaze cues, it is hard to rule our arousal or alerting effects arising from the eyes (Senju & Johnson, 2009). Our recent paper (Krishnan-Barman & Hamilton, 2019) manipulated only the belief in being watched because participants stood side by side and did not directly see each other’s eyes. This suggests that the effect is not merely an epiphenomenon of arousal (Senju & Johnson, 2009) but is driven by the capacity to signal to the partner when the partner’s eyes are open. Together, all these results offer support for the hypothesis that imitation is a social signal initiated by the sender.
We now turn to the second hypothesis: if imitation is a social signal, then being imitated should (on some level) be detected by the receiver and this new information should change the internal state or behaviour of the receiver. Several detailed reviews outline the downstream impacts of being mimicked (Chartrand & Lakin, 2012; Chartrand & van Baaren, 2009; Hale & Hamilton, 2016a; but see Hale & Hamilton, 2016b). Broadly being mimicked appears to build rapport and increase our liking for other people (Chartrand & Bargh, 1999; Lakin & Chartrand, 2003; Stel & Vonk, 2010), and this effect is present from early in childhood (Meltzoff, 2007). Interestingly, this effect may persist even when the mimicker is a computer or virtual-reality agent (Bailenson & Yee, 2005; Suzuki et al., 2003; but see Hale & Hamilton, 2016b). In addition to building rapport, mimicking has also been shown to increase prosocial behaviour such as helping others (Müller et al., 2012) or increasing the tips that restaurant patrons give waitresses (van Baaren et al., 2003). Thus, positive behavioural consequences of being imitated seem well-documented, though the precise neural and cognitive mechanisms which allow us to detect “being mimicked” are less well defined (Hale & Hamilton, 2016a). Taken together, these results support the hypothesis that imitation is a signal, produced by a sender when they are being watched, and resulting in changes in behaviour among the recipients of this signal.
One important question in a signalling account of imitation concerns the level of intentionality and awareness of the signals, in both the sender and the receiver. Like many nonverbal behaviours, people often seem to be unaware of when they imitate others and of when others imitate them. In fact, awareness of being imitated may reduce the social glue effect (Kulesza et al., 2016). Thus, the social signalling framework makes no claims that people consciously intend to send a signal or are explicitly aware of receiving a signal, and it is possible that all these sophisticated processes can occur without awareness, in the same way that a tennis player can hit a ball without awareness of their patterns of muscle activity. It will be an interesting question for future studies to explore how intentions and awareness interact with nonverbal social signalling behaviours.
Gaze as an example of social signalling
Gaze and eye movements are a particularly intriguing social behaviour, because the eyes are used to gather information about the world but can also signal information to others (Gobel et al., 2015). There is evidence that people change their gaze behaviour when they are being watched, which supports the first prediction of the social signalling framework. For instance, participants direct less gaze to the face of a live confederate than to the face of the same confederate in a prerecorded video clip (Laidlaw et al., 2011). Similarly, across two studies we recently showed that, when participants are in a live interaction or when they are (or believe they are) in a live video call, they gaze less to the other person than if they are seeing a prerecorded video clip (Cañigueral & Hamilton, 2019a; Cañigueral, Ward, et al., 2021).
Some studies have suggested that gaze avoidance found in live contexts signals compliance with social norms (e.g., it is not polite to stare at someone; Foulsham et al., 2011; Gobel et al., 2015; Goffman, 1963) or reduces arousal associated with eye contact in live interactions (Argyle & Dean, 1965; Kendon, 1967). However, other studies using tasks that involve conversation have shown that the amount of gaze directed to the confederate is greater when they are listening compared to speaking (Cañigueral, Ward, et al., 2021; Freeth et al., 2013). Moreover, one study found that in contexts involving natural conversation participants direct more gaze to the confederate when they believe they are in a live video call (Mansour & Kuhn, 2019). Altogether these findings suggests that, beyond being in a live interaction, the communicative context and role in the interaction also modulates gaze patterns (Cañigueral & Hamilton, 2019b).
In line with the second prediction of the social signalling framework, some studies show that live direct or averted gaze has effects on the receiver. Studies using pictures and virtual agents have shown that direct gaze can engage brain systems linked to reward (Georgescu et al., 2013; Kampe et al., 2001), but can also be processed as a threat stimulus (Sato et al., 2004). In live contexts, seeing direct or averted gaze activates the approach or avoidance motivational brain system, respectively (Hietanen et al., 2008; Pönkänen et al., 2011). In conversation, eye gaze regulates turn-taking between speakers and listeners: speakers avert their gaze when they start to speak and when they hesitate to indicate that they want to say something but give direct gaze to the listener when they are finishing an utterance to indicate that they want to give the turn (Ho et al., 2015; Kendon, 1967). Thus, the key role of eye gaze as a social signal emerges from its dual function as a cue of “being watched” and as a dynamic modulator of social interactions on a moment-by-moment basis (Cañigueral & Hamilton, 2019b).
Neural mechanisms for social signalling
Within the social signalling framework, the exchange of social signals will also modulate brain mechanisms engaged by senders and receivers. At the sender’s end, brain activity should change depending on the presence or absence of an audience, and several studies have used creative paradigms to test this hypothesis inside the fMRI scanner. For instance, using mirrors it has been shown that mutual eye contact with a live partner recruits the medial prefrontal cortex, a brain area involved in mentalizing and communication (Cavallo et al., 2015). Using a fake video-call paradigm, it has also been shown that the belief in being watched during a prosocial decision-making task recruits brain regions associated with mentalizing (medial prefrontal cortex) and reward processing (ventral striatum), which are two key processes for reputation management (Izuma et al., 2009, 2010). Similarly, the belief in being watched or that an audio feed is presented in real-time (versus prerecorded) engages mentalizing brain regions (Müller-Pinzler et al., 2016; Redcay et al., 2010; Somerville et al., 2013; Warnell et al., 2018), and the belief of chatting online with another human (versus a computer) engages reward processing areas (Redcay et al., 2010; Warnell et al., 2018). These studies all point to the idea that a particular network of brain regions previously linked to mentalizing and reward are also engaged when a participant feels they can be seen by or can communicate with another person.
Commenting on patterns of brain activity related to receiving social signals from other people might seem simple, in the sense that hundreds of studies have itemized brain regions of social perception. Such studies do not typically distinguish whether a particular behaviour was intended as a signal or not but have identified brain systems which respond to emotional faces, to gestures, to actions and to observing gaze patterns (Andric & Small, 2012; Bhat et al., 2017; Diano et al., 2017; Pelphrey et al., 2004). When the finer distinction between a signal and a cue matters, it is helpful to consider studies which distinguish between ostensive and nonostensive behaviours. These have shown that ostensive communicative cues such as direct gaze, being offered an object, or hearing one’s own name recruit brain areas related to processing of communicative intent, mental states, and reward (Caruana et al., 2015; Kampe et al., 2003; Redcay et al., 2016; Schilbach et al., 2006; Schilbach et al., 2010; Tylén et al., 2012).
During social signalling, receivers also need to infer the intended message or “speaker meaning” embedded in a signal, which is strongly dependent on contextual information beyond the signal itself (e.g., based on assumptions about the senders’ beliefs and intentions; Hagoort, 2019). For instance, sustained direct gaze between participants interacting face-to-face can result in either laughter or hostility, according to the context set by a preceding cooperative or competitive task (Jarick & Kingstone, 2015). Thus, at the receiver’s end social signalling may also recruit brain systems involved in inferring such “speaker meaning.” Studies within the field of neuropragmatics have investigated this question in the context of spoken language, and have found that listening to irony, implicit answers or indirect evaluations and requests recruits the medial prefrontal cortex and temporal-parietal junction (Bašnáková et al., 2014; Jang et al., 2013; Spotorno et al., 2012; van Ackeren et al., 2012). Moreover, when participants are the receivers of the indirect message, versus just overhearers, listening to indirect replies also recruits the anterior insula and pregenual anterior cingulate cortex (Bašnáková et al., 2015). These findings suggest that mentalizing and affective brain systems are required to understand the “speaker meaning” and communicative intent of a signal (Hagoort, 2019; Hagoort & Levinson, 2014).
Although the studies presented above have advanced our understanding of how the brain implements a variety of cognitive processes when being watched or when receiving a social signal, they rely on controlled laboratory settings that require participants to be alone and stay still inside the fMRI scanner. This limits the researcher’s ability to study brain systems recruited in social interactions, where participants naturally move their face, head, and body to communicate with others. Luckily, these limitations can be overcome by techniques that allow much higher mobility (Czeszumski et al., 2020). For instance, although EEG has traditionally been highly sensitive to motion artifacts, recent developments have created robust mobile EEG (MEG) systems that can be easily used in naturalistic settings (Melnik et al., 2017). To a lesser extent, MEG has also successfully been used for studies involving natural conversation (Mandel et al., 2016). Finally, functional near-infrared spectroscopy (fNIRS) is a novel neuroimaging technique that can record haemodynamic signals in the brain during face-to-face interactions (Pinti et al., 2018). Crucially, the fact that EEG and fNIRS are silent and wearable means that they can be easily combined with other methodologies that capture natural social behaviours, such as motion capture (mocap), face-tracking and eye-tracking systems.
For instance, by combining mocap and fNIRS to study imitation in a dyadic task, we have recently found that when participants are being watched as they perform an imitation task, there was a decrease in activation of the right parietal region and the right temporal-parietal junction (Krishnan-Barman, 2021). In another study (Cañigueral, Zhang, et al., 2021), we simultaneously recorded pairs of participants (who were facing each other) with eye-tracking, face-tracking, and fNIRS to test how social behaviours and brain activity are modulated when sharing biographical information. Results showed that reciprocal interactions where information was shared recruited brain regions previously linked to reputation management (Izuma, 2012), particularly to mentalizing (temporo-parietal junction; [TPJ]) and strategic decision-making (dorsolateral prefrontal cortex [dlPFC]; Saxe & Kanwisher, 2003; Saxe & Wexler, 2005; Soutschek et al., 2015; Speitel et al., 2019).
Within the social signalling framework, it is necessary to link brain activity patterns to meaningful social signals to fully understand the neurocognitive systems engaged during social interactions. In an exploratory analysis, we investigated how the amount of facial displays is related to brain activity in face-to-face interactions (Cañigueral, Zhang, et al., 2021). We found that spontaneous production of facial displays (i.e., participants moving their own face) recruited the left supramarginal gyrus, whereas spontaneous observation of facial displays (i.e., participants seeing their partner move the face) recruited the right dlPFC. These brain regions have been previously linked to speech actions (Wildgruber et al., 1996) and emotion inference from faces (A. Nakamura et al., 2014; K. Nakamura et al., 1999), respectively. However, these findings also suggest that these brain regions are able to track facial displays over time (as the interaction develops), and further reveal that there may be specific brain systems involved in the dynamic processing of social signals beyond those traditionally linked to motor control and face perception.
Other studies have taken advantage of EEG and fNIRS to study how two brains synchronize when two people are interacting face-to-face and exchange specific social signals. For instance, dual brain and video recordings of hand movements show that cross-brain synchrony increases during spontaneous imitation of hand movements (Dumas et al., 2010). In combination with eye-tracking systems, it has also been shown that cross-brain synchrony between partners increases during moments of mutual eye contact (Hirsch et al., 2017; Leong et al., 2017; Piazza et al., 2020). To further test which specific aspects of social signalling are modulated by cross-brain synchrony, we combined behavioural and neural dyadic recordings with a novel analytical approach that carefully controls for task- and behaviour-related effects (cross-brain GLM; Kingsbury et al., 2019; Cañigueral, Zhang, et al., 2021). We found that, after controlling for task structure and social behaviours, cross-brain synchrony between mentalizing (right TPJ) and strategic decision-making regions (left dlPFC) increased when participants were sharing information. In line with the mutual prediction theory (Hamilton, 2021; Kingsbury et al., 2019), this finding suggests that cross-brain synchrony allows us to appropriately anticipate and react to each other’s social signals in the context of an ongoing shared interaction.
Altogether, these studies demonstrate how a multimodal approach to social interactions is crucial to fully understand the neurocognitive systems underlying social signalling. Simple manipulations of the belief in being watched or communicative context show that mentalizing, reward and decision-making brain systems are engaged when participants take part in a live interaction. Beyond this, specific brain regions related to speech production and emotion processing track the dynamic exchange of social signals, while cross-brain synchrony might index the participants’ ability to anticipate and react to these signals. Future studies that carefully manipulate the social context and combine novel technologies to capture both brain activity and social behaviours will be critical to discern the role of each of these mechanisms in social signalling, as well as how they are all coordinated to enable real-world face-to-face social interactions.
Taking the social signalling framework further
The ideas about social signalling outlined here provide a very minimal version of this framework. We suggest that the simple manipulation of “being watched” or not by another person provides a core test of social signalling, and that the two key features required to identify a signal are that the sender intends (on some level) to send a signal and that the receiver reacts (in some way) to the signal. Researchers in animal communication often examine further criteria. For example, persistence by the sender provides evidence that a signal is important—if the sender does not see any reaction from the recipient and the signal matters, the sender will keep sending it. Such behaviour implies that the sender has a goal of “she must get the message” and will persist until the goal is achieved. Other studies of animal behaviour consider if a particular signal is honest or deceptive (Dawkins & Guilford, 1991), and how recipients can distinguish these. Although our basic framework does not include these additional features, testing for them could be useful to have a more complex and detailed approach to social signalling.
Similarly, it is important to consider how social signalling might be modulated along different dimensions. For instance, while in many situations social signals are overt, in some cases there will be covert social signalling to facilitate effective cooperation: covert signals can be accurately received by its intended audience to foster affiliation, but not by others if it may lead to dislike (Smaldino et al., 2018). From the point of view of the audience, social signalling also differs in its directedness, that is, signals can be directed to “me,” or to a third person, or can also be undirected (e.g., face-covering tattoos; Gambetta, 2009). Importantly, depending on their directedness social signals will recruit different brain systems (Tylén et al., 2012). Another intriguing aspect is the context in which social signalling takes place, and how we adapt (or not) social signals to each of these contexts. For example, nonverbal behaviours such as nodding or hand movements play a central role in face-to-face communication to convey the full complexity of a spoken message (Kendon, 1967, 1970), but we continue to perform them in a telephone conversation although they can no longer be seen by the receiver. Thus, social signalling should allow for the nuance present in real-world contexts when testing which behaviours count or not as social signals. Finally, inspired by fields like conversation analysis (Schegloff, 2007), the investigation of social interactions within a social signalling framework entails considering signals as components within sequences of interaction instead of isolated entities, where prior and subsequent signals determine the relevance and meaning of the current signal. Acknowledging these dimensions when designing and interpreting studies will be critical to avoid an oversimplified view of social signalling.
It is also helpful to consider how this signalling framework relates to other approaches to the study of social interaction, and we briefly describe two rival approaches. First, some have suggested that we should focus entirely on the interaction and the emergent features of that situation (De Jaegher et al., 2010). Such dynamical systems often eschew traditional descriptions of single brain cognition, and of individuals as “senders” and “receivers.” There may well be situations, such as understanding the dynamic coordination of pianists playing a duet, where a division of the interaction into two distinct roles does not help. However, as the examples above illustrate, there are many situations where it is useful to understand who is sending a signal, what signal that is and how the recipient responds.
Second, some theories of social behaviour draw on studies of linguistic communication to interpret actions in terms of many different levels of relevance (Sperber & Wilson, 1995), where each action is tailored to the needs of the recipient. Such careful and detailed communication may be found for verbal behaviour, but it is not clear that the same models can be imported as a framework for all types of social interaction. Our approach here deliberately draws on work from animal cognition, which makes minimal assumptions about the complexity of the cognitive processes underlying social signalling. A key challenge is to understand if signalling can be driven by simple rules or if it requires the full complexity of linguistic communication.
In the present paper, we aim to highlight a “mid-level” type of explanation as a useful framework for interpreting current studies and guiding future studies. We suggest that a social signalling framework gives us a way to understand two-person interactions at the single-brain level, in the context of all our existing cognitive neuroscience. We aim to define precisely what signals each individual sends and receives during an interaction and the cognitive processes involved. These cognitive processes happen within a single brain but only in the context of a dynamic interaction with another person.
Concluding remarks
Recent calls for second-person neuroscience have resulted in a significant body of research focused on two-person interactions. However, it is not yet clear which neurocognitive mechanisms underlie these real-time dynamic social behaviours, or how novel interactive methods can relate to findings from traditional single-brain studies of social cognition. The social signalling framework proposes that communication is embodied in social behaviours, and so must be instantiated in the physical world via signals embedded in motor actions, eye gaze or facial expressions. We propose that a social signalling framework can help us make sense of face-to-face interactions by taking step-by-step advances from traditional one-person studies to novel two-person paradigms (e.g., subtle manipulations of being watched). Key to this work is to understand the details of the signals—to identify a specific signal, link it to a context, understand when it is produced, and understand what effect it has on the receiver. We believe that, without a detailed understanding of signalling behaviours, it will be hard to make sense of the new wave of data emerging from second-person neuroscience methods.
Funding
A.H. and R.C. were funded by the Leverhulme Trust on Grant No. RPG-2016-251 (PI: A.H.). S.K.B. was funded by an ESRC PhD studentship.
Data availability
Not applicable.
Code availability
Not applicable.
Declarations
Ethics approval
Not applicable
Consent to participate
Not applicable
Consent for publication
Not applicable
Conflicts of interest/Competing interests
The authors have no financial or nonfinancial conflict of interests to disclose.
Open practices statement
Not applicable
Footnotes
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Andric M, Small SL. Gesture’s neural language. Frontiers in Psychology. 2012;3(99):1–12. doi: 10.3389/fpsyg.2012.00099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Argyle M, Dean J. Eye-Contact, Distance and Affilitation. Sociometry. 1965;28(3):289–304. doi: 10.2307/2786027. [DOI] [PubMed] [Google Scholar]
- Babiloni F, Astolfi L. Social neuroscience and hyperscanning techniques: Past, present and future. Neuroscience & Biobehavioral Reviews. 2014;44:76–93. doi: 10.1016/j.neubiorev.2012.07.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bacharach M, Gambetta D. Trust in signs. In: Cook KS, editor. Trust in society. Russell Sage Foundation; 2001. pp. 148–184. [Google Scholar]
- Bailenson JN, Yee N. Digital chameleons: Automatic assimilation of nonverbal gestures in immersive virtual environments. Psychological Science. 2005;16(10):814–819. doi: 10.1111/j.1467-9280.2005.01619.x. [DOI] [PubMed] [Google Scholar]
- Baltazar M, Hazem N, Vilarem E, Beaucousin V, Picq JL, Conty L. Eye contact elicits bodily self-awareness in human adults. Cognition. 2014;133(1):120–127. doi: 10.1016/j.cognition.2014.06.009. [DOI] [PubMed] [Google Scholar]
- Bašnáková J, van Berkum J, Weber K, Hagoort P. A job interview in the MRI scanner: How does indirectness affect addressees and overhearers? Neuropsychologia. 2015;76:79–91. doi: 10.1016/j.neuropsychologia.2015.03.030. [DOI] [PubMed] [Google Scholar]
- Bašnáková J, Weber K, Petersson KM, van Berkum J, Hagoort P. Beyond the language given: The neural correlates of inferring speaker meaning. Cerebral Cortex. 2014;24(10):2572–2578. doi: 10.1093/cercor/bht112. [DOI] [PubMed] [Google Scholar]
- Bavelas JB, Black A, Lemery CR, Mullett J. “I show how you feel”: Motor mimicry as a communicative act. Journal of Personality and Social Psychology. 1986;50(2):322–329. doi: 10.1037/0022-3514.50.2.322. [DOI] [Google Scholar]
- Bhat, A. N., Hoffman, M. D., Trost, S. L., Culotta, M. L., Eilbott, J., Tsuzuki, D., & Pelphrey, K. A. (2017). Cortical activation during action observation, action execution, and interpersonal synchrony in adults: A functional near-infrared spectroscopy (fNIRS) study. Frontiers in Human Neuroscience, 11(September). 10.3389/fnhum.2017.00431 [DOI] [PMC free article] [PubMed]
- Bond CFJ. Social facilitation: A self-presentational view. Journal of Personality and Social Psychology. 1982;42(6):1042–1050. doi: 10.1037/0022-3514.42.6.1042. [DOI] [Google Scholar]
- Burgoon JK, Dunbar NE. Nonverbal expressions of dominance and power in human relationships. In: Manusov V, Patterson ML, editors. The SAGE handbook of nonverbal communication. SAGE Publications; 2006. pp. 279–298. [Google Scholar]
- Cage EA. Mechanisms of social influence: Reputation management in typical and autistic individuals (Doctoral thesis) University of London; 2015. [Google Scholar]
- Cañigueral R, Hamilton A. F. de C. Being watched: Effects of an audience on eye gaze and prosocial behaviour. Acta Psychologica. 2019;195:50–63. doi: 10.1016/j.actpsy.2019.02.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cañigueral R, Hamilton A. F. de C. The Role of eye gaze during natural social interactions in typical and autistic people. Frontiers in Psychology. 2019;10(560):1–18. doi: 10.3389/fpsyg.2019.00560. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cañigueral, R., Ward, J. A., & Hamilton, A. F. de C. (2021). Effects of being watched on eye gaze and facial displays of typical and autistic individuals during conversation. Autism,25(1), 210–226. [DOI] [PMC free article] [PubMed]
- Cañigueral, R., Zhang, X., Noah, J. A., Tachtsidis, I., Hamilton, A. F. de C., & Hirsch, J. (2021). Facial and neural mechanisms during interactive disclosure of biographical information. NeuroImage, 226, 117572. 10.1016/j.neuroimage.2020.117572 [DOI] [PMC free article] [PubMed]
- Caruana N, Brock J, Woolgar A. A frontotemporoparietal network common to initiating and responding to joint attention bids. NeuroImage. 2015;108:34–46. doi: 10.1016/j.neuroimage.2014.12.041. [DOI] [PubMed] [Google Scholar]
- Cavallo A, Lungu O, Becchio C, Ansuini C, Rustichini A, Fadiga L. When gaze opens the channel for communication: Integrative role of IFG and MPFC. NeuroImage. 2015;119:63–69. doi: 10.1016/j.neuroimage.2015.06.025. [DOI] [PubMed] [Google Scholar]
- Chartrand TL, Bargh JA. The chameleon effect: The perception–behavior link and social interaction. Journal of Personality and Social Psychology. 1999;76(6):893–910. doi: 10.1037/0022-3514.76.6.893. [DOI] [PubMed] [Google Scholar]
- Chartrand TL, Lakin JL. The antecedents and consequences of human behavioral mimicry. Annual Review of Psychology. 2012;64(1):285–308. doi: 10.1146/annurev-psych-113011-143754. [DOI] [PubMed] [Google Scholar]
- Chartrand, T. L., & van Baaren, R. B. (2009). Human mimicry. In M. P. Zanna (Ed.), Advances in experimental social psychology (Vol. 41, pp. 219–274). 10.1016/S0065-2601(08)00405-X
- Conty L, George N, Hietanen JK. Watching Eyes effects: When others meet the self. Consciousness and Cognition. 2016;45:184–197. doi: 10.1016/j.concog.2016.08.016. [DOI] [PubMed] [Google Scholar]
- Crivelli C, Fridlund AJ. Facial displays are tools for social influence. Trends in Cognitive Sciences. 2018;22(5):388–399. doi: 10.1016/j.tics.2018.02.006. [DOI] [PubMed] [Google Scholar]
- Czeszumski A, Eustergerling S, Lang A, Menrath D, Gerstenberger M, Schuberth S, Schreiber F, Rendon ZZ, König P. Hyperscanning: A valid method to study neural inter-brain underpinnings of social interaction. Frontiers in Human Neuroscience. 2020;14(February):1–17. doi: 10.3389/fnhum.2020.00039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dawkins MS, Guilford T. The corruption of honest signalling. Animal Behaviour. 1991;41(5):865–873. doi: 10.1016/S0003-3472(05)80353-7. [DOI] [Google Scholar]
- De Jaegher H, Di Paolo E, Gallagher S. Can social interaction constitute social cognition? Trends in Cognitive Sciences. 2010;14(10):441–447. doi: 10.1016/j.tics.2010.06.009. [DOI] [PubMed] [Google Scholar]
- de Klerk CCJM, Hamilton A. F. de C., Southgate V. Eye contact modulates facial mimicry in 4-month-old infants: An EMG and fNIRS study. Cortex. 2018;106:93–103. doi: 10.1016/j.cortex.2018.05.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Diano M, Tamietto M, Celeghin A, Weiskrantz L, Tatu MK, Bagnis A, Duca S, Geminiani G, Cauda F, Costa T. Dynamic changes in amygdala psychophysiological connectivity reveal distinct neural networks for facial expressions of basic emotions. Scientific Reports. 2017;7(February):1–13. doi: 10.1038/srep45260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dijksterhuis, A. (2005). Why are we social animals: The high road to imitation as social glue. In S. Hurley & N. Chater (Eds.), Perspectives on imitation: From neuroscience to social science (Vol. 2., pp. 207–220). MIT Press.
- Dumas, G., Nadel, J., Soussignan, R., Martinerie, J., & Garnero, L. (2010). Inter-brain synchronization during social interaction. PLOS ONE, 5(8), Article e12166. 10.1371/journal.pone.0012166 [DOI] [PMC free article] [PubMed]
- Freeth, M., Foulsham, T., & Kingstone, A. (2013). What affects social attention? Social presence, eye contact and autistic traits. PLOS ONE, 8(1). 10.1371/journal.pone.0053286 [DOI] [PMC free article] [PubMed]
- Farmer H, Ciaunica A, Hamilton A. F. de C. The functions of imitative behaviour in humans. Mind & Language. 2018;33(4):378–396. doi: 10.1111/mila.12189. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Flynn E, Smith K. Investigating the mechanisms of cultural acquistion: How pervasive is overimitation in adults? Social Psychology. 2012;43(4):185–195. doi: 10.1027/1864-9335/a000119. [DOI] [Google Scholar]
- Foulsham T, Walker E, Kingstone A. The where, what and when of gaze allocation in the lab and the natural environment. Vision Research. 2011;51(17):1920–1931. doi: 10.1016/j.visres.2011.07.002. [DOI] [PubMed] [Google Scholar]
- Gambetta, D. (2009). Codes of the underworld: How criminals communicate. Princeton University Press.
- Fridlund AJ. Sociality of solitary smiling: Potentiation by an implicit audience. Journal of Personality and Social Psychology. 1991;60(2):229–240. doi: 10.1037/0022-3514.60.2.229. [DOI] [Google Scholar]
- Gallese V, Goldman A. Mirror neurons and the simulation theory of mind-reading. Trends in Cognitive Sciences. 1998;2(12):493–501. doi: 10.1016/S1364-6613(98)01262-5. [DOI] [PubMed] [Google Scholar]
- Goffman, E. (1963). Behavior in public places. Simon & Schuster.
- Geen RG. Evaluation apprehension and response withholding in solution of anagrams. Personality and Individual Differences. 1985;6(3):293–298. doi: 10.1016/0191-8869(85)90052-2. [DOI] [Google Scholar]
- Georgescu AL, Kuzmanovic B, Schilbach L, Tepest R, Kulbida R, Bente G, Vogeley K. Neural correlates of “social gaze” processing in high-functioning autism under systematic variation of gaze duration. NeuroImage: Clinical. 2013;3:340–351. doi: 10.1016/j.nicl.2013.08.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gobel MS, Kim HS, Richardson DC. The dual function of social gaze. Cognition. 2015;136:359–364. doi: 10.1016/j.cognition.2014.11.040. [DOI] [PubMed] [Google Scholar]
- Hagoort P. The neurobiology of language beyond single-word processing. Science. 2019;366(6461):55–58. doi: 10.1126/science.aax0289. [DOI] [PubMed] [Google Scholar]
- Hagoort P, Levinson SC. Neuropragmatics. In: Gazzaniga MS, Mangun GR, editors. The cognitive neurosciences. 5. MIT Press; 2014. pp. 667–674. [Google Scholar]
- Hale J, Hamilton A. F. de C. Cognitive mechanisms for responding to mimicry from others. Neuroscience and Biobehavioral Reviews. 2016;63:106–123. doi: 10.1016/j.neubiorev.2016.02.006. [DOI] [PubMed] [Google Scholar]
- Hale J, Hamilton A. F. de C. Testing the relationship between mimicry, trust and rapport in virtual reality conversations. Scientific Reports. 2016;6(1):35295. doi: 10.1038/srep35295. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hamilton, A. F. de C. (2021). Hyperscanning: Beyond the Hype. Neuron, 109(3), 404–407. 10.1016/j.neuron.2020.11.00 [DOI] [PubMed]
- Hamilton, A. F. de C., Lind, F. (2016). Audience effects: What can they tell us about social neuroscience theory of mind and autism? Culture and Brain, 4(2) 159–177. 10.1007/s40167-016-0044-5 [DOI] [PMC free article] [PubMed]
- Hari R, Puce A. The tipping point of animacy: How, when, and where we perceive life in a face. Psychological Science. 2010;21(12):1854–1862. doi: 10.1177/0956797610388044. [DOI] [PubMed] [Google Scholar]
- Hazem N, George N, Baltazar M, Conty L. I know you can see me: Social attention influences bodily self-awareness. Biological Psychology. 2017;124:21–29. doi: 10.1016/j.biopsycho.2017.01.007. [DOI] [PubMed] [Google Scholar]
- Heyes C. Automatic imitation. Psychological Bulletin. 2011;137(3):463–483. doi: 10.1037/a0022288. [DOI] [PubMed] [Google Scholar]
- Heyes C. When does social learning become cultural learning? Developmental Science. 2017;20(2):1–14. doi: 10.1111/desc.12350. [DOI] [PubMed] [Google Scholar]
- Hietanen JK, Helminen TM, Kiilavuori H, Kylliäinen A, Lehtonen H, Peltola MJ. Your attention makes me smile: Direct gaze elicits affiliative facial expressions. Biological Psychology. 2018;132:1–8. doi: 10.1016/j.biopsycho.2017.11.001. [DOI] [PubMed] [Google Scholar]
- Hietanen JK, Kylliäinen A, Peltola MJ. The effect of being watched on facial EMG and autonomic activity in response to another individual’s facial expressions. Scientific Reports. 2019;9(1):14759. doi: 10.1038/s41598-019-51368-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hietanen JK, Leppänen JM, Peltola MJ, Linna-aho K, Ruuhiala HJ. Seeing direct and averted gaze activates the approach-avoidance motivational brain systems. Neuropsychologia. 2008;46(9):2423–2430. doi: 10.1016/j.neuropsychologia.2008.02.029. [DOI] [PubMed] [Google Scholar]
- Hirsch J, Zhang X, Noah JA, Ono Y. Frontal temporal and parietal systems synchronize within and across brains during live eye-to-eye contact. NeuroImage. 2017;157(January):314–330. doi: 10.1016/j.neuroimage.2017.06.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ho S, Foulsham T, Kingstone A. Speaking and listening with the eyes: Gaze signaling during dyadic interactions. PLOS ONE. 2015;10(8):1–18. doi: 10.1371/journal.pone.0136905. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holler J, Kendrick KH, Levinson SC. Processing language in face-to-face conversation: Questions with gestures get faster responses. Psychonomic Bulletin & Review. 2018;25(5):1900–1908. doi: 10.3758/s13423-017-1363-z. [DOI] [PubMed] [Google Scholar]
- Hömke P, Holler J, Levinson SC. Eye blinking as addressee feedback in face-to-face conversation. Research on Language and Social Interaction. 2017;50(1):54–70. doi: 10.1080/08351813.2017.1262143. [DOI] [Google Scholar]
- Hömke, P., Holler, J., & Levinson, S. C. (2018). Eye blinks are perceived as communicative signals in human face-to-face interaction. PLOS ONE, 13(12), Article e0208030. 10.1371/journal.pone.0208030 [DOI] [PMC free article] [PubMed]
- Izuma K. The social neuroscience of reputation. Neuroscience Research. 2012;72(4):283–288. doi: 10.1016/j.neures.2012.01.003. [DOI] [PubMed] [Google Scholar]
- Izuma K, Matsumoto K, Camerer CF, Adolphs R. Insensitivity to social reputation in autism. Proceedings of the National Academy of Sciences. 2011;108(42):17302–17307. doi: 10.1073/pnas.1107038108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Izuma K, Saito DN, Sadato N. Processing of the incentive for social approval in the ventral striatum during charitable donation. Journal of Cognitive Neuroscience. 2009;22(4):621–631. doi: 10.1162/jocn.2009.21228. [DOI] [PubMed] [Google Scholar]
- Izuma K, Saito DN, Sadato N. The roles of the medial prefrontal cortex and striatum in reputation processing. Social Neuroscience. 2010;5(2):133–147. doi: 10.1080/17470910903202559. [DOI] [PubMed] [Google Scholar]
- Jang G, Yoon S, Lee S-E, Park H, Kim J, Ko JH, Park H-J. Everyday conversation requires cognitive inference: Neural bases of comprehending implicated meanings in conversations. NeuroImage. 2013;81:61–72. doi: 10.1016/j.neuroimage.2013.05.027. [DOI] [PubMed] [Google Scholar]
- Jarick M, Kingstone A. The duality of gaze: eyes extract and signal social information during sustained cooperative and competitive dyadic gaze. Frontiers in Psychology. 2015;6(September):1–7. doi: 10.3389/fpsyg.2015.01423. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kampe KKW, Frith CD, Dolan RJ, Frith U. Reward value of attractiveness and gaze. Nature: Brief. Communications. 2001;413:589. doi: 10.1038/35098149. [DOI] [PubMed] [Google Scholar]
- Kampe KKW, Frith CD, Frith U. “Hey John”: signals conveying communicative intention toward the self activate brain regions associated with “mentalizing”, regardless of modality. The Journal of Neuroscience. 2003;23(12):5258–5263. doi: 10.1523/JNEUROSCI.23-12-05258.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kendon A. Some functions of gaze-direction in social interaction. Acta Psychologica. 1967;26:22–63. doi: 10.1016/0001-6918(67)90005-4. [DOI] [PubMed] [Google Scholar]
- Kendon A. Movement coordination in social interaction: Some examples described. Acta Psychologica. 1970;32:100–125. doi: 10.1016/0001-6918(70)90094-6. [DOI] [PubMed] [Google Scholar]
- Kingsbury L, Huang S, Wang J, Gu K, Golshani P, Wu YE, Hong W. Correlated neural activity and encoding of behavior across brains of socially interacting animals. Cell. 2019;178(2):429–446.e16. doi: 10.1016/j.cell.2019.05.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Konvalinka I, Vuust P, Roepstorff A, Frith CD. Follow you, follow me: Continuous mutual prediction and adaptation in joint tapping. Quarterly Journal of Experimental Psychology. 2010;63(11):2220–2230. doi: 10.1080/17470218.2010.497843. [DOI] [PubMed] [Google Scholar]
- Krishnan-Barman S. Adults imitate to send a social signal. 2021. [DOI] [PubMed] [Google Scholar]
- Krishnan-Barman S, Hamilton A. F. de C. Adults imitate to send a social signal. Cognition. 2019;187:150–155. doi: 10.1016/j.cognition.2019.03.007. [DOI] [PubMed] [Google Scholar]
- Kulesza W, Dolinski D, Wicher P. Knowing that you mimic me: The link between mimicry, awareness and liking. Social Influence. 2016;11(1):68–74. doi: 10.1080/15534510.2016.1148072. [DOI] [Google Scholar]
- Kumano S, Hamilton A. F. de C., Bahrami B. The role of anticipated regret in choosing for others. Scientific Reports. 2021;11(1):12557. doi: 10.1038/s41598-021-91635-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Laidlaw KEW, Foulsham T, Kuhn G, Kingstone A. Potential social interactions are important to social attention. Proceedings of the National Academy of Sciences. 2011;108(14):5548–5553. doi: 10.1073/pnas.1017022108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lakin JL, Chartrand TL. Using nonconscious behavioral mimicry to create affiliation and rapport. Psychological Science. 2003;14(4):334–339. doi: 10.1111/1467-9280.14481. [DOI] [PubMed] [Google Scholar]
- Lakin JL, Jefferis VE, Cheng CM, Chartrand TL. The chameleon effect as social glue: Evidence for the evolutionary significance of nonconscious mimicry. Journal of Nonverbal Behavior. 2003;27(3):145–161. doi: 10.1023/A:1025389814290. [DOI] [Google Scholar]
- Leong V, Byrne E, Clackson K, Georgieva S, Lam S, Wass S. Speaker gaze increases information coupling between infant and adult brains. Proceedings of the National Academy of Sciences. 2017;114(50):13290–13295. doi: 10.1073/pnas.1702493114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mandel A, Bourguignon M, Parkkonen L, Hari R. Sensorimotor activation related to speaker vs. listener role during natural conversation. Neuroscience Letters. 2016;614:99–104. doi: 10.1016/j.neulet.2015.12.054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mansour H, Kuhn G. Studying “natural” eye movements in an “unnatural” social environment: The influence of social activity, framing, and sub-clinical traits on gaze aversion. Quarterly Journal of Experimental Psychology. 2019;72(8):1913–1925. doi: 10.1177/1747021818819094. [DOI] [PubMed] [Google Scholar]
- Melnik A, Legkov P, Izdebski K, Kärcher SM, Hairston WD, Ferris DP, König P. Systems, subjects, sessions: To what extent do these factors influence EEG data? Frontiers in Human Neuroscience. 2017;11(March):1–20. doi: 10.3389/fnhum.2017.00150. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Meltzoff AN. “Like me”: A foundation for social cognition. Developmental Science. 2007;10(1):126–134. doi: 10.1111/j.1467-7687.2007.00574.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mol L, Krahmer E, Maes A, Swerts M. Seeing and being seen: The effects on gesture production. Journal of Computer-Mediated Communication. 2011;17(1):77–100. doi: 10.1111/j.1083-6101.2011.01558.x. [DOI] [Google Scholar]
- Montague PR, Berns GS, Cohen JD, McClure SM, Pagnoni G, Dhamala M, Wiest MC, Karpov I, King RD, Apple N, Fisher RE. Hyperscanning: Simultaneous fMRI during Linked Social Interactions. NeuroImage. 2002;16(4):1159–1164. doi: 10.1006/nimg.2002.1150. [DOI] [PubMed] [Google Scholar]
- Müller BCN, Maaskant AJ, van Baaren RB, Dijksterhuis AP. Prosocial consequences of imitation. Psychological Reports. 2012;110(3):891–898. doi: 10.2466/07.09.21.PR0.110.3.891-898. [DOI] [PubMed] [Google Scholar]
- Müller-Pinzler L, Gazzola V, Keysers C, Sommer J, Jansen A, Frassle S, Einhauser W, Paulus FM, Krach S. Neural pathways of embarrassment and their modulation by social anxiety. NeuroImage. 2016;49(0):252–261. doi: 10.1016/j.neuroimage.2015.06.036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nakamura A, Maess B, Knösche TR, Friederici AD. Different hemispheric roles in recognition of happy expressions. PLOS ONE. 2014;9(2):e88628. doi: 10.1371/journal.pone.0088628. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nakamura K, Kawashima R, Ito K, Sugiura M, Kato T, Nakamura A, Hatano K, Nagumo S, Kubota K, Fukuda H, Kojima S. Activation of the right inferior frontal cortex during assessment of facial emotion. Journal of Neurophysiologyeurophysiology. 1999;82(3):1610–1614. doi: 10.1152/jn.1999.82.3.1610. [DOI] [PubMed] [Google Scholar]
- Over H, Carpenter M. The social side of imitation. Child Development Perspectives. 2013;7(1):6–11. doi: 10.1111/cdep.12006. [DOI] [Google Scholar]
- Pelphrey KA, Viola RJ, McCarthy G. When strangers pass: Processing of mutual and averted social gaze in the superior temporal sulcus. Psychological Science. 2004;15(9):598–603. doi: 10.1111/j.0956-7976.2004.00726.x. [DOI] [PubMed] [Google Scholar]
- Piazza EA, Hasenfratz L, Hasson U, Lew-Williams C. Infant and adult brains are coupled to the dynamics of natural communication. Psychological Science. 2020;31(1):6–17. doi: 10.1177/0956797619878698. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pickering MJ, Garrod S. An integrated theory of language production and comprehension. Behavioral and Brain Sciences. 2013;36(4):329–347. doi: 10.1017/S0140525X12001495. [DOI] [PubMed] [Google Scholar]
- Pinti P, Tachtsidis I, Hamilton A. F. de C., Hirsch J, Aichelburg C, Gilbert SJ, Burgess PW. The present and future use of functional near-infrared spectroscopy (fNIRS) for cognitive neuroscience. Annals of the New York Academy of Sciences. 2018;1464(1):5–29. doi: 10.1111/nyas.13948. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pönkänen LM, Peltola MJ, Hietanen JK. The observer observed: Frontal EEG asymmetry and autonomic responses differentiate between another person’s direct and averted gaze when the face is seen live. International Journal of Psychophysiology. 2011;82(2):180–187. doi: 10.1016/j.ijpsycho.2011.08.006. [DOI] [PubMed] [Google Scholar]
- Redcay E, Dodell-Feder D, Pearrow MJ, Mavros PL, Kleiner M, Gabrieli JDE, Saxe R. Live face-to-face interaction during fMRI: a new tool for social cognitive neuroscience. NeuroImage. 2010;50(4):1639–1647. doi: 10.1016/j.neuroimage.2010.01.052. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Redcay E, Ludlum RS, Velnoskey KR, Kanwal S. Communicative signals promote object recognition memory and modulate the right posterior STS. Journal of Cognitive Neuroscience. 2016;28(1):8–19. doi: 10.1162/jocn_a_00875. [DOI] [PubMed] [Google Scholar]
- Redcay E, Schilbach L. Using second-person neuroscience to elucidate the mechanisms of social interaction. Nature Reviews Neuroscience. 2019;20(8):495–505. doi: 10.1038/s41583-019-0179-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Risko EF, Richardson DC, Kingstone A. Breaking the fourth wall of cognitive science: real-world social attention and the dual function of gaze. Current Directions in Psychological Science. 2016;25(1):70–74. doi: 10.1177/0963721415617806. [DOI] [Google Scholar]
- Sato W, Yoshikawa S, Kochiyama T, Matsumura M. The amygdala processes the emotional significance of facial expressions: An fMRI investigation using the interaction between expression and face direction. NeuroImage. 2004;22(2):1006–1013. doi: 10.1016/j.neuroimage.2004.02.030. [DOI] [PubMed] [Google Scholar]
- Saxe, R., & Kanwisher, N. (2003). People thinking about thinking people: The role of the temporo-parietal junction in “theory of mind.” NeuroImage, 19, 1835–1842. [DOI] [PubMed]
- Saxe R, Wexler A. Making sense of another mind: The role of the right temporo-parietal junction. Neuropsychologia. 2005;43(10):1391–1399. doi: 10.1016/j.neuropsychologia.2005.02.013. [DOI] [PubMed] [Google Scholar]
- Schegloff EA. Sequence organization in interaction. Cambridge University Press; 2007. [Google Scholar]
- Schilbach L, Timmermans B, Reddy V, Costall A, Bente G, Schlicht T, Vogeley K. Toward a second-person neuroscience. Behavioral and Brain Sciences. 2013;36(4):393–414. doi: 10.1017/S0140525X12000660. [DOI] [PubMed] [Google Scholar]
- Schilbach L, Wilms M, Eickhoff SB, Romanzetti S, Tepest R, Bente G, Shah NJ, Fink GR, Vogeley K. Minds made for sharing: Initiating joint attention recruits reward-related neurocircuitry. Journal of Cognitive Neuroscience. 2010;22(12):2702–2715. doi: 10.1162/jocn.2009.21401. [DOI] [PubMed] [Google Scholar]
- Schilbach L, Wohlschlaeger AM, Kraemer NC, Newen A, Shah NJ, Fink GR, Vogeley K. Being with virtual others: Neural correlates of social interaction. Neuropsychologia. 2006;44(5):718–730. doi: 10.1016/j.neuropsychologia.2005.07.017. [DOI] [PubMed] [Google Scholar]
- Sebanz N, Bekkering H, Knoblich G. Joint action: Bodies and minds moving together. Trends in Cognitive Sciences. 2006;10(2):70–76. doi: 10.1016/j.tics.2005.12.009. [DOI] [PubMed] [Google Scholar]
- Senju A, Johnson MH. The eye contact effect: Mechanisms and development. Trends in Cognitive Sciences. 2009;13(3):127–134. doi: 10.1016/j.tics.2008.11.009. [DOI] [PubMed] [Google Scholar]
- Skyrms, B. (2010). Signals. Oxford University Press.10.1093/acprof:oso/9780199580828.001.0001
- Smaldino PE, Flamson TJ, McElreath R. The evolution of covert signaling. Scientific Reports. 2018;8(1):4905. doi: 10.1038/s41598-018-22926-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Somerville LH, Jones RM, Ruberry EJ, Dyke JP. Medial prefrontal cortex and the emergence of self-conscious emotion in adolescence. Psychological Science. 2013;24(8):1554–1562. doi: 10.1177/0956797613475633. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Soutschek A, Sauter M, Schubert T. The importance of the lateral prefrontal cortex for strategic decision making in the prisoner’s dilemma. Cognitive, Affective, & Behavioral Neuroscience. 2015;15(4):854–860. doi: 10.3758/s13415-015-0372-5. [DOI] [PubMed] [Google Scholar]
- Speitel C, Traut-Mattausch E, Jonas E. Functions of the right DLPFC and right TPJ in proposers and responders in the ultimatum game. Social Cognitive and Affective Neuroscience. 2019;14(3):263–270. doi: 10.1093/scan/nsz005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sperber D, Wilson D. Relevance: Communication and cognition. (2nd ed.). 1995. [Google Scholar]
- Spotorno N, Koun E, Prado J, Van Der Henst J-B, Noveck IA. Neural evidence that utterance-processing entails mentalizing: The case of irony. NeuroImage. 2012;63(1):25–39. doi: 10.1016/j.neuroimage.2012.06.046. [DOI] [PubMed] [Google Scholar]
- Stegmann UE. Animal communication theory: Information and influence. Cambridge University Press; 2013. [Google Scholar]
- Stel M, Vonk R. Mimicry in social interaction: Benefits for mimickers, mimickees, and their interaction. British Journal of Psychology. 2010;101(2):311–323. doi: 10.1348/000712609X465424. [DOI] [PubMed] [Google Scholar]
- Strauss B. Social facilitation in motor tasks: A review of research and theory. Psychology of Sport and Exercise. 2002;3(3):237–256. doi: 10.1016/S1469-0292(01)00019-X. [DOI] [Google Scholar]
- Suzuki N, Takeuchi Y, Ishii K, Okada M. Effects of echoic mimicry using hummed sounds on human–computer interaction. Speech Communication. 2003;40(4):559–573. doi: 10.1016/S0167-6393(02)00180-2. [DOI] [Google Scholar]
- Tennie C, Frith U, Frith CD. Reputation management in the age of the world-wide web. Trends in Cognitive Sciences. 2010;14(11):482–488. doi: 10.1016/j.tics.2010.07.003. [DOI] [PubMed] [Google Scholar]
- Teufel C, Fletcher PC, Davis G. Seeing other minds: Attributed mental states influence perception. Trends in Cognitive Sciences. 2010;14(8):376–382. doi: 10.1016/j.tics.2010.05.005. [DOI] [PubMed] [Google Scholar]
- Thorndike EL. Animal intelligence: An experimental study of the associative processes in animals. The Psychological Review: Monograph Supplements. 1898;2(4):i–109. [Google Scholar]
- Triplett N. The dynamogenic factors in pacemaking and competition. The American Journal of Psychology. 1898;9(4):507–533. doi: 10.2307/1412188. [DOI] [Google Scholar]
- Tylén K, Allen M, Hunter BK, Roepstorff A. Interaction vs. observation: Distinctive modes of social cognition in human brain and behavior? A combined fMRI and eye-tracking study. Frontiers in Human Neuroscience. 2012;6(331):1–11. doi: 10.3389/fnhum.2012.00331. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Uzgiris IC. Two functions of imitation during infancy. International Journal of Behavioral Development. 1981;4(1):1–12. doi: 10.1177/016502548100400101. [DOI] [Google Scholar]
- van Ackeren MJ, Casasanto D, Bekkering H, Hagoort P, Rueschemeyer S-A. Pragmatics in action: Indirect requests engage theory of mind areas and the cortical motor network. Journal of Cognitive Neuroscience. 2012;24(11):2237–2247. doi: 10.1162/jocn_a_00274. [DOI] [PubMed] [Google Scholar]
- van Baaren RB, Holland RW, Steenaert B, van Knippenberg A. Mimicry for money: Behavioral consequences of imitation. Journal of Experimental Social Psychology. 2003;39(4):393–398. doi: 10.1016/S0022-1031(03)00014-3. [DOI] [Google Scholar]
- Vivanti G, Dissanayake C. Propensity to imitate in autism is not modulated by the model’s gaze direction: An eye-tracking study. Autism Research. 2014;7(3):392–399. doi: 10.1002/aur.1376. [DOI] [PubMed] [Google Scholar]
- Wang Y, Hamilton A. F. de C. Social top-down response modulation (STORM): A model of the control of mimicry in social interaction. Frontiers in Human Neuroscience. 2012;6(June):1–10. doi: 10.3389/fnhum.2012.00153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wang Y, Hamilton A. F. de C. Why does gaze enhance mimicry? Placing gaze-mimicry effects in relation to other gaze phenomena. The Quarterly Journal of Experimental Psychology. 2014;67(4):747–762. doi: 10.1080/17470218.2013.828316. [DOI] [PubMed] [Google Scholar]
- Wang, Y., Newport, R., & Hamilton, A. F. de C. (2011). Eye contact enhances mimicry of intransitive hand movements. Royal Society Biology Letters,7(1), 7–10. [DOI] [PMC free article] [PubMed]
- Warnell KR, Sadikova E, Redcay E. Let’s chat: developmental neural bases of social motivation during real-time peer interaction. Developmental Science. 2018;21(April):1–14. doi: 10.1111/desc.12581. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Whiten, A., & Ham, R. (1992). On the nature and evolution of imitation in the animal kingdom: Reappraisal of a century of research. 10.1016/S0065-3454(08)60146-1
- Wildgruber D, Ackermann H, Klose U, Kardatzki B, Grodd W. Functional lateralization of speech production at primary motor cortex: An fMRI study. NeuroReport. 1996;7:2791–2795. doi: 10.1097/00001756-199611040-00077. [DOI] [PubMed] [Google Scholar]
- Wilms M, Schilbach L, Pfeiffer UJ, Bente G, Fink GR, Vogeley K. It’s in your eyes—Using gaze-contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience. Social Cognitive and Affective Neuroscience. 2010;5(1):98–107. doi: 10.1093/scan/nsq024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zajonc RB. Social facilitation. Science. 1965;149(3681):269–274. doi: 10.1126/science.149.3681.269. [DOI] [PubMed] [Google Scholar]
- Zajonc RB, Sales SM. Social facilitation of dominant and subordinate responses. Journal of Experimental Social Psychology. 1966;2(2):160–168. doi: 10.1016/0022-1031(66)90077-1. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable.
Not applicable.