Abstract
We reanalyzed a data set consisting of a U.S. undergraduate sample (N = 212) from a previous study (Hertenstein et al. 2006a) that showed that touch communicates distinct emotions between humans. In the current reanalysis, we found that anger was communicated at greater-than-chance levels only when a male comprised at least one member of a communicating dyad. Sympathy was communicated at greater-than-chance levels only when a female comprised at least one member of the dyad. Finally, happiness was communicated only if females comprised the entire dyad. The current analysis demonstrates gender asymmetries in the accuracy of communicating distinct emotions via touch between humans.
Keywords: Touch, Emotional communication, Tactile, Contact, Gender differences
Introduction
Touch is a rich medium of social exchange and through it, individuals form strong attachments and cooperative alliances, they negotiate status differences, they soothe and calm, and they express sexual and romantic interest (Hertenstein et al. 2006a). Given the centrality of touch to social life, it is likely to be a highly gendered form of human communication.
There is a longstanding interest in exploring the magnitude and sources of gender differences in the expression of emotion. The predominant focus in this work has been on the face and voice (LaFrance, et al. 2003; Scherer et al. 2003). In the present investigation, we examine gender differences in the communication of distinct emotions via touch in humans. We do so in the context of evolutionary and constructionist theories of gender, build upon previous work published in Sex Roles, and rely upon well-tested methodology in the field of emotion. Although the sample is limited to one in the U.S., the work has implications and raises questions for the communication of emotion via touch in other cultures. In relation to this point, Table 1 includes sample characteristics for the empirical articles cited.
Table 1.
Author (year) | Demographic of sample | Location |
---|---|---|
Bailenson et al. (2007) | Undergraduates | U.S. |
Banse and Scherer (1996) | Actors and undergraduates | Germany |
Becker et al. (2007) | Undergraduates | U.S. |
Birnbaum (1984) | Parents, undergraduates, preschoolers | U.S. |
Clark and Shields (1997) | High School: 14–19 yrs | U.S. |
Clynes and Nettheim (1982) | University students and staff | Australia |
Day and Carroll (2004) | Undergraduates | Canada |
Eisenberg et al. (1989) | Children (2nd and 5th grade) and undergraduates | U.S. |
Frank and Stennett (2001) | Wide age range across adulthood | Australia and U.S. |
Gohm and Clore (2000) | Undergraduates | U.S. |
Gross et al. (1994) | Undergraduates | U.S. |
Gross and John (1998) | Undergraduates | U.S. |
Halberstadt et al. (1988) | Undergraduates | U.S. |
Henley (1973) | Broad age range | U.S. |
Hertenstein and Campos (2001) | Infants | U.S. |
Hertenstein et al. (2009) | Undergraduates | U.S. |
Hertenstein et al. (2006a) | Undergraduates | U.S. & Spain |
Hess et al. (2004) | Undergraduates | U.S. and Canada |
Hess et al. (2000) | Undergraduates | Canada |
Jones and Yarbrough (1985) | Undergraduates | U.S. |
Kring and Gordon (1998) | Undergraduates | U.S. |
Mayer et al. (2000) | Broad age range | U.S. |
Plant et al. (2000) | Undergraduates | U.S. |
Robinson and Johnson (1997) | Undergraduates | U.S. |
Shiota et al. (2006) | Undergraduates | U.S. |
Simon and Nath (2004) | Broad age range | U.S. |
Smith, and MacLean (2007) | University students | Canada |
Timmers et al. (2003) | Undergraduates | Netherlands |
Tracy et al. (2009) | Undergraduates | U.S. |
Weiss (1992) | Graduate students | U.S. |
Touch and Emotion in Human Communication
Some research indicates that touch communicates the hedonic tone of emotion predominantly, that is overall warmth or distress (Hertenstein and Campos 2001; Jones and Yarbrough 1985; Knapp and Hall 1997), or that touch intensifies the meaning of emotional displays in other modalities (Knapp and Hall 1997). Recent studies by Hertenstein and colleagues have documented, however, that touch communicates several distinct emotions between humans (Hertenstein et al. 2006a, see Clynes and Nettheim (1982) for the association of distinct states by people pressing a pressure-sensitive button). In this research, two strangers were placed in a room in which they were separated by a barrier. They could not see one another, but they could reach each other through an aperture in a curtain. One person touched the other on the forearm, in each instance trying to convey one of 12 emotions. After each touch, the person touched had to choose which emotion s/he thought the encoder was communicating by selecting a term in a modified forced-choice format (Frank and Stennett 2001).
In two different undergraduate samples from Spain and the U.S., participants accurately decoded anger, fear, disgust, love, gratitude, and sympathy at above-chance levels (Hertenstein et al. 2006a). Participants did not decode happiness, surprise, sadness, embarrassment, envy, and pride at above chance levels. Accuracy rates ranged from 48% to 83% for the accurately decoded emotions, which is comparable to those observed in studies of facial displays and vocal communication with samples around the world (Elfenbein and Ambady 2002; Scherer et al. 2003). Specific tactile behaviors demonstrated by the U.S. sample were associated with each of the emotions. For example, sympathy was associated with stroking and patting, anger with hitting and squeezing, disgust with a pushing motion, gratitude shaking of the hand, fear with trembling, and love with stroking.
Sympathy, Anger, Happiness and the Gendered Nature of Touch
The study of gender and emotion is one of the richest traditions in the field of emotion (for reviews, see Brody and Hall 2008; Citrin et al. 2004; LaFrance et al. 2003). Researchers interested in gender have focused on stereotypes (e.g., Hess et al. 2000; Robinson and Johnson 1997; Timmers et al. 2003), self-reported experience (e.g., Gross and John 1998; Hess et al. 2000; Simon and Nath 2004), verbalization of emotion (e.g., Roter et al. 2002), emotional expression (e.g., Halberstadt et al. 1988; Kring and Gordon 1998), physiology (e.g., Kring & Gordon), nonverbal decoding of emotions (e.g., Hall 1990), and constructs such as emotional intelligence (e.g. Day and Carroll 2004; Mayer et al. 2000) and emotional competence (e.g., Gohm and Clore 2000). The empirical areas of inquiry within this tradition have established several regularities: women are stereotyped as being more emotional (e.g., Plant et al. 2000); females smile more (LaFrance et al. 2003) and cry more often (Gross et al. 1994); women are more expressive of emotion in general and better at decoding emotion (Brody and Hall 2008). The present research contributes to this literature by exploring an understudied modality of emotion-communication between humans—touch.
With respect to touch in human communication, scattered studies have yielded certain regularities. Males appear to initiate touch more than females (Henley 1973; although see Stier and Hall 1984). Studies also find gender differences in the perceived valence of a touch (Hall et al. 2005; Hertenstein et al. 2006). In this research, women are more likely than men to perceive touch from opposite-gender strangers as unpleasant and an invasion of privacy. Moreover, the more women perceive a touch as sexual from a male stranger, the less they perceive the touch as warm and friendly; whereas, the more men perceive a touch as sexual from a female stranger, the more they perceive it as warm, pleasant, and friendly (for a review, see Hertenstein et al. 2006b).
In the present study, we tested predictions regarding gender differences in the accuracy with which individuals can communicate distinct emotions through touch in human communication, relying on previously published data (Hertenstein et al. 2006a). This study included the requisite four different dyad groups (encoder-decoder): female–female, female–male, male–male, and male–female. Overall accuracy across 12 emotions did not vary by gender, as reported in the original article. However, gender differences were not analyzed for each emotion separately. These aggregate analyses limited the inferences that could be made regarding gender differences in the communication of emotion via touch. In the present study, we analyzed each of the emotions separately focusing on two emotions that evolutionary and social role accounts both suggest should vary by gender: sympathy and anger. We also focused on happiness, an emotion that has shown consistent gender differences (e.g., LaFrance et al 2003).
Evolutionary and social role accounts suggest potential and consistent gender differences in the communication of sympathy and anger via touch between humans. Sympathy is a care-taking emotion that supports other-oriented, altruistic behavior (Eisenberg et al. 1989; Goetz, and Keltner 2007). Within evolutionary accounts, it is assumed that women disproportionately take on the care-taking demands of raising offspring. Within social role accounts, it is well documented that central socialization practices—parental discourse, child rearing manuals, cultural stereotypes—amplify the place of sympathy in women’s psyches (e.g., Clark 1997; Clark and Shields 1997). Both accounts suggest that women should be more likely to experience and express sympathy. Consistent with this analysis, females report experiencing more sympathy than do men (Brody and Hall 2000; Shiota et al. 2006). In the present study we predicted that when females are in an experimental dyad, sympathy will be decoded at above-chance levels.
Anger, in contrast, promotes aggression (Berkowitz 1993). Given that anger produces assertive, competitive behavior in face-to-face interactions, anger is intertwined with status contests and affordances (e.g., Tiedens and Leach 2004). Evolutionary accounts (e.g., Daly and Wilson 1994; Kenrick et al. 2004) contend that men more readily enter into confrontational encounters to rise in hierarchies and gain preferential access to mates, and should more readily experience and express anger. Guided by this theorizing, one study recently found that participants could more accurately and more quickly detect male than female angry facial displays (Becker et al. 2007). Social role accounts likewise assume that anger is a gendered emotion, one more fitting with the stereotypical roles granted to men, revolving around self-assertion, competition, and status (Kring 2000). Guided by such theorizing, it has been found, for example, that while mothers talk more about most emotions to their young daughters than their young sons—to socialize them in the ways of care-taking—they talk more with their boys about anger (Fivush et al. 2006). And in adults, it has been found that men consistently report experiencing and expressing more anger than women (Brody and Hall 2000; Kring 2000; Plant et al. 2000). We therefore predicted that dyads involving males would communicate anger with touch at above-chance levels.
Finally, one of the most consistent gender differences identified in the literature on emotion relates to the stereotypes, experience, and expression of happiness (Hess et al. 2004). Women are assumed to experience and express happiness more than men in a variety of contexts (Brody and Hall 2008; Fischer 1993; Hall et al. 2000). Stereotypes such as this have been documented as early as toddlerhood and are thought to arise from socialization practices originating from television, parental stereotypes, differential reinforcement of emotional expression, and actual observations of emotionality (Birnbaum 1984).
Researchers have documented empirical support for such stereotypes (Brody and Hall 2000); women report experiencing more happiness than men (Brody 1993) and they smile more than men (Hall et al. 2002; LaFrance et al. 2003). A number of explanations have been proposed to explain these gender differences, some of which emphasize power and status (Henley 1977, 1995; LaFrance and Henley 1994), some of which emphasize the social roles of the genders (Brody and Hall 2000; Shields 2000), and some that combine these two explanations (LaFrance and Hecht 1999; LaFrance et al. 2003). Predicated upon theory and the empirical work demonstrating that women experience and express more positive emotionality than men, we predicted that dyads comprised solely of females would communicate happiness with touch at above-chance levels.
In summary, based on theoretical and empirical evidence reviewed above, we made predictions regarding three different emotions: sympathy, anger, and happiness. More specifically, we predicted that when females are in an experimental dyad, sympathy will be decoded at above-chance levels. We also predicted that dyads involving males would accurately communicate anger with touch. Finally, we predicted that dyads comprised solely of females would accurately communicate happiness with touch.
Although the primary goal of the current study was to address the decoding of emotion via touch, a subsidiary purpose was to provide tactile signals that were used to communicate sympathy, anger, and happiness between humans. The field of emotion has been advanced by precise characterizations of emotion-specific signals in the face (Ekman 1993) and voice (Scherer, et al. 2003). The two most common coding systems for the face include Izard’s (1979) maximally discriminative facial movement coding system (MAX) and Ekman and Friesen’s (1978) Facial Action Coding System (FACS). These systems are anatomically based coding systems that require frame-by-frame video analysis of muscle movements. Researchers have also devised techniques to analyze spectrograms of vocal expressions of emotion (Scherer, et al. 2003). Researchers attend to a number of technical parameters when analyzing vocal expressions of emotion including the mean, variability, and range of frequency, as well as vocal intensity and spectral noise (Scherer, et al.).
Researchers have sometimes, but not always, relied on bottom-up descriptions of emotion signals rather than making a priori predictions of what should be observed (Ekman 1993). Kagan (2007) has called for those in the affective sciences to take more of a Baconian, bottom-up, approach given that the field is still in its early stages of development. In line with these traditions, we provide tactile descriptions of emotion and do so with a more modest coding system than those available for the face and voice. We describe the duration and most often used tactile behaviors to communicate sympathy, anger, and happiness. Our coding system includes several qualities of touch (e.g., squeezing, stroking, tapping, trembling, hitting, scratching,) and is based on a number of other systems used by researchers investigating touch in human communication (e.g., Argyle 1975; Jones and Yarbrough 1985; Weiss 1992).
Method
Participants
The sample consisted of 212 participants (106 unacquainted dyads) from a large public university who ranged in age from 18 to 40 years (M = 20.15 years, SD = 3.20). Participants received extra credit for an introductory psychology course for participating. The self-identified ethnic background of the sample was primarily Caucasian (34%), Chinese (30%), and Korean (12%). One member of the dyad was randomly assigned to the role of encoder, the other to the role of decoder. Like Banse and Scherer (1996), we use the terms encoding and decoding because they connote the research method; no inference should be made that a “code” exists in the emotional signal. The gender breakdown of the four possible dyads was as follows (encoder–decoder): female–female (n = 24), female–male (n = 27), male–male (n = 27), and male–female (n = 28) (Hertenstein et al. 2006a).
Procedure and Materials
Upon arrival, the encoder and decoder sat at a table and were separated by an opaque black curtain. The participants could neither see nor talk to each other during the experiment, to preclude the possibility that they might provide nontactile clues to the emotion being communicated. Twelve emotion words were displayed serially to the encoder on sheets of paper in a randomized order. The encoder was instructed to think about how he or she wanted to communicate each emotion and then to make contact with the decoder’s bare arm from the elbow to the end of the hand to signal each emotion, using any form of touch he or she deemed appropriate. The decoder could not see any part of the touch because his or her arm was positioned on the encoder’s side of the curtain. We restricted the location of the touch because we wanted to limit the possibility that participants would receive any cues about their partner’s height, gender, build, etc. Participants were not told the partner’s gender and all tactile displays were video recorded. After each tactile display was administered, the decoder was administered a forced-choice response sheet reading “Please choose the term that best describes what this person is communicating to you.” The response sheet contained the following 13 response options: anger, disgust, fear, happiness, sadness, surprise, sympathy, embarrassment, love, envy, pride, and gratitude, as well as none of these terms are correct, to possibly reduce artificial inflation of accuracy rates (see Frank and Stennett 2001). These emotions were listed in random order across participants (Hertenstein et al. 2006a).
Measures and Coding
The dependent measure of interest to address our hypotheses was the proportion of participants selecting each response option when decoding the tactile stimuli. Thus, for each of the 12 target emotions, the proportion that participants chose each response option was computed (e.g., the proportion of participants that chose sympathy when sympathy was intended to be communicated by the encoder). This measure is in line with a long tradition in the emotion literature in which forced-choice methodologies have been employed (e.g., Ekman 1972; Frank and Stennett 2001; Tracy et al. 2009). In addition to this measure, we asked decoders on a questionnaire at the end of the study the following: “Do you think a male or a female was touching you in this experiment?” The response options included male or female.
All of the tactile displays were coded on a second-by-second basis by research assistants who were naive to the emotion being communicated. The coding system was informed by a survey of coding systems used by researchers investigating touch (e.g., Argyle 1975; Jones and Yarbrough 1985; Weiss 1992). The specific types of touch that were coded included holding the other, squeezing, stroking, rubbing, pushing, pulling, pressing, patting, tapping, shaking, pinching, trembling, poking, hitting, scratching, massaging, tickling, slapping, lifting, picking, finger interlocking, swinging, and tossing (i.e., tossing the decoder’s hand). In addition, the duration that each encoder touched the decoder for each emotion was calculated. Interrater agreement on all of the codes, based on 20% overlap in coders’ judgments, ranged from .83 to .99.
Results
Decoding of Emotions
In Table 2, we present the proportion of participants choosing the two most common response options for each of the target emotions of interest—sympathy, anger, and happiness. We display these data for each of the four gender dyad combinations. For example, amongst the all-male dyads, 70.4% of the decoders chose anger and 14.8% chose fear in the condition in which anger was attempted to be communicated by the encoder.
Table 2.
Encoder–decoder group | Emotion | |||||
---|---|---|---|---|---|---|
Anger | Happiness | Sympathy | ||||
Target | 2nd | Target | 2nd | Target | 2nd | |
Male–Male | 70.4** | FE 14.8 | 18.5 | GR 33.3 | 40.7 | LO 33.3 |
Male–Female | 57.1** | FE 14.3 | 28.6 | GR 25.0 | 64.3** | GR 10.7 |
Female–Female | 37.5 | DI 20.8 | 50.0** | GR 16.7 | 62.5** | SA, LO, GR 8.3 |
Female–Male | 59.3** | DI 25.9 | 25.9 | GR 18.5 | 59.3** | LO 18.5 |
Values under “target” refer to the proportion of time that the target emotion was chosen as the first choice (e.g., anger chosen in the anger touch condition) and the values in the “2nd” column refer to the next most common response option that was chosen. Asterisks denote greater-than-chance accuracy rates greater than 25% as tested by binomials
DI disgust, FE fear, SA sadness, LO love, GR gratitude
** p > .01
For each of our predictions, we conducted four binomial tests—one for each of the four possible gender dyad combinations—for each emotion of interest (i.e., sympathy, anger, and happiness). Specifically, the proportion of participants who chose each response option was assessed against chance for all of the target emotions. Following Frank and Stennett (2001), we set chance at 25% (for a rationale, see Hertenstein et al. 2006a). This strategy allowed us to test if a given gender dyad type was capable of communicating a given emotion at above chance levels. In the example described above, the binomial tests indicated that anger was chosen at above-chance levels, whereas fear was not.
The data presented in Table 2 support our hypotheses. Our first prediction was that when females are in an experimental dyad, sympathy would be decoded at above-chance levels. Supporting this hypothesis, sympathy was communicated at greater-than-chance levels only when a female comprised at least one member of the dyad (on average, 62% accuracy). Importantly, the second most commonly chosen response option for all of these dyad types never exceeded chance levels. Dyads consisting solely of males did not communicate sympathy at above chance levels.
Our second prediction was that dyads involving males would communicate anger with touch at above-chance levels. Consistent with this hypothesis, anger was communicated at greater-than-chance levels only when a male comprised at least one member of the dyad (on average, 62% accuracy). Moreover, the second most commonly chosen response option for all of these dyad types did not exceed chance. Dyads consisting only of females did not communicate anger at above chance levels.
Finally, we predicted that dyads comprised solely of females would communicate happiness with touch at above-chance levels. Supporting this hypothesis, only dyads consisting solely of females communicated happiness and the secondly most common response option, gratitude, was not above chance levels. This finding on happiness dovetails with studies showing that women smile more (LaFrance et al. 2003), share emotions more (Rimé et al. 2002), and experience more prosocial emotions (Shiota et al. 2006). It should be mentioned that the other target emotions investigated were either communicated by all of the four dyad types at greater-than-chance levels (fear, disgust, love, gratitude), or none of the four dyad types (sadness, surprise, embarrassment, envy, and pride).
It is clearly possible that decoders could reliably infer the gender of the encoder, perhaps from the quality of the touch administered. This categorization, furthermore, could have influenced their judgments of the emotion-related touches (Hess et al. 2004). As indicated, the gender of the encoder was not verbally revealed to the decoder. However, was it possible for decoders to ascertain encoders’ gender via touch? To address this question, we computed the percentage of cases in which decoders accurately inferred the gender of the encoder. Setting chance at 50%, we conducted binomial tests and found that 79% of female decoders correctly identified male encoders and 96% correctly identified female encoders (both ps < .01). For male decoders, 70% (p = .052) correctly identified male encoders and 81% (p < .01) correctly identified female encoders. These results indicate that decoders were capable of accurately decoding the encoder’s gender.
Encoding of Emotions
Did the gender composition of the dyads influence the tactile actions associated with the communication of the different emotions? In Table 3, we present the average durations of tactile contact in the four dyads for the emotions of interest. One-way omnibus ANOVAS were performed for the three emotions of interest entering the gender dyad type (4 levels) as the independent factor and the duration of time that elapsed for the tactile behaviors as the dependent variable. The duration of tactile behaviors did not differ between the dyad gender types for the emotions (all ps > .05). Post-hoc pair-wise comparisons were conducted for each emotion to examine whether there was any difference in duration between any two gender dyad combinations. These analyses yielded no statistically significant results (all ps > .05). These analyses indicate that the observed gender differences in the accuracy of communicating sympathy, anger, and happiness could not be attributed to differences in how long participants made tactile contact.
Table 3.
Encoder–Decoder group | Emotion | |||||
---|---|---|---|---|---|---|
Anger | Happiness | Sympathy | ||||
M | SD | M | SD | M | SD | |
Male–Male | 3.47 | (1.68) | 7.46 | (4.39) | 6.91 | (3.91) |
Male–Female | 5.14 | (3.25) | 8.00 | (5.06) | 8.33 | (6.28) |
Female–Female | 5.23 | (7.63) | 8.58 | (7.42) | 7.50 | (4.20) |
Female–Male | 4.38 | (2.58) | 8.43 | (3.50) | 7.50 | (5.83) |
Average | 4.54 | (3.74) | 8.10 | (5.55) | 7.57 | (5.21) |
Table 4 presents data relevant to the more specific patterns of tactile behavior that women and men relied on to communicate the emotions of interest. The values reported indicate the percentage of time encoders used each quality of touch as a function of the total time touched for the trial. Here, one finds differences that might explain the decoding differences reported earlier. Sympathy was associated most with patting for all dyads, but the second most common behavior for all male dyads was shaking, whereas this was not the case for the other dyads. Turning to anger, one finds that this emotion was associated with squeezing for all dyad groups except all female dyads, the group that did not communicate anger at above-chance levels. In addition, pushing was not one of the most frequent types of touch for the male–female group, whereas it was for the other dyad types. Finally, finger interlocking was in the most frequently employed tactile behaviors except the group that accurately communicated happiness—the group comprised solely of females. Patting, however, was one of the most frequent types of touch used by the all female dyad, whereas this was not true for the other dyad types.
Table 4.
Encoder–Decoder group | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Emotion | Male–Male | Male–Female | Female–Female | Female–Male | ||||||||
Tactile behavior | M | SD | Tactile behavior | M | SD | Tactile behavior | M | SD | Tactile behavior | M | SD | |
Anger | Squeeze | 40.44 | 47.07 | Hit | 30.31 | 43.23 | Shake | 22.22 | 44.10 | Tremble | 18.00 | 38.78 |
Hit | 27.17 | 43.75 | Squeeze | 16.31 | 33.89 | Press | 18.56 | 37.06 | Hit | 16.25 | 33.74 | |
Tremble | 16.94 | 35.86 | Lift | 12.06 | 26.19 | Hit | 16.62 | 30.90 | Slap | 15.75 | 29.78 | |
Push | 7.39 | 24.38 | Shake | 11.81 | 24.7 | Slap | 13.89 | 33.33 | Squeeze | 13.75 | 30.76 | |
Slap | 5.11 | 15.98 | Slap | 8.31 | 25.8 | Push | 11.11 | 33.33 | Push | 9.44 | 25.80 | |
Happiness | Shake | 40.00 | 52.92 | Swing | 58.00 | 40.25 | Swing | 66.32 | 48.36 | Swing | 63.71 | 36.73 |
Swing | 33.33 | 57.74 | Shake | 17.63 | 28.75 | Shake | 14.08 | 25.78 | Finger interlocking | 14.29 | 37.80 | |
Slap | 26.67 | 46.19 | Lift | 7.63 | 7.46 | Pat | 8.33 | 20.77 | Shake | 6.00 | 12.37 | |
Lift | 13.33 | 23.09 | Finger interlocking | 7.63 | 21.57 | Tap | 8.33 | 28.87 | Slap | 3.86 | 10.21 | |
Finger interlock | 4.39 | 9.28 | Stroke | 3.13 | 8.84 | Lift | 4.17 | 8.55 | Lift | 1.57 | 4.16 | |
Sympathy | Pat | 38.36 | 36.77 | Pat | 34.78 | 38.67 | Pat | 39.57 | 29.25 | Pat | 27.87 | 34.20 |
Shake | 15.73 | 32.11 | Tremble | 11.67 | 20.08 | Stroke | 22.22 | 31.65 | Stroke | 17.00 | 30.68 | |
Stroke | 10.18 | 16.52 | Stroke | 10.89 | 22.39 | Shake | 6.60 | 10.06 | Tremble | 8.87 | 26.61 | |
Push | 6.45 | 21.41 | Squeeze | 9.28 | 21.88 | Rub | 4.33 | 10.29 | Rub | 8.80 | 26.07 | |
Rub | 4.55 | 15.08 | Rub | 8.39 | 17.02 | Squeeze | 3.07 | 8.93 | Shake | 4.07 | 13.02 |
Mean values refer to the percentage of total touch time each touch quality was used by the encoder to communicate the given emotion. Values range from 0 to 100 and greater values reflect a greater percentage of time that a particular quality of touch was utilized by the encoder
No formal hypotheses were proposed for the tactile encoding behaviors and there were a large number of touch qualities coded making inferential analyses unwieldy and difficult to interpret. Moreover, it is important to note that variability in the types of touch used by the dyads was marked. For these reasons, great inferential caution should be exercised when considering the links between encoding behaviors and decoding accuracy, as well as making comparative statements between the gender dyad types. Moreover, the great variability in the data obviates against there being a “prototype” expression of emotion as is often implicated in the face and voice (Ekman 1993); instead, there seems to be a multitude of ways in which emotion can be communicated via touch between humans.
Discussion
Here, we documented gender differences in the communication of distinct emotions via touch between humans. Guided by evolutionary and social role accounts of emotion, as well as the empirical literature, we hypothesized that women would be able to communicate sympathy and happiness through brief touches to the arm of a stranger, whereas men would be able to communicate anger. The data from the present study supported these predictions. We observed no gender-related differences in the communication of disgust, fear, envy, embarrassment, sadness, pride, love, and gratitude.
Sympathy was only communicated accurately through tactile contact to the arm in dyads comprised of at least one female; dyads consisting of only males communicated sympathy at less-than-chance levels. This result is consistent with studies documenting gender differences in self-reports of compassion (Shiota et al. 2006), self-reports of empathy (Eisenberg and Lennon 1983), and interests in care-taking (Gilligan 1982). Whether similar gender differences in the communication of sympathy would be observed in studies of emotion-related facial display or vocalization is an important question, and one that would more fully characterize the extent to which women enjoy an advantage in communicating the quintessential care-taking emotion—sympathy.
Anger, in contrast, was communicated accurately only when the dyad contained at least one male; dyads comprised of only females communicated anger at less-than-chance levels. Interestingly, the most accurate dyad groups were those comprised solely of males. These findings dovetail with the well documented tendency for men to show more aggressive behavior than women (Daly and Wilson 1994), stereotypes of men as more angry, and recent evidence generated by Becker et al. (2007) in the realm of facial displays finding that humans’ perceptual systems are tuned to be particularly sensitive to angry facial expression by males. In their research, participants more quickly and accurately classified the word angry with male faces than female faces, and more quickly and accurately judged angry faces when they were displayed by males than females.
Finally, we found that the gender composition of the dyads also affected the communication of happiness. That is, only dyads comprised solely of females communicated happiness at greater-than-chance levels. As mentioned, this finding dovetails with studies showing that women smile more (LaFrance et al. 2003), share emotions more (Rimé et al. 2002), and experience more prosocial emotions (Shiota et al. 2006). The data are also consistent with Becker et al.’s (2007) work indicating that participants (a) thought of female facial displays more often than male displays when asked to spontaneously generate a mental image of a happy face, (b) more quickly and accurately classified the word happy with female faces than male faces, (c) more quickly and accurately judged happy faces when they were displayed by females than males, and (d) perceived faces as more happy when they were feminized.
The present study adopted the design of some traditional emotion recognition investigations in the field. Given that the study was not a true experiment, causal inferences must be made cautiously. However, several features of the paradigm increase our confidence in the findings (Hertenstein et al. 2006a). In most previous judgment studies, observers judged highly prototypical displays or those posed by actors, whereas in our study people decoded emotion from the idiosyncratic tactile actions of other untrained participants (see Hertenstein 2010 for a discussion regarding bottom-up approaches to emotion; also, see Clynes and Nettheim (1982) for a unique approach to studying button-pressing and the association of some emotions). Second, our response format included the response option none of these terms are correct, which reduced the likelihood of inflated accuracy rates (Frank and Stennett 2001). Finally, we restricted the tactile stimulation to one location on the body, thus eliminating one aspect of tactile communication—location on the body of the touch recipient—that is likely to provide additional information with respect to the emotion communicated.
What might explain the findings we observed in terms of gender and touch? Because decoders could not see the “tactile interaction” that transpired on the other side of the screen, decoders must have relied upon tactile cues alone to ascertain the gender of the encoder. Our data indicated that decoders accurately perceived encoders’ gender between 70% and 96% of the cases depending on the specific gender-dyad composition. As a result, it is possible that decoders may have been interpreting the encoder’s touch by means of gender-stereotypes (Hess et al. 2004). If this were occurring, even at non-conscious levels, stereotypes may have inflated accuracy for the emotions most readily associated with gender stereotypes like anger and sympathy. According to Hess et al., women are expected to display sadness more and men to display more anger. Indeed, when subjects rated the likelihood of neutral facial displays to exhibit various emotions, women’s faces were expected to display more stereotypic female emotions whereas men’s faces were expected to show more male stereotypic faces (Hess et al. 2007). Although our study was not designed to investigate whether these findings held in the tactile modality, similar processes may underlie our results. This points to a future area of investigation.
It is also possible that decoders’ knowledge of the encoders’ gender may play a role in the meaning that they attributed to touch because of stereotypes (Brody and Hall 2008). For example, decoders may be more likely to interpret a particular tactile gestalt from a female as sympathy whereas decoders may interpret the same tactile gestalt from a male as a different emotion. These explanations are consistent with stereotype theorists (e.g., Biernat 2003), as well as empirical studies (e.g., Hess et al. 2000) which indicate that membership in a stereotyped group—in this case, the gender of both the encoder and decoder—can drive ambiguous perceptions in the direction of the stereotypes.
The above explanations focus on the decoder, but the tactile behaviors used by the encoders may well lead to the observed gender differences in perception. There is evidence of gender differences in the behaviors used by encoders to communicate emotion (Hertenstein et al. 2009). In the current study, the gender of decoders was never verbally revealed to encoders by the experimenter. However, given that encoders administered the tactile stimulation on their side of the opaque screen to the decoders, the encoders were able to visually see the morphology of the decoder’s arm. Although we did not ask encoders at the end of the study whether they believed they were touching a male or female decoder, we think it likely that encoders knew the gender of decoders. It is possible that the gender of the decoder may have influenced the tactile behaviors used by the encoder to communicate the emotions. Moreover, the gender of the decoder may interact with the gender of the encoder to influence the demonstrated behavior. Indeed, there is evidence of this in Table 4. For example, dyads comprised entirely of males squeezed each other 40% of the time, whereas males squeezed females less than half of this time in the anger condition. Also noteworthy in this condition is that squeezing behavior in dyads comprised entirely of females was not in the five most common types of touch and was only evident 14% of the time in dyads comprised of female encoders and male decoders. Overall, there was evidence that the target’s gender influenced the behavior of the encoder and that the gender of both participants interacted. Again, this is consistent with the literature indicating that group membership norms—in this case gender—for displaying particular emotions influences both the decoding and encoding of emotions (Kirouac and Hess 1999).
In sum, our study documents gender differences in the communication of distinct emotions between humans via touch. Studies have shown that females more accurately identify the meaning of a variety of non-verbal cues, including expressions of emotion (Brody and Hall 2008). However, Brody and Hall persuasively argue that the goal for current researchers is to identify and document specific variables that moderate and mediate gender differences evident in non-verbal communication. Indeed, our study demonstrates that there is not an overall female advantage in encoding and decoding emotion as is sometimes suggested, but that it is emotion specific in the tactile modality.
Our study suggests a number of important directions for future research. First, it will be important for research to identify the sources of the gender differences in the communication of emotion that we observed. We examined whether these differences might be related to differences in tactile behaviors. Experimental studies could directly document whether these behavioral differences produce differences in the decoding of sympathy, anger, and happiness. Second, studies of stratified and egalitarian cultures with respect to gender could more explicitly address whether these gender differences in sympathy, anger, and happiness hold across cultures where the gender roles are more differentiated or not (Wood and Eagly 2002). This kind of research will document the deeper origins of likely gender differences in the communication of emotion via touch. Third, future research should examine why some paradigms used to study tactile human communication do not yield gender differences, at least in the decoding of emotion (e.g., Bailenson et al. 2007; Hertenstein et al. 2009). Fourth, investigations specifically designed to uncover potential gender differences in how emotions are communicated via haptic devices (e.g., Bailenson et al. 2007; Smith and MacLean 2007) and button-pressing (e.g., Clynes and Nettheim 1982) would be valuable. Finally, it will be important to examine possible gender differences from a developmental perspective (Hertenstein 2002). Our study contained a few participants that extended the age range of our sample which may have influenced the findings (4 members over 30-years-old). It will be important for future research to systematically examine how age interacts with gender in the communication of emotion via touch.
Acknowledgments
Open Access
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
References
- Argyle M. Bodily communication. Oxford: International Universities Press; 1975. [Google Scholar]
- Bailenson JN, Yee N, Brave S, Merget D, Koslow D. Virtual interpersonal touch: Expressing and recognizing emotions through haptic devices. Human-Computer Interaction. 2007;22:325–353. [Google Scholar]
- Banse R, Scherer KR. Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology. 1996;70:614–636. doi: 10.1037/0022-3514.70.3.614. [DOI] [PubMed] [Google Scholar]
- Becker DV, Kenrick DT, Neuberg SL, Blackwell KC, Smith DM. The confounded nature of angry men and happy women. Journal of Personality and Social Psychology. 2007;92:179–190. doi: 10.1037/0022-3514.92.2.179. [DOI] [PubMed] [Google Scholar]
- Berkowitz L. Aggression: Its causes, consequences, and control. New York: McGraw-Hill; 1993. [Google Scholar]
- Biernat M. Toward a broader view of social stereotyping. The American Psychologist. 2003;58:1019–1027. doi: 10.1037/0003-066X.58.12.1019. [DOI] [PubMed] [Google Scholar]
- Birnbaum DW. The etiology of children’s stereotypes about sex differences in emotionality. Sex Roles. 1984;10:677–691. doi: 10.1007/BF00287379. [DOI] [Google Scholar]
- Brody LR. On understanding gender differences in the expression of emotion: Gender roles, socialization and language. In: Ablon SL, Brown D, Khantzian EJ, Mack JE, editors. Human feelings: Exploration in affect development and meaning. England: Analytic; 1993. pp. 87–121. [Google Scholar]
- Brody LR, Hall JA. Gender, emotion, and expression. In: Lewis M, Haviland-Jones JM, editors. Handbook of emotions. 2. New York: Guilford; 2000. pp. 338–349. [Google Scholar]
- Brody LR, Hall JA. Gender and emotion in context. In: Lewis M, Haviland-Jones J, editors. Handbook of emotions. 3. New York: Guilford; 2008. pp. 395–408. [Google Scholar]
- Citrin LB, Roberts T, Fredrickson BL. Objectification theory and emotions: A feminist psychological perspective on gendered affect. In: Tiedens LZ, Leach CW, editors. The social life of emotions. New York: Cambridge University Press; 2004. pp. 203–223. [Google Scholar]
- Clark C. Misery and company: Sympathy in everyday life. Chicago: University of Chicago Press; 1997. [Google Scholar]
- Clark RD, Shields G. Family communication and delinquency. Adolescence. 1997;32:81–92. [PubMed] [Google Scholar]
- Clynes M, Nettheim N. The living quality of music: Neurobiologic patterns of communicating feeling. In: Clynes M, editor. Music, mind and brain: The neuropsychology of music. New York: Plenum; 1982. pp. 47–82. [Google Scholar]
- Daly M, Wilson M. Evolutionary psychology of male violence. In: Archer J, editor. Male violence. New York: Routledge; 1994. pp. 253–288. [Google Scholar]
- Day AL, Carroll SA. Using an ability-based measure of emotional intelligence to predict individual performance, group performance, and group citizenship behaviors. Personality and Individual Differences. 2004;36:1443–1458. doi: 10.1016/S0191-8869(03)00240-X. [DOI] [Google Scholar]
- Eisenberg N, Lennon R. Sex differences in empathy and related capacities. Psychological Bulletin. 1983;94:100–131. doi: 10.1037/0033-2909.94.1.100. [DOI] [Google Scholar]
- Eisenberg N, Fabes RA, Miller PA, Fultz J, Shell R, Mathy RM, et al. Relation of sympathy and personal distress to prosocial behavior: A multimethod study. Journal of Personality and Social Psychology. 1989;57:55–66. doi: 10.1037/0022-3514.57.1.55. [DOI] [PubMed] [Google Scholar]
- Ekman P. Universals and cultural differences in facial expressions of emotion. In: Cole J, editor. Nebraska symposium on motivation 1971. Lincoln: University of Nebraska Press; 1972. pp. 207–283. [Google Scholar]
- Ekman P. Facial expressions of emotion. The American Psychologist. 1993;48:384–392. doi: 10.1037/0003-066X.48.4.384. [DOI] [PubMed] [Google Scholar]
- Ekman P, Friesen WV. Facial action coding system: A technique for the measurement of facial movement. Palo Alto: Consulting Psychologists; 1978. [Google Scholar]
- Elfenbein HA, Ambady N. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin. 2002;128:203–235. doi: 10.1037/0033-2909.128.2.203. [DOI] [PubMed] [Google Scholar]
- Fischer AH. Sex differences in emotionality: Fact or stereotype? Feminism & Psychology. 1993;3:303–318. doi: 10.1177/0959353593033002. [DOI] [Google Scholar]
- Fivush R, Reese E, Haden CA. Elaborating on elaborations: Role of maternal reminiscing style in cognitive and socioemotional development. Child Development. 2006;77:1568–1588. doi: 10.1111/j.1467-8624.2006.00960.x. [DOI] [PubMed] [Google Scholar]
- Frank MG, Stennett J. The forced-choice paradigm and the perception of facial expressions of emotion. Journal of Personality and Social Psychology. 2001;80:75–85. doi: 10.1037/0022-3514.80.1.75. [DOI] [PubMed] [Google Scholar]
- Gilligan C. In a different voice: Psychological theory and women’s development. Cambridge: Harvard University Press; 1982. [Google Scholar]
- Goetz JL, Keltner D. Shifting meanings of self-conscious emotions across cultures: A social-functional approach. In: Tracy JL, Robins RW, Tangney JP, editors. The self-conscious emotions: Theory and research. New York: Guilford; 2007. pp. 153–173. [Google Scholar]
- Gohm CL, Clore GL. Individual differences in emotional experience: Mapping available scales to processes. Personality and Social Psychology Bulletin. 2000;26:679–697. doi: 10.1177/0146167200268004. [DOI] [Google Scholar]
- Gross JJ, John OP. Mapping the domain of expressivity: Multimethod evidence for a hierarchical model. Journal of Personality and Social Psychology. 1998;74:170–191. doi: 10.1037/0022-3514.74.1.170. [DOI] [PubMed] [Google Scholar]
- Gross JJ, Fredrickson BL, Levenson RW. The psychophysiology of crying. Psychophysiology. 1994;31:460–468. doi: 10.1111/j.1469-8986.1994.tb01049.x. [DOI] [PubMed] [Google Scholar]
- Halberstadt AG, Hayes CW, Pike K. Gender and gender role differences in smiling and communication consistency. Sex Roles. 1988;19:589–604. doi: 10.1007/BF00289738. [DOI] [Google Scholar]
- Hall JA. Nonverbal sex difference: Accuracy of communication and expressive style. Baltimore: Johns Hopkins University Press; 1990. [Google Scholar]
- Hall JA, Carter JD, Horgan TG. Gender differences in nonverbal communication of emotion. In: Fischer AH, editor. Gender and emotion: Social psychological perspectives. New York: Cambridge University Press; 2000. pp. 97–117. [Google Scholar]
- Hall JA, Carney DR, Murphy NA. Gender differences in smiling. In: Abel MH, editor. An empirical reflection on the smile. Lewiston: Edwin Mellen; 2002. pp. 155–185. [Google Scholar]
- Hall JA, Coats EJ, Smith LeBeau L. Nonverbal behavior and the vertical dimension of social relations: A meta-analysis. Psychological Bulletin. 2005;131:898–924. doi: 10.1037/0033-2909.131.6.898. [DOI] [PubMed] [Google Scholar]
- Henley NM. Status and sex: Some touching observations. Bulletin of the Psychonomic Society. 1973;2:91–93. [Google Scholar]
- Henley N. Body politics: Power, sex and nonverbal communication. Englewood Cliffs: Prentice-Hall; 1977. [Google Scholar]
- Henley NM. Body politics revisited: What do you know today? In: Kalbfleisch PJ, Cody MJ, editors. Gender, power, and communication in human relationships. Hillsdale: Erlbaum; 1995. pp. 27–61. [Google Scholar]
- Hertenstein MJ. Touch: Its communicative functions in infancy. Human Development. 2002;45:70–94. doi: 10.1159/000048154. [DOI] [Google Scholar]
- Hertenstein MJ. Cautions in the study of infant emotional displays. Emotion Review. 2010;2:130–131. doi: 10.1177/1754073909355005. [DOI] [Google Scholar]
- Hertenstein MJ, Campos JJ. Emotion regulation via maternal touch. Infancy. 2001;2:549–566. doi: 10.1207/S15327078IN0204_09. [DOI] [PubMed] [Google Scholar]
- Hertenstein MJ, Keltner D, App B, Bulleit BA, Jaskolka AR. Touch communicates distinct emotions. Emotion. 2006a;6:528–533. doi: 10.1037/1528-3542.6.3.528. [DOI] [PubMed] [Google Scholar]
- Hertenstein MJ, Verkamp JM, Kerestes AM, Holmes RM. The communicative functions of touch in humans, nonhuman primates, and rats: A review and synthesis of the empirical research. Genetic, Social, and General Psychology Monographs. 2006b;132:5–94. doi: 10.3200/MONO.132.1.5-94. [DOI] [PubMed] [Google Scholar]
- Hertenstein MJ, Holmes R, McCullough M, Keltner D. The communication of emotion via touch. Emotion. 2009;9:566–573. doi: 10.1037/a0016108. [DOI] [PubMed] [Google Scholar]
- Hess U, Senécal S, Kirouac G, Herrera P, Philippot P, Kleck RE. Emotional expressivity in men and women: Stereotypes and self-perceptions. Cognition & Emotion. 2000;14:609–642. doi: 10.1080/02699930050117648. [DOI] [Google Scholar]
- Hess U, Adams RB, Kleck RE. Facial appearance, gender, and emotion expression. Emotion. 2004;4:378–388. doi: 10.1037/1528-3542.4.4.378. [DOI] [PubMed] [Google Scholar]
- Hess U, Adams R, Kleck R. When two do the same, it might not mean the same: The perception of emotional expressions shown by men and women. In: Hess U, Philippot P, editors. Group dynamics and emotional expression. New York: Cambridge University Press; 2007. pp. 33–50. [Google Scholar]
- Izard, C. E. (1979). The maximally discriminative facial movement coding system (MAX). Unpublished manuscript. Available from Instructional Resource Center, University of Delaware, Newark, DE.
- Jones SE, Yarbrough AE. A naturalistic study of the meanings of touch. Communication Monographs. 1985;52:19–56. doi: 10.1080/03637758509376094. [DOI] [Google Scholar]
- Kagan J. What is emotion? History, measures, and meaning. New Haven: Yale University Press; 2007. [Google Scholar]
- Kenrick D, Trost M, Sundie J. Sex roles as adaptations: An evolutionary perspective on gender differences and similarities. In: Eagly AH, Sternberg RJ, Beall AE, editors. The psychology of gender. 2. New York: Guilford; 2004. pp. 65–91. [Google Scholar]
- Kirouac G, Hess U. Group membership and the decoding of nonverbal behavior. In: Philippot P, Feldman RS, Coats EJ, editors. The social context of nonverbal behavior. New York: Cambridge University Press; 1999. pp. 182–210. [Google Scholar]
- Knapp ML, Hall JA. Nonverbal communication in human interaction. 4. Fort Worth: Harcourt Brace College; 1997. [Google Scholar]
- Kring AM. Gender and anger. In: Fischer AH, editor. Gender and emotion: Social psychological perspectives. New York: Cambridge University Press; 2000. pp. 211–231. [Google Scholar]
- Kring AM, Gordon AH. Sex differences in emotion: Expression, experience, and physiology. Journal of Personality and Social Psychology. 1998;74:686–703. doi: 10.1037/0022-3514.74.3.686. [DOI] [PubMed] [Google Scholar]
- LaFrance M, Henley NM. On oppressing hypotheses: Or differences in nonverbal sensitivity revisited. In: Radtke HL, Stam HJ, editors. Power/gender: Social relations in theory and practice. Thousand Oaks: Sage; 1994. pp. 287–311. [Google Scholar]
- LaFrance M, Hecht MA. Option or obligation to smile: The effects of power and gender on facial expression. In: Philippot P, Feldman RS, Coats EJ, editors. The social context of nonverbal behavior. New York: Cambridge University Press; 1999. pp. 45–70. [Google Scholar]
- LaFrance M, Hecht MA, Paluck EL. The contingent smile: A meta-analysis of sex differences in smiling. Psychological Bulletin. 2003;129:305–334. doi: 10.1037/0033-2909.129.2.305. [DOI] [PubMed] [Google Scholar]
- Mayer JD, Caruso DR, Salovey P. Emotional intelligence meets traditional standards for an intelligence. Intelligence. 2000;27:267–298. doi: 10.1016/S0160-2896(99)00016-1. [DOI] [Google Scholar]
- Plant EA, Hyde JS, Keltner D, Devine PG. The gender stereotyping of emotions. Psychology of Women Quarterly. 2000;24:81–92. doi: 10.1111/j.1471-6402.2000.tb01024.x. [DOI] [Google Scholar]
- Rimé B, Corsini S, Herbette G. Emotion, verbal expression, and the social sharing of emotion. In: Fussell SR, editor. The verbal communication of emotions: Interdisciplinary perspectives. Mahwah: Erlbaum; 2002. pp. 185–208. [Google Scholar]
- Robinson MD, Johnson JT. Is it emotion or is it stress? Gender stereotypes and the perception of subjective experience. Sex Roles. 1997;36:235–258. doi: 10.1007/BF02766270. [DOI] [Google Scholar]
- Roter DL, Hall JA, Aoki Y. Physician gender effects in medical communication: A meta-analytic review. Journal of the American Medical Association. 2002;288:756–764. doi: 10.1001/jama.288.6.756. [DOI] [PubMed] [Google Scholar]
- Scherer KR, Johnstone T, Klasmeyer G. Vocal expression of emotion. In: Davidson RJ, Goldsmith HH, Scherer KR, editors. Handbook of affective sciences. New York: Oxford University Press; 2003. pp. 433–456. [Google Scholar]
- Shields S. Thinking about gender, thinking about theory: Gender and emotional experience. In: Fischer AH, editor. Gender and emotion: Social psychological perspectives. New York: Cambridge University Press; 2000. pp. 3–23. [Google Scholar]
- Shiota M, Keltner D, John O. Positive emotion dispositions differentially associated with Big Five personality and attachment style. The Journal of Positive Psychology. 2006;1:61–71. doi: 10.1080/17439760500510833. [DOI] [Google Scholar]
- Simon RW, Nath LE. Gender and emotion in the United States: Do men and women differ in self-reports of feelings and expressive behavior? The American Journal of Sociology. 2004;109:1137–1176. doi: 10.1086/382111. [DOI] [Google Scholar]
- Smith J, MacLean K. Communicating emotion through a haptic link: Design space and methodology. International Journal of Human Computer Studies. 2007;65:376–387. doi: 10.1016/j.ijhcs.2006.11.006. [DOI] [Google Scholar]
- Stier DS, Hall JA. Gender differences in touch: An empirical and theoretical review. Journal of Personality and Social Psychology. 1984;47:440–459. doi: 10.1037/0022-3514.47.2.440. [DOI] [Google Scholar]
- Tiedens LZ, Leach CW. The social life of emotions. New York: Cambridge University Press; 2004. [Google Scholar]
- Timmers M, Fischer AH, Manstead ASR. Ability versus vulnerability: Beliefs about men’s and women’s emotional behavior. Cognition & Emotion. 2003;17:41–63. doi: 10.1080/02699930302277. [DOI] [PubMed] [Google Scholar]
- Tracy JL, Robins RW, Schriber RA. Development of a FACS-verified set of basic and self-conscious emotion expressions. Emotion. 2009;9:554–559. doi: 10.1037/a0015766. [DOI] [PubMed] [Google Scholar]
- Weiss SJ. Measurement of the sensory qualities in tactile interaction. Nursing Research. 1992;41:82–86. doi: 10.1097/00006199-199203000-00005. [DOI] [PubMed] [Google Scholar]
- Wood W, Eagly AH. A cross-cultural analysis of the behavior of women and men: Implications for the origins of sex differences. Psychological Bulletin. 2002;128:699–727. doi: 10.1037/0033-2909.128.5.699. [DOI] [PubMed] [Google Scholar]