Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Sep 1.
Published in final edited form as: J Nonverbal Behav. 2015 Sep 1;39(3):215–240. doi: 10.1007/s10919-015-0208-6

Sensitivity to Spatiotemporal Percepts Predicts the Perception of Emotion

Vanessa L Castro, R Thomas Boone
PMCID: PMC4554541  NIHMSID: NIHMS686367  PMID: 26339111

Abstract

The present studies examined how sensitivity to spatiotemporal percepts such as rhythm, angularity, configuration, and force predicts accuracy in perceiving emotion. In Study 1, participants (N = 99) completed a nonverbal test battery consisting of three nonverbal emotion perception tests and two perceptual sensitivity tasks assessing rhythm sensitivity and angularity sensitivity. Study 2 (N = 101) extended the findings of Study 1 with the addition of a fourth nonverbal test, a third configural sensitivity task, and a fourth force sensitivity task. Regression analyses across both studies revealed partial support for the association between perceptual sensitivity to spatiotemporal percepts and greater emotion perception accuracy. Results indicate that accuracy in perceiving emotions may be predicted by sensitivity to specific percepts embedded within channel- and emotion-specific displays. The significance of such research lies in the understanding of how individuals acquire emotion perception skill and the processes by which distinct features of percepts are related to the perception of emotion.

Keywords: emotion perception, emotion recognition, rhythm, angularity, force, spatiotemporal percept

Introduction

The communication of emotion is an organic process in both humans and animals (Buck, Savin, Miller, and Caul 1969; Darwin 1872; Miller, Murphy, and Mirsky 1959). The ability to accurately perceive emotions is related to enhanced socioemotional functioning across the lifespan (Halberstadt and Hall 1980; Halberstadt, Parker, and Castro 2013; Hodgins and Zuckerman 1990; Pitterman and Nowicki 2004; Sabatelli, Buck, and Dreyer 1982; Sternglanz and DePaulo 2004; Trentacosta and Fine 2010) while the inability to accurately perceive emotions has been implicated in several psychological impairments, including autism (Ashwin, Chapman, Colle, and Baron-Cohen 2006; Ingersoll 2010) and schizophrenia (Shean, Bell, and Cameron 2007; Toomey, Schuldberg, Corrigan, and Green 2002; Toomey, Wallace, Corrigan, Schuldberg, and Green 1997). Given the wide-ranging impact of this ability, the acquisition of skill in this domain merits extensive research.

To date, the research on emotion perception has revealed several interesting individual differences. Perhaps the most well-known finding is that of gender, with women outperforming men on emotion perception tasks (Hall and Matsumoto 2004; Rosip and Hall 2004). Age is also predictive of emotion perception, and the trajectory suggests an inverted U-shaped function with gains in emotion perception skill from infancy through childhood and adolescence, peak performance in middle age, and decline in older adulthood (for reviews see Halberstadt et al. 2013; Harris 2008). There is some evidence of an emotion perception personality profile, such that people scoring high on openness, conscientiousness, and empathy are more accurate at perceiving others' emotions (Hall, Andrzejewski, and Yopchick 2009; Matsumoto et al. 2000; Terracciano, Merritt, Zonderman, and Evans 2003). In addition, cultural group differences may afford perceptual advantages, as emotion perception is enhanced for in-group members as compared to out-group members (Elfenbein and Ambady, 2002; Kang and Lau 2012).

Although empirically we know a lot about such individual differences (i.e., gender, age, personality traits, and culture), less is known regarding other perceptual experience-dependent factors that are more proximal to and embedded within the process of perceiving emotion. Recent lines of research indicate that configural features play a significant role in the process of perceiving emotion. Several theorists posit that emotional meaning is communicated through abstract spatiotemporal forms or percepts, such as rhythm, angularity, geometric configuration, and force, which are embedded in the actions and acoustics of the nonverbal media. (e.g., Aronoff 2006; Aronoff, Woike, and Hyman 1992; Atkinson, Tunstall, and Dittrich 2007; Boone 1996; Cunningham and Sterling 1988; DeMeijer 1989; Hunter, Schellenberg, and Schimmack 2008; Pavlova, Sokolov, and Sokolov 2005; Scherer and Oshinsky 1977). These percepts vary across space and time, hence referred to as spatiotemporal percepts.

For example, Aronoff (2006) demonstrated that threat was accurately conveyed to perceivers through a set of geometric patterns (such as lines and curves) as opposed to specific facial features; similar findings have been shown with anger and happiness as well. The main contribution from Aronoff's research is that distinct emotional cues, such as facial features and body postures, are not necessary to convey emotional meaning when the relevant abstract percepts are present. Similarly, other researchers have found evidence for the existence of abstract spatiotemporal percepts, including angularity (Aronoff et al. 1992; Pavlova et al. 2005), and kinematic properties such as rhythm (Atkinson et al. 2007; Boone 1996; Cunningham and Sterling 1988; Hunter et al. 2008), and force (DeMeijer 1989).

Building upon this initial line of research, we propose that emotion perception may be predicted by sensitivity to spatiotemporal percepts. If spatiotemporal percepts are a mechanism by which emotions are communicated across different nonverbal media, then greater sensitivity to the various percepts should dictate greater nonverbal accuracy. Research indicates a strong association between experience in a domain and increased nonverbal accuracy (Vrij and Semin 1996). Thus, experience may amplify sensitivity to abstract elements of percepts that intersect throughout the nonverbal space, allowing us to make better inferences about emotion. For example, rhythm sensitivity may heighten awareness of rhythm in dynamic displays of emotion and angularity sensitivity may heighten the awareness of angularity in static displays of emotion. Depending on the media, certain spatiotemporal percepts are made more salient, resulting in increased skill in perceiving emotion in some media more than others. We investigated this possibility in a series of studies.

Study 1

The first study sought to extend previous findings by examining whether individual differences in sensitivity to rhythm (a dynamic percept) and angularity (a static percept) are related to accuracy in perceiving emotions. Participants completed a rhythm sensitivity task, an angularity sensitivity task, and a nonverbal test battery, including two subtests of the Diagnostic Analysis of Nonverbal Accuracy (DANVA2, Nowicki and Duke 1994) and the Emotion Contrast Task (ECT;). It was predicted that rhythm sensitivity and angularity sensitivity would predict the perception of emotion in the following ways:

H1: Sensitivity to rhythm will significantly predict the perception of emotion on the dynamic nonverbal decoding test. That is, participants who are able to recognize and maintain a rhythm will demonstrate greater accuracy in perceiving emotions on the ECT than participants who are unable to recognize and maintain a rhythm.
H2: Sensitivity to angularity will significantly predict the perception of emotion on the static nonverbal decoding tests. That is, participants who are better able to detect subtle differences between identical figures presented at two angles will demonstrate greater accuracy in perceiving emotions on the DANVA2-AF and the DANVA2-POS than participants who are less able to detect these angular changes.

Methods

Participants

A total of 99 volunteers (47 males, 52 females) from participated in this study in exchange for course credit in an introductory psychology course. Participants ranged in age from 18 to 24 (M = 19.41, SD = 1.31).

Nonverbal Test Battery

Participants were administered a nonverbal test battery consisting of three nonverbal emotion decoding tests. All tests were scored using an accuracy model. Participants received a value of 0 for incorrect responses and a value of 1 for correct responses. Accuracy was then averaged across all items to create a mean emotion perception accuracy score for each nonverbal test. Nonverbal tests were presented either on a computer monitor or pre-recorded video, and answers were recorded with paper and pencil.

The Diagnostic Analysis of Nonverbal Accuracy for Adult Faces (DANVA2-AF; Nowicki and Duke 1994) is a measure of an individual's ability to identify emotion in adult facial expressions. The test is composed of 24 slides, each slide depicting a color photograph of a male or female expressing one of four discrete emotions (happy, sad, angry, fearful). There are six photographs of each emotion counterbalanced by gender. Participants were shown each slide and then were presented with four emotion categories (happy, sad, angry, fearful) and asked to detect the emotion being expressed by circling the answer on a sheet. The DANVA2-AF has an average Cronbach's coefficient alpha of .78 (Nowicki and Duke 1994). The DANVA2-AF has been shown to accurately predict job success (Byron, Terranova, and Nowicki 2007), and lower DANVA2-AF scores have been related to higher ratings of depression and lower ratings of relationship well-being in college students (Nowicki 2003).

The Diagnostic Analysis of Nonverbal Accuracy for Postures (DANVA2-POS; Pitterman and Nowicki 2004) is composed of 32 color photographs depicting various sitting or standing postures expressed by a male or female. Each standing and sitting posture displays one of four emotions (happy, sad, angry, or fearful), resulting in eight photographs of each emotion counterbalanced by gender. The faces in the photographs are blurred such that no facial expression is observed. The photographs are presented individually on slides. Participants are presented with the four emotion categories and are asked to identify the emotion being expressed by circling the answers on a separate on a sheet. The DANVA2-POS has a Cronbach's coefficient alpha of .78 (Nowicki and Duke 1992). The DANVA2-POS has been shown to accurately predict specific interpersonal subscales having to do with lack of friends and suspiciousness (Shean et al. 2007).

The Emotion Contrast Task (ECT;) consists of a series of 12 paired-videotaped dance/expressive body movement performances. The performances consist of spontaneous body movements expressed by one female dancer and one male dancer. The ECT involves identifying which of two performances simultaneously presented in a split screen format match a target emotion (happiness, sadness, anger, and fear). Emotions were balanced across dancers. Performances within a pairing are matched by gender, resulting in male-male or female-female paired-performances. Participants are asked to watch the paired presentation and then indicate on an answer sheet which of the presentations (left or right) correctly matches the target emotion label. Target emotions are listed on the prepared answer sheet directly above the responses to be selected (left or right) for which of the two segments matched the target emotion. Responses were fully counterbalanced both within target and distractor. The ECT was successfully used to map out children's emergent ability to recognize emotional meaning in body movement.

Sensitivity Tasks

Participants also completed two perceptual sensitivity tasks. One task measured sensitivity to rhythm while the other task measured sensitivity to angularity.

Rhythm Sensitivity

The rhythm sensitivity task, consists of computer-presented patterns whereby participants are asked to mimic a rhythm by pressing the space bar on a computer keyboard. The task was constructed on the argument that the ability to match and maintain a rhythm is a measure of perception sensitivity.

After a brief training period in which participants were shown how to perform the task and given a practice trial, participants were presented with a series of four rhythmic patterns. The first three patterns are visual with rectangles that are presented with a specific temporal interval. The fourth pattern is auditory with beeps presented with a specific temporal interval. Participants are instructed to match the rhythm of the stimulus presented by pressing the space bar and to continue after the stimulus presentation ceases until instructed to stop by the presentation of a red “stop” sign by the computer. The test phase is the time between the cessation of the stimulus presentation and the presentation of the stop sign; it is during this time that participants must demonstrate their ability to maintain the rhythmic pattern as a measure of their rhythm sensitivity.

Angularity Sensitivity

The angularity sensitivity task is also presented on a computer and involves the simultaneous side-by-side comparison of two images. The goal of this task was to judge whether a series of 24 paired-trials were the same or different. Each trial consisted of two identical figures simultaneously presented in a split screen format such that one figure appeared on the left side of the screen and one figure appeared on the right side of the screen (see Fig 1). Initially, a more elaborate model had been attempted, including a pre-test, feedback, and post-test period; however, due to problems with the feedback segment, we elected to use the first series of 24 figures presented without feedback. Scores on the pre and post test periods were correlated, r(96) = .44, p < .001, suggesting moderate test-retest validity. However, to reduce redundancy in the measures used, only data from the pre-test trials were analyzed.

Fig 1.

Fig 1

Sample item from the angularity sensitivity task. In this depiction, the item on the left has been rotated three degrees counter clockwise relative to the item on the right. Participants were asked to discriminate between the two lines to determine if they are the same or different

Stimuli for the angularity sensitivity task were first developed using a word processing program to draw static figures varying in shape (oval, straight line, and S-shaped curved line). A total of 24 paired-figure trials were composed, and figure-pairs varied across trials by shape with equal numbers of trials for each shape. In half the trials, both figures were completely identical. For the other half of trials, one figure differed systematically from the other in rotation and direction. Differences in angularity were made by utilizing a randomized series of rotations where one of the two figures presented in each trial was rotated by 0 to 9 degrees. Rotations were chosen on the basis of difficulty with the aim to include one rotation that was more subtle and difficult to detect (e.g. a 3 degree rotation) and one rotation that was less subtle and thus easier to detect (e.g. a 9 degree rotation). Both the direction (clockwise or counter clockwise) and side (left or right) of rotation were counterbalanced across the 24 trials to account for any biases relating to hemispheric presentation. Participants were presented with each paired-trial on the computer and then were asked to select whether the figures were the same or different with their mouse. Handedness was taken into account by moving the mouse to each participant's dominant side. Initial validity testing of the angularity sensitivity task indicated that participants were able to accurately follow and discriminate the presence of discrepant angular orientations well above chance (50% or 12 out of 24) using this measure (M = 17.88, SD = 2.70, t(98) = 21.69, p <.001).

Procedure

Participants were individually administered the nonverbal test battery and sensitivity tests in counterbalanced blocks. One group of participants (N = 49) completed the sensitivity tasks first while the other group of participants (N = 50) completed the nonverbal test battery first. After the nonverbal and sensitivity testing, participants were asked to complete a brief questionnaire asking for information on participant age, gender, and college major.

Results

Data Manipulation

Prior to analysis, raw data were examined for normality and outliers. Mahalanobis' distance values above 18.5 were examined based on a chi-square distribution for 4 df and .001 critical value. Four participants were identified as multivariate outliers by Mahalanobis' distance and were dropped from subsequent analyses. All other assumptions for linear regression were met.

Sensitivity to rhythm was calculated by first finding the absolute deviation in response time for each rhythm sequence during the test phase, which occurred after the presentation of the stimuli ceased, during which time participants attempted to hold the rhythm on their own. For example, the first test sequence rhythm has a target time interval of 2.05 secs. Deviations are calculated by taking the absolute value of the difference between participants' response time during the test phase and the target time interval. These absolute deviations are then averaged across all four rhythms to create an overall rhythm deviation variable. Greater values on this rhythm deviation variable denote less accurate rhythm identification and maintenance.

A second rhythm sensitivity variable was created to measure the ability to recognize the third rhythm pattern, which is a syncopated rhythm. The syncopated rhythm was defined as a three beat pattern with a silent third beat, thus resulting in a “short” pause between the first two beats and a “long” pause in the silent third beat (i.e., beat beat pause, beat beat pause; the spaces between the beats represent the short pause pattern and the pause at the end represents the long pause pattern). In this task, the syncopated rhythm has target time intervals of .967 (short pause pattern) and 1.883 (long pause pattern) secs. Recognition of the syncopated rhythm was calculated by comparing confidence intervals calculated at 1 SD above and below the mean for both the short and long rhythms. If there was no overlap in the confidence intervals for each rhythm, we judged that the syncopated rhythm was recognized. If instead the confidence interval for the long rhythm overlapped with the confidence interval for the short rhythm, suggesting that the participant did not see a difference in the length of the two pause patterns, we judged that the syncopated rhythm was not recognized. Thus, a dichotomous syncopation variable was produced, whereby the participant recognized or did not recognize the syncopation.

Sensitivity to angularity was calculated using the variable d′ from signal detection theory (for detailed information regarding the calculations used in signal detection see Macmillan and Creelman 1991; Stanislaw and Todorov 1999; Theodor 1972). Signal detection theory was selected as opposed to traditional measures of accuracy scoring used in emotion perception research due to the discriminatory nature of the task. Further, signal detection theory allows for the separation of the signal (e.g., discrimination between two choices) from noise (e.g., error) and accounts for response bias (e.g., bias towards selecting a particular answer choice) in addition to sensitivity (e.g., correct discrimination), thus providing an ideal performance variable for perceptual tasks (Stanislaw and Todorov 1999). Greater d′ values indicate greater ability to distinguish signal from noise whereas values closer to 0 indicate an inability to distinguish signal from noise.

Data Analysis

Prior to testing the hypotheses, gender effects were considered using bivariate correlations and regression analyses. Gender was not significantly correlated with performance on any of the nonverbal decoding tests (ps > .213) nor was gender significantly related to rhythm or angularity sensitivity (ps > .091). In addition, gender remained a nonsignificant predictor at all model steps when entered in the first step of the regression models tested. To enhance parsimony, gender was omitted from subsequent analyses.

We then examined the intercorrelations among the rhythm and angularity sensitivity tasks. As shown in Table 1, there was a moderate correlation between the two rhythm variables. There were also small-to-moderate correlations between angularity d′ and both rhythm variables, suggesting that participants with greater angularity sensitivity also demonstrated greater rhythm sensitivity. As these correlations were not large in size, multicollinearity was not a concern (Tabachnick and Fidell 2007).

Table 1. Means, Standard Deviations, and Intercorrelations for DANVA2-AF, DANVA2-POS, ECT, Gender, Rhythm Sensitivity, and Angularity Sensitivity (Study 1, N=95).
Variable M SD 1 2 3 4 5 6 7
1. DANVA2-AF 0.76 0.10 -- .22* .07 .13 -.05 .21* .11
2. DANVA2-POS 0.69 0.13 -- .03 .04 -.10 .10 .10
3. ECT 0.88 0.14 -- .12 -.10 .26** .00
4. Gendera 0.54 0.50 -- .11 -.05 -.17f
5. Rhythm Deviation 0.18 0.09 -- -.49*** -.31**
6. Sync Recognition 0.82 0.39 -- .20*
7. Angle d 1.54 0.73 --

p <.10.

*

p < .05.

**

p < .01.

***

p < .001.

a

Coded as 0 = male, 1 = female.

b

Coded as 0 = syncopation not recognized, 1 =syncopation recognized.

Intracorrelations were then calculated between the DANVA2-AF, the DANVA2-POS, and the ECT to determine the existence of a general emotion perception latent variable produced by overlap in the measures. While there was some overlap between the two DANVA tests, there was not sufficient overlap among the nonverbal tests to justify a multivariate approach. Subsequently, each nonverbal measure was examined separately in a series of univariate analyses.

Our hypotheses were tested via a series of hierarchical regressions with each of the nonverbal measures entered as a dependent variable. Absolute rhythm deviation and syncopation recognition were entered in the first model step, with angularity d′ in the second and final step. Results are reported for each dependent variable by model step in Table 2.

Table 2. Hierarchical Regressions of DANVA2-AF, DANVA2-POS, and ECT, onto Rhythm Deviation, Syncopation Recognition, and Angularity Sensitivity (Study 1).
Predictor variables DANVA2-AF DANVA2-POS ECT

R2 β (SE) R2 β (SE) R2 β (SE)
Step 1 .05 .01 .07*
 Rhythm Deviation .06(.13) -.07(.17) .04(.17)
 Syncopation Recognition .24(.03)* .06(.04) .28(.04)*
Step 2 .05 .02 .07
 Rhythm Deviation .09(.14) -.05(.18) .03(.18)
 Syncopation Recognition .23(.03) .06(.04) .29(.04)*
 Angle d′ .09(.02) .07(.02) -.05(.02)

p < .10.

*

p < .05.

DANVA2-AF

The overall model investigating accuracy of the DANVA2-AF was not significant, F(3,91) = 1.68, p = 0.177, R2Δ = 0.01. The predicted variable (angularity d′) failed to demonstrate a significant relationship to nonverbal accuracy in the DANVA2-AF.

DANVA2-POS

The overall model investigating accuracy of the DANVA2-POS was also not significant, F(3,91) = 0.53, p =.665, R2Δ = 0.01. The predicted variable (angularity d′) did not demonstrate a significant relationship to nonverbal accuracy in the DANVA2-POS.

ECT

The model investigating accuracy on the ECT yielded a nonsignificant overall model, F(3,91) = 2.36, p =.077, R2Δ = .00. However, the first model step was significant, F(2,92) = 3.46, p =.035, R2Δ = .07. The ability to recognize the syncopated rhythm was related to greater accuracy on the ECT, t(92) = 2.46, p =.016; that is, participants who were able to recognize a syncopated rhythm were significantly more accurate at perceiving emotions from dynamic body movements than were participations who failed to recognize a syncopated rhythm. This effect remained significant in the final model step. The effect of absolute rhythm deviation failed to achieve significance in relation to nonverbal accuracy on the ECT.

Discussion

The regression models revealed some support for the association between individual differences in perceptual sensitivity to spatiotemporal forms and increased nonverbal accuracy. As predicted, we found that rhythm sensitivity was related to performance on the dynamic measure of emotion perception, providing support for Hypothesis 1. Angularity sensitivity failed to predict accuracy on both static measures of emotion perception; thus, Hypothesis 2 was not supported. Results from Study 1 suggest that dynamic displays of emotion are related to individual differences in sensitivity to temporal percepts while the relations between spatial-percept sensitivities and static displays remain undetermined.

Interestingly, we failed to find a gender effect on emotion perception accuracy. A gender bias in nonverbal sensitivity with women outperforming men has been well established in the literature (Hall and Matsumoto 2004; Rosip and Hall 2004), so similar gender differences were expected to emerge across all the nonverbal measures. However, it may be that the particular nonverbal emotion perception tasks selected for this study demonstrate lower rates of gender differences, and recent meta-analysis estimates suggest that gender differences account for relatively small amounts of variance in explaining nonverbal accuracy (Hall and Gunnery 2013).

Study 2

Study 1 provided partial support for a global spatiotemporal sensitivity-emotion perception association. These findings suggest that different nonverbal media accent different spatiotemporal percepts, that is, some percepts are more relevant in some media than others. To the extent that specific percepts are embedded within specific emotions, some emotions may be more salient or more readily perceived in some media than others. In Study 2, we aimed to refine the scope of sensitivity further by identifying additional percepts that may be more closely tied to and embedded within specific emotions within specific media.

One static spatiotemporal percept that is conceptually related to the percept of angularity is geometric configuration. Sensitivity to configuration should relate to the ability to discriminate between configural manipulations of geometric properties within a figure. To illustrate, picture an inverted “V” shape that consists of two lines and can be likened to the furrowed brows used to depict anger. Configural sensitivity would be demonstrated through the ability to discriminate between changes in the distance and intersection of such lines. Similarly, a picture of a curved line is indicative of a smile used to convey happiness or a frown used to convey sadness. Configural sensitivity for curved lines would be demonstrated through the ability to discriminate between changes in height and width of the curve. Thus, we extended the relatively global measures of spatiotemporal sensitivity used in Study 1 to more specific percepts that are associated with specific facial features (i.e., eyebrows, smiles, and frowns) and embedded within specific emotions (i.e., anger, happiness, sadness).

In addition to rhythm, angularity, and configuration, research suggests that emotion perception may relate to another spatiotemporal feature: force. Force or forcefulness is often described in the literature as being strong or light, and greater force has been associated with better communication of anger in body movements (DeMeijer 1989). Therefore, a significant association exists between the communication of anger in body movements and the use of force. Force is a dynamic spatiotemporal percept much like tempo, pitch, and rhythm, and it can be manipulated to discriminate between emotions. The literature on emotion perception and nonverbal communication, however, has often overlooked the value of force as a spatiotemporal percept, particularly with regards to an individual's sensitivity to force.

Study 2 aimed to replicate and extend findings from Study 1 to further map out how individual differences in sensitivity to static and dynamic spatiotemporal percepts predict the perception of emotions from static and dynamic displays of body movements and facial expressions. Participants completed two static sensitivity tasks (angularity and configuration), two dynamic sensitivity tasks (rhythm and force), and a battery consisting of four nonverbal tests. Given the findings of Study 1, this nonverbal test battery was adjusted to include the DANVA2-AF, the DANVA2-POS, the ECT, and the Communication of Affect Receiving Ability Test (CARAT; Buck 1976). An attempt was made to select a balanced battery of nonverbal tests that included two static and two dynamic measures to allow for differential predictions between sensitivity tasks and modes of nonverbal presentation. Moreover, within each mode of nonverbal presentation, there are one facial expression test and one bodily expression test. Further, three of the four nonverbal tests allow for the examination of emotion-specific perception accuracies. Therefore, inclusion of all four tests together provides a more comprehensive study design.

It was predicted that individual differences in the identification of rhythm, angularity, configuration, and force would influence the perception of emotion in the following ways:

H1: Sensitivity to rhythm will significantly predict the perception of emotion on two dynamic nonverbal decoding tests. Participants who are better able to recognize and maintain a rhythm will demonstrate greater accuracy in perceiving emotions on the ECT and the CARAT than participants who are less able to recognize and maintain a rhythm.
H2: Sensitivity to angularity will significantly predict the perception of emotion on two static nonverbal decoding tests. Participants who are better able to detect subtle differences between identical figures presented at two angles will demonstrate greater accuracy in perceiving emotions on the DANVA2-AF and the DANVA2-POS than participants who are unable to discriminate differences in angularity.
H3: Sensitivity to configural manipulations within emotion-cue percepts will significantly predict the perception of emotion on two static nonverbal decoding tests. Participants who are able to detect subtle differences in orientation between two figures depicting straight lines, upward curves, and downward curves will demonstrate greater accuracy in perceiving emotions on the DANVA2-AF and DANVA2-POS than participants who are unable to discriminate differences in configuration.
H4: Sensitivity to force will significantly predict the perception of emotion on two dynamic nonverbal decoding tests. Participants with greater force sensitivity will demonstrate greater accuracy in perceiving emotions on the ECT and CARAT than participants with less sensitivity to force.

Methods

Participants

A total of 101 volunteers (46 males, 55 females) from participated in this study in exchange for course credit in an introductory psychology course. Participants ranged in age from 17 to 31 (M = 19.43, SD = 2.12).

Nonverbal Test Battery

Participants were administered a nonverbal test battery consisting of four nonverbal emotion decoding tests. All tests were scored using an accuracy model as in Study 1. Nonverbal tests were presented either on a computer monitor or pre-recorded video, and answers were recorded with paper and pencil.

The Diagnostic Analysis of Nonverbal Accuracy for Adult Faces (DANVA2-AF; Nowicki and Duke 1994) consists of 24 photographs representing the four emotions of happy, sad, angry, and fearful. The DANVA2-AF as described in Study 1 was used.

The Diagnostic Analysis of Nonverbal Accuracy for Postures (DANVA2-POS; Pitterman and Nowicki 2004) consists of 40 photographs representing the emotions of happy, sad, angry, fearful, and neutral. The DANVA2-POS as described in Study 1 was used.

The Emotion Contrast Task (ECT;) consists of 12 video segments that depict happiness, sadness, anger, or fear. The ECT as described in Study 1 was used.

The Communication of Affect Receiving Ability Test (CARAT; Buck 1976) is a 32-item test consisting of silent video sequences where a sender spontaneously responds to emotionally loaded slides varying in content (sexual, scenic, unpleasant, or unusual). Each video sequence shows the upper trunk and head of the sender. Receivers (the participants) are asked to judge the kind of slide being viewed by the sender. Thus, receivers do not see the slide that is viewed by the sender but rather must rely on the nonverbal cues expressed to infer the type of slide that the sender has viewed in each sequence. Participants were presented with each video sequence and asked to select from an answer sheet which of the four types of scenes the expresser viewed (sexual, scenic, unpleasant, or unusual). Accuracy was calculated as a sum across the 32 items. The CARAT differs from other forms of nonverbal sensitivity tests in that it utilizes spontaneous rather than posed expressions. Additionally, although the CARAT focuses on the face of the target, the presentation is that of video clips and as such represents dynamic stimuli. The CARAT has a normative Cronbach's alpha of .56 (Buck 1976). Performance on the CARAT has been significantly correlated to interpersonal perception of trust, such that people who did well on the CARAT also did well in making interpersonal judgments of trustworthiness (Boone and Buck 2004).

Sensitivity Tasks

Four tasks were administered to assess spatiotemporal sensitivity.

Angularity and Rhythm Sensitivities

Both the rhythm and angularity tasks aforementioned in Study 1 were administered. No changes were made to the rhythm sensitivity task. Given the use of only the first 24 items of the angularity test in Study 1, the angularity sensitivity task was shortened to include only the initial 24 items. Participants were administered the rhythm sensitivity task and angularity sensitivity task on the computer and responses were recorded on the computer.

Configural Sensitivity

A third sensitivity task was created in an effort to measure individual differences in sensitivity to a spatiotemporal dimension that is more directly embedded in channel-specific displays of emotion. We extracted those features that are thought to be integral to the communication of emotion in displays of the face (e.g., eyebrows and mouths; Ekman and Friesen 1975) with special attention to the percepts used in each feature (e.g., lines and curves). People are able to accurately recognize emotion in schematic drawings of the face, which utilize such features, and this ability emerges at a relatively young age (Borke 1971; Denham and Couchoud 1990; MacDonald, Kirkpatrick, and Sullivan 1996). We were interested in extending this research to examine whether sensitivity to the percepts used to convey emotions in these features also relates to the perception of emotion. Thus, we created a measure focusing on the very basic elements of a schematic face (lines and curves) to generate stimuli for this task.

The configural sensitivity task was developed in a manner similar to development of the angularity task. Stimuli for the configural sensitivity task were first developed using a word processing program to draw static figures varying in shape. A total of three different shapes were used: line, smile (upturned curve), and frown (downturned curve). Shapes roughly corresponded to eyebrows and the curves of a smile and frown as would be presented in a schematic facial drawing (see Fig 2). Two figures were simultaneously presented in a split screen format for each trial resulting in a total of 36 paired-figure trials. Paired-figures varied across trials by shape with equal numbers of trials for each shape. In half the trials, both figures were completely identical in spatial configuration. For the other half of trials, one figure differed systematically from the other (see Fig 3 and Fig 4); differences were made by utilizing a randomized series of spatial manipulations where one of the two figures presented in each trial was horizontally manipulated (for lines, the distance between the lines was increased by .125 inch; for smiles and frowns the width was increased by .160 inch) and/or vertically manipulated (lines were elongated by .125 inch; smiles and frowns were elongated by .330 inch). Equal numbers of vertical, horizontal, and vertical/horizontal manipulations were made across the 36 paired-trials. Further, as in the angularity task, the side (left or right) of manipulation was counterbalanced across the 36 trials to account for any biases relating to hemispheric presentation.

Fig 2.

Fig 2

Examples of the shapes used in the configural sensitivity task. Item a. represents a line, which roughly corresponds to the shape of eyebrows. Item b. represents an upward curved line, which roughly corresponds to the shape of a smile. Item c. represents a downward curved line, which roughly corresponds to the shape of a frown

Fig 3.

Fig 3

Sample trial from the configural sensitivity task. Two stimuli items that abstractly represent the eyebrows. Participants were asked to discriminate between item a. and item b. as to whether the items are the same or different. The items differ in their spatial manipulations, either through distance or width. The items pictured display a manipulation of distance – the figure on the bottom is closer together than the figure on the top

Fig 4.

Fig 4

Sample trial from the configural sensitivity task. Two stimuli items that abstractly represent the mouth. Participants were asked to discriminate between item a. and item b. as to whether the items are the same or different. The items differ in their spatial manipulations, either through distance or width. The items pictured display a manipulation of width – the figure on the bottom is more attenuated than the figure on the top

Participants were presented with each paired-trial on the computer and then were asked to select whether the paired-figures were the same or different with their mouse. Greater configural sensitivity is reflected by the ability to make fine discriminations between different configurations of the same figure.

Force Sensitivity

A fourth and final sensitivity task was developed to assess an individuals' attunement to force. The force task was developed using a gif building graphic design program, GifBuilder (manufactured by Yves Piguet). GifBuilder is a scripted utility that creates animated GIF files. Stimuli were modeled to resemble a basketball that bounces when thrown. Stimuli were created by adding together a sequence of eight images where the position of the basketball varied in each image, so as to resemble animated changes in force that occur as a basketball bounces (see Fig 5). Sequences were manipulated so that the position of the basketball reflected variations in force. For example, greater force would be reflected in a sequence of images where the position of the basketball changes rapidly from image to image. In contrast, less force would be reflected in a sequence of images where the position of the basketball changes somewhat more slowly from image to image. Two sequences were paired and presented simultaneously in a split screen format with a total of 12 paired-sequence trials. Participants viewed the left sequence, then the right sequence, within each paired-trial. As in both the angularity and configural sensitivity tasks, force manipulation was counterbalanced across the 12 trials to account for any biases relating to hemispheric presentation. That is, for half the trials, the sequence on the left displayed greater force and for half the trials the sequence on the right displayed greater force.

Fig 5.

Fig 5

Sample item from the force sensitivity task. Each of the nine figures pictures were combined in numerical order (i.e., 1 to 9) to form a dynamic, animated sequence of movement. In this example, the “basketball” figure drops from the top and bounces against the bottom.

Participants were presented with each paired-trial and were instructed to observe each presented sequence one at a time. This method of presentation was preferred over a completely simultaneous viewing to ensure that issues relating to dual-task processing and attention would be minimized. In this way, participants were able to attend to each animated sequence separately. After viewing each sequence within a paired-trial, participants were asked to select on an answer sheet which of the two images (left or right) depicted greater force. Force accuracy was scored by averaging the correct identification across all trials. The forcefulness task was presented visually on the computer, and answers were recorded with paper and pencil.

The latter two sensitivity tests were specifically designed for their use of channel-embedded and dynamic cues, respectively, to round out the panel of sensitivity tasks. Embedded here refers to the integral utility of cues for a particular display of emotion in a specific nonverbal channel, namely the use of curves and lines in schematic facial expressions to convey happiness, sadness, anger, and fear, and the use of force in dynamic body movements to convey anger. Further, the addition of these two sensitivity tasks allowed for parallel comparisons among all test measures; there were two static sensitivity tasks and two static nonverbal decoding tasks, two dynamic sensitivity tasks and two dynamic nonverbal decoding tasks. Therefore, inclusion of all four sensitivity tasks together provides a more comprehensive and parallel study design.

Procedure

The same procedure was used as in Study 1. One group of participants (N = 51) completed the sensitivity tasks first while the other group of participants (N = 50) completed the nonverbal test battery first. Participants then reported their age, gender, and college major on a brief questionnaire.

Results

Data Manipulation

Prior to analysis, raw data were examined for normality and outliers. Mahalanobis' distance values above 22.5 were examined based on a chi-square distribution for 6 df and .001 critical value. Three participants were identified as multivariate outliers by Mahalanobis' distance and were dropped from subsequent analyses. All other assumptions for linear regression were met.

Sensitivity to rhythm was calculated as in Study 1 using the absolute average deviation and syncopation recognition scores. Sensitivity to angularity was calculated as in Study 1 using d′. Sensitivity to configural manipulation was similarly calculated using d′ for each geometric shape, yielding three values: line d′, smile d′, and frown d′. Sensitivity to force was calculated as a mean accuracy score.

Data Analysis

As in Study 1, data were analyzed in several stages. We first considered gender effects using bivariate correlations and regression analyses. In this sample, gender was marginally correlated with performance on the DANVA2-AF, r(97) = .18, p = .072, indicating that females tended to be more accurate at recognizing emotions in static facial expressions as compared to males. No other correlations with gender approached significance. Thus, to enhance parsimony and maintain consistency between studies, gender was omitted from subsequent analyses.

We then examined intercorrelations between rhythm, angularity, configural, and force sensitivity variables (see Table 3). The two rhythm sensitivity variables (absolute rhythm deviation and syncopation recognition) were significantly correlated as were line d′ and smile d′; these correlations ranged from moderate to large. No other intercorrelations between sensitivity tasks were significant. Both syncopation recognition and smile d′ emerged as nonsignificant predictors in initial regression models and were subsequently omitted from final analyses (see below); thus, multicollinearity among predictors was not a concern (Tabachnick and Fidell 2007).

Table 3. Means, Standard Deviations, and Intercorrelations for DANVA2-AF, DANVA2-POS, ECT, CARAT, Gender, Rhythm Sensitivity, Angularity Sensitivity, Configural Sensitivity, and Force Sensitivity (Study 2, N = 98).
Variable M SD Nonverbal Tests Gender Rhythm Angle Configuration Force

1 2 3 4 5 6 7 8 9 10 11 12
1. DANVA2-AF .72 .12 -- .39*** .17 .06 .18 -.09 .08 .12 .01 -.04 .07 -.03
2. DANVA2-POS .70 .11 -- .22* .30** .12 -.13 .16 .22* .19 .09 .13 -.14
3. ECT .90 .12 -- .12 .13 -.23* .05 -.00 .19 -.09 -.20 .07
4. CARAT 17.12 3.30 -- .07 -.21* .21* .16 .16 .14 -.01 -.02
5. Gender .53 .50 -- .03 -.09 -.17 .03 .11 .12 .04
6. Rhythm Deviation .17 .09 -- -.51*** .01 -.04 .10 .11 -.19
7. Sync Recognitiona .86 .35 -- .20 .06 .08 .01 .12
8. Angle d′ 1.47 .63 -- .18 .11 -.05 -.11
9. Line d′ 1.28 .77 -- .34** .14 -.03
10. Smile d′a 1.11 .73 -- .13 -.11
11. Frown d′ .32 .78 -- -.03
12. Forcea .45 .15 --

p < .10.

*

p < .05.

**

p < .01.

***

p < .001.

a

Dropped from subsequent regression models.

Intracorrelations were then calculated between the DANVA2-AF, the DANVA2-POS, the ECT, and the CARAT to determine the existence of a latent variable (see Table 3). As expected, the two subscales of the DANVA2 were significantly moderately correlated, r(97) = .39, p <.001, with greater accuracy in the perception of emotion in static faces relating to greater accuracy in the perception of emotion in static body postures. Further, perception of static body postures was related positively to the perception of emotions in dynamic body movements on the ECT, r(95) = .22, p = .036, and dynamic, multimodal expressions on the CARAT, r(97) = .30, p = .003. These findings are among the better correlations reported in the literature and provide some evidence of a latent emotion perception construct; however, given that these correlations are small to moderate in size, each nonverbal measure was analyzed independently.

Our hypotheses were again tested using a series of hierarchical regressions with each of the nonverbal measures entered as a dependent variable. Given the novelty of the sensitivity tasks, we began by testing an initial series of regression analyses with absolute rhythm deviation and syncopation recognition in the first model step, angularity d′ in the second step, line d′, smile d′, and frown d′ in the third step, and force sensitivity in the fourth step. In all model analyses, syncopation recognition, smile d′, and force sensitivity were ns, possibly due to low variability in these variables (see means and standard deviations reported in Table 3). Thus to increase parsimony, models were trimmed to include the following sequence of predictors: absolute rhythm deviation in the first model step, angularity d′ in the second step, and line d′ and frown d′ in the third and final step. Each block was assessed for statistical significance and within each block individual variables were evaluated; marginally significant findings are reported where appropriate given the exploratory nature of data. Relative predictive power was assessed by significant changes in R2 values. Results are reported for each dependent variable by model step in Table 4.

Table 4. Hierarchical Regressions of DANVA2-AF, DANVA2-POS, ECT, and CARAT onto Angularity Sensitivity, Rhythm Sensitivity, and Configural Sensitivity (Study 2).
Predictor variables DANVA2-AF DANVA2-POS ECT CARAT

R2 β (SE) R2 β (SE) R2 β (SE) R2 β (SE)
Step 1 .01 .02 .05* .01*
 Rhythm Deviation -.09(.14) -.13(.13) -.23(.13)* -.21(3.74)*
Step 2 .02 .06* .05 .08*
 Rhythm Deviation -.09(.14) -.13(.13) -.23(.14)* -.21(3.71)*
 Angle d′ .12(.02) .22(.02)* -.00(.02) .17(.52)
Step 3 .03 .10* .13* .10
 Rhythm Deviation -.10(.15) -.14(.13) -.20(.13)* -.20(3.75)*
 Angle d′ .13(.02) .20(.02)* -.05(.02) .14(.53)
 Line d′ -.03(.02) .12(.02) .21(.02)* .12(.44)
 Frown d′ .10(.02) .13(.01) -.20(.02) .01(.43)

p < .10.

*

p < .05.

DANVA2-AF

The overall model investigating accuracy of the DANVA2-AF was not significant, F(4,93) = 0.77, p = 0.547, R2Δ = 0.01. None of the predicted variables (angularity d′, line d′, and frown d′) were significant.

DANVA2-POS

The overall model investigating accuracy of the DANVA2-POS was significant, F(4,93) = 2.65, p = 0.038, R2Δ = 0.04. The addition of angularity d′ in the second model step resulted in a significant change, F(2,95)= 3.27, p = 0.031, R2Δ = 0.05. Angularity d′ was significantly positively related to performance on the DANVA2-POS, t(95) = 2.19, p = 0.031; participants who were more able to detect subtle differences in angularity tended to be more accurate at perceiving emotions in static body postures than participants less sensitive to subtle differences in angularity. This effect remained significant in the final model step. No other predicted effects (line d′ and frown d′) were significant.

ECT

The overall model investigating accuracy of the ECT was significant, F(4,91) = 3.27, p = 0.015, R2Δ= .07. As predicted, absolute rhythm deviation was significantly negatively related to nonverbal accuracy on the ECT in the first model step, t(94) = -2.32, p = 0.022, indicating that participants who deviated less from the rhythms tended to be more accurate at perceiving emotions from dynamic body movements as compared to participants who deviated more. Absolute rhythm deviation remained a significant predictor in subsequent steps. Further, the addition of line d′ and frown d′ in the third model step resulted in a significant change, F(4,91) = 3.27, p = 0.028, R2Δ =.07. Line d′ was significantly positively related to performance on the ECT, t(91)= 2.11, p = .038; participants who were better able to detect subtle differences in the configuration of lines were more accurate at perceiving emotions in dynamic body movements than were participants less able to detect subtle differences in the configuration of lines. Frown d′ was marginally related to performance on the ECT, t(91) = -1.96, p = 0.053, though in the opposite direction; participants who were more sensitive to subtle differences in the configuration of frowns were less accurate at perceiving emotions in dynamic body movements as compared to participants with greater sensitivity to the configuration of frowns. No other effects approached significance.

CARAT

The overall model investigating accuracy of the CARAT was marginally significant, F(4,93) = 2.14, p = 0.082, R2Δ = 0.02. As predicted, absolute rhythm deviation was significantly negatively related to performance on the CARAT in the first model step, t(96) = -2.06, p = 0.042; participants who deviated less from the rhythms were more accurate at perceiving emotions from a dynamic, multimodal display as compared to participants who deviated more. This effect remained significant in subsequent steps. The addition of angularity d′ in the second model step resulted in a marginally significant change, F(2,95) = 3.55, p = 0.098, R2Δ = 0.03. Angularity d′ was marginally significantly related to performance on the CARAT in the second model step, t(95) = 1.67, p = 0.098; participants who were better able to discriminate between subtle differences in angularity tended to be more accurate at perceiving emotions in dynamic, multimodal displays as compared to participants less sensitive to subtle differences in angularity. No other effects approached significance.

Exploratory analyses

In addition to testing our hypotheses, we conducted an exploratory analysis of the bivariate correlations between the four spatiotemporal sensitivity tasks and emotion-specific perception accuracies on the DANVA2-AF, DANVA2-POS, and ECT. The CARAT was not included in these analyses, as the CARAT does not allow for separation of accuracy by emotion (see Table 5). Associations that were expected to be significant as predicted by theory are noted in bold.

Table 5.

Intercorrelations for Rhythm, Angularity, Configural, and Force Sensitivities and Happy, Sad, Anger, Fear Accuracies on the DANVA2-AF, DANVA2-POS, and ECT.

Sensitivity Variables DANVA2-AF DANVA2-POS ECT

Happy Sad Angry Fearful Happy Sad Angry Fearful Happy Sad Angry Fearful
Rhythm Deviation .03 -.01 -.10 -.12 -.03 -.07 -.13 -.03 -.07 -.23* -.29** -.08
Angle d′ .04 .06 .14 .08 .08 .20* .14 .15 .01 .02 -.04 -.01
Line d′ .11 -.06 .10 -.09 .04 .08 .26* .00 .07 .03 .28** .17
Smile d′ .05 -.02 .03 -.13 .09 .07 .08 -.06 -.16 .02 .10 -.10
Frown d′ .05 .09 .05 .01 .03 .18 .18 -.09 -.23* -.06 .06 -.18
Force -.02 -.04 -.08 .08 -.30** -.08 .02 .01 .06 .03 .14 -.01

Bolded coefficients represent expected associations.

p < .10.

*

p < .05.

**

p < .01.

Absolute rhythm deviation was negatively correlated with the perception of sadness and anger in dynamic body movements, suggesting that greater rhythm sensitivity as measured by a more refined ability to maintain rhythm was related to greater accuracy in perceiving sad and angry body movements. However rhythm sensitivity was not correlated with the perception of happy or fearful body movements. Angularity sensitivity correlated only with the perception of sadness in static body postures, suggesting that greater sensitivity to angles was related to greater accuracy in perceiving sad but not happy, angry, or fearful body postures. Line d′ was significantly correlated only with the perception of anger in both static and dynamic body expressions. These correlations suggest that sensitivity to lines may be particularly salient in perceiving expressions of anger displayed by the body. Smile d′ was unrelated to the perception of happiness and sadness in both static facial and body expressions. Frown d′ was marginally positively related to the perception of sadness in static body postures. However, there were no significant correlations between frown d′ and sadness in facial expressions, or with the perception of happiness in either static facial or body expressions. Force sensitivity was unrelated to the perception of anger in dynamic body movements, but was surprisingly negatively related to the perception of happiness in static body postures, suggesting that sensitivity to force may relate to lower accuracy in perceiving happy body postures.

Discussion

Study 2 revealed additional support for the association between individual differences in sensitivity to spatiotemporal percepts and increased nonverbal emotion perception accuracy. Consistent with Hypothesis 1, participants who demonstrated greater rhythm sensitivity, by deviating less from the rhythms, were more accurate at perceiving emotions in dynamic facial and body movements than were participants with less rhythm sensitivity. Consistent with Hypothesis 2, participants who demonstrated greater sensitivity to angles were more accurate at perceiving emotions in static body postures than were participants with less angularity sensitivity. Hypotheses 3 and 4 were not supported, as configuration sensitivity failed to predict the perception of emotion in static facial expressions and body postures, and force sensitivity failed to predict the perception of emotion in dynamic facial and body movements.

Interestingly, participants who demonstrated greater sensitivity to lines were more accurate at perceiving emotions in dynamic body movements. Moreover, participants with greater angularity sensitivity were more accurate at perceiving emotions in dynamic, multimodal expressions. Exploratory analyses revealed further interesting findings: sensitivity to rhythm and the configuration of lines appeared to relate to the perception of anger in body postures and movements, whereas sensitivity to angles and the configuration of frowns related to the perception of sad body postures. Neither force sensitivity nor sensitivity to the configuration of smiles correlated as expected.

This pattern of sensitivity to spatiotemporal percepts predicting accuracy on a specific subset of emotion perception measures suggests two important points: (1) some percepts are more relevant to some nonverbal media and (2) relevancy may vary by the degree to which percepts are embedded within an emotional expression. To illustrate the first point, rhythm sensitivity was related to performance on the two dynamic measures of nonverbal emotion decoding, but not to static displays of emotion. Given that rhythm is a dynamic percept, it would make sense that the strongest relation would emerge for those tasks requiring perception of emotion in dynamic stimuli. A more complex pattern was observed for angularity and configural sensitivities. Angularity sensitivity was related to the perception of emotion in static body postures but not static facial expressions. There may be something specific or unique to the expression of emotion in body postures that utilizes angularity cues more so than static facial expressions. Indeed, cues related to angularity in body postures such as hunched shoulders, raised head, and collapsed upper body, are predictive of two emotion-related dimensions: activation and approach (Wallbott 1998). We may expect then angularity to be especially relevant to static bodily expressions of emotion, as opposed to static facial expressions. Similarly, the association between configural sensitivity and emotion perception was specific to measures that focused on body expressions of emotion. To illustrate the second point, our exploratory analyses fell in line with theoretical expectations regarding spatiotemporal sensitivity-emotion perception associations, such that individual differences in sensitivity to specific spatiotemporal percepts (such as frowns) were associated with the perception of emotions in which these percepts are most likely embedded (e.g., sadness).

Interestingly, participants with greater sensitivity to the configuration of frowns tended to be less accurate at perceiving bodily expressions of emotion, with lower accuracy for the perception of emotions, particularly happiness, in dynamic body movements. The percepts embedded within the dynamic body-focused nonverbal test (ECT) may be more representative of the configural cues embedded within expressions of happiness than sadness, though this possibility remains an empirical question. Given the specific cue set embedded in this test, sensitivity to frowns may then work against the viewer, resulting in lower accuracy in perceiving emotions for which downward curves are a not a relevant percept. The presence of certain shapes in the nonverbal emotion decoding tests may influence the extent to which sensitivity to certain spatiotemporal percepts predict emotion perception and as well as the direction of these relationships. We would expect the strongest associations to emerge for those percepts that are embedded within both the emotional cue set and the nonverbal media.

Contrary to our predictions, individual differences in force sensitivity were not predictive of emotion perception on any of the nonverbal emotion decoding tests nor was force sensitivity related to the perception of anger in dynamic body expressions in the exploratory analyses. As noted briefly above, these null effects may be due to low variability in the force sensitivity task; such restricted variance may make it more difficult to detect subtle effects. Further, participants failed to accurately detect differences in forcefulness above chance level (M = .45, SD = .15), suggesting that participants may have been guessing as to which figure displayed greater force. It was also difficult to create the force sensitivity measure on the basis of optimal sensitivity. It is possible that the sensitivity task actually assesses acceleration as opposed to true force, as acceleration was a primary variable manipulated to illustrate differences in force. It is also possible that the force sensitivity task taps into additional qualities in the perceiver, such as affective biases, that are not embedded directly within the process of perceiving emotional cues but which influence this process nonetheless. For example, individuals with a negative affect bias, may be more sensitive to stimulus cues that align with their affective bias, but actively avoid cues that contradict their bias. In such a case, individual differences in force sensitivity might relate positively to negative affect biases, but might predict greater perceptual inaccuracy for stimuli that contradict this bias (i.e., happy expressions). Future studies would benefit from more extensive consideration of force sensitivity, including the creation of more sensitive stimuli that reflect the true percept of force, such as in real videos of bouncing basketballs (as opposed to animated simulations).

In addition, none of the spatiotemporal percepts emerged as significant predictors of emotion perception on the DANVA2-AF. Although the DANVA2-AF subtest has been used widely in the nonverbal accuracy literature (e.g., Pitterman and Nowicki 2004; Trentacosta and Fine 2010), the task utilizes primarily prototypical expressions of emotion that may be more representative of a global emotion perception skill and may not be suited for research aimed at identifying fine discriminations in perceptual sensitivity.

Overall Discussion

Our results present preliminary evidence that individual differences in sensitivity to spatiotemporal percepts are related to emotion perception accuracy. In the first study, we demonstrated that sensitivity to rhythm was significantly related to the ability to accurately perceive emotions in dynamic body movements. In the second study, we extended these findings by demonstrating that (1) sensitivity to rhythm predicted accuracy in perceiving emotions in dynamic facial and body movements, (2) sensitivity to angularity predicted accuracy in perceiving emotions in static body postures, and (3) sensitivity to configural manipulations of lines predicted accuracy in perceiving emotions in dynamic body movements.

Our results add to the corpora of literature on individual differences in emotion perception accuracy and support findings illustrating that emotional meaning is communicated through spatiotemporal percepts (Aronoff 2006; Aronoff et al. 1992; Atkinson et al. 2007; Cunningham and Sterling 1988; Hunter et al. 2008; Scherer and Oshinsky 1977). We extend these findings to show that sensitivity to such manipulations relates to the perception of emotions. Our results also extend the literature on spatiotemporal percepts by identifying additional abstract forms, such as configuration, that heighten the awareness of cues and subsequently enhance nonverbal perceptual sensitivity. The ability to accurately perceive emotional information thus is in part related to the attention to these abstract percepts embedded within nonverbal media.

Limitations

While novel, our results should be considered preliminary in that the angularity, configural, and force sensitivity measures were created specifically for these studies and thus have not been validated by other research. While the angularity sensitivity task was administered in both the first study and the second study, the configural and force sensitivity tasks were administered only in the second study and thus have not been replicated. Future research should attempt to validate these measures by replicating the present studies with diverse samples.

Our effects are also relatively small in size. Collectively, the spatiotemporal percepts measured explain at most 13% of the variance in emotion perception accuracy, with greater percentage variance explained for the nonverbal tests assessing the perception of emotion in dynamic and static body expressions and less percentage variance explained for the perception of static and dynamic facial expressions of emotion. In general, larger sample sizes allow for the detection of subtle effects, but do not increase effect sizes. Thus, our effects, though small, are likely not due to issues of power. Rather, the emergence of particular percepts in relation to emotion perception accuracy on particular nonverbal tests likely reflects the true nature of this skill. That is, the perception of emotion appears to be specific to the media in which cues are embedded, and thus the association between percept sensitivity and emotion perception are dependent upon a match between media. Our results found small associations between sensitivity to spatial and temporal percepts and the perception of emotion on tasks that most likely embed these percepts; rhythm sensitivity was predictive of perception accuracy on dynamic body postures, but not static; angularity sensitivity was predictive of perception accuracy on static body postures, but not dynamic; and configural sensitivity to specific percepts (lines) was predictive of perception accuracy on dynamic body postures, but not dynamic facial expressions. Further, exploratory correlational analyses revealed similar patterns: rhythm sensitivity emerged relevant for the perception of dynamic expressions of anger, but not static; angularity sensitivity was related to the perception of sadness in static body postures, but not movements; and sensitivity to the configuration of lines was related to the perception of anger in static and dynamic expressions of the body but not the face.

Strengths

Although the effects are small in size, we believe they contribute to our general understanding of individual differences in emotion perception accuracy, and extend the literature to include individual difference factors that are more proximal to the perceptual process, thus putting “perception” back into the process of nonverbal perception. Individual difference factors have a rich history in the field of nonverbal emotion perception (e.g., gender, age, personality, and culture; Elfenbein and Ambady, 2002; Hall et al. 2009; Hall and Matsumoto 2004; Halberstadt et al. 2013; Harris 2008; Kang and Lau 2012; Matsumoto et al. 2000; Rosip and Hall 2004; Terracciano et al. 2003), yet such factors remain distal from the process of perceiving percepts embedded within emotional displays. Our results suggest that people with greater sensitivity to these percepts are more accurate at perceiving emotions. This finding may be particularly important for programs aimed at remediating emotion perception skill, as factors including age, gender, personality, and culture may not easily be changed, but sensitivity to spatiotemporal percepts may be trained. Future research should examine the development of sensitivity to spatiotemporal percepts and whether such sensitivity is an experience-dependent ability that would facilitate training. For example, deliberate exposure to these percepts may increase sensitivity to these percepts, thus contributing to greater accuracy in perceiving emotion. Research has found that the perception of emotion in speech is enhanced by musical training (Strait, Kraus, Skoe, and Ashley 2009; Thompson, Schellenberg, and Husain 2004). In the realm of body cues, Pitterman and Nowicki (2004) were able to demonstrate greater accuracy in the perception of emotion from body postures among dancers and athletes. Similarly, Brownlow, Dixon, Egbert, and Radcliffe (1997) found that dance experience led to finer motion cue discriminations among body movements. Arguably, greater exposure or saturation to the media-specific cues is likely to increase sensitivity to the spatiotemporal percepts embedded in nonverbal communication, particularly those percepts embedded in the media of interest. The field would benefit from a closer examination of this possibility, as the identification and training of the relevant percepts offers a proximal solution to those individuals with lower emotion perception skills.

In sum, the results proposed here shine light on an ability that is integral to everyday functioning; that is the ability to accurately perceive emotions in others. Although we know a lot about individual characteristics that predict emotion perception, we know little about the process of perceiving emotion and even less about the factors most proximal to this process. Recent research has highlighted the possibility that spatiotemporal cues are a mechanism by which emotions are communicated across different nonverbal media (e.g., Aronoff 2006). Our results suggest that greater sensitivity to the various percepts predict greater nonverbal emotion perception accuracy. By identifying such sensitivities, we have changed the way in which nonverbal perception is conceptualized. Emphasis is shifted to identifying the qualities related to the perceptual sensitivity of the perceiver rather than distal characteristics of the individual. This change may facilitate innovative methods of examining and training nonverbal perceptual accuracy.

References

  1. Aronoff J. How we recognize angry and happy emotion in people, places, and things. Cross-Cultural Research. 2006;40:83–105. doi: 10.1177/1069397105282597. [DOI] [Google Scholar]
  2. Aronoff J, Woike BA, Hyman LM. Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition. Journal of Personality and Social Psychology. 1992;62:1050–1066. doi: 10.1037/0022-3514.62.6.1050. [DOI] [Google Scholar]
  3. Ashwin C, Chapman E, Colle L, Baron-Cohen S. Impaired recognition of negative basic emotions in autism: A test of the amygdala theory. Social Neuroscience. 2006;1:349–363. doi: 10.1080/17470910601040772. [DOI] [PubMed] [Google Scholar]
  4. Atkinson AP, Tunstall ML, Dittrich WH. Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition. 2007;104:59–72. doi: 10.1016/j.cognition.2006.05.005. [DOI] [PubMed] [Google Scholar]
  5. Boone R. Doctoral dissertation. 1996. The attribution of emotion to body movements: The development of cue attunement. Retrieved from Dissertations Abstracts Online. (AAI9628658) [Google Scholar]
  6. Boone RT, Buck R. Emotion receiving ability: A new view of measuring individual differences in the ability to accurately judge others' emotions. In: Geher G, editor. Measuring emotional intelligence: Common ground and controversy. Hauppague, NY: Nova Science Publishing; 2004. pp. 73–89. [Google Scholar]
  7. Borke H. Interpersonal perception of young children: Egocentrism or empathy? Developmental Psychology. 1971;5:263–269. doi: 10.1037/h0031267. [DOI] [Google Scholar]
  8. Brownlow S, Dixon AR, Egbert CA, Radcliffe RD. Perception of movement and dancer characteristics from point-light displays of dance. The Psychological Record. 1997;47:411–421. [Google Scholar]
  9. Buck R. A test of nonverbal receiving ability: Preliminary studies. Human Communication Research. 1976;2:162–171. doi: 10.1111/j.1468-2958.1976.tb00708.x. [DOI] [Google Scholar]
  10. Buck R, Savin V, Miller R, Caul W. Nonverbal communication of affect in humans. Proceedings of the Annual Convention of the American Psychological Association. 1969;4(Pt. 1):367–368. [Google Scholar]
  11. Byron K, Terranova S, Nowicki S. Nonverbal emotion recognition and salespersons: Linking ability to perceived and actual success. Journal of Applied Social Psychology. 2007;37:2600–2619. doi: 10.1111/j.1559-1816.2007.00272.x. [DOI] [Google Scholar]
  12. Cunningham JG, Sterling R. Developmental analysis in the understanding of affective meaning in music. Motivation and Emotion. 1988;12:399–413. doi: 10.1007/BF00992362. [DOI] [Google Scholar]
  13. Darwin C. The expression of emotions in man and animals. New York: Philosophical Library; 1872. [DOI] [Google Scholar]
  14. DeMeijer M. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior. 1989;13:247–268. doi: 10.1007/BF00990296. [DOI] [Google Scholar]
  15. Denham SA, Couchoud EA. Young preschoolers' understanding of emotions. Child Study Journal. 1990;20:171–192. [Google Scholar]
  16. Ekman P, Friesen WV. Unmasking the face: A guide to recognizing emotions from facial clues. Englewood Cliffs, NJ: Prentice-Hall; 1975. [Google Scholar]
  17. Elfenbein HA, Ambady N. Is there an in-group advantage in emotion recognition? Psychological Bulletin. 128:243–249. doi: 10.1037/0033-2909.128.2.243. [DOI] [PubMed] [Google Scholar]
  18. Halberstadt AG, Hall JA. Who's getting the message? Children's nonverbal skill and their evaluation by teachers. Developmental Psychology. 1980;16:564. doi: 10.1037/0012-1649.16.6.564. [DOI] [Google Scholar]
  19. Halberstadt AG, Parker AE, Castro VL. Nonverbal communication: Developmental perspectives. In: Hall JA, Knapp ML, editors. Handbook of communication science. Vol. 2. Berlin: Mouton de Gruyter; 2013. pp. 93–128. [DOI] [Google Scholar]
  20. Hall JA, Andrzejewski SA, Yopchick JE. Psychosocial correlates of interpersonal sensitivity: A meta-analysis. Journal of Nonverbal Behavior. 2009;33:149–180. doi: 10.1007/s10919-009-0070-5. [DOI] [Google Scholar]
  21. Hall JA, Gunnery SD. Gender differences in nonverbal communication. In: Hall JA, Knapp ML, editors. Handbook of communication science. Vol. 2. Berlin: Mouton de Gruyter; 2013. pp. 639–670. [DOI] [Google Scholar]
  22. Hall JA, Matsumoto D. Gender differences in judgments of multiple emotions from facial expressions. Emotion. 2004;4:201–206. doi: 10.1037/1528-3542.4.2.201. [DOI] [PubMed] [Google Scholar]
  23. Harris PL. Children's understanding of emotion. In: Lewis M, Haviland-Jones JM, Barrett LF, editors. Handbook of emotions. New York, NY: The Guilford Press; 2008. pp. 320–331. [Google Scholar]
  24. Hodgins H, Zuckerman M. The effect of nonverbal sensitivity on social interaction. Journal of Nonverbal Behavior. 1990;14:155–170. doi: 10.1007/BF00996224. [DOI] [Google Scholar]
  25. Hunter PG, Schellenberg E, Schimmack U. Mixed affective responses to music with conflicting cues. Cognition and Emotion. 2008;22:327–352. doi: 10.1080/02699930701438145. [DOI] [Google Scholar]
  26. Ingersoll B. Broader autism phenotype and nonverbal sensitivity: Evidence for an association in the general population. Journal of Autism and Developmental Disorders. 2010;40:590–598. doi: 10.1007/s10803-009-0907-0. [DOI] [PubMed] [Google Scholar]
  27. Kang SM, Lau AS. Revisiting the out-group advantage in emotion recognition in a multicultural society: Further evidence for the in-group advantage. Emotion. 2012;13:203–215. doi: 10.1037/a0030013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. MacDonald PM, Kirkpatrick SW, Sullivan LA. Schematic drawings of facial expressions for emotion recognition and interpretation by preschool-aged children. Genetic, Social, and General Psychology Monographs. 1996;122:373–388. [PubMed] [Google Scholar]
  29. Macmillan NA, Creelman CD. Detection theory: A user's guide. Cambridge, England: Cambridge University Press; 1991. [Google Scholar]
  30. Matsumoto D, LeRoux J, Wilson-Cohn C, Raroque J, Kooken K, Ekman P, et al. Goh A. A new test to measure emotion recognition ability: Matsumoto and Ekman's Japanese and Caucasian Brief Affect Recognition Test (JACBERT) Journal of Nonverbal Behavior. 2000;24:179–209. doi: 10.1023/A:1006668120583. [DOI] [Google Scholar]
  31. Miller R, Murphy J, Mirsky I. Relevance of facial expression and posture as cues in communication of affect between monkeys. Archives of General Psychiatry. 1959;1:480–488. doi: 10.1001/archpsyc.1959.03590050048006. [DOI] [PubMed] [Google Scholar]
  32. Nowicki S., Jr Manual for the receptive tests of The Diagnostic Analysis of Nonverbal Accuracy 2. Unpublished manuscript 2003 [Google Scholar]
  33. Nowicki S, Jr, Duke MP. Helping the child who doesn't fit in. Atlanta: Peachtree Publishers; 1992. [Google Scholar]
  34. Nowicki S, Jr, Duke MP. Individual differences in the nonverbal communication of affect: The Diagnostic Analysis of Nonverbal Accuracy Scale. Journal of Nonverbal Behavior. 1994;18:9–35. doi: 10.1007/BF02169077. [DOI] [Google Scholar]
  35. Pavlova M, Sokolov AA, Sokolov A. Perceived dynamics of static images enables emotional attribution. Perception. 2005;34:1107–1116. doi: 10.1068/p5400. [DOI] [PubMed] [Google Scholar]
  36. Pitterman H, Nowicki S. A test of the ability to identify emotion in human standing and sitting postures: The Diagnostic Analysis of Nonverbal Accuracy-2 Posture Test (DANVA2-POS) Genetic, Social, and General Psychology Monographs. 2004;130:146–162. doi: 10.3200/MONO.130.2.146-162. [DOI] [PubMed] [Google Scholar]
  37. Rosip J, Hall J. Knowledge of nonverbal cues, gender, and nonverbal decoding accuracy. Journal of Nonverbal Behavior. 2004;28:267–286. doi: 10.1007/s10919-004-4159-6. [DOI] [Google Scholar]
  38. Sabatelli R, Buck R, Dreyer A. Nonverbal communication accuracy in married couples: Relationship with marital complaints. Journal of Personality and Social Psychology. 1982;43:1088–1097. doi: 10.1037/0022-3514.43.5.1088. [DOI] [PubMed] [Google Scholar]
  39. Scherer KR, Oshinsky JS. Cue utilization in emotion attribution from auditory stimuli. Motivation and Emotion. 1977;1:331–346. doi: 10.1007/BF00992539. [DOI] [Google Scholar]
  40. Shean G, Bell E, Cameron C. Recognition of nonverbal affect and schizotypy. Journal of Psychology: Interdisciplinary and Applied. 2007;141:281–291. doi: 10.3200/JRLP.141.3.281-292. [DOI] [PubMed] [Google Scholar]
  41. Stanislaw H, Todorov N. Calculation of signal detection theory measures. Behavior Research Methods. 1999;31:137–149. doi: 10.3758/BF03207704. [DOI] [PubMed] [Google Scholar]
  42. Sternglanz R, DePaulo B. Reading nonverbal cues to emotions: The advantages and liabilities of relationship closeness. Journal of Nonverbal Behavior. 2004;28:245–266. doi: 10.1007/s10919-004-4158-7. [DOI] [Google Scholar]
  43. Strait DL, Kraus N, Skoe E, Ashley R. Musical experience and neural efficiency: Effects of training on subcortical processing of vocal expressions of emotion. European Journal of Neuroscience. 2009;29:661–668. doi: 10.1111/j.1460-9568.2009.06617.x. [DOI] [PubMed] [Google Scholar]
  44. Tabachnick BG, Fidell LS. Using multivariate statistics. Boston, MA: Allyn & Bacon/Pearson Education; 2007. [Google Scholar]
  45. Terracciano A, Merritt M, Zonderman AB, Evans MK. Personality traits and sex differences in emotion recognition among African Americans and Caucasians. Annals of the New York Academy of Sciences. 2003;1000:309–312. doi: 10.1196/annals.1280.032. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Theodor LH. A neglected parameter: Some comments on “A table for the calculation of d′ and β”. Psychological Bulletin. 1972;78:260–261. doi: 10.1037/h0033119. [DOI] [Google Scholar]
  47. Thompson W, Schellenberg E, Husain G. Decoding speech prosody: Do music lessons help? Emotion. 2004;4:46–64. doi: 10.1037/1528-3542.4.1.46. [DOI] [PubMed] [Google Scholar]
  48. Toomey R, Schuldberg D, Corrigan P, Green M. Nonverbal social perception and symptomatology in schizophrenia. Schizophrenia Research. 2002;53:83–91. doi: 10.1016/S0920-9964(01)00177-3. [DOI] [PubMed] [Google Scholar]
  49. Toomey R, Wallace C, Corrigan P, Schuldberg D, Green M. Social processing correlates of nonverbal social perception in schizophrenia. Psychiatry: Interpersonal and Biological Processes. 1997;60:292–300. doi: 10.1080/00332747.1997.11024807. [DOI] [PubMed] [Google Scholar]
  50. Trentacosta CJ, Fine SE. Emotion knowledge, social competence, and behavior problems in childhood and adolescence: A meta- analytic review. Social Development. 2010;19:1–29. doi: 10.1111/j.1467-9507.2009.00543.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Vrij A, Semin G. Lie experts' beliefs about nonverbal indicators of deception. Journal of Nonverbal Behavior. 1996;20:65–80. doi: 10.1007/BF02248715. [DOI] [Google Scholar]
  52. Wallbott HG. Bodily expression of emotion. European Journal of Social Psychology. 1998;28:879–896. doi: 10.1002/(SICI)1099-0992(1998110)28:6%3C879∷AID-EJSP901%3E3.0.CO;2-W. [DOI] [Google Scholar]

RESOURCES