Skip to main content
BMC Psychology logoLink to BMC Psychology
. 2025 Jan 7;13:12. doi: 10.1186/s40359-024-02301-8

Task difficulty modulates the effect of eye contact on word memory in females

Xinghe Feng 1,#, Qiqi Hu 2,#, Chaoxiong Ye 1,3,, Zhonghua Hu 1,3,
PMCID: PMC11705906  PMID: 39773223

Abstract

Background

The influence of eye contact on memory has been a topic of extensive study, yet its effects remain ambiguous. This inconsistency may be attributed to the varying levels of task difficulty encountered when conducting this type of research.

Methods

To explore this possibility, our study used a word memory task that also integrated eye gaze as a means of examining how task difficulty (easy or difficult) modulates the effect of eye contact on word memory. A total of 41 participants engaged in the memory task under varying eye contact conditions.

Results

Our findings revealed a significant interaction between task difficulty and eye contact: For easy tasks, memory accuracy was lower with eye contact, whereas for difficult tasks, accuracy was improved with eye contact. Intriguingly, this effect was predominantly observed in female participants. In easy tasks, eye contact appeared to hinder memory performance in females, whereas it enhanced performance in difficult tasks.

Conclusions

These results suggest that the impact of eye contact on memory is not uniformly positive or negative, but is instead contingent on task complexity and influenced by gender differences. This study contributes new insights into the fluctuating effects of eye contact on memory, thereby enriching our understanding of the relationship between nonverbal social cues and cognitive processes.

Supplementary Information

The online version contains supplementary material available at 10.1186/s40359-024-02301-8.

Keywords: Eye contact effect, Memory, Task difficulty, Gender differences

Introduction

Eye gaze, a crucial component of facial expressions, conveys nonverbal cues, including emotions, intentions, and attentional focus [13]. In social contexts, direct eye contact is often perceived as a manifestation of superiority, control, or dominance [4]. Concurrently, it can signify acceptance and empathy, thereby fostering a congenial atmosphere in interpersonal interactions [5]. The sensitivity of an individual to the signals transmitted through another’s gaze is noteworthy. Evidence from visual search tasks suggests a quicker identification of direct gaze compared to averted gaze [6, 7], highlighting the immediate attention-capturing property of direct gaze [8, 9]. In addition to its informative and sensitive aspects, direct eye contact has the potential to influence an individual’s cognitive processes [10].

The role of memory, as a key element in individual learning and development, intersects significantly with eye contact; consequently, understanding the interaction between eye contact and memory is vital for elucidating these cognitive processes. Previous studies have revealed a diverse impact of eye contact on memory task performance; for example, eye contact has been demonstrated to enhance memory, especially in tasks involving faces, names, or linguistic elements. Studies show that eye contact strengthens attention and the recall of faces [1114] and aids in associating names with faces [15]. Additionally, it positively influences the recall of verbal information, thereby aiding in the retention of words and numbers [1618] and enhancing the memory of detailed content, such as a product or story [2, 1921]. This facilitation could be associated with the significance of eye contact, arousal level, and attentional focus. When individuals make eye contact with each other, it creates a stronger feeling of closeness [19]. Conversely, those who avoid eye contact may come across as defensive [22], evasive [23], or inattentive [24]. The development of this type of negative impression results in individuals being less likely to heed the words of others [19]. Moreover, eye contact serves as an arousing stimulus and a nonverbal cue that directs attention to essential information when paired with verbal communication, thereby enhancing focus and reducing distractions [17, 25, 26].

Other studies have suggested that eye contact may impede memory performance in working memory tasks, as individuals show improved performance in the no-eye-contact condition compared to the eye-contact condition [27, 28]. The potential reason for this phenomenon could be that the presence of social cues, such as eye contact, can potentially disrupt an individual’s concentration on tasks that require high cognitive resources and prompt processing of situational information, and therefore impede memory performance [27, 28]. Our review of the experimental tasks used in previous studies led us to speculate that the varied findings on the impact of eye contact on memory reported previously may be attributable to inconsistencies in the difficulty levels of the experimental tasks. The level of task difficulty could potentially influence how cognitive load and attentional resources are allocated during cognitive processing. Moreover, based on the Contextual Theory of Eye Gaze Processing [29], variations in task difficulty may influence an individual’s interpretation of eye contact, their motivational tendencies (approach or avoidance), and their emotional responses (positive or negative).

Our hypothesis is that the previous discrepant findings might arise from differences in task difficulty. However, none of the previous studies that investigated the influence of task difficulty and eye contact on memory revealed any modulating effect of task difficulty. In a story recall experiment, Otteson [20] observed no impact of eye contact, task difficulty, and participant gender on memory performance. Notably, eye contact enhanced story retention for boys but not for girls when task difficulty was not considered. An experiment conducted by Sherwood [21] focusing on the recollection of subject knowledge revealed that eye contact improved memory performance for both easy and difficult information. Notably, the previous studies did not systematically manipulate and measure real-time eye contact between individuals; therefore, uncertainty remains regarding the extent of eye contact established by the listener with the speaker [30]. Additionally, these studies neglected to consider the effects of participants’ language skills, teacher pressures, and the interaction between the participants’ gender and the experimenter’s gender. Hence, the processing of social cues, such as eye contact, during high-level cognitive tasks is complex, and raises the first question about whether any effect of eye contact on memory is influenced by task difficulty.

Some studies have suggested that males show greater enhancement of memory performance through eye contact [2], whereas others have found contrasting effects based on gender, with negative impacts on male memory but positive implications for females [30]. This aspect of how gender impacts the moderation of the eye contact effect by task difficulty warrants further investigation. Given the observed gender-based differences in sensitivity to eye contact [3032], a second aim of the present study is to examine whether gender influences the moderating role of task difficulty on the eye contact effect.

Previous studies have further indicated that an individual’s memory performance can be influenced by the gender of others, in addition to their own gender. For example, when the face maintaining eye contact is of a gender different from one’s own, the effect of eye contact on facial memory is amplified [33]. The memory performance for story details decreased when males made eye contact with females but increased when interacting with males [2]. The influence of sociocultural norms on the interpretation of eye contact between males and females may explain this phenomenon [34, 35]. The previous studies have shown a potential interaction between self-gender and model gender in cognitive tasks; therefore, a third aim of the present study is to also include model gender as a variable in the analyses to examine whether model gender can co-modulate, along with the gender of participants, the effect of eye contact on memory in tasks of varying difficulty levels.

Overall, previous research has not shown a significant and consistent effect of task difficulty on memory performance in relation to eye contact, and this may be attributable to variations in eye contact manipulation and gender interaction. To address this knowledge gap, the current study used a word memory task while maintaining precise control over eye contact and incorporating a second language as the experimental material. We opted for the second language because it offers greater flexibility in manipulating the variable ‘task difficulty’ within the experimental material and allows for sufficient variability in accuracy to evaluate the effects of eye contact and gender, whereas the native language may exhibit a ceiling effect in this regard. By manipulating memory difficulty levels, the study sought to examine the potential modulating role of task difficulty on the relationship between eye contact and memory performance while also considering the potential modulating effects of model gender and participant gender. In light of the absence of previous research incorporating both participant gender and model gender to investigate how task difficulty moderates the effect of eye contact on memory, this exploratory study aims to provide fresh insights into the intricate relationship among task difficulty, social cues, and memory.

Methods

Participants

A total of 41 participants were recruited (20 males, average age 20.56 years, standard deviation (SD) 1.84 years, age range 18–25 years) from Sichuan Normal University. All participants had no history of neurological or psychiatric disorders and possessed normal hearing and vision (or corrected to normal). The participants were all right-handed and had passed the National College English Test-4 (CET-4) in China. Prior to the experiment, the participants were unaware of its purpose. We calculated a priori power of the experiment using G*power 3.1 [36] to ensure it was adequately powered. We assumed a medium effect size (f = 0.25) and set α = 0.05. According to G*power, we needed a sample size of at least 24 to obtain a significant interaction with an actual power of 0.96. In addition, when considering task difficulty, eye contact, the gender of participants, and the gender of models in prior studies, sample sizes varied from around 18 to 84 individuals [2, 20, 21, 30]. Considering the sample size in previous studies and the potential data exclusions, we have set our sample size at 41. Each participant signed an informed consent form before participating in the experiment. This study complied with the Helsinki Declaration and was approved by the Ethics Committee of Sichuan Normal University (Approval No: SCNU-220406).

Materials

Linguistic material

The materials used in the study included 192 English sentences divided into two types: one consisted of 96 easy statements from keywords and example sentences of high school entrance examinations, and the other comprised 96 more difficult statements from vocabulary and example sentences of the CET-4. Each type of statement (easy or difficult) included 4 (6, 7, 8, or 9 words) × 24 sentences. The 96 easy sentences were randomly divided into two sets of 48 sentences each, and the same was done for the 96 difficult sentences. One set of easy sentences was subsequently combined with one set of difficult sentences to form two lists, each comprising 48 easy sentences and 48 difficult sentences. These lists featured distinct vocabulary and example sentences but maintained an equivalent level of difficulty (see Supplementary Information Table S1).

Video material

To create the video content, eight models (4 males) were chosen, and 10 participants (5 males; mean age ± SD = 25.3 ± 1.06 years) were enlisted to evaluate the facial images of those models based on a 9-point scale that measured valence, arousal, and attractiveness. Based on the ratings, two models (one male and one female) with moderate levels of attractiveness and valence, as well as low arousal, were chosen from the pool of candidates. No significant differences were found in the paired samples t-tests for attractiveness, valence, and arousal between the two models, ps > 0.05 (see Table 1).

Table 1.

The results of the model face assessment (M ± SD)

Male model Female model t p
Attractiveness 4.15 ± 1.38 5.00 ± 1.47 1.780 0.109
Valence 4.65 ± 0.85 5.60 ± 0.97 1.956 0.082
Arousal 3.05 ± 1.19 3.75 ± 1.74 1.210 0.257

The videos, which were filmed using Canon cameras, featured the two chosen models in white tops against a backdrop of blue curtains. All videos were recorded in a brightly illuminated room. While filming, the camera lens was aligned with the model’s eyes. In the video, the model (either male or female) could be seen speaking an English sentence. The models were instructed to sustain their gaze either toward the camera lens or downward at the screen while reading the English sentence without necessitating any shifts in eye movement. Both models were from the English department of Sichuan Normal University. Each model selected all 96 sentences from a list and read them aloud in two different gaze forms. Consequently, each sentence produced two perspectives: direct eye contact with the camera and a downward gaze. The participants were instructed to maintain their attention on the models throughout the experiment, thereby creating conditions with and without eye contact. The video content was edited and cropped in Adobe Premiere Pro 2022 (https://www.adobe.com/products/premiere.html). The specifications included videos being 800 × 480 pixels, 5 s in duration, with a frame rate of 25 frames per second, and audio set to two-channel stereo (44,100 Hz and 128 kbps).

Word material

The words used in the word judgment task were selected from the Compulsory English Vocabulary for the senior school entrance exam (easy) and the CET-4 (difficult) to ensure that they were of the appropriate level of difficulty for the video.

Design and procedure

This study was conducted in a laboratory setting. The experiment utilized E-prime 2.0 software (http://pstnet.com/products/e-prime/), and the stimuli (videos and words) were presented on an 18-inch LCD monitor (1024 × 768 pixels, refresh rate 85 Hz), with participants positioned 60 cm away. The experiment employed a mixed design with four factors: two levels of task difficulty (easy and difficult), two eye contact conditions (with and without eye contact), two genders of models (male and female), and two genders of participants (male and female). Task difficulty, eye contact conditions, and the gender of models were the within-subject variables, while the gender of participants was a between-subject variable. The dependent variables were reaction time and accuracy of the word judgments in the task.

Before the experiment, the participants’ CET-4 scores were recorded. To ensure that participants understood the experimental procedure, a practice session was conducted. This session was similar to the formal experiment, but consisted of only three trials and used very simple sentences from middle school English vocabulary and example sentences. The formal experiment proceeded only after each participant had achieved 100% accuracy in the practice session. Before the participants conducted the formal experiment, they were provided with an instruction indicating the requirement to maintain visual focus on the model’s eyes during the entirety of the video presentation phase.

In the formal experiment, each trial consisted of watching a video and judging words in two stages. Each trial began with a 5 s video during which the participants were instructed to focus on the model’s eyes and pay attention to the model’s speech. Following the video, the participants were required to quickly and accurately judge whether the five words displayed next had appeared in the preceding speech (see Fig. 1). Participants were required to press “1” if the word had appeared in the video, or press “2” if it had not. During the word recall phase, the five words displayed after each video were always different. In half of the trials, the set consisted of two old words (previously shown in the video) and three new words, while in the other half, it consisted of three old words and two new words. The entire experimental design comprised four blocks, with each block exclusively dedicated to one of the four experimental conditions: easy–male model, easy–female model, difficult–male model, and difficult–female model. The eye contact conditions were randomly presented within each block. A total of 48 trials were administered within each block, with 24 trials involving eye-contact conditions and the remaining 24 trials excluding any eye-contact conditions. The block order was balanced among the participants. In total, 192 trials (24 trials per condition) were completed in the experiment. Following each block, we specifically inquired whether the participants had complied with the directive to gaze into the model’s eyes throughout the experiment.

Fig. 1.

Fig. 1

(a) Schematic diagram of the event sequence in a single trial of the experiment. (b) Participant’s perspective under conditions with and without eye contact. Models both agreed to authorize portraits

Data analyses

Our analysis, conducted using SPSS 23.0 (https://www.ibm.com/spss) and JASP (version 0.11.1, JASP Team, 2019, jasp-stats.org), began by determining the average accuracy rates (ACC) for each participant across varied conditions. The calculation of the mean reaction times (RTs) excluded incorrect responses (8.38%) and outliers (2.17%; defined as RTs outside three standard deviations).

We conducted a four-way repeated measures analysis of variance (ANOVA) with 2 (Task difficulty: easy, difficult) × 2 (Eye contact condition: with eye contact, without eye contact) × 2 (Gender of models: male, female) × 2 (Gender of participants: male, female) for RTs and ACC. The Greenhouse–Geisser correction to the degrees of freedom was utilized in all analyses when necessary. All follow-up pairwise comparisons were Bonferroni corrected. We also reported the Bayes factor (BF10), which was estimated by the JASP software to evaluate the magnitude of support for the alternative hypothesis over the null hypothesis (nondifference hypothesis) [3739]. Specifically, Bayes factors range from 0 to ∞, with a BF10 > 1, indicating that the alternative hypothesis predicts the data to a stronger degree than the null hypothesis (specifically, for the alternative hypothesis, a BF10 of 1 to 3 means a weak degree, a BF10 of 3 to 10 means a moderate degree, and a BF10 of 10 or more means a strong degree), while a BF10 < 1 indicates evidence for the null hypothesis (specifically, for the null hypothesis, a BF10 of 0.33 to 1 means a weak degree, a BF10 of 0.1 to 0.33 means a moderate degree, and a BF10 of 0.1 or less means a strong degree) [40]. In our study, one null hypothesis indicated no disparity in accuracy between the with eye contact and without eye contact conditions, while the other null hypothesis suggested no notable distinction in accuracy between male and female participants. We employed the default settings of JASP, utilizing a Cauchy prior distribution (r = 0.707), to compute the BF10 values.

Results

Reaction times

The RTs (see Table 2) showed a significant main effect of task difficulty (see Fig. 2), F(1, 39) = 107.456, p < 0.001, Inline graphic = 0.734. This effect manifested in faster reaction times in easy tasks (848.54 ± 122.40 ms, Mean ± SD) compared to difficult tasks (941.09 ± 132.03 ms), thereby affirming the validity of our task difficulty parameter. The other potential main effects and interaction effects did not reach statistical significance (see Supplementary information Table S2), ps > 0.05.

Table 2.

Reaction times (M ± SD, ms) for task difficulty, eye contact condition, gender of participants, and gender of models

Task difficulty Eye contact condition Female model Male model
Female participants Male participants Female participants Male participants
Easy Without eye contact 863.41 ± 151.49 823.93 ± 95.14 883.11 ± 136.21 834.01 ± 130.28
With eye contact 867.86 ± 150.35 821.14 ± 119.28 867.28 ± 126.12 822.69 ± 121.88
Difficult Without eye contact 971.79 ± 146.67 905.92 ± 141.70 970.26 ± 133.75 913.34 ± 119.93
With eye contact 974.80 ± 154.49 904.80 ± 128.61 980.36 ± 129.80 902.27 ± 130.06

Fig. 2.

Fig. 2

Mean reaction time (a) and accuracy (b) of easy and difficult tasks, *** p < 0.001

Accuracy

ACC (see Table 3) showed a significant main effect of task difficulty (see Fig. 2), F(1, 39) = 125.819, p < 0.001, Inline graphic = 0.763. The ACC in word recognition was higher in the easy tasks (0.94 ± 0.03, Mean ± SD) than in the difficult tasks (0. 90 ± 0.03), thereby reinforcing the validity of task difficulty.

Table 3.

The accuracy (M ± SD) for task difficulty, eye contact condition, gender of participants, and gender of models

Task difficulty Eye contact condition Female model Male model
Female participants Male participants Female participants Male participants
Easy Without eye contact 0.94 ± 0.04 0.94 ± 0.03 0.94 ± 0.04 0.94 ± 0.04
With eye contact 0.93 ± 0.04 0.93 ± 0.04 0.93 ± 0.04 0.94 ± 0.04
Difficult Without eye contact 0.89 ± 0.03 0.91 ± 0.04 0.88 ± 0.04 0.90 ± 0.05
With eye contact 0.91 ± 0.03 0.90 ± 0.05 0.90 ± 0.03 0.90 ± 0.04

Eye contact condition and task difficulty showed a significant interaction effect, F(1, 39) = 16.947, p < 0.001, Inline graphic = 0.303. Follow-up pairwise comparisons showed that in the easy tasks, the ACC was lower with eye contact (0.93 ± 0.04) than without eye contact (0.94 ± 0.03), t(40) = 2.530, p = 0.015, Cohen’s d = 0.395, BF10 = 2.791, whereas in the difficult tasks, the ACC was higher with eye contact (0.90 ± 0.03) than without eye contact (0.89 ± 0.04), t(40) = -2.267, p = 0.029, Cohen’s d = -0.354, BF10 = 1.656. The BF10 values (both > 1 and < 3) showed weak differences in ACC with and without eye contact in both the easy and difficult tasks. The gender of the models and task difficulty also showed a significant interaction effect, F(1, 39) = 7.333, p = 0.010, Inline graphic = 0.158. In the difficult tasks, the ACC was higher for female models (0.90 ± 0.04) than for male models (0.89 ± 0.04), t(40) = 2.443, p = 0.019, Cohen’s d = 0.382, BF10 = 2.339, whereas in the easy tasks, there was no significant difference in the ACC between male models (0.94 ± 0.04) and female models (0.93 ± 0.03), t(40) = -1.024, p = 0.312, Cohen’s d = -0.160, BF10 = 0.275. The BF10 value, falling between 1 and 3, suggested weak support for the alternative hypothesis in the difficult tasks. Conversely, in the easy tasks, moderate evidence supported the null hypothesis over the alternative hypothesis, with the BF10 value ranging from 0.1 to 0.33.

Notably, the interaction effect of the eye contact condition, task difficulty, and gender of participants was significant (see Fig. 3), F(1, 39) = 9.959, p = 0.003, Inline graphic = 0.203. Follow-up Bonferroni-corrected pairwise comparisons revealed that, for females, the ACC was lower with eye contact (0.93 ± 0.04) than without eye contact (0.94 ± 0.03) in the easy tasks, t(20) = 2.626, p = 0.016, Cohen’s d = 0.573, BF10 = 3.387, while the opposite effect was shown in the difficult tasks, with higher ACC with eye contact (0.90 ± 0.03) than without eye contact (0.88 ± 0.03), t(20) = -3.724, p = 0.001, Cohen’s d = -0.813, BF10 = 28.037. For males, no significant difference was detected between the ACC with and without eye contact for either task type (easy: t(19) = 1.007, p = 0.327, Cohen’s d = 0.225, BF10 = 0.363; difficult: t(19) = 0.245, p = 0.809, Cohen’s d = 0.055, BF10 = 0.239). The BF10 results indicated that female gender demonstrates moderate support (3 < BF10 < 10) for the alternative hypothesis in easy tasks and stronger support (BF10 > 10) in difficult tasks, while male gender exhibits weak evidence (0.33 < BF10 < 1) for the null hypothesis in easy tasks and moderate evidence (0.1 < BF10 < 0.33) in difficult tasks when compared to the alternative hypothesis. Other main effects and interaction effects were not significant (see Supplementary information Table S3), ps > 0.05.

Fig. 3.

Fig. 3

The mean accuracy for task difficulty × eye contact condition × gender of participants, * p < 0.05, ** p < 0.01, ns > 0.05

Discussion

This study explored the impact of eye contact on memory performance in word memory tasks with different difficulty levels. The results showed that task difficulty can modulate the effect of eye contact on memory and that this modulation effect differs between male and female participants.

As an initial finding, this study showed that eye contact impeded word memory in an easy task, whereas it facilitated word memory in a difficult task. This is consistent with the results of some previous studies on the influence of eye contact on memory. For example, in an end-of-sentence word recall task, eye contact hindered memory performance [27], whereas in detailed memory tasks [2, 19, 20], digit–letter compound memory tasks [16, 17], and the highest-level memory span tasks [18], eye contact enhanced memory performance. Analysis of the difficulty levels of these previously investigated tasks revealed that the end-of-sentence word recall represents an immediate and concise memory demand, whereas memory for story details, mixed letter-number material, and similar aspects entails a delayed and complex memory requirement. However, the results of the present study are inconsistent with a previous study by Sherwood [21], who showed that eye contact enhances the recall of information, regardless of its difficulty.

One explanation for the observed differences between the present study and Sherwood’s may be the inconsistency in controlling eye contact. Sherwood’s study [21] also involved a six-minute verbal presentation comparing the disciplines of biology, chemistry, and psychology during the memory encoding process. The prolonged memory encoding may entail a range of abilities, such as acquiring knowledge, understanding, and storing information in long-term memory. Consequently, the findings of these two studies cannot be directly compared.

We propose the following potential explanation for the modulation of eye contact effects on memory by task difficulty. From the perspective of the Contextual Theory of Eye Gaze Processing, the influence of eye contact on memory may be context dependent [29]. The diverse levels of task difficulty can influence an individual’s perception of eye contact, their motivational inclination (approach or avoidance), and their emotional reaction (positive or negative). Specifically, in the present study, it may be the case that completing difficult memory tasks demands a higher cognitive load and more attention resources, and individuals might allocate fewer attention resources to maintaining eye contact and instead prioritize focusing on the task at hand. In this context, eye contact can diminish daydreaming, decrease attention lapses, be interpreted as a form of reward [41], and consequently boost task performance. However, in easy tasks where cognitive activities are easily accomplished, eye contact can serve as a distraction that impairs task performance. This phenomenon may be explained by the participant’s diverted attention toward maintaining a consistent focus on the facial images of other individuals and a resulting increased rumination on social relationships and heightened feelings of surveillance, which consequently elicit negative emotions associated with distrust [42, 43]. This, in turn, negatively affects task performance [44, 45].

One noteworthy finding of our study is that the influence of task difficulty and eye contact on memory was modulated by the gender of the participants. Specifically, for female participants, the presence of eye contact significantly reduced word memory accuracy in easy tasks, while it significantly enhanced word memory accuracy in difficult tasks. By contrast, male participants did not show significant differences in accuracy, regardless of the presence of eye contact across different task difficulties. This demonstrates the intricacy of the process by which task difficulty modulates the effect of eye contact on memory in advanced cognitive processes. This complexity may be reflected in gender differences in cognitive processing strategies related to eye contact.

Previous studies have suggested that females may exhibit a greater tendency to utilize metacognitive, social, and affective strategies in the context of English language learning when compared to males [46, 47]. The utilization of these strategies could lead to inconsistencies in the application and interpretation of eye contact among individuals of different genders. Specifically, females may exhibit a higher tendency to use social strategies under challenging circumstances, such as seeking information from the model’s eyes. This behavior facilitates enhanced attentional focus and cognitive resource allocation, consequently resulting in improved memory performance. Conversely, in simple situations, the utilization of metacognitive strategies is more prevalent among females, who are inclined to monitor their own performance and are susceptible to distractions from eye contact from others, leading to distraction and thus impairing memory performance. While males do not employ these strategies and consequently utilize less eye information in their surroundings, the observation that eye contact does not impact memory performance in tasks of varying difficulty is noteworthy.

Additionally, the findings of the present study may indicate differences in how males and females process eye contact, with females being more sensitive to others’ eye contact, particularly in situations with differing levels of complexity. This finding is consistent with previous research, which has reported that females are more sensitive to eye contact in word-recognition tasks [30], gaze cueing paradigms [48, 49], and eye-contact tasks [50, 51], suggesting that females are more likely to pay attention to the eyes and interpret social signals related to others’ eye contact [32, 52, 53]. This interpretation may be linked to the difficulty setting of the memory task, with females tending to view eye contact as a sign of stress and disruption in easy situations and as a sign of encouragement and support in challenging situations. However, other studies have shown that males were more influenced by others’ eye contact when recalling stories [2, 20], which is inconsistent with the results for males in our study.

Lanthier [30] discovered that eye contact impeded word recall in males, whereas Otteson [20] observed that eye contact enhanced males’ retention of story details. Conversely, we found that eye contact had no influence on males’ memory. This discrepancy could be attributed to the possibility that males are less responsive to eye contact from individuals portrayed in a video setting. Previous studies that integrated interactivity into real-life environments rather than merely passive viewing of videos found that individuals interpret and respond differently to real interactions versus videos [54], indicating the complex interactions between eye contact, task difficulty, and participant gender.

The model’s gender was also an important factor in the design of this study; however, unfortunately, we did not find a four-way interaction between the model’s gender, eye contact, task difficulty, and participant gender, which is inconsistent with previous study findings [2]. Helminen [2] discovered that male participants exhibited enhanced memory performance in response to eye contact from male models, whereas their performance declined when faced with eye contact from female models. This difference between our study findings and Helminen’s may reflect differences in the interactions used, as the interaction between the model and participants in our study was based on videos and might be weaker than real interactions. In Helminen’s study, female models exhibited a higher frequency of gaze shifts than male models. Additionally, participants’ beliefs regarding whether models could see them influenced their eye contact with models in realistic scenes. Despite the models’ visual access to the participants, reduced eye contact and increased gaze shifts were observed when the participants believed they were being watched by the models [2]. In the present study, eye contact remained consistent regardless of the participants’ beliefs, potentially explaining why the gender of the model did not influence memory.

This study has some limitations. First, we used meaningful English expressions and words as a means of manipulating the task difficulty. Although we controlled for the participants’ English language proficiency, other factors, such as knowledge and experience, might still have affected their performance in memory tasks. Furthermore, individuals process their native language and second language in different ways [55]. Therefore, subsequent studies could use unfamiliar words and phrases, meaningless characters, or native language to further reduce the influence of individual knowledge and experience and determine the stability of the modulation of eye contact effects on memory by task difficulty. A second limitation is that this study did not investigate how individual characteristics influence the eye contact effect. Individuals with autism spectrum disorders [56] or social anxiety disorders [57] might exhibit traits of indifference or unwillingness to make eye contact. Future research could consider individual traits as variables and explore whether these traits modulate the impact of task difficulty on the eye contact effect. Moreover, to enhance our comprehension of how eye contact influences memory across different task complexities, it is crucial to examine how levels of arousal, cognitive load, and attentional resources are coordinated within this mechanism. Future research might also benefit from incorporating skin conductance measurements, EEG technology, and other techniques to further elucidate and validate the underlying reasons of the current research findings. Lastly, this study only involved eye contact during the encoding phase. Future research could explore whether task difficulty modulates how eye contact influences memory across various stages, such as during the recall phase.

Conclusion

This study used a word memory task with eye gaze to explore how task difficulty affects the impact of eye contact on word memory. Task difficulty was found to modulate the impact of eye contact on word memory and to produce different results between male and female genders. The study found that for females, eye contact hindered their memory performance in easier tasks, whereas it aided their memory performance in more challenging tasks. By contrast, males did not exhibit this task-related difference in the effects of eye contact on memory. This study provides new insights into explaining the inconsistent results of eye contact on memory, thereby contributing to a more comprehensive understanding of the relationship between nonverbal social cues and cognition.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1 (22.2KB, docx)

Acknowledgements

We would like to thank Qiang Liu (Sichuan Normal University) who provided insight and expertise in statistical analysis.

Abbreviations

CET-4

The National College English Test-4

ACC

Accuracy rate

RTs

Reaction times

SD

Standard deviation

ANOVA

Repeated measures analysis of variance

BF10

The Bayes factor

Author contributions

Z.H. and Q.H conceived and designed the experiments. X.F. and Q.H. performed the data acquisition and analyzed the data. Z.H., C.Y., X.F., and Q.H. interpreted the data and drafted the manuscript. All authors revised and approved the manuscript.

Funding

This study was supported by Sichuan Province Science and Technology Education Joint Fund Project (grant number 2024NSFSC2092 to Zhonghua Hu) and Research Council of Finland (former Academy of Finland) Academy Research Fellow project (grant number 355369 to Chaoxiong Ye) and Finnish Cultural Foundation (grant number 00231373 to Chaoxiong Ye).

Data availability

The data that support the findings of this study are available at https://data.mendeley.com/datasets/t4ssmf6b8h/1. The materials used in this study are available from the corresponding author (huzhonghua2000@163.com, Zhonghua Hu) on reasonable request.

Declarations

Ethics approval and consent to participate

This study complied with the Helsinki Declaration and was approved by the Ethics Committee of Sichuan Normal University (Approval No: SCNU-220406).

Consent for publication

Each participant in our study signed an informed consent form before the experiment. Each model signed a portrait rights license agreement giving us permission to use their video as experimental material and to publish them.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Xinghe Feng and Qiqi Hu have contributed equally to this study.

Contributor Information

Chaoxiong Ye, Email: cxye1988@163.com.

Zhonghua Hu, Email: huzhonghua2000@163.com.

References

  • 1.Emery NJ. The eyes have it: the neuroethology, function and evolution of social gaze. Neurosci Biobehavioral Reviews. 2000;24(6):581–604. [DOI] [PubMed] [Google Scholar]
  • 2.Helminen TM, Pasanen TP, Hietanen JK. Learning under your gaze: the mediating role of affective arousal between perceived direct gaze and memory performance. Psychol Res. 2016;80:159–71. [DOI] [PubMed] [Google Scholar]
  • 3.Ristic J, Kingstone A. Taking control of reflexive social attention. Cognition. 2005;94(3):B55–65. [DOI] [PubMed] [Google Scholar]
  • 4.Hall J, Coats E, Lebeau LJPB. Nonverbal behavior and the vertical dimension of social relations: a meta-analysis. 2005;131(6):898–924. [DOI] [PubMed]
  • 5.Wirth JH, Sacco DF, Hugenberg K, Williams KD. Eye gaze as relational evaluation: averted eye gaze leads to feelings of ostracism and relational devaluation. Pers Soc Psychol Bull. 2010;36(7):869–82. [DOI] [PubMed] [Google Scholar]
  • 6.Conty L, N’Diaye K, Tijus C, George N. When eye creates the contact! ERP evidence for early dissociation between direct and averted gaze motion processing. Neuropsychologia. 2007;45(13):3024–37. [DOI] [PubMed] [Google Scholar]
  • 7.Von Grünau M, Anston C. The detection of gaze direction: a stare-in-the-crowd effect. Perception. 1995;24(11):1297–313. [DOI] [PubMed] [Google Scholar]
  • 8.Lyyra P, Astikainen P, Hietanen JK. Look at them and they will notice you: Distractor-independent attentional capture by direct gaze in change blindness. Visual Cognition. 2018;26(1):25–36. [Google Scholar]
  • 9.Senju A, Hasegawa T. Direct gaze captures visuospatial attention. Visual Cognition. 2005;12(1):127–44. [Google Scholar]
  • 10.Senju A, Johnson MH. The eye contact effect: mechanisms and development. Trends Cogn Sci. 2009;13(3):127–34. [DOI] [PubMed] [Google Scholar]
  • 11.Hood BM, Macrae CN, Cole-Davies V, Dias M. Eye remember you: the effects of gaze direction on face recognition in children and adults. Dev Sci. 2003;6(1):67–71. [Google Scholar]
  • 12.Smith AD, Hood BM, Hector K. Eye remember you two: gaze direction modulates face recognition in a developmental study. Dev Sci. 2006;9(5):465–72. [DOI] [PubMed] [Google Scholar]
  • 13.Mason M, Hood B, Macrae CN. Look into my eyes: gaze direction and person memory. Memory. 2004;12(5):637–43. [DOI] [PubMed] [Google Scholar]
  • 14.Goodman LR, Phelan HL, Johnson SA. Sex differences for the recognition of direct versus averted gaze faces. Memory. 2012;20(3):199–209. [DOI] [PubMed] [Google Scholar]
  • 15.Lopis D, Conty L. Investigating eye contact effect on people’s name retrieval in normal aging and in Alzheimer’s disease. Front Psychol. 2019;10:1218. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Fry R, Smith GF. The effects of feedback and eye contact on performance of a digit-coding task. J Soc Psychol. 1975;96(1):145–6. [Google Scholar]
  • 17.Kelley DH, Gorham J. Effects of immediacy on recall of information. Communication Educ. 1988;37(3):198–207. [Google Scholar]
  • 18.Falck-Ytter T, Carlström C, Johansson M. Eye contact modulates cognitive processing differently in children with autism. Child Dev. 2015;86(1):37–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Fullwood C, Doherty-Sneddon G. Effect of gazing at the camera during a video link on recall. Appl Ergon. 2006;37(2):167–75. [DOI] [PubMed] [Google Scholar]
  • 20.Otteson JP, Otteson CR. Effect of teacher’s gaze on children’s story recall. Percept Mot Skills. 1980;50(1):35–42. [Google Scholar]
  • 21.Sherwood JV. Facilitative effects of gaze upon learning. Percept Mot Skills. 1987;64(3suppl):1275–8. [Google Scholar]
  • 22.Kleck RE, Nuessle W. Congruence between the indicative and communicative functions of eye contact in interpersonal relations. Br J Soc Clin Psychol. 1968;7(4):241–6. [DOI] [PubMed] [Google Scholar]
  • 23.Hemsley GD, Doob AN. The Effect of looking behavior on perceptions of a Communicator’s credibility 1. J Appl Soc Psychol. 1978;8(2):136–42. [Google Scholar]
  • 24.Kleinke CL, Staneski RA, Berger DE. Evaluation of an interviewer as a function of interviewer gaze, reinforcement of subject gaze, and interviewer attractiveness. J Personal Soc Psychol. 1975;31(1):115. [DOI] [PubMed] [Google Scholar]
  • 25.Ekman P. About brows: emotional and conversational signals. Hum Ethol. 1979:163–202.
  • 26.Whittaker S, O’Conaill B. The role of vision in face-to-face and mediated communication. 1997.
  • 27.Nemeth D, Turcsik AB, Farkas G, Janacsek K. Social communication impairs working-memory performance. Appl Neuropsychology: Adult. 2013;20(3):211–4. [DOI] [PubMed] [Google Scholar]
  • 28.Goldfarb LP, Plante TG, Brentar JT, DliGregorio M. Administering the Digit Span Subtest of the WISC-ILL: should the Examiner make Eye Contact or not? Assessment. 1995;2(4):313–8. [Google Scholar]
  • 29.Hadders-Algra M. Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehavioral Reviews. 2022;132:304–23. [DOI] [PubMed] [Google Scholar]
  • 30.Lanthier SN, Jarick M, Zhu MJ, Byun CS, Kingstone A. Socially communicative eye contact and gender affect memory. Front Psychol. 2019;10:1128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Bailenson JN, Blascovich J, Beall AC, Loomis JM. Equilibrium theory revisited: mutual gaze and personal space in virtual environments. Volume 10. Presence: Teleoperators & Virtual Environments; 2001. pp. 583–98. 6. [Google Scholar]
  • 32.Bayliss AP, Di Pellegrino G, Tipper SP. Sex differences in eye gaze and symbolic cueing of attention. Q J Experimental Psychol Sect A. 2005;58(4):631–50. [DOI] [PubMed] [Google Scholar]
  • 33.Vuilleumier P, George N, Lister V, Armony J, Driver J. Effects of perceived mutual gaze and gender on face processing and recognition memory. Visual Cognition. 2005;12(1):85–101. [Google Scholar]
  • 34.Hubbard ASE, Burgoon JK. Nonverbal communication. An Integrated Approach to Communication Theory and Research. Routledge; 2019. pp. 333–46.
  • 35.Henley NM. Body politics revisited: what do we know today? Gender, power, and communication in human relationships. Routledge; 2012. pp. 27–61.
  • 36.Faul F, Erdfelder E, Lang A-G, Buchner A. G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods. 2007;39(2):175–91. [DOI] [PubMed] [Google Scholar]
  • 37.Hu C-P, Kong X, Wagenmakers E-J, Ly A, Peng K. The Bayes factor and its implementation in JASP: a practical primer. Adv Psychol Sci. 2018;26(6):951–65. [Google Scholar]
  • 38.Wagenmakers E-J, Love J, Marsman M, Jamil T, Ly A, Verhagen J, et al. Bayesian inference for psychology. Part II: example applications with JASP. Psychon Bull Rev. 2018;25:58–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Wagenmakers E-J, Marsman M, Jamil T, Ly A, Verhagen J, Love J, et al. Bayesian inference for psychology. Part I: theoretical advantages and practical ramifications. Psychon Bull Rev. 2018;25:35–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.van Doorn J, Böhm U, Dablander F, Derks K, Draws T, et al. The JASP guidelines for conducting and reporting a bayesian analysis. Psychon Bull Rev. 2021;28:813–26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Kampe KK, Frith CD, Dolan RJ, Frith U. Reward value of attractiveness and gaze. Nature. 2001;413(6856):589–89. [DOI] [PubMed]
  • 42.Panagopoulos C, van der Linden S. The feeling of being watched: do eye cues elicit negative affect? North Am J Psychol. 2017;19(1).
  • 43.Wiener G. The effect of distrust on some aspects of intelligence test behavior. J Consult Psychol. 1957;21(2):127. [DOI] [PubMed]
  • 44.Doherty-Sneddon G, Phelps FG. Gaze aversion: a response to cognitive or social difficulty? Mem Cognit. 2005;33:727–33. [DOI] [PubMed]
  • 45.Doherty-Sneddon G, Anderson A, O’malley C, Langton S, Garrod S, Bruce V. Face-to-face and video-mediated communication: a comparison of dialogue structure and task performance. J Exp Psychol Appl. 1997;3(2):105.
  • 46.Green JM, Oxford R. A closer look at learning strategies, L2 proficiency, and gender. TESOL Q. 1995;29(2):261–97. [Google Scholar]
  • 47.Aslan O. The role of gender and language learning strategies in learning English. Middle East Technical University; 2009.
  • 48.McCrackin SD, Itier RJ. Individual differences in the emotional modulation of gaze-cuing. Cogn Emot. 2019;33(4):768–800. [DOI] [PubMed] [Google Scholar]
  • 49.Hayward DA, Ristic J. Feature and motion-based gaze cuing is linked with reduced social competence. Sci Rep. 2017;7(1):44221. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Porter G, Hood BM, Troscianko T, Macrae CN. Females, but not males, show greater pupillary response to direct-than deviated-gaze faces. Perception. 2006;35(8):1129–36. [DOI] [PubMed] [Google Scholar]
  • 51.Leeb RT, Rejskind FG. Here’s looking at you, kid! A longitudinal study of perceived gender differences in mutual gaze behavior in young infants. Sex Roles. 2004;50:1–14. [Google Scholar]
  • 52.Connellan J, Baron-Cohen S, Wheelwright S, Batki A, Ahluwalia J. Sex differences in human neonatal social perception. Infant Behav Dev. 2000;23(1):113–8. [Google Scholar]
  • 53.Lutchmaya S, Baron-Cohen S, Raggatt P. Foetal testosterone and eye contact in 12-month-old human infants. Infant Behav Dev. 2002;25(3):327–35. [Google Scholar]
  • 54.Noah JA, Zhang X, Dravida S, Ono Y, Naples A, McPartland JC et al. Real-time eye-to-eye contact is associated with cross-brain neural coupling in angular gyrus. Front Hum Neurosci. 2020:19. [DOI] [PMC free article] [PubMed]
  • 55.Melby-Lervåg M, Lervåg A. Reading comprehension and its underlying components in second-language learners: a meta-analysis of studies comparing first-and second-language learners. Psychol Bull. 2014;140(2):409. [DOI] [PubMed] [Google Scholar]
  • 56.Baron-Cohen S. Mindblindness: an essay on autism and theory of mind. MIT Press; 1997.
  • 57.Weeks JW, Howell AN, Goldin PR. Gaze avoidance in social anxiety disorder. Depress Anxiety. 2013;30(8):749–56. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1 (22.2KB, docx)

Data Availability Statement

The data that support the findings of this study are available at https://data.mendeley.com/datasets/t4ssmf6b8h/1. The materials used in this study are available from the corresponding author (huzhonghua2000@163.com, Zhonghua Hu) on reasonable request.


Articles from BMC Psychology are provided here courtesy of BMC

RESOURCES