Skip to main content
Heliyon logoLink to Heliyon
. 2024 Jun 14;10(12):e32984. doi: 10.1016/j.heliyon.2024.e32984

Congruent or conflicting? The interaction between emoji and textual sentence is not that simple!

Yuan-fu Dai a, Xiao-yan Gao a, Wen-wu Leng c, Chen Huang a, Wen-jing Yu b, Chang-hao Jiang c,
PMCID: PMC11238005  PMID: 38994052

Abstract

As a Japanese graphic symbol widely used in the world, Emoji plays an important role in computer mediated communication. Despite its prevalent use, the interaction dynamics between emoji and textual sentences remain inadequately explored. Based on the emotional function of emoji, this study uses the indirect priming method to explore the emotional impact of emoji on subsequent text in computer mediated communication through two progressive behavioral experiments. The results show that: (1) Emoji positioned at the onset of a sentence induce an emotional priming effect; (2) The processing speed is slowest when emoji and text are emotionally conflicting, while in non-conflicting condition, the type of emoji moderates the processing of combined sentences; (3) The emotional influence of emoji plays an auxiliary role, and the valence of textual sentence plays a decisive role in emotional perception.

Keywords: Emoji, Emotional priming effect, Emotional cues, Congruent effect

1. Introduction

Emoji is from the Japanese e [picture] + moji [character], which is graphic symbols with predefined names and codes [1]. Since their inception, emojis have rapidly proliferated across social networks, capturing global attention and transforming digital communication [2]. Their ability to encapsulate and convey non-verbal cues—akin to expressions, gestures, and intonations in traditional communication [3,4] —has been pivotal in their ascent. In computer-mediated communication (CMC), emojis have emerged as a critical tool for expressing emotions, analogous to the role of facial expressions in face-to-face interactions [5]. Although emoji is widely used because of its emotional features, the nature of its interaction with textual sentences remains to be fully understood. This research aims to delve into the emotional functionality of emojis and, building on this foundation, investigate the dynamics and outcomes of their interaction with textual sentences across a spectrum of emotional valences.

1.1. The function of emoji

Earlier studies have shown that emoji can significantly influence people's perception of the object of communication, set the emotional tone, reduce ambiguity and enhance contextual expression [6], and also change semantic [7]. There are diverging views in social semiotics about the role of emojis: while some see them as an independent language capable of conveying nuanced meanings through combinations [8], others consider them as a paralinguistic tool, with their meanings influenced by surrounding text [9,10]. Despite the controversy surrounding their characterization, it is widely accepted that emoji are packed with emotional cues [6,11]. Furthermore, similar to facial expressions in face-to-face interactions, both facial emoji and some non-facial emoji can convey emotions in CMC [12,13].

Although emoji's semantic and emotional functions are crucial in CMC, assessing the effectiveness of emoji is challenging when considering the combination of emoji and text as a unified entity due to their automatic and rapid processing [14]. Fortunately, the priming effect from implicit memory provides a new thought [15,16]. Priming effect was defined as when a specific prior encounter with an item leads to a change in the ability to identify or produce it [17]. The priming effect was assessed in an experimental task that did not require conscious recall, but allowed for the presentation of the same or similar stimuli prior to the response task [18]. Participants exhibited enhanced performance when the stimuli were present, as opposed to when they were absent. For example, Fazio et al. [19] found that subjects have shorter response time to words if the word valence is elicited by prior stimuli with similar valence.

Therefore, researchers employed the priming effect to explore the functions of emoji, for instance, Yang et al. [20] employed emoji for the emotional priming of words, finding no significant priming effect in behavioral experiments, yet significant effects in event-related potentials were noted. The less consistent results between behavioral and brain data were explained by another study of the effect of emoji on neutral narrative sentence: emoji affects subsequent text processing of the N400, and affects processing after the second word [21]. Thus, the short length of emotional words may be an important factor in causing the priming effect to disappear. So, our study from the receiver's perspective, replaces the emotional words as emotional textual sentences and selects emoji of two emotional valences (positive, negative) and textual sentences of three valences (positive, neutral, negative) to formulate the Hypothesis as follow.

Hypothesis 1

(H1): There is an emotional priming effect of emoji at the beginning of sentences both in accuracy and response time.

1.2. The process of the interaction between emoji and textual sentence

In addressing how emoji and textual sentences interact, numerous researchers emphasize the alignment between verbal and non-verbal cues, noting that response times diminish when the emotional cues of emoji match those of the textual content [22,23]. For example, Boutet et al. [24] added emojis with three emotional valences (positive, neutral, and negative) to the end of instant messages and matched the messages and emojis for congruence and incongruence. They found that participants' processing speed and comprehension of the messages increased in the congruent case. Later, Hand et al. [25] expanded to five emoji types (no emoji, negative face, neutral face, positive face, object emoji). Despite the exciting findings on congruent effect, incongruent condition has not been explored sufficiently. One important condition, irony, has been overlooked. Research reveals that sentences combining positive external and negative internal emotions often convey irony [26], exemplified by “I'm pretty happy right nowImage 1”. Contrary to the congruent effect, conflicting emotional cues between the emoji and the text lead to a lower processing speed [27].

Apart from congruent and conflicting conditions, incongruent and non-conflicting conditions are commonly ignored. Current research on the interaction between emoji and text in non-conflicting, incongruent scenarios, especially with neutral narrative sentences, is still ambiguous. Although neutral narrative sentences with positive emoji have been found to convey positive emotions [21], this result was limited to the analysis of neutral narrative sentences and was not directly compared to the results of the congruent and conflicting conditions. Thus, this paper emphasizes that it is not reasonable for neutral narrative sentences to be simply reduced to incongrent conditions.

Therefore, this study aims at specifically characterizing the process and outcome of emoji and textual sentence interactions under congruent conditions, conflicting conditions, and incongruent but non-conflicting condition. Based on this idea, our study still selects emoji of two emotional valences (positive, negative) and textual sentences of three valences (positive, neutral, negative) as the experimental material, while the reaction material comprises combinations of emoji and text, and establish the Hypothesis as follows.

Hypothesis 2

(H2): Response time was shorter for combinations of congruent conditions than for combinations containing neutral textual sentences, while response times were longest for conflicting condition.

Hypothesis 3

(H3): Non-neutral text sentences will have their valence scores reversed due to the addition of emoji with the opposite valence.

2. Material and methods

2.1. Recruitment and participants

The total required sample size estimated with G-power 3.1 [28] for F-test repeated measures, a power of 0.95 and a medium effect size of 0.25 was 28. Thirty randomly recruited participants from Jinggangshan University, aged 21.30 ± 1.26 years, half of whom were men, were required to use social software for more than 2 h per day in this study. All participants were recruited within a week and they were native Chinese speakers, right-handed, with normal or corrected vision, no tobacco or alcohol addiction, no recent use of any psychoactive drug, and no anxiety or depression disorder. Before experiment, participants read and signed an informed consent form. They received remuneration upon completing the study.

2.2. Materials

2.2.1. Emojis

Considering the influence of culture and language [29], the Japanese emoji closest to the eastern culture [30] was selected on the emojiall.com website using Excel' s web data export technique. The 20 highest scoring emojis (10 positive and 10 negative) were selected by combining the degree of valence and confidence level, which contained both facial and non-facial expressions. A questionnaire was specially developed following the treatment of Yang et al. [20]. The questionnaire was scored on a seven-point Likert scale, with higher scores indicating more positive emoji and lower scores indicating more negative emoji. The internal consistency coefficient of the questionnaire was 0.899. These emojis were uniformly resized to 50 × 50 pixels using Adobe Photoshop, ensuring preservation of their original color qualities, and the final selected six emojis (three positive) are displayed in Table 1.

Table 1.

Experimental materials emojis.

Types Emoji Valence Score (M ± SD)
Positive Image 2 Image 3 Image 4 6.14 ± 0.97
Negative Image 5 Image 6 Image 7 2.24 ± 1.25

2.2.2. Sentences

Sentence stimuli were selected from the PANAS scale revised by Qiu et al. [31] (nine positive words and nine negative words). These words were incorporated after the 2nd word of the sentence to form a more authentic chat sentence, while nine neutral (narrative) sentences were added to form sentence bank and the corresponding questionnaire was specially developed. Similarly, a seven-point Likert scale assessed emotional valence (M1) and authenticity (M2), with Cronbach's α being 0.917 and 0.911, respectively. Scoring methods varied by sentence type: M1 + M2 for positive sentences, M2 - | 4 - M1| for neutral, and M2 - M1 for negative sentences, with higher scores indicating better quality, the final selected nine sentences detailed in Table 2.

Table 2.

Experimental materials sentences (Translated version, original version with Chinese character is offered in Appendix A).

Types Sentence Valence (M ± SD) Authenticity (M ± SD)
Positive I was so thankful. I was really happy. I was super excited. 5.86 ± 0.85 5.67 ± 0.89
Neutral I was playing computer games. I dropped off some documents. I spent some time reading. 4.71 ± 0.98 5.22 ± 1.17
Negative I felt really sad. I was pretty mad. I got really scared. 3.86 ± 1.83 5.35 ± 1.11

In addition to text-only sentences, a combination of emoji and textual sentences was created, all positioned at the beginning of the sentence, as shown in Table 3. The materials of the combined sentences are consistent with the emoji and textual sentences in other aspects, with a total of 6 × 9 = 54 styles, which are also processed accordingly into image form for programming purposes.

Table 3.

Experimental materials combined sentences (Part of translated version, full and original version are offered in Appendix A).

Sentences/emojis Positive Negative
Positive Image 8I was really happy. Image 9I was really happy.
Neutral Image 10I felt really sad. Image 11I felt really sad.
Negative Image 12I dropped off some documents. Image 13I dropped off some documents.

2.3. Designs

This study consisted of two progressive behavioral experiments. Experiment 1 was a 2 (emoji types: positive, negative) × 3(textual sentence types: positive, neutral, negative) within-group design, with the within-group independent variables being emoji types and textual sentence types, and the dependent variables being the accuracy and response time of the behavioral experiment. Experiment 2 was also 2 (emoji types: positive, negative) × 3 (textual sentence types: positive, neutral, negative) Within-group design, the within-group independent variables were emoji types and textual sentence types, and the dependent variables were response time and emotional valence score.

2.4. Procedures

2.4.1. Experiment 1

Three positive and three negative emoji stimuli were combined with three each of positive, neutral, and negative sentence stimuli, creating 54 unique experimental stimuli (6 × 9), each randomly presented twice for a total of 108 trials. Additionally, 18 practice trials using similar materials prepared participants for the experiment. During the trials, conducted on E-Prime 3 with subjects 70 cm from the screen, a “+” gaze point appeared for 500 ms, followed by a 300 ms emoji display. A 500 ms stimulus onset asynchrony (SOA), as per Holderbaum et al. [32], facilitates semantic priming, hence a 200 ms blank screen interval preceded the sentence stimuli, which remained until response. Press “a”, “s”, and “d” to represent judgments as positive, neutral, and negative, respectively. Both accuracy and response time were recorded, and a trial sequence is depicted in Fig. 1.

Fig. 1.

Fig. 1

Flow chart of experiment 1 (1 trail).

2.4.2. Experiment 2

Experiment 2 mirrored Experiment 1 in setup and execution. Each trial began with a “+” displayed for 500 ms, followed by 300 ms of emoji stimuli, a 200 ms blank screen, then 300 ms of textual sentence stimuli, and another 200 ms blank. Participants then assessed the initial emoji-sentence combination's valence, recording response time by pressing “a”, “s”, or “d”. The keystroke determined the subsequent page and rating scale: “a” or “d” for a 1–3 scale (higher numbers indicate deeper valence) and “s” recorded as 0. Each trial concluded as shown in Fig. 2. Only response time and valence scores were collected, with negative valence scores converted to negative values, forming a seven-point emotional valence scale.

Fig. 2.

Fig. 2

Flow chart of experiment 2 (1 trail).

2.5. Data analysis

The data obtained in two experiments were analyzed by excluding the response time data other than 3 standard deviations. SPSS 26.0 was used for statistical analysis of the data in this study, and the α was set at 0.05 (two-tailed) for data analysis. The data obtained from the two experiments were tested for normality and found to be normally distributed, and therefore suitable for parametric testing. The main effect and interaction were estimated in both experiments using a two-way repeated measures ANOVA. Mauchly's test of sphericity was performed first [33], and Greenhouse-Geisser or Huynh-Feldt correction was used for those that did not satisfy the spherical assumptions before the test, and Bonferroni method was used for post hoc comparisons or simple effects analysis [34].

3. Results

3.1. Experiment 1

3.1.1. Accuracy

The main effect of textual sentence types was significant F(1.26,36.39) = 9.031, p = 0.003, η2P = 0.237, and the correction value ε = 0.627; the interaction effect of emoji types and textual sentence types was significant F(1.41,40.92) = 5.600, p = 0.014, η2P = 0.162, and the correction value ε = 0.627. The mean accuracy and standard errors for each case are shown in Fig. 3, and post hoc comparisons showed that the accuracy was significantly higher for positive sentences than for negative sentences and neutral sentences with positive emoji stimuli (p = 0.004; p = 0.037); the accuracy was significantly higher for negative sentences than for neutral sentences with negative emoji stimuli (p = 0.008); and for positive sentences, the accuracy with positive emoji stimuli was significantly higher than with negative emoji (p = 0.006).

Fig. 3.

Fig. 3

Mean accuracy for positive, neutral, and negative sentences by different valence emojis; Note:*p < 0.05,**p < 0.01,***p < 0.001; same as below.

3.1.2. Response time

The main effect of emoji types was significant F(1,29) = 6.266, p = 0.018, η2P = 0.178, i.e., positive emoji stimuli were responded to significantly faster than negative emoji (p = 0.018), as shown in Fig. 4A. The main effect of textual sentence types was also significant F(2,58) = 3.283, p = 0.045, η2P = 0.102, but post hoc comparisons showed no significant between any of the three types of sentences. The mean response time for each case is shown in Fig. 4B.

Fig. 4.

Fig. 4

Mean response time (ms) in emojis (A) and sentences (B) with different valence.

3.2. Experiment 2

3.2.1. Response time

The two main effect were both not significant (p > 0.05). However, the interaction of emoji and textual sentence types was significant F(1.38,39.93) = 8.826, p = 0.002, η2P = 0.233, corrected value ε = 0.688. The mean response time for each case is shown in Fig. 5. Post hoc comparisons revealed that for combined sentences containing positive emoji or neutral textual sentences, the response time was shorter when the sentences contained positive emoji than negative emoji (p = 0.011, p = 0.010). Conversely, for combined sentences containing negative textual sentences, the response time was shorter when the sentences contained negative emoji than positive emoji (p = 0.020). Among the sentences containing positive emoji, the response time was shorter for combinations containing positive and neutral textual sentences than for those containing negative textual sentences (p = 0.018; p = 0.005). Among the sentences containing negative emoji, only the combinations containing negative textual sentences had shorter response time than those containing positive textual sentences (p = 0.042).

Fig. 5.

Fig. 5

Mean response time (ms) for positive, neutral, and negative sentences with different emojis.

3.2.2. Valence score

The main effect of emoji types was significant F(1,29) = 14.970, p < 0.001, η2P = 0.340; textual sentence types main effect was significant F(1.29,37.52) = 108.906, p < 0.001, η2P = 0.790, corrected value ε = 0.647; The interaction of emoji types and textual sentence types were significant F(1.48,42.86) = 4.028, p = 0.036, η2P = 0.122, corrected value ε = 0.739. The mean valence scores for each case are shown in Fig. 6. Post hoc comparisons showed that after positive emoji stimulation, positive sentence valence was significantly higher than neutral sentence and negative sentence, while neutral sentences had a higher score than negative sentences (p < 0.001). After negative emoji stimulation, the pattern remained with positive sentences scoring higher than neutral and negative sentences, and neutral sentences scoring higher than negative sentences (p < 0.001). Across all three sentence types, valence scores were significantly higher following positive emoji stimulation compared to negative emoji stimulation (p < 0.001; p < 0.001; p = 0.004).

Fig. 6.

Fig. 6

Mean valence scores for positive, neutral, and negative sentences by different valence emojis. The score of emotional valence less than 0 is judged as negative, and more than 0 is judged as positive.

4. Discussion

4.1. The emotional priming effect of emoji

The accuracy in Experiment 1 showed that when judging the valence of sentence expressions with emotional valence, the accuracy was higher when stimulated by emoji with the same valence than when stimulated by emoji with different valences. The result is consistent with the previous study, with a facilitation effect occurring when emoji and sentence valence are congruent, and a hindrance effect occurring when they are incongruent [24]. The distinction lies in the fact that their research exhibits both emojis concurrently, whereas our study exhibited them individually. Although our results are different from the behavioral results obtained by Yang et al. [20], their study's supplemental EEG findings, including N1 and N400 amplitudes, indicated a congruent effect. This suggests the effect exists but demands more nuanced detection methods. Compared to the emotional words used in their study, the emoji sentences in the present study apparently required longer processing. Consequently, the priming effect was also able to be observed in the behavioral experiment.

Besides, the result showed that neutral sentences were less likely to be judged as correct. We guessed that is due to the addition of emoji stimuli. The result revealed that emotional emoji can partially or even determine the emotional valence of neutral sentences [35]. This aligns with findings from research involving emotional words [36]. For this finding, one of the convincing explanations is that neutral narrative sentences or words have fewer emotional cues and the need for information supplementation by emoji [37].

Results at response time indicate that people apparently quicker reactions to positive emojis over negative ones, and a faster identification of positive sentences compared to those of different valences, so H1 was accepted. Therefore, it can be assumed that people are more sensitive to positive emotions during perception, which is inextricably linked to the more frequent use of positive emoji in online communication [5,38].

4.2. The process of interaction between emoji and textual sentence

The response times data in Experiment 2 showed that regardless of the valence of emoji and textual sentence, response time was significantly lower in the congruent condition than in the conflicting condition. This result is not surprising, as similar results were found in both the previous congruent [24] and conflicting studies [21]. In addition, Leathers et al. [39] make a similar argument about face-to-face communication, indicating that individuals have to do extra processing when emotional cues are conflicting. However, interestingly, compare to other emotional sentences, the response time was fastest for neutral sentences combined with positive emoji; whereas among the negative emoji, the response time for neutral sentences was in the middle. This outcome challenges the initial categorization of conditions as merely incongruent or non-conflicting, so H2 was accepted partly.

These results indicate that emoji and textual sentences have complex dynamic interactions. One possible explanation for this phenomenon could be that the number and complexity of emotional cues influence their processing speed. This idea is mainly derived from Cognitive Load Theory [40]. In the combined sentences, neutral sentences with emoji as emotional cue do not require cue integration, so they are processed faster. Compared to the congruent condition, the emotional cues in the conflicting condition are more complex and require longer processing time [21]. However, this interpretation only supports combined sentences containing positive emoji. When combined sentences contain negative emoji, the negative text is processed faster than neutral text. Although current study suggested that emotional cues may be processed even faster than single cues when they are congruent, this conclusion was based on stimuli from two different channels, visual and auditory [41]. Even if there were no difference in channel, this argument would not explain the different effects of positive and negative emoji on subsequent text.

Fortunately, evolutionary theory could contribute to the explanations of the outcomes associated with the use of negative emojis. Individuals tend to prioritize the processing of emotional information when it has an impact on their survival [42]. This implies that negative emotional cues are more likely to be noticed. Consequently, sentences containing negative emojis are processed more rapidly than those containing neutral emojis when the sentences are negative.

For combined sentences containing positive emoji, on the other hand, positive preference was clearly influential in neutral narrative sentences [38]. Nevertheless, this idea merely describes the result and does not offer an explanation for the underlying causes. It has been demonstrated that the scope of attention is influenced by emotional states. In particular, the activation of a positive emotional state has been shown to expand the scope of attention, whereas a negative emotional state has the opposite effect, narrowing the range of attention [43]. This is why sentence processing initiated by positive emojis is always faster than negative emojis in non-conflicting condition.

The results of the valence scores showed that regardless of the valence of textual sentence, combinations with positive emoji were always higher than combinations containing negative emoji. Moreover, the valence of any combination, whether paired with positive or negative emoji, primarily aligns with the valence of the textual sentence. In other words, reversal of the valence score did not occur, and therefore H3 was rejected. This result seems a bit counterintuitive, as we can clearly sense that the negative perception of sarcasm would be strong. However, revisiting the foundational elements, the concept of emoji as a paralanguage is increasingly recognized [10]. This idea from social semiotic explains precisely why the emotional cues of emoji play a secondary role in CMC, while the emotional cues of textual sentences are the dominant ones.

In summary, this study firstly explores the emotion priming effect of emoji and finds that emoji can have an emotional impact on subsequent texts regardless of their emotional valence. Secondly, the interaction between emoji and textual sentence is specified as slowest when conflicting, and in non-conflicting condition, processing speed is moderated by the type of emoji. Finally, it is verified that emojis have an emotion-assisting function, i.e., they can affect the emotional valence of the text, enhancing, empowering, or weakening it, but they do not qualitatively change the sentences. This study has several limitations, including a lower temporal resolution in behavioral experiments compared to those using event-related potentials. Additionally, the absence of pre-defined dialog scenarios may increase random errors, control over additional variables during administration remains inadequate, and the need for a larger sample size is evident. Likewise, this provides some references for subsequent studies.

5. Conclusion

First, the emotion priming effect of emoji exists. Second, the processing speed is slowest when emoji and text are emotionally conflicting, while in non-conflicting condition, the type of emoji moderates the processing of combined sentences. Third, in the process of emotion perception, the emotional function of emoji plays only a secondary role, and the emotion expressed by the textual content plays a decisive role. The results reveal that during online communication, emoji can be used to modulate the emotional expression of the conversation, but the core emotion of the conversation still comes from the textual content.

Ethics statement

This study has received full ethical approval from the committee of the Capital University of Physical Education and Sports [approval number 2023A021].

Data availability statement

All data and materials have been made publicly available at the Mendeley Data (https://doi.org/10.17632/6xv8mkkz88.1).

CRediT authorship contribution statement

Yuan-fu Dai: Writing – review & editing, Writing – original draft, Visualization, Software, Resources, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. Xiao-yan Gao: Writing – review & editing, Project administration, Methodology, Investigation, Data curation, Conceptualization. Wen-wu Leng: Writing – review & editing, Software, Methodology, Investigation, Data curation. Chen Huang: Writing – review & editing, Validation, Methodology, Data curation. Wen-jing Yu: Writing – review & editing, Investigation, Data curation. Chang-hao Jiang: Writing – review & editing, Writing – original draft, Validation, Supervision, Project administration, Methodology, Funding acquisition, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgement

This study acknowledges the support from the Beijing Municipal Social Science Foundation (19YTA001) and the Emerging Interdisciplinary Platform for Medicine and Engineering in Sports of the Capital University of Physical Education and Sports.

Footnotes

Appendix A

Supplementary data to this article can be found online at https://doi.org/10.1016/j.heliyon.2024.e32984.

Appendix A. Supplementary data

The following is the Supplementary data to this article.

Multimedia component 1
mmc1.docx (84.4KB, docx)

References

  • 1.Rodrigues D., Prada M., Gaspar R., Garrido M.V., Lopes D. Lisbon emoji and emoticon database (leed): norms for emoji and emoticons in seven evaluative dimensions. Behav. Res. Methods. 2018;50(1):392–405. doi: 10.3758/s13428-017-0878-6. [DOI] [PubMed] [Google Scholar]
  • 2.Kejriwal M., Wang Q., Li H., Wang L. An empirical study of emoji usage on twitter in linguistic and national contexts. Online social networks and media. 2021;24 doi: 10.1016/j.osnem.2021.100149. [DOI] [Google Scholar]
  • 3.Archer D., Akert R.M. Words and everything else: verbal and nonverbal cues in social interpretation. J. Pers. Soc. Psychol. 1977;35(6):443–449. doi: 10.1037/0022-3514.35.6.443. [DOI] [Google Scholar]
  • 4.Aull B. A study of phatic emoji use in whatsapp communication. Internet Pragmat. 2019;2(2):206–232. doi: 10.1075/ip.00029.aul. [DOI] [Google Scholar]
  • 5.Walther J.B., D Addario K.P. The impacts of emoticons on message interpretation in computer-mediated communication. Soc. Sci. Comput. Rev. 2001;19(3):324–347. doi: 10.1177/089443930101900307. [DOI] [Google Scholar]
  • 6.Kaye L.K., Wall H.J., Malone S.A. “Turn that frown upside-down”: a contextual account of emoticon usage on different virtual platforms. Comput. Hum. Behav. 2016;60:463–467. doi: 10.1016/j.chb.2016.02.088. [DOI] [Google Scholar]
  • 7.Sampietro A. Emoji and rapport management in Spanish whatsapp chats. J. Pragmat. 2019;143:109–120. doi: 10.1016/j.pragma.2019.02.009. [DOI] [Google Scholar]
  • 8.López R.P., Cap F. Proceedings of the 8th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis. Association for Computational Linguistics; Copenhagen, Denmark: 2017. Did you ever read about frogs drinking coffee? Investigating the compositionality of multi-emoji expressions; pp. 113–117. [DOI] [Google Scholar]
  • 9.Parkwell C. Emoji as social semiotic resources for meaning-making in discourse: mapping the functions of the toilet emoji in cher's tweets about donald trump. Discourse, context & media. 2019;30 doi: 10.1016/j.dcm.2019.100307. [DOI] [Google Scholar]
  • 10.Logi L., Zappavigna M. A social semiotic perspective on emoji: how emoji and language interact to make meaning in digital messages. New Media Soc. 2023;25(12):3222–3246. doi: 10.1177/14614448211032965. [DOI] [Google Scholar]
  • 11.Bai Q., Dan Q., Mu Z., Yang M. A systematic review of emoji: current research and future perspectives. Front. Psychol. 2019;10:2221. doi: 10.3389/fpsyg.2019.02221. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Jaeger S.R., Ares G. Dominant meanings of facial emoji: insights from Chinese consumers and comparison with meanings from internet resources. Food Qual. Prefer. 2017;62:275–283. doi: 10.1016/j.foodqual.2017.04.009. [DOI] [Google Scholar]
  • 13.Riordan M.A. The communicative role of non-face emojis: affect and disambiguation. Comput. Hum. Behav. 2017;76:75–86. doi: 10.1016/j.chb.2017.07.009. [DOI] [Google Scholar]
  • 14.Comesaña M., Soares A.P., Perea M., Piñeiro A.P., Fraga I., Pinheiro A. Erp correlates of masked affective priming with emoticons. Comput. Hum. Behav. 2013;29(3):588–595. doi: 10.1016/j.chb.2012.10.020. [DOI] [Google Scholar]
  • 15.Roediger H.L. Implicit memory: retention without remembering. Am. Psychol. 1990;45(9):1043–1056. doi: 10.1037/0003-066X.45.9.1043. [DOI] [PubMed] [Google Scholar]
  • 16.Weingarten E., Chen Q., Mcadams M., et al. From primed concepts to action: a meta-analysis of the behavioral effects of incidentally presented words. Psychol. Bull. 2016;142(5):472–497. doi: 10.1037/bul0000030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Tulving E., Schacter D.L. Priming and human memory systems. Science. 1990;247(4940):301–306. doi: 10.1126/science.2296719. [DOI] [PubMed] [Google Scholar]
  • 18.Schacter D.L., Buckner R.L. Priming and the brain. Neuron. 1998;20(2):185–195. doi: 10.1016/S0896-6273(00)80448-1. [DOI] [PubMed] [Google Scholar]
  • 19.Fazio R.H., Sanbonmatsu D.M., Powell M.C., Kardes F.R. On the automatic activation of attitudes. J. Pers. Soc. Psychol. 1986;50:229–238. doi: 10.1037/0022-3514.50.2.229. [DOI] [PubMed] [Google Scholar]
  • 20.Yang J., Yang Y., Xiu L., Yu G. Effect of emoji prime on the understanding of emotional words - evidence from erps. Behav. Inf. Technol. 2022;41(6):1313–1322. doi: 10.1080/0144929X.2021.1874050. [DOI] [Google Scholar]
  • 21.Pfeifer V.A., Armstrong E.L., Lai V.T. Do all facial emojis communicate emotion? The impact of facial emojis on perceived sender emotion and text processing. Comput. Hum. Behav. 2022;126 doi: 10.1016/j.chb.2021.107016. [DOI] [Google Scholar]
  • 22.Nexø L.A., Strandell J. Testing, filtering, and insinuating: matching and attunement of emoji use patterns as non-verbal flirting in online dating. Poetics. 2020;83 doi: 10.1016/j.poetic.2020.101477. [DOI] [Google Scholar]
  • 23.Kaye L.K., Darker G.M., Rodriguez-Cuadrado S., Wall H.J., Malone S.A. The emoji spatial stroop task: exploring the impact of vertical positioning of emoji on emotional processing. Comput. Hum. Behav. 2022;132 doi: 10.1016/j.chb.2022.107267. [DOI] [Google Scholar]
  • 24.Boutet I., Leblanc M., Chamberland J.A., Collin C.A. Emojis influence emotional communication, social attributions, and information processing. Comput. Hum. Behav. 2021;119 doi: 10.1016/j.chb.2021.106722. [DOI] [Google Scholar]
  • 25.Hand C.J., Burd K., Oliver A., Robus C.M. Interactions between text content and emoji types determine perceptions of both messages and senders. Computers in human behavior reports. 2022;8 doi: 10.1016/j.chbr.2022.100242. [DOI] [Google Scholar]
  • 26.Chauhan D.S., Singh G.V., Arora A., Ekbal A., Bhattacharyya P. An emoji-aware multitask framework for multimodal sarcasm detection. Knowledge-Based Syst. 2022;257 doi: 10.1016/j.knosys.2022.109924. [DOI] [Google Scholar]
  • 27.Howman H.E., Filik R. The role of emoticons in sarcasm comprehension in younger and older adults: evidence from an eye-tracking experiment, Q. J. Exp. Psychol. 2020;73(11):1729–1744. doi: 10.1177/1747021820922804. [DOI] [PubMed] [Google Scholar]
  • 28.Faul F., Erdfelder E., Lang A., Buchner A. G*power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods. 2007;39(2):175–191. doi: 10.3758/BF03193146. [DOI] [PubMed] [Google Scholar]
  • 29.Barbieri F., Kruszewski G., Ronzano F., Saggion H. Proceedings of the 24th ACM International Conference on Multimedia. Association for Computing Machinery; Amsterdam, The Netherlands: 2016. How cosmopolitan are emojis? Exploring emojis usage and meaning over different languages with distributional semantics; pp. 531–535. [DOI] [Google Scholar]
  • 30.Yuki M., Maddux W.W., Masuda T. Are the windows to the soul the same in the east and west? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States. J. Exp. Soc. Psychol. 2007;43(2):303–311. doi: 10.1016/j.jesp.2006.02.004. [DOI] [Google Scholar]
  • 31.Qiu L., Zheng X., Wang Y.F. Revision of the positive affect and negative affect scale. Chinese Journal of Applied Psychology. 2008;14(3):249–254. [Google Scholar]
  • 32.Holderbaum C.S., Mansur L.L., de Salles J.F. Heterogeneity in semantic priming effect with a lexical decision task in patients after left hemisphere stroke. Dementia & neuropsychologia. 2016;10(2):91–97. doi: 10.1590/S1980-5764-2016DN1002004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Keselman H.J., Rogan J.C., Mendoza J.L., Breen L.J. Testing the validity conditions of repeated measures f tests. Psychol. Bull. 1980;87(3):479–481. doi: 10.1037/0033-2909.87.3.479. [DOI] [Google Scholar]
  • 34.Castañeda M.B., Levin J.R., Dunham R.B. Using planned comparisons in management research: a case for the bonferroni procedure. J. Manag. 1993;19(3):707–724. doi: 10.1016/0149-2063(93)90012-C. [DOI] [Google Scholar]
  • 35.Ted Luor T., Wu L., Lu H., Tao Y. The effect of emoticons in simplex and complex task-oriented communication: an empirical study of instant messaging. Comput. Hum. Behav. 2010;26(5):889–895. doi: 10.1016/j.chb.2010.02.003. [DOI] [Google Scholar]
  • 36.Kousta S., Vinson D.P., Vigliocco G. Emotion words, regardless of polarity, have a processing advantage over neutral words. Cognition. 2009;112(3):473–481. doi: 10.1016/j.cognition.2009.06.007. [DOI] [PubMed] [Google Scholar]
  • 37.Daft R.L., Lengel R.H. Organizational information requirements, media richness and structural design. Manag. Sci. 1986;32(5):554–571. doi: 10.1287/mnsc.32.5.554. [DOI] [Google Scholar]
  • 38.Vidal L., Ares G., Jaeger S.R. Use of emoticon and emoji in tweets for food-related emotional expression. Food Qual. Prefer. 2016;49:119–128. doi: 10.1016/j.foodqual.2015.12.002. [DOI] [Google Scholar]
  • 39.Leathers D.G. The impact of multichannel message inconsistency on verbal and nonverbal decoding behaviors, Commun. Monogr. 1979;46(2):88–100. doi: 10.1080/03637757909375994. [DOI] [Google Scholar]
  • 40.Plass J.L., Kalyuga S. Four ways of considering emotion in cognitive load theory. Educ. Psychol. Rev. 2019;31(2):339–359. doi: 10.1007/s10648-019-09473-5. [DOI] [Google Scholar]
  • 41.Gerdes A.B.M., Wieser M.J., Alpers G.W. Emotional pictures and sounds: a review of multimodal interactions of emotion cues in multiple domains. Front. Psychol. 2014;5:1351. doi: 10.3389/fpsyg.2014.01351. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Cohen N., Henik A., Mor N. Can emotion modulate attention? Evidence for reciprocal links in the attentional network test. Exp. Psychol. 2011;58(3):171–179. doi: 10.1027/1618-3169/a000083. [DOI] [PubMed] [Google Scholar]
  • 43.Friedman R.S., Forster J. Implicit affective cues and attentional tuning: an integrative review. Psychol. Bull. 2010;136(5):875–893. doi: 10.1037/a0020495. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia component 1
mmc1.docx (84.4KB, docx)

Data Availability Statement

All data and materials have been made publicly available at the Mendeley Data (https://doi.org/10.17632/6xv8mkkz88.1).


Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES