Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
. 2017 Jan 6;7:2061. doi: 10.3389/fpsyg.2016.02061

An Integrated Review of Emoticons in Computer-Mediated Communication

Nerea Aldunate 1,*, Roberto González-Ibáñez 2
PMCID: PMC5216035  PMID: 28111564

Abstract

Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic cues to convey emotional meaning. Research has shown that emoticons contribute to a greater social presence as a result of the enrichment of text-based communication channels. Additionally, emoticons constitute a valuable resource for language comprehension by providing expressivity to text messages. The latter findings have been supported by studies in neuroscience showing that particular brain regions involved in emotional processing are also activated when people are exposed to emoticons. To reach an integrated understanding of the influence of emoticons in human communication on both socio-cognitive and neural levels, we review the literature on emoticons in three different areas. First, we present relevant literature on emoticons in CMC. Second, we study the influence of emoticons in language comprehension. Finally, we show the incipient research in neuroscience on this topic. This mini review reveals that, while there are plenty of studies on the influence of emoticons in communication from a social psychology perspective, little is known about the neurocognitive basis of the effects of emoticons on communication dynamics.

Keywords: emoticons, emotional expressivity, computer-mediated communication, language comprehension, neuroscience

Introduction

Emotions are one of the most important information signals in social cognition (Pessoa, 2009). This information usually comes from facial expressions in face-to-face interactions. However, today people use alternative communication channels – like text-based communication – to interact with others when they are not physically present. In such cases, facial information is not always available. The lack of affective signals and pragmatic information in human communication has been associated with difficulty in understanding other people (Kiesler et al., 1984). In computer-mediated communication (CMC), particularly in text-based communication, face-related information has been partially substituted by emoticons, which enable interlocutors to express different emotional states by using text-based representations of facial expressions. Although emoticons are unnatural, iconic, and static representations of human facial expressions, they have become a popular resource to enrich text-based communication since 1982 (Vincent and Fortunati, 2009). Beyond their uses and impacts, little is known about how emoticons are processed in CMC at the brain level.

In this mini review, we examine socio-cognitive and neuroscience research on the evidence of psychological effects of emoticons in CMC and their processing in human cognition. This article is structured in four parts. The first part presents a review of relevant literature on the importance of facial expressions in communication. Second, the importance of emoticons in text-based communication, social presence, and language processing is discussed. Third, research on neuroscience and emoticons is reviewed. Finally, we conclude with a discussion about future research on this topic and practical implications for cognitive sciences.

The Importance of Facial Expressions in Communication

Natural interactions in human communication are full of contextual cues, such as gestures, prosody, and facial expressions, which are processed for common understanding. Some theories about human interactions have considered that meaning comprehension requires an ecological context and the use of expressive non-verbal tools to enrich communication, which is the case of face-to-face contexts (Hörmann, 1981).

Facial expressions are considered one of the most important cues in human communication. Evidence suggests that from birth, human beings prefer facial expressions over other types of stimuli (Goren et al., 1975; Johnson et al., 1991). The face is one of the most visible and complex sources of information about the emotional states of individuals. Faces are integrated into comprehension processes during social interactions and communication. In fact, most human beings have a special capacity to process them with specific mechanisms (Ekman et al., 2013). For example, the process of face perception involves a complex neural network that enables people to recognize aspects of facial structure of others and to communicate socially using the perception of facial movements (Haxby et al., 2000, 2002).

In natural conditions, facial expressions and other contextual cues inform individuals about the affective states of the participants in the communication process. However, during the last decades, alternative communication media have proliferated (e.g., desktop computers and smartphones) in developed countries (Poushter et al., 2015). In particular, in text-based CMC there is a physical gap that implies a loss of contextual information (e.g., facial expressions). This aspect constitutes a difficulty for CMC, especially for communication with figurative language (Sarbaugh-Thompson and Feldman, 1998), which may have negative (Sproull and Kiesler, 1986; Okdie et al., 2011) or positive (Walther, 1995, 1996; Bargh et al., 2002) effects on human relationships as a result of the lack of nonverbal and contextual cues.

The lack of face-to-face interactions and therefore the absence of regulating feedback, affect interactions at different levels. It also causes people to behave differently while interacting in some CMC contexts. For example, it is known that when small groups interact face-to-face in a problem-solving discussion, they are more likely to reach agreement than when they communicate in a computerized conference (Hiltz et al., 1986). On the other hand, self-awareness has been reported to be lower in face-to-face communication than in CMC (Matheson and Zanna, 1988, 1990). These differences are due to difficulties in communicative coordination between people without a context that provides informational feedback, cues for controlling discussion, and lack of nonverbal involvement (Kiesler et al., 1984).

The Importance of Emoticons in Text-Based Communication

Two factors that influence the success of a communication channel are its “social presence” and its “richness.” On the one hand, social presence (Short et al., 1976; Gunawardena and Zittle, 1997) refers to the degree to which a channel can be used to transmit relational information that increases the perception of human interaction with non-verbal cues such as facial expressions (Short et al., 1976). For instance, text-based CMC like chats or emails, have less social presence than face-to-face communication because they lack social non-verbal cues for the interactions. On the other hand, media richness (Daft and Lengel, 1986) refers to the degree of adequacy of a channel to the communication context and goals in order to reduce uncertainty and equivocality. For example, managers may prefer face-to-face group meetings over email exchange in order to avoid misinterpretations of messages when working on a task. In this scenario, face-to-face communication, which is equipped with a variety of resources, such as facial expressions and prosody, is considered to be richer than text-based communication.

Despite its levels of social presence and richness, text-based communication often has a social function; however, it needs more expressive tools to enrich and socialize interactions. One way text-based CMC users have made their conversations more dynamic, conveying more social presence and richness has been through the use of emoticons. Emoticons can be considered a humanized tool for emotional expressiveness that replaces cues that are naturally present in face-to-face and voice-to-voice communication. Emoticons provide enjoyment and more information to the interaction (Huang et al., 2008) and they also enhance the comprehension of messages (Walther and D’Addario, 2001) with a more accurate perception of emotions, attitudes, and intentions during text-based communication (Lo, 2008).

While there is quite abundant evidence on how emoticons are used in text-based CMC, little is known about how they are processed at the cognitive and brain level. This knowledge could help to explain the effects of emoticons on text-based communication. In this sense, both cognitive science and neuroscience could contribute to address this question. On the one hand, research in cognitive science related to language processing constitutes a valuable framework for understanding how emoticons can influence the interpretation of messages. On the other hand, research in neuroscience provides further evidence on how the brain operates in CMC contexts, especially when interlocutors perceive affective information on schematic faces integrated with language messages.

Influence of Emoticons on Language Comprehension

Facial expressions are naturally present in face-to-face communication, however, in mediated communication, particularly in CMC, communication of facial expressions may be limited to the richness of the underlying communication channel (Daft and Lengel, 1986). Emoticons can be defined as facial expression surrogates, which correspond to symbolic, lexical, or graphical resources used in written communication to convey emotional expressions. In this sense, emoticons can be used as paralinguistic cues of an utterance, which in written communication can help writers and readers to elucidate meaning (Schuller and Batliner, 2013).

Emoticons have been described, in some cases, as useful resources in text-based mediated communication (Rivera et al., 1996). They have provided people with an alternative way to express their emotions, feelings, and mood changes to others in restricted communication channels such as text-based ones. Emoticons have been important in increasing social presence in text-based interactions, where communication tends to be cold without affective cues. For example, a study has shown that in socio-emotional contexts, emoticons are more commonly used than in task-oriented ones (Derks et al., 2007b). Likewise, the same authors also showed that emoticons are more common in interactions between friends than between strangers, and in positive contexts than in negative ones (Derks et al., 2008).

It has been stated that emoticons can play a central role in interpreting text messages in CMC (Walther and D’Addario, 2001). In this context, Derks et al. (2007a) and Lo (2008) found that text messages can be strengthened by using emoticons. Emoticons can also change the direction of a message to the opposite direction, which is the case of sarcastic utterances. A related study by Filik et al. (2015) showed that emoticons are more expressive than punctuation marks in the disambiguation of sarcastic utterances. Along the same lines, González-Ibáñez et al. (2011) and Muresan et al. (2016) showed that emoticons in short messages like those found on microblogging platforms (e.g., Twitter) are particularly helpful for humans and machines in classifying sarcastic, non-sarcastic, positive, and negative messages. For example, consider the following written sentence: “I love the customer service of this company.” Without a context and lack of verbal cues, it is likely that its interpretation is literal. However, with the incorporation of traditional emoticons (i.e. :) and :( ) at the end of the sentence, its intentional meaning (literal and sarcastic, respectively) is clarified.

Contrary to the above studies, results from an experiment conducted by Walther and D’Addario (2001) showed that the contribution of emoticons in interpreting messages is limited. However, the authors found consistency in the interpretation of negatively biased messages as a result of both negative emoticons and negative verbal content.

In addition to message interpretation, Thompson et al. (2016) found that emoticons have the potential to reduce negative responses that are typically experienced when exposed to ironic texts. It is important to note that these findings are contextualized in an artificial setting in which electrodermal activity (for arousal) and electromyography (for facial expressions) were used to measure participants’ reactions. Overall, the authors found that the presence of emoticons increased arousal levels, reduced frowning expressions, and enhanced smiling.

The above articles illustrate the active role that emoticons can play in language comprehension, specifically in text message interpretation. Although simple in expressions, emoticons have the potential to enrich verbal expressions acting as substitutes of richer forms, such as facial expressions and prosody, which are naturally present in face-to-face interactions.

What Does Neuroscience Tell Us About Emoticons?

Recently, neuroscientists have been studying neural processing during emoticon perception. Different techniques have been chosen to study emoticon processing. One of these has been functional Magnetic Resonance Imaging (fMRI), which provides information on the brain areas involved in particular processing with a high spatial resolution. The other technique is electroencephalography (EEG), which provides high temporal resolution. In EEG studies, there is a specific technique known as event related potentials (ERPs), which provides information about the time course of the processing and its characteristics after a specific stimulation.

fMRI Studies

One question that has driven research is whether emoticons activate the same brain areas that faces do. One of the structures involved in face processing is the right fusiform gyrus, which participates in face recognition (Kanwisher and Yovelm, 2006) and presents dysfunctional activation in patients with prosopagnosia, which refers to the incapacity to recognize faces (Barton et al., 2002).

Initial fMRI studies indicated that isolated emoticons do not activate the same brain regions in the processing of human faces. However, they activate the same areas involved in emotional discrimination, like the right inferior frontal gyrus. Yuasa et al. (2006) showed that in an emotional valence decision task, human faces activate the right fusiform gyrus, right inferior frontal gyrus, right middle frontal gyrus, and right inferior parietal lobe. Conversely, Japanese emoticons did not activate the right fusiform gyrus, but they activated the right inferior frontal gyrus. These findings suggest that these kinds of emoticons may be associated with emotional discrimination. In a posterior study, Yuasa et al. (2011) investigated brain activations during the exposure to emoticons presented after sentences. Results showed that emoticons are associated with nonverbal information processing and that the brain areas involved in verbal and nonverbal information processing are more active when sentences are followed by emoticons than when they are not. Specifically, they found that, although the right fusiform gyrus was not activated when sentences with emoticons were presented, the right inferior frontal gyrus was activated. This finding indicates that emotional discrimination based on emoticon perception is similar to other non-verbal cues like facial expressions or speech prosody.

In another study, Shin et al. (2008) compared brain activity for pictures of real faces and real houses with their corresponding schematic icons to observe the neural basis for iconic symbol recognition. For this purpose, they used emoticons as face icons. The results of this study showed more cortical brain activation than subcortical for icons, involving the frontal and parietal cortex, and the temporo-occipital junction in the ventral pathway – implied in the know-what processes, which are related to object recognition. The authors suggest that this activation corresponds to neural correlates of visual concept formation, distinguishing it from the perception of concrete visual objects. Specifically, they observed a bilateral activation of the fusiform gyrus during the exposure to both emoticons and house icons. Such activation could be associated with the characteristics of the icons, which share aspects of words and objects. The authors also observed activation in a region known as Visual Word Form Area, which is a specific area of he left fusiform gyrus that responds to word’s letters (Cohen et al., 2002) and participates in “integrating the shape elements into a meaningful whole” (Starrfelt and Gerlach, 2007; Shin et al., 2008, p. 303). Unlike emoticons, faces activated the inferior occipital cortex, which participates in face information processing at the individual level (Gauthier et al., 2000), and this activity decreased along the ventral pathway.

Recently, the activity in the fusiform gyrus for emotional words and emoticons was observed in patients with autism spectrum disorder (ASD) (Han et al., 2014). These patients have a social dysfunction due to different brain pattern alterations. For example, patients with ASD do not present activation in the fusiform gyrus when they see emotional facial expressions (Critchley et al., 2000; Pierce et al., 2001). Moreover, patients with ASD do not show activation in the right fusiform gyrus in the discrimination of facial expressions (Schultz et al., 2000). In the study of Han et al. (2014), patients with ASD and healthy participants were exposed to emotional words and emoticons to observe the activation of the fusiform gyrus. They observed an increment in the activity of the fusiform gyrus for emotional words, which led the authors to hypothesize that this result could be a manifestation of a compensatory mechanism to control social information processing. Nevertheless, further analyses of brain activation during the presentation of emoticons revealed a lower activation in the fusiform gyrus. This result suggests that word descriptions are more familiar with emotional processing than facial expressions in ASD patients.

EEG Studies

Face processing has been studied with EEG showing the N170 ERP for face perception. This component is a negative-going deflection in the occipito-temporal cortex with a maximum peak around 170 ms after the presentation of a face and it is related to the configurational processing of faces (Bentin et al., 1996; Allison et al., 1999; McCarthy et al., 1999). It is known that this component appears even if faces are schematic (Sagiv and Bentin, 2001; Babiloni et al., 2010).

Today, there are very few studies about emoticon processing with EEG recording. Churches et al. (2014) obtained similar ERPs (N170) for typographic smiley emoticons “:-)” and faces, suggesting similarities in emoticon and face perception. A study by Comesaña et al. (2013) suggests that processing of emoticons is more privileged than that of the word to which they refer to. In particular, the authors showed a priming effect with masked emoticons in word processing with modulations in N2 and late positive component (LPC), which is not present by masked affective words. It is known that positive words with high positive arousal level influence LPC (Gibbons, 2009), reflecting the activation of motivational and attentional brain systems by emotional stimuli and the initial memory storage during the processing of affective information (Cuthbert et al., 2000). It is important to note that this component is related to the stimuli’s arousal and not to its valence (Hinojosa et al., 2009).

Concluding Thoughts

At present, humans relations are rapidly evolving through the use of electronic means. This fact has involved an increasing use of alternative cues to communicate emotional states in computer-mediated contexts. Emoticon, as one of these alternative cues, can change people’s dispositions in the comprehension processes or interactions with others. Moreover, they can affect decisions, mood, or perspective of the conversation.

The use of emoticons is important in CMC because they enable greater emotional expressiveness in absence of context. Not only do emoticons provide more fun to the interaction (e.g., Huang et al., 2008; Thompson et al., 2016), but they also facilitate disambiguation of messages, enabling better comprehension (Lo, 2008), therefore they are an expressive resource. Although this is important, there is little evidence on the cognitive and brain dimensions on how these emoticons are processed in human communication. Today, studies have focused on processing structural characteristics of emoticons and their relationship with the processing of emotional information and human faces. However, further research toward understanding the processing of emoticons within the dynamics of text-based communications is required. This would enable in-depth understanding of how affectivity unfolds in CMC interactions and how this affective dimension affects language understanding.

Additionally, it is important to note that not all emoticons have the same degree of humanization. For example, emojis are emoticons that are more human than typographic in nature. Research on the effects of different types of emoticons according to their structural characteristics (e.g., emojis, typographic, Japanese, etc.) may be useful for developing communication technology to favor social presence and the affectivity that characterizes communication in social interactions.

Author Contributions

All authors listed, have made substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

Funding. This study was funded by Inserción a la Academia 79140067 (PAI), Comisión Nacional de Investigación Científica y Tecnológica (CONICYT).

References

  1. Allison T., Puce A., Spencer D. D., McCarthy G. (1999). Electrophysiological studies of human face perception. I: potentials generated in occipitotemporal cortex by face and non-face stimuli. Cereb. Cortex 9 415–430. 10.1093/cercor/9.5.415 [DOI] [PubMed] [Google Scholar]
  2. Babiloni C., Vecchio F., Buffo P., Buttiglione M., Cibelli G., Rossini P. M. (2010). Cortical responses to consciousness of schematic emotional facial expressions: a high-resolution EEG study. Hum. Brain Mapp. 31 1556–1569. 10.1002/hbm.20958 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bargh J. A., McKenna K. Y., Fitzsimons G. M. (2002). Can you see the real me? Activation and expression of the “true self” on the Internet. J. Soc. Issues 58 33–48. 10.1111/1540-4560.00247 [DOI] [Google Scholar]
  4. Barton J. J., Press D. Z., Keenan J. P., O’Connor M. (2002). Lesions of the fusiform face area impair perception of facial configuration in prosopagnosia. Neurology 58 71–78. 10.1212/WNL.58.1.71 [DOI] [PubMed] [Google Scholar]
  5. Bentin S., Allison T., Puce A., Perez E., McCarthy G. (1996). Electrophysiological studies of face perception in humans. J. Cogn. Neurosci. 8 551–565. 10.1162/jocn.1996.8.6.551 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Churches O., Nicholls M., Thiessen M., Kohler M., Keage H. (2014). Emoticons in mind: an event-related potential study. Soc. Neurosci. 9 196–202. 10.1080/17470919.2013.873737 [DOI] [PubMed] [Google Scholar]
  7. Cohen L., Lehéricy S., Chochon F., Lemer C., Rivaud S., Dehaene S. (2002). Language-specific tuning of visual cortex? Functional properties of the Visual Word Form Area. Brain 125 1054–1069. 10.1093/brain/awf094 [DOI] [PubMed] [Google Scholar]
  8. Comesaña M., Soares A. P., Perea M., Piñeiro A. P., Fraga I., Pinheiro A. (2013). ERP correlates of masked affective priming with emoticons. Comput. Hum. Behav. 29 588–595. 10.1016/j.chb.2012.10.020 [DOI] [Google Scholar]
  9. Critchley H. D., Daly E. M., Bullmore E. T., Williams S. C., Van Amelsvoort T., Robertson D. M., et al. (2000). The functional neuroanatomy of social behaviour. Brain 123 2203–2212. 10.1093/brain/123.11.2203 [DOI] [PubMed] [Google Scholar]
  10. Cuthbert B. N., Schupp H. T., Bradley M. M., Birbaumer N., Lang P. J. (2000). Brain potentials in affective picture processing: covariation with autonomic arousal and affective report. Biol. Psychol. 52 95–111. 10.1016/S0301-0511(99)00044-7 [DOI] [PubMed] [Google Scholar]
  11. Daft R. L., Lengel R. H. (1986). Organizational information requirements, media richness and structural design. Manag. Sci. 32 554–571. 10.1287/mnsc.32.5.554 [DOI] [Google Scholar]
  12. Derks D., Bos A. E., Von Grumbkow J. (2007a). Emoticons and online message interpretation. Soc. Sci. Comput. Rev. 26 379–388. 10.1177/0894439307311611 [DOI] [Google Scholar]
  13. Derks D., Bos A. E., Von Grumbkow J. (2007b). Emoticons and social interaction on the Internet: the importance of social context. Comput. Hum. Behav. 23 842–849. 10.1016/j.chb.2004.11.013 [DOI] [Google Scholar]
  14. Derks D., Bos A. E., Von Grumbkow J. (2008). Emoticons in computer-mediated communication: social motives and social context. Cyberpsychol. Behav. 11 99–101. 10.1089/cpb.2007.9926 [DOI] [PubMed] [Google Scholar]
  15. Ekman P., Friesen W. V., Ellsworth P. (2013). Emotion in the Human Face: Guidelines for Research and an Integration of Findings. Atlanta, GA: Elsevier. [Google Scholar]
  16. Filik R., ?urcan A., Thompson D., Harvey N., Davies H., Turner A. (2015). Sarcasm and emoticons: comprehension and emotional impact. Q. J. Exp. Psychol. 69 2130–2146. 10.1080/17470218.2015.1106566 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Gauthier I., Tarr M. J., Moylan J., Skudlarski P., Gore J. C., Anderson A. W. (2000). The fusiform “face area” is part of a network that processes faces at the individual level. J. Cogn. Neurosci. 12 495–504. 10.1162/089892900562165 [DOI] [PubMed] [Google Scholar]
  18. Gibbons H. (2009). Evaluative priming from subliminal emotional words: insights from event-related potentials and individual differences related to anxiety. Conscious. Cogn. 18 383–400. 10.1016/j.concog.2009.02.007 [DOI] [PubMed] [Google Scholar]
  19. González-Ibáñez R., Muresan S., Wacholder N. (2011). Indentifying sarcasm in twitter: a closer look. Paper Presented at the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-HLT 2011), Portland, OR. [Google Scholar]
  20. Goren C. C., Sarty M., Wu P. Y. (1975). Visual following and pattern discrimination of face-like stimuli by newborn infants. Pediatrics 56 544–549. [PubMed] [Google Scholar]
  21. Gunawardena C. N., Zittle F. J. (1997). Social presence as a predictor of satisfaction within a computer-mediated conferencing environment. Am. J. Distance Educ. 11 8–26. 10.1080/08923649709526970 [DOI] [Google Scholar]
  22. Han D. H., Yoo H. J., Kim B. N., McMahon W., Renshaw P. F. (2014). Brain activity of adolescents with high functioning autism in response to emotional words and facial emoticons. PLoS ONE 9:e91214 10.1371/journal.pone.0091214 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Haxby J. V., Hoffman E. A., Gobbini M. I. (2000). The distributed human neural system for face perception. Trends Cogn. Sci. 4 223–233. 10.1016/S1364-6613(00)01482-0 [DOI] [PubMed] [Google Scholar]
  24. Haxby J. V., Hoffman E. A., Gobbini M. I. (2002). Human neural systems for face recognition and social communication. Biol. Psychiatry 51 59–67. 10.1016/S0006-3223(01)01330-0 [DOI] [PubMed] [Google Scholar]
  25. Hiltz S. R., Johnson K., Turoff M. (1986). Experiments in group decision making communication process and outcome in face-to-face versus computerized conferences. Hum. Commun. Res. 13 225–252. 10.1111/j.1468-2958.1986.tb00104.x [DOI] [Google Scholar]
  26. Hinojosa J. A., Carretié L., Méndez-Bértolo C., Míguez A., Pozo M. A. (2009). Arousal contributions to affective priming: electrophysiological correlates. Emotion 9 164–171. 10.1037/a0014680 [DOI] [PubMed] [Google Scholar]
  27. Hörmann H. (1981). To Mean-To Understand. New York, NY: Springer. [Google Scholar]
  28. Huang A. H., Yen D. C., Zhang X. (2008). Exploring the potential effects of emoticons. Inform. Manag. 45 466–473. 10.1016/j.im.2008.07.001 [DOI] [Google Scholar]
  29. Johnson M. H., Dziurawiec S., Ellis H., Morton J. (1991). Newborns’ preferential tracking of face-like stimuli and its subsequent decline. Cognition 40 1–19. 10.1016/0010-0277(91)90045-6 [DOI] [PubMed] [Google Scholar]
  30. Kanwisher N., Yovelm G. (2006). The fusiform face area: a cortical region specialized for the perception of faces. Philos. Trans. R. Soc. Lond. B Biol. Sci. 361 2109–2128. 10.1098/rstb.2006.1934 [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Kiesler S., Siegel J., McGuire T. W. (1984). Social psychological aspects of computer-mediated communication. Am. Psychol. 39 1123–1134. 10.1037/0003-066X.39.10.1123 [DOI] [Google Scholar]
  32. Lo S. K. (2008). The nonverbal communication functions of emoticons in computer-mediated communication. Cyberpsychol. Behav. 11 595–597. 10.1089/cpb.2007.0132 [DOI] [PubMed] [Google Scholar]
  33. Matheson K., Zanna M. P. (1988). The impact of computer-mediated communication on self-awareness. Comput. Hum. Behav. 4 221–233. 10.1016/0747-5632(88)90015-5 [DOI] [Google Scholar]
  34. Matheson K., Zanna M. P. (1990). Computer-mediated communications: the focus is on me. Soc. Sci. Comput. Rev. 8 1–12. 10.1177/089443939000800102 [DOI] [Google Scholar]
  35. McCarthy G., Puce A., Belger A., Allison T. (1999). Electrophysiological studies of human face perception. II: response properties of face-specific potentials generated in occipitotemporal cortex. Cereb. Cortex 9 431–444. 10.1093/cercor/9.5.431 [DOI] [PubMed] [Google Scholar]
  36. Muresan S., González-Ibáñez R., Ghosh D., Wacholder N. (2016). Identification of nonliteral language in social media: a case study on sarcasm. J. Assoc. Inform. Sci. Technol. 67 2725–2737. 10.1002/asi.23624 [DOI] [Google Scholar]
  37. Okdie B. M., Guadagno R. E., Bernieri F. J., Geers A. L., Mclarney-Vesotski A. R. (2011). Getting to know you: face-to-face versus online interactions. Comput. Hum. Behav. 27 153–159. 10.1016/j.chb.2010.07.017 [DOI] [Google Scholar]
  38. Pessoa L. (2009). How do emotion and motivation direct executive control? Trends Cogn. Sci. 13 160–166. 10.1016/j.tics.2009.01.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Pierce K., Müller R. A., Ambrose J., Allen G., Courchesne E. (2001). Face processing occurs outside the fusiformface area’in autism: evidence from functional MRI. Brain 124 2059–2073. 10.1093/brain/124.10.2059 [DOI] [PubMed] [Google Scholar]
  40. Poushter J., Bell J., Oates R. (2015). Internet Seen as Positive Influence on Education but Negative on Morality in Emerging and Developing Nations. Washington, DC: Pew Research Center. [Google Scholar]
  41. Rivera K., Cooke N. J., Bauhs J. A. (1996). The effects of emotional icons on remote communication. Paper Presented at the Conference Companion on Human Factors in Computing Systems, New York, NY. [Google Scholar]
  42. Sagiv N., Bentin S. (2001). Structural encoding of human and schematic faces: holistic and part-based processes. J. Cogn. Neurosci. 13 937–951. 10.1162/089892901753165854 [DOI] [PubMed] [Google Scholar]
  43. Sarbaugh-Thompson M., Feldman M. S. (1998). Electronic mail and organizational communication: does saying “hi” really matter? Organ. Sci. 9 685–698. 10.1287/orsc.9.6.685 [DOI] [Google Scholar]
  44. Schuller B., Batliner A. (2013). Computational Paralinguistics: Emotion, Affect and Personality in Speech and Language Processing. Chichester: John Wiley & Sons. [Google Scholar]
  45. Schultz R. T., Gauthier I., Klin A., Fulbright R. K., Anderson A. W., Volkmar F., et al. (2000). Abnormal ventral temporal cortical activity during face discrimination among individuals with autism and Asperger syndrome. Arch. Gen. Psychiatry 57 331–340. 10.1001/archpsyc.57.4.331 [DOI] [PubMed] [Google Scholar]
  46. Shin Y. W., Kwon J. S., Kwon K. W., Gu B. M., Song I. C., Na D. G., et al. (2008). Objects and their icons in the brain: the neural correlates of visual concept formation. Neurosci. Lett. 436 300–304. 10.1016/j.neulet.2008.03.047 [DOI] [PubMed] [Google Scholar]
  47. Short J., Williams E., Christie B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons. [Google Scholar]
  48. Sproull L., Kiesler S. (1986). Reducing social context cues: electronic mail in organizational communication. Manag. Sci. 32 1492–1512. 10.1287/mnsc.32.11.1492 [DOI] [Google Scholar]
  49. Starrfelt R., Gerlach C. (2007). The visual what for area: words and pictures in the left fusiform gyrus. Neuroimage 35 334–342. 10.1016/j.neuroimage.2006.12.003 [DOI] [PubMed] [Google Scholar]
  50. Thompson D., Mackenzie I. G., Leuthold H., Filik R. (2016). Emotional responses to irony and emoticons in written language: evidence from EDA and facial EMG. Psychophysiology 53 1054–1062. 10.1111/psyp.12642 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Vincent J., Fortunati L. (2009). Electronic Emotion: The Mediation of Emotion via Information and Communication Technologies. Bern: Peter Lang. [Google Scholar]
  52. Walther J. B. (1995). Relational aspects of computer-mediated communication: experimental observations over time. Organ. Sci. 6 186–203. 10.1287/orsc.6.2.186 [DOI] [Google Scholar]
  53. Walther J. B. (1996). Computer-mediated communication impersonal, interpersonal, and hyperpersonal interaction. Commun. Res. 23 3–43. 10.1177/009365096023001001 [DOI] [Google Scholar]
  54. Walther J. B., D’Addario K. P. (2001). The impacts of emoticons on message interpretation in computer-mediated communication. Soc. Sci. Comput. Rev. 19 324–347. 10.1177/089443930101900307 [DOI] [Google Scholar]
  55. Yuasa M., Saito K., Mukawa N. (2006). Emoticons convey emotions without cognition of faces: an fMRI study. Paper Presented at the Human Factors in Computing Systems, Montréal, QC. [Google Scholar]
  56. Yuasa M., Saito K., Mukawa N. (2011). Brain activity when reading sentences and emoticons: an fMRI study of verbal and nonverbal communication. Electron. Commun. Jpn. 94 17–24. 10.1002/ecj.10311 [DOI] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES