Skip to main content
Sage Choice logoLink to Sage Choice
. 2021 Feb 5;30(6):759–776. doi: 10.1177/0963662521990848

Interactions between emotional and cognitive engagement with science on YouTube

Ilana Dubovi 1,, Iris Tabak 2
PMCID: PMC8314998  PMID: 33546572

Abstract

This study aimed to map and characterize public engagement with science on YouTube. A two-part study was conducted. First, we collected and quantitatively analyzed trending videos on YouTube to evaluate the magnitude of public interaction with science content. Then, we assessed actual, rather than self-reports of, media interactions with science-related YouTube trending videos. We tested associations between behavioral engagement of viewing, liking, disliking or commenting, and emotional and cognitive engagement. Our findings affirm that science content attracts high public interest and that emotional and cognitive engagement with science on social media are distinct, but interrelated. We show that regardless of the valence of emotional engagement, emotion is linked to greater behavioral engagement of posting comments and to greater cognitive engagement of argumentative deliberation. Therefore, our findings suggest that social media interactions, which tend to evoke emotional responses, are a promising means of advancing person-to-person engagement with science.

Keywords: argumentation, behavioral engagement, cognitive engagement, emotional engagement, public engagement in science, sentiment analysis, social media, YouTube


Social media is gaining momentum as both a context for public engagement with science, and a target for research on public engagement with science (Hargittai et al., 2018; Huber et al., 2019; Kahle et al., 2016; Metag, 2020). In fact, social media may offer alternatives to people who want to engage with science but have reservations concerning traditional or print media (Metag, 2020). However, little is known about the nature of public engagement with science through social media (Hargittai et al., 2018; Kahle et al., 2016; Metag, 2020). In addition, the term engagement connotes multiple meanings and there is a need for greater precision and conceptualization of this term (Hargittai et al., 2018; Ji et al., 2019; Jünger and Fähnrich, 2020; McCallie et al., 2009). In this article, we examine public engagement with science on YouTube, focusing on naturally occurring online behaviors, employing a more refined, multidimensional, conceptualization of engagement.

The term “engagement with science” is used to mark an active and reciprocal interaction between public and scientific content through interaction with media, other members of the public or formal scientific entities, such as scientists, or governmental research institutes (Hargittai et al., 2018; Khan, 2017; McCallie et al., 2009). Yet, “engagement” invariably denotes a broad range of activities and interactions from consumption such as reading or viewing of scientific information, to interaction, such as deliberation over the merit, applicability or moral implications of specific scientific findings, to public action, such as influencing policy (McCallie et al., 2009). Overall, the term “engagement” is underspecified and equivocal (Hargittai et al., 2018; Ji et al., 2019; Jünger and Fähnrich, 2020; McCallie et al., 2009). In this article, we synthesize literature from communication, psychology, and education to articulate a conceptualization of engagement on social media.

Less is known about public interaction with science in the context of social media than in the context of print or older media such as radio, television, and standard websites. In the context of social media, little is known, for example, about the object of interactions, the extent of interactions, the nature of interactions, and how interactions manifest in different social media platforms. While large-scale surveys are essential for identifying public science information sources and attitudes toward different sources (Funk et al., 2017; National Science Board, 2018), they do not necessarily measure whether people skim or immerse in these online sources, agree or disagree with the information or what emotional responses they evoke in the real world. Yet, this knowledge is key for understanding the role of new media in supporting an agentive public. In this article, we apply a conceptual framework of engagement on social media to analyses of user interactions and to semantic analyses of comments on science-related YouTube channels, in order to gain insight into such understanding.

We focus on YouTube for a few reasons. First, as one of the top three platforms for social media and for broadcasting content (Alexa Internet Inc., 2020; Perrin and Anderson, 2019), it merits attention and study. Second, the myriad of content options and scale of use offer potential for studying a diverse audience with respect to public engagement with science. Finally, the post-video comment space presents a context where debate and deliberation can be observed and measured (Dubovi and Tabak, 2020), yet this comment space is understudied (Murthy and Sharma, 2019).

Our two research questions are (1) what is the proportion of engagement with science within top trending videos on YouTube? and (2) what characterizes behavioral, cognitive, and emotional engagement with scientific content?

1. Literature review

Public participation in science on social media

Investigating Internet and social media users’ comments on science-related sites and discussions can provide a window into the interests, attitudes, thoughts, and deliberations of a myriad of people who use these sites world-wide. For example, Kahle et al. (2016) found that visiting a site correlated with curiosity and with commenting, they further found that commenting corresponded with controversy and that arguments were associated with longer threads. This suggests that comments might reflect a deeper level, or different form or quality of engagement. Considerable research on user comments examines the lifespan and flow of ideas, tracking elements such as retweeting, or one comment taking up ideas from another comment (Bhattacharya et al., 2014; Hine, 2014; Salathé and Khandelwal, 2011).

Scientists and science communicators impose a frame (Jünger and Fähnrich, 2020; Nisbet and Mooney, 2007; Tabak, 2016) on the article or video that they publish and post. That is, the content that they present conveys not only facts, but also attitudes and positions, even if these are not expressed explicitly (Chong and Druckman, 2007; Entman, 1993). Therefore, one line of research examining the content of user comments studies whether the tone, position expressed or topic of user comments correspond with those of the focal news article or video. For example, Tian (2010) found that how a video is framed in its written description on YouTube altered the valence of the comments about that video (Tian and Yoo, 2020). Similarly, in a variety of videos on genetically modified food, different sources, such as advocacy groups or scientists, evinced different tones, such as positive, negative or neutral, and user comments similarly varied by source but generally reflected the tone of the video (Chi et al., 2018).

Unlike sentiments and tone, there was no clear pattern in the topics that users raised in their comments, some topics were similar to the topics raised in the video, and some topics were different (Kahle et al., 2016; Knowles and Wilkinson, 2015; Tian, 2010; Zhang, 2016). It is possible that similarity in tone and sentiment is a result of a process of self-selection, where people choose to view and comment on videos that correspond with their views. However, expressions of disagreement are sometimes associated with extensive comments (Dubovi and Tabak, 2020), suggesting that disagreement as well as agreement draw viewing and commenting. Therefore, the relationship between video content, and tone and sentiment of comments is complex, and comment analysis alone does not provide sufficient information to determine whether video content affects comment sentiment.

Increasingly, scholars have become interested in how people discuss science on social media and whether these discussions help to create a deeper understanding of how the public uses and constructs scientific knowledge. One consistent finding is that user comments include many references to personal experiences, and that these personal experiences are considered credible sources of evidence by the commentators as well as by other users (Kahle et al., 2016; Knowles and Wilkinson, 2015; Len-Ríos et al., 2014; Morphett et al., 2020; Zhang, 2016). People are unlikely to cite sources in support of their claims (Zhang, 2016). This does not necessarily mean that the public sees no need for empirical evidence, rather, it may be a function of perceiving social media’s communicative genre as informal conversation that does not call for providing evidentiary support for claims. In fact, online comments often question and critique scientific methods and interpretations (Len-Ríos et al., 2014; Orr and Baram-Tsabari, 2018). When social media users do cite sources, these sources vary considerably from personal experiences, to published research, to news articles, to Google searches, to the Bible or to quotes from well-known individuals (Zhang, 2016).

One characteristic that distinguishes public discussions on social media from formal scientific communications is in the expression of emotion in relation to scientific topics. Participants express fear (Orr and Baram-Tsabari, 2018), anger (Knowles and Wilkinson, 2015), and other emotions in their comments. Thus, their interactions with scientific content are both cognitive and emotional. Media consumption can have cognitive, affective, attitudinal, and behavioral effects, and these different effect-dimensions can interact (Maier et al., 2014). For example, affective reactions to science news media can influence judgments such as risk assessment (Maier et al., 2014). Therefore, more attention should be given to the ways in which these different dimensions play out in social media’s role in public engagement with science, because this may affect how people apply scientific knowledge, and it may also affect the trust people place in public scientific research.

The construct of engagement and how it manifests in social media

Engagement is a multidimensional construct that refers to an individual’s active involvement in an activity (Christenson et al., 2012). Operationalizations of engagement have been offered from a variety of theoretical and practical approaches (Christenson et al., 2012; D’Mello et al., 2017). Grounded in communication, psychological, and educational research, we synthesize literature on the construct of engagement, articulating three types of engagement: behavioral, emotional, and cognitive (D’Mello et al., 2017; Fredricks et al., 2004; Sinatra et al., 2015; Skinner, 2016).

Behavioral engagement on social media

Behavioral engagement is the outward manifestation of involvement. In educational contexts, it refers to students’ involvement in observable behavior directly related to the learning process, starting with attendance and continuing along the spectrum to attentiveness, compliance, effort expenditure, concentration, focus, persistence, hard work, and taking the initiative in academic tasks (Sinatra et al., 2015; Skinner and Belmont, 1993). Educational research shows a robust link between behavioral engagement and achievement (Ladd and Dinella, 2009; Luo et al., 2009; Wang and Eccles, 2012).

Online behavioral engagement on various social media platforms is typically expressed symbolically through actions such as liking, commenting, and sharing. On YouTube, it also includes uploading and sharing videos, viewing videos, and reading comments. Users may choose to remain passive by simply consuming content, or play an active role by participating in various interactions, and even repurpose content to fit their needs (Khan, 2017). Posting comments signifies a higher form of engagement. Comments can reveal and expose personal meaning and resources, which in turn can transform user–user interaction into dialogue (Kahle et al., 2016; Muntinga et al., 2011; Shao, 2009).

Emotional engagement on social media

A number of theoretical traditions specify different components of emotional engagement (Christenson et al., 2012; D’Mello et al., 2017). Theories of motivation, including self-determination theory, and the control-value theory of academic emotions (Deci and Ryan, 1985; Pekrun and Linnenbrink-Garcia, 2012), emphasize how positive and negative emotions that are generated by learning situations influence involvement in learning activities. Positive emotions include enthusiasm, interest, satisfaction, and enjoyment (Renninger and Bachrach, 2015). Disaffected emotional components include boredom, anxiety, and frustration (Skinner and Belmont, 1993). Since behavioral and emotional engagement in the classroom are tightly coupled, both have been found to be strong predictors of grades and achievement test scores (Connell et al., 1995).

On social media, emotional engagement is mostly inferred from comment text through computational techniques such as sentiment analysis (Mohammad, 2016) that typically identify the components and intensity of emotional arousal discussed above (Huang and Grant, 2020). In these techniques, a lexicon for a set of possible emotions (e.g. joy, anger, sadness), depicting a collection of words directly (e.g. delighted as an index of joy) and indirectly (e.g. shout as an index of anger) related to the set of emotions is used to analyze and compute instances of words from the text. Other approaches infer emotions through content analysis or other qualitative analyses that focus more closely on the content and meaning of texts in context, such as in tweets or in the context of a conversation thread (Gaspar et al., 2016). Sentiment analysis offers greater consistency and reliability because it is not dependent on case-by-case judgment and interpretation; it is also resource-lean and efficient. However, unlike qualitative approaches, it may overlook nuances of meaning that arise from the ways in which particular words are used in context.

Research that employs such sentiment analysis reveals associations between commenting patterns and sentiments. Thelwall et al. (2012) report that while user comments are typically mildly to moderately positive, 35% of comments contain some negativity. They explain that although negative sentiment was uncommon, it was more prevalent in comments for videos attracting many comments; conversely, positive sentiment was disproportionately common in videos attracting few comments. Thus, it seems that negativity can drive commenting, perhaps partly through long-running, acrimonious, comment-based discussions. Similarly, a large-scale study using 756 popular queries to generate 67,290 videos with 6.1 million comments has investigated the role of sentiment in categories and the ratings of comments (i.e. the extent to which YouTube users rate a comment as good or bad), finding that ratings were predominantly positive (Siersdorfer et al., 2010). This study also found that negative comments tended to be disliked and positive comments tended to be liked, similar to the relationships between behavioral and emotional engagement found in learning contexts.

Cognitive engagement on social media

Cognitive engagement is more difficult to define precisely. The “problem that plagues construct definition here is lack of agreement among scholars about how cognitive engagement should be operationalized” (Sinatra et al., 2015). A widely used definition of cognitive engagement is psychological investment, which refers to students’ mental orientation during learning activities, such as an orientation toward deeper understanding and growth, a preference for challenge, and use of self-regulatory or coping strategies. Students become psychologically invested when they expend cognitive effort in order to understand, go beyond the requirement of the activity, use flexible problem-solving, and choose challenging tasks.

User post-video comments on YouTube can reflect this approach to cognitive engagement because users go beyond the basic task requirements of viewing to posting. Of course, the comments themselves can be relatively flippant comments, such as “great,” “thanks for posting” or “very timely.” Comment contents can further reflect cognitive engagement in different ways. One form of such engagement, often measured through self-report surveys (Jensen, 2011; Oeldorf-Hirsch, 2018; Perse, 1990), is elaborating on the information provided in the original post. The asynchronous nature of post-video comments on YouTube and other digital media supports deeper engagement by giving participants more time to think, reflect, and seek information before they contribute to the discussion (Lucas et al., 2014). Another form of cognitive engagement on social media is argumentation (Asterhan and Hever, 2015; Dubovi and Tabak, 2020; Shapiro and Park, 2015, 2018; Tsovaltzi et al., 2015), where users respond to each other’s comments by further elaborating, challenging claims in the post or other comments, or by providing evidence to support or refute claims. In this way, cognitive engagement is both an individual and collaborative accomplishment.

2. Methods

Research design

We conducted a two-part study. Part 1 included collection and quantitative analysis of YouTube trending videos to evaluate the magnitude of public interaction with science content. Part 2 incorporated evaluation of YouTube trending videos that were associated with science and educational content for behavioral, emotional, and cognitive engagement using quantitative analyses.

Data collection and sample

Study part 1

Aiming to estimate the proportion of science content on YouTube, we created a usable scope of data that focused on trending videos. Trending videos were selected for our data set as it is the only approach that is allowed by the YouTube Data Application Programming Interface (API) for a systematic collection of public standard feeds. Although this is a convenience sample, it is pertinent and informative, because it yields the videos which have the potential to become high traffic videos on YouTube in a relatively short time. YouTube Data API permits retrieval of only 200 new top trending videos a day, including feeds and statistics related to videos and users. To characterize the collected video content, we used a YouTube Data API feature to automatically associate video categories 1 . This enabled us to learn about the landscape of daily spontaneous topics that people in different viewership communities are watching on YouTube, for example, views of a how-to topic that is relevant to everyday decisions, such as troubleshooting and problem-solving; or views of an educational topic that might be used for recording and disseminating course lectures (Kousha et al., 2012). Our data set was collected from March 2019 to July 2019 using a Python script (Jolly, 2018).

Study part 2

To assess the level of behavioral engagement with science content, we counted the number of views, likes, dislikes, number of comments for each trending video that was classified by YouTube’s API as Science and Education content. The data were collected at two different time points: the first time point was during part 1 of the study, as described earlier; the second time point was 1 year later, during May 2020.

To assess the level of emotional engagement and cognitive engagement, we focused on a random sample of 1000 comments from each video that was associated with science and educational content. Each comment was double-coded for emotional engagement using an automated sentiment analysis (see below), and for cognitive engagement using an automated classification of argumentative moves.

An automated sentiment analysis to pinpoint emotional engagement was conducted using natural language processing with the Syuzhet R package (Jockers, 2017). The Syuzhet package incorporates several lexicons which enables both sentiment polarity evaluation (i.e. reporting positive or negative words), and emotional categorization into the following eight types: anger, anticipation, disgust, fear, joy, sadness, surprise, trust, positive, negative. Syuzhet natural language processing has been applied to several sources of data and was found reliable (Ragupathy and Maguluri, 2018; Widyaningrum et al., 2019; Yoon et al., 2017). Grounded in the Interactive-Constructive-Active-Passive (ICAP) framework that suggests that arguing a position is the highest level of cognitive engagement (Chi et al., 2018), we tracked elementary units of argumentative discourse. We developed an automated coding scheme of linguistic argumentation elementary units (i.e. agree, disagree, evidence, reason, fact (Mochales and Moens, 2011; Toulmin, 1958; Weinberger and Fischer, 2006). Then, we used nCoder, an online software (Arastoopour Irgens et al., 2020; Cai et al., 2019) to automatically apply the coding scheme to 89,000 comments. Inter-rater reliability between a human rater (first author) and the automated algorithm was almost perfect, Cohen’s kappa = 0.96 (McHugh, 2012).

Analysis

Descriptive statistics were computed using means, standard deviations, frequencies, and percentages to describe the distribution of the top trending YouTube videos. In addition, a one-way analysis of variance (ANOVA) and repeated measures ANOVA were conducted to examine differences across different science and educational topics comparing two time points of data collection. To evaluate associations between engagement characteristics, we used a bivariate parametric correlation analysis. Data were analyzed using SPSS version 25.

3. Findings

Study part 1

As we mentioned in the Data Collection section, we collected our trending video data over a 3-month period starting from March 2019; and each day, we added a new collection of 200 trending videos to our data set. Thus, in total, we collected 14,700 trending videos (for details see Supplemental material 1). Analysis revealed that videos with educational and science content were included among the top trending YouTube videos. We learned that the three leading categories (of the categories automatically generated by the API) that incorporated the highest number of videos were (1) the entertainment category (encompassing 30% of the top trending videos); (2) the music category (encompassing 14.5% of the top trending videos); (3) and the sports category (encompassing 11.4% of the top trending videos); which derived from 338, 248, and 148 channels, respectively (see Supplemental material 1). Science and education was the 11th highest category deriving from 36 channels.

Videos with science and educational content derived from only 36 channels and encompass 3.4% (N = 503) of the top trending videos (Figure 1). Even though the science and education category has a relatively small amount of trending videos, the analyses show that they were able to trigger a relatively high number of views on average, reaching the sixth highest category (more than the comedy and sports categories, Figure 1a) and also a high number of post-video comments on average, reaching the eighth highest category (Figure 1b).

Figure 1.

Figure 1.

(a) mean number of views per category and (b) Mean number of post-video comments per category.

Study part 2

Behavioral engagement

To pinpoint characteristics of behavioral engagement with science content, we focused on 503 videos that were classified as science and education content. Of these, we removed videos that were selected as trending videos multiple times (duplicates), which left us with 89 videos. Subsequently, for the sample of 89 videos, we collected the following data at two different time points, a year apart: number of comments, views, likes, dislikes (for details see Table 1).

Table 1.

Summary data of top trending videos with focus on science and educational content at two different time points (89 videos).

Topic No of videos (%) No of comments Views Likes Dislikes Example
Time 1 Time 2 Time 1 Time 2 Time 1 Time 2 Time 1 Time 2
Human organism 15 (17) 11,095 (±16,382) 19,171* (±26,028) 1,696,285 (±1,444,401) 4,718,483 (±4,725,446) 87,394 (±104,025) 176,016 (±214,982) 2380 (±3031) 4429 (±4468) How fall asleep/vaccines side effects/human hair
Universe 16 (18) 6771 (±6811) 8010 (±7243) 1,826,481 (±1,755,865) 3,158,983 (±2,689,998) 62,001 (±65,661) 89,231 (±93,564) 1415 (±1745) 2052 (±2015) The quantum theory that connects the entire universe
Nature 10 (11) 5678 (±5376) 7867 (±7452) 2,141,678 (±3,657,271) 4,050,883 (±6,084,642) 66,722 (±107,129) 101,104 (±149,392) 1767 (±1873) 2738 (±2375) World’s lightest solid/ liquid oxygen & diamond?
Making & How it works 30 (34) 8414 (±11,085) 10,353 (±14,041) 3,109,563 (±5,472,657) 5,510,661 (±8,332,356) 82,701 (±115,964) 123,449 (±161,183) 2771 (±3778) 4011 (±5250) Magnetic sand/magnesium fires
History 9 (10) 1647 (±667) 1966 (±933) 584,937 (±277,317) 1,436,425 (±662,186) 16,055 (±8208) 26,259 (±8559) 485 (±229) 485 (±314) U.S. Civil War
Other 9 (10) 3223 (±2234) 4320 (±3316) 1,704,515 (±2,767,927) 2,989,407 (±4,778,036) 29,563 (±19,221) 46,810 (±35,609) 1916 (±3702) 2991 (±5804) Language and accent/economics

Numbers represent Mean ± SD.

*

Significant difference p < .05 between time points compared to differences in other video categories.

YouTube offers various interactions with the content, which might be distinguished from simple exposure to content, such as viewing the video, and by more proactive behaviors of commenting, liking and disliking. Following one-way ANOVA that compared the total number, within our full corpus of data, of each of these types of interactions, Bonferroni post hoc tests showed that viewing videos was the most common user experience compared to liking, commenting, and disliking (2,134,570 ± 3,665,698; 65,862 ± 93,105; 7054 ± 10,133; 2031 ± 2970, respectively; F(3,352) = 29.5, p < .0001).

Using one-way ANOVA, we compared counts of instances of behavioral engagement in post-video comments between videos covering different topics: human organism, universe, nature, making, history and others (see Supplemental material 2). There were no significant differences between topics in the number of instances of cognitive engagement in post-video comments (F(15,332) = 0.81, p = .66). We further conducted repeated measures ANOVA to test whether there were differences, within each topic, in the increase between time points in the number of views, comments, likes, and dislikes. The only significant increase between time periods was the increase in comments on the topic human organism (health) (F(5,83) = 2.42, p < .05; for details see Supplemental material 3). All other differences between topics in increases between time points in views, comments, likes, and dislikes, were not significant.

We tested for correlations between views, comments likes and dislikes within the science and educational videos using a bivariate parametric correlation analysis. There were moderate to strong correlations between the various engagement behaviors (Table 2). Namely, there is a moderate to strong correlation between the number of comments posted to the number of views (r = .64, p < .01), number of likes (r = .88, p < .01), and dislikes (r = .75, p < .01; Table 2).

Table 2.

Bivariate intercorrelations between behavioral, emotional and cognitive engagement (N = 89 videos).

Variable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Behavioral engagement 1. Comments
2. Views 0.64***
3. Likes 0.88*** 0.71***
4. Dislikes 0.75*** 0.84*** 0.76***
Emotional engagement1. 5. Neutral emotions -0.17 −0.01 −0.16 −0.08
6. Positive emotions 0.13 −0.09 0.11 0.01 −0.86***
7. Negative emotions 0.20 −0.02 0.08 0.08 −0.85*** 0.75***
8. Anger 0.07 −0.09 −0.04 0.04 −0.69*** 0.59*** 0.88***
9. Disgust 0.24* −0.04 0.10 0.08 −0.68*** 0.67*** 0.83*** 0.78***
10. Fear 0.17 −0.09 0.04 0.05 −0.71*** 0.65*** 0.89*** 0.87*** 0.76***
11. Sadness 0.24* −0.02 0.13 0.08 −0.77*** 0.73*** 0.91**** 0.76*** 0.80*** 0.79***
12. Anticipation 0.10 −0.09 0.07 −0.03 −0.77*** 0.89*** 0.74**** 0.60*** 0.60*** 0.65*** 0.74***
13. Joy 0.21* −0.05 0.17 0.05 −0.80*** 0.87*** 0.66*** 0.57*** 0.60*** 0.60*** 0.62*** 0.81***
14. Surprise 0.10 −0.03 0.08 0.05 −0.79*** 0.82*** 0.81*** 0.66*** 0.67*** 0.72*** 0.78*** 0.86*** 0.74***
15. Trust 0.15 −0.08 0.11 0.02 −0.83*** 0.95*** 0.80*** 0.66*** 0.70*** 0.71*** 0.77*** 0.91*** 0.85*** 0.82***
Cognitive engagement 16. Argumentation 0.22* −0.05 0.29** 0.04 −0.71*** 0.70*** 0.78*** 0.46*** 0.50*** 0.55*** 0.70*** 0.74*** 0.55*** 0.66*** 0.81***

Data represents the Pearson r.

*

p < .05; **p < .01; ***p < .001.

Emotional engagement

Semantic analysis was conducted on a sample of 1000 comments from each video (89 videos, total 89000 comments). Sentiment polarity evaluation shows that on average there were significantly fewer negative expressions (407 ± 205) than positive and neutral expressions (580 ± 304; 545 ± 99, respectively) per video (F(2,258) = 15.16, p < .001). Furthermore, we compared how often post-video comments express each of eight types of emotions: anger, anticipation, disgust, fear, joy, sadness, surprise, and trust. This one way ANOVA analysis shows that there is a significant difference in how emotions are expressed in post-video comments (F(7,688) = 30.1, p < .001). Bonferroni post hoc analysis found that overall trust is the most commonly expressed emotion, that there are significantly more comments that express joy than anger, and that anger and disgust are half as often expressed than trust and anticipation (Figure 2). There were no significant differences between science and educational topics in the level of emotional engagement expressed within their associated post-video comments (one way ANOVA F(35,648) = 1.22, p = .18).

Figure 2.

Figure 2.

Sentiment analysis: Amount of emotional expressions per video (Mean ± SD; median; 10–90 percentile range).

Emotional engagement and behavioral engagement

Bivariate parametric correlation analysis demonstrates significant small to moderate correlations between number of posted comments and expressed feelings of sadness (r = 0.24, p < .05), disgust (r = 0.24, p < .05), and joy (r = 0.21, p < .05). That is, expression of emotions such as sadness, disgust, and joy were correlated with a higher tendency to post comments (Table 2). However, when analyzing increases in views, comments, likes, and dislikes between the two data collection time points, no significant correlations were found (Table 2).

Cognitive engagement

Analysis of cognitive engagement was conducted on the same sample as the semantic analysis of emotional engagement (89 videos, total 89,000 comments). Automated coding of linguistic argumentation using Syuzhet R package (Jockers, 2017), detected that at least 16% of comments included an argumentative expression. The one-way ANOVA analysis comparing the number of argumentative expressions in post-video comments between videos of different science and education topics shows differences with a large effect size (F(5,81) = 5.97, p < .001, η2 = 0.27). Bonferroni post hoc analysis found that videos with making content incorporated significantly less argumentative expressions than all other topics; making M = 39.6 (±28.3); universe M = 117.9 (±71.9); human organism, M = 84.1 (±71.3); nature M = 58.3 (±34.1); history M = 57.8 (±45.8); other M = 44.7 (±24.4).

Cognitive, emotional, and behavioral engagement

Bivariate parametric correlation analysis demonstrates significant positive correlations between cognitive, behavioral, and emotional engagement. The findings demonstrate a small but significant correlation between argumentation and behavioral engagement, namely: number of comments are correlated with average number of argumentative expressions (r = 0.22, p < .05); and number of likes are correlated with average number of argumentative expressions (r = 0.29, p < .01). However, when analyzing the amount of gained views, comments, likes, and dislikes between the two time points of data collection to cognitive engagement, no significant correlations were found (r = −0.12, p = .262; r = 0.05, p = .660; r = −0.05, p = .646; r = −0.07, p = .535, respectively).

In addition, incorporation of argumentative expressions is correlated with positive emotional expressions (r = 0.84, p < .001) as well as with negative emotional expressions (r = 0.77, p < .001). Neutral statements were negatively correlated with argumentation (r = −0.82, p < .001). Detailed emotional analysis specified that expression of all eight emotions (anger, disgust, fear, sadness, anticipation, joy, surprise and trust) is strongly correlated with argumentative discourse (Table 2).

To further understand the impact of emotional and behavioral engagement on cognitive engagement, we performed a hierarchical multiple regression analysis (Table 3). Preliminary analyses were conducted to ensure that there were no violations of the assumptions of normality, linearity, and homoscedasticity. In addition, before conducting the regression analyses, the variance inflation factor (VIF) value was measured to identify the presence of multicollinearity (Thompson et al., 2017). Multicollinearity was an issue for emotional variables; thus, the anticipation variable was removed to make sure that the tolerance was greater than 0.10 for all variables in the model, and VIFs were well below 6.0 (Stevens, 2001). In Model 1, characteristics of behavioral engagement did not have a significant contribution to the regression model, F(4, 82) = 1.86, p = .12. However, incorporation of emotional engagement characteristics in Model 2 explained 70% of the variance in cognitive engagement. Trust, joy, sadness, and disgust had a positive effect on cognitive engagement. Thus, the model as a whole explained 79% of cognitive engagement level, F(7, 75) = 35.47, p < .001.

Table 3.

Summary of hierarchical regression analysis of cognitive engagement (N = 89 videos).

Variable Model 1 Model 2
β t β t
Comments 0.35 1.48 0.25 1.83
Views 0.29 1.46 0.06 0.59
Likes 0.01 0.06 −0.12 −0.95
Dislikes 0.01 0.06 −0.03 −0.01
Anger 0.18 1.43
Disgust 0.22 2.17*
Fear 0.01 0.08
Sadness 0.32 2.71**
Joy 0.50 4.59***
Surprise 0.08 0.64
Trust 1.17 8.88***
AIC 702.449 591.342
R 2 0.08 0.78
F for change in R2 1.86 35.47***
ΔR2 0.08 0.70

AIC: Akaike Information Criterion.

Coefficients in the table are standardized beta coefficients.

*

p < .05; **p < .01; ***p < .001.

4. Discussion

The goal of this study was to extend our understanding of public engagement with science on social media. In particular, we focused on the characteristics of social media, which enable person-to-person engagement, and on analyses of actual media interactions rather than self-report data on surveys. Self-reports offer important insights, especially into the meaning the individuals ascribe to their interactions, but analyses of actual media interactions offer a way to more accurately assess sequence, frequency, and forms of interaction, as well as the ability to test their association with multiple factors.

We explored the magnitude of engagement with science relative to other content on YouTube, and then, we investigated in-depth characteristics of behavioral, emotional, and cognitive dimensions of engagement. Overall, our findings show positive correlations between the three types of engagement, and illuminate the role of emotions in public engagement with science on social media. Here, we elaborate on these findings.

To explore the magnitude of interactions with science on YouTube, we analyzed both the distribution of different types of trending videos, and the proportion of content that might be relevant to engaging with science. Our analysis revealed that the most trending content that users search for on YouTube relates to entertainment and music. However, the top trending YouTube videos (i.e. 3.4% of top trending videos) also include videos on educational and science topics, which attract a relatively high number of views and post-video comments. This finding is consistent with the Science and Engineering Indicators report about Internet and social media as a primary source of information about science and technology (National Science Board, 2018). Such information seeking motives are among the most likely to predict participatory acts such as liking or disliking videos, and commenting on videos (Khan, 2017). Investigating actual interactions, Thelwall et al. (2012) corroborate the interest in science and the propensity for participatory acts. They show that the most discussed topics, which received the highest number of replies on YouTube, were related to the science and technology category. Our findings extend this earlier research by identifying the expression of and interrelationships between behavioral, emotional, and cognitive engagement.

First, we explored behavioral engagement with science content. We found that on average, each video with science content received 20 times more views than likes, and 300 times more views than comments. Indeed, studies show that consumption behavior, absent participatory acts such as likes or comments, makes up 90% of many online communities (Edelmann, 2016; Nonnecke and Preece, 1999). Of course, we do not know whether this “viewing only” participation includes additional offline behaviors that would qualify as deeper engagement. It would be interesting to pursue future studies that investigate both online and offline behavior in order to gauge whether social media participatory acts reflect the full extent of engagement with the content posted on social media. Past research, based on self-reports of information elaboration provide some insights (Jensen, 2011; Oeldorf-Hirsch, 2018; Perse, 1990), but direct observation and measurement of behavior would provide more veridical accounts.

Activity that is visible on social media, posting comments and participating in post-video discussions, reflects the rarer type of engagement, cognitive engagement. Our findings show that behavioral and cognitive engagement are related. Although correlational analyses cannot reveal causal or temporal directions, we believe that the greater prevalence of behavioral engagement suggests that behavioral engagement triggers cognitive engagement. In other words, viewers of science videos are more likely to engage in person-to-person argumentative deliberation if the video content has prompted behavioral engagement such as posting comments and liking the video.

Cognitive engagement, which may be most pertinent to public engagement with science, is also associated with emotional engagement. Consistent with earlier research analyzing user comments (Knowles and Wilkinson, 2015; Orr and Baram-Tsabari, 2018), our findings show that YouTube post-video comments offer an important space for emotional expressions, and that these are related to other forms of engagement with science on social media (Maier et al., 2014). Our findings revealed that expression of both positive and negative emotions, as in the case of behavioral engagement, correlate with commenting behaviors. Negative emotions, such as disgust or sadness, and positive emotions, such as joy, were linked to a tendency to post more comments and, as such, to deeper cognitive involvement. Thus, emotional stimulation, regardless of valence, is associated with, and possibly triggers both behavioral and cognitive engagement. This idea is further supported by the finding that neutral emotional expressions were negatively related to processes of argumentative elaboration.

More research is needed before we can make specific recommendations for science communication. However, some immediate implications from this study are that in composing videos, science communicators should prioritize emotional triggers alongside considerations concerning the content of the message, such as detail, accuracy, and clarity. The study further implies that science communicators have some flexibility in terms of the specific emotions that they target, because both positive and negative emotions were associated with engagement.

This study has several limitations that should be taken into consideration. First, YouTube API does not make users’ demographic data available; therefore, future studies should look into variables such as gender and age to evaluate their effect on engagement level. For example, do gender gaps identified among YouTube video hosts (Amarasekara and Grant, 2018) also appear in differences in forms or degree of engagement. In addition, the analysis did not capture the accuracy and quality of the comments; therefore, further studies should consider whether the content of the comments is accurate or not, and evaluate the effect of comment quality on engagement. This line of research can also be extended by incorporating qualitative analyses, and by comparing the different types of user engagement across social media platforms such as between Facebook and YouTube, in trying to understand the relationship between platform characteristics and forms of engagement. Furthermore, this study focuses on the role of emotional valence and emotional presence only; however, it has been shown that emotional strength has different effects on citizen engagement with social media (Ji et al., 2019). Thus, subsequent studies should analyze the role of emotional strength.

In addition to addressing these limitations, there are also other future lines of research suggested by the current findings. One line of research could collect social media data over smaller intervals of time in order to examine interrelationships between changes in science content and changes in engagement. Our study highlights the role of emotion in social media interactions, and its interactions with other forms of engagement. Therefore, studies that explore the effects of emotional engagement on other factors would extend our understanding of the role of social media in public involvement with science. For example, Huber et al. (2019) suggest that emotion could be a key variable in understanding dynamics of trust, and Reif et al. (2020) relate emotions to perceptions of scientific trustworthiness (Reif et al., 2020).

5. Conclusion

The ubiquity of social media and calls for increasing public engagement with science motivate this research. We aimed to clarify the construct of engagement, and investigate how behavioral, emotional, and cognitive engagement with science on social media are distinguished, but interrelated. Cognitive engagement on social media hinges on behavioral and emotional engagement. This relationship holds regardless of valence. Both positive (e.g. joy) and negative emotions (e.g. sadness) are associated with and may trigger greater behavioral and cognitive engagement. Therefore, social media interactions, which tend to evoke emotional response, offer promise in advancing person-to-person cognitive engagement with science.

Supplemental Material

sj-pdf-1-pus-10.1177_0963662521990848 – Supplemental material for Interactions between emotional and cognitive engagement with science on YouTube

Supplemental material, sj-pdf-1-pus-10.1177_0963662521990848 for Interactions between emotional and cognitive engagement with science on YouTube by Ilana Dubovi and Iris Tabak in Public Understanding of Science

Author biographies

Ilana Dubovi is a learning science researcher at the Nursing Department at the Tel Aviv University. Her work spans the fields of educational technology, lifelong learning, cognition and emotion in learning processes, and medical education.

Iris Tabak is a learning scientist in the Department of Education at Ben-Gurion University of the Negev, and a Fellow of the International Society of the Learning Sciences (ISLS). She examines social, material, cognitive, and affective aspects of complex reasoning. Recently focusing on how non-scientists use networked resources to make evidence-based health decisions.

Footnotes

Author note: The authors thank Adi Kedem for her assistance in coding. Parts of this manuscript are based on an earlier paper presented at the 13th International Conference of Computer Supported Collaborative Learning (CSCL2019 in Lyon, France). They thank the International Society of the Learning Sciences (ISLS) that owns the copyright for this earlier paper (Dubovi & Tabak, 2019) for permission to re-use part of this work here.

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported, in part, by the Israel Planning and Budgeting Committee I-CORE Program and the Israel Science Foundation (ISF grant no. 1716/12) through the Learning in a Networked Society (LINKS) center, and by a Kreitman School for Advanced Research Studies Postdoctoral Fellowship to the first author.

Supplemental material: Supplemental material for this article is available online.

Contributor Information

Ilana Dubovi, Tel Aviv University, Israel.

Iris Tabak, Ben-Gurion University of the Negev, Israel.

References

  1. Alexa Internet Inc. (2020) The top 500 sites on the web. Available at: https://www.alexa.com/topsites
  2. Amarasekara I, Grant WJ. (2018) Exploring the YouTube science communication gender gap: A sentiment analysis. Public Understanding of Science 28: 68–84. [DOI] [PubMed] [Google Scholar]
  3. Arastoopour Irgens G, Dabholkar S, Bain C, Woods P, Hall K, Swanson H, et al. (2020) Modeling and measuring high school students’ computational thinking practices in science. Journal of Science Education and Technology 29: 137–161. [Google Scholar]
  4. Asterhan CSC, Hever R. (2015) Learning from reading argumentive group discussions in Facebook: Rhetoric style matters (again). Computers in Human Behavior 53: 570–576. [Google Scholar]
  5. Bhattacharya S, Srinivasan P, Polgreen P. (2014) Engagement with health agencies on Twitter. PLoS One 9: e112235. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Cai Z, Siebert-Evenstone A, Eagan B, Williamson Shaffer D, Hu X, Graesser AC. (2019) nCoder+: A semantic tool for improving recall of nCoder coding. Springer, pp. 41–54. Available at: https://www.researchgate.net/publication/336470284_nCoder_A_Semantic_Tool_for_Improving_Recall_of_nCoder_Coding
  7. Chi MTH, Adams J, Bogusch EB, Bruchok C, Kang S, Lancaster M, et al. (2018) Translating the ICAP theory of cognitive engagement into practice. Cognitive Science 42: 1777–1832. [DOI] [PubMed] [Google Scholar]
  8. Chong D, Druckman JN. (2007) A theory of framing and opinion formation in competitive elite environments. Journal of Communication 57(1): 99–118. [Google Scholar]
  9. Christenson SL, Reschly AL, Wylie C. (2012) Handbook of Research on Student Engagement. New York, NY: Springer. [Google Scholar]
  10. Connell JP, Halpem-Felsher BL, Clifford E, Crichlow W, Usinger P. (1995) Hanging in there: Behavioral, psychological, and contextual factors affecting whether African American adolescents stay in high school. Journal of Adolescent Research 10: 41–63. [Google Scholar]
  11. Deci EL, Ryan RM. (1985) Intrinsic Motivation and Self-Determination in Human Behavior. New York, NY: Plenum. [Google Scholar]
  12. D’Mello S, Dieterle E, Duckworth A. (2017) Advanced, analytic, automated (AAA) measurement of engagement during learning. Educational Psychologist 52: 104–123. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Dubovi I, Tabak I. (2020) An empirical analysis of knowledge co-construction in YouTube comments. Computers & Education 156: 103939. [Google Scholar]
  14. Edelmann N. (2016) What is lurking? A literature review of research on lurking. In: Riva G, Wiederhold BK, Cipresso P. (eds) The Psychology of Social Networking: Personal Experience in Online Communities. Berlin: De Gruyter, pp. 159–174. [Google Scholar]
  15. Entman RM. (1993) Framing: Towards clarification of a fractured paradigm. Journal of Communication 43(4): 51–58. [Google Scholar]
  16. Fredricks JA, Blumenfeld PC, Paris AH. (2004) School engagement: Potential of the concept, state of the evidence. Review of Educational Research 74: 59–109. [Google Scholar]
  17. Funk C, Gottfried J, Mitchell A. (2017) For media or other inquiries: “Science news and information today.” Pew Research Center, 20. [Google Scholar]
  18. Gaspar R, Pedro C, Panagiotopoulos P, Seibt B. (2016) Beyond positive or negative: Qualitative sentiment analysis of social media reactions to unexpected stressful events. Computers in Human Behavior 56: 179–191. [Google Scholar]
  19. Hargittai E, Füchslin T, Schäfer MS. (2018) How do young adults engage with science and research on social media? Some preliminary findings and an agenda for future research. Social Media and Society 4: 205630511879772. [Google Scholar]
  20. Hine C. (2014) Headlice eradication as everyday engagement with science: An analysis of online parenting discussions. Public Understanding of Science 23: 574–591. [DOI] [PubMed] [Google Scholar]
  21. Huang T, Grant WJ. (2020) A good story well told: Storytelling components that impact science video popularity on YouTube. Frontiers in Communication 5: 86–99. [Google Scholar]
  22. Huber B, Barnidge M, Gil de, Zúñiga H, Liu JH. (2019) Fostering public trust in science: The role of social media. Public Understanding of Science 28: 759–777. [DOI] [PubMed] [Google Scholar]
  23. Jensen JD. (2011) Knowledge acquisition following exposure to cancer news articles: A test of the cognitive mediation model. Journal of Communication 61: 514–534. [Google Scholar]
  24. Ji YG, Chen ZF, Tao W, Li ZC. (2019) Functional and emotional traits of corporate social media message strategies: Behavioral insights from S&P 500 Facebook data. Public Relations Review 45: 88–103. [Google Scholar]
  25. Jockers M. (2017) Syuzhet: Extracts sentiment and sentiment-derived plot arcs from text. R package version 1.04. Available at: https://cran.r-project.org/web/packages/syuzhet/
  26. Jolly M. (2018) Trending YouTube video scraper. Available at: https://github.com/mitchelljy/Trending-YouTube-Scraper
  27. Jünger J, Fähnrich B. (2020) Does really no one care? Analyzing the public engagement of communication scientists on Twitter. New Media & Society 22: 387–408. [Google Scholar]
  28. Kahle K, Sharon AJ, Baram-Tsabari A. (2016) Footprints of fascination: Digital traces of public engagement with particle physics on CERN’s social media platforms. PLoS One 11: e0156409. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Khan ML. (2017) Social media engagement: What motivates user participation and consumption on YouTube? Computers in Human Behavior 66: 236–247. [Google Scholar]
  30. Knowles R, Wilkinson C. (2015) The worries of weaning: Newspaper reporting of infant weaning and its impact on dialogue in online discussion forums. Journalism 18: 350–367. [Google Scholar]
  31. Kousha K, Thelwall M, Abdoli M. (2012) The role of online videos in research communication: A content analysis of YouTube videos cited in academic publications. Journal of the American Society for Information Science and Technology 63: 1710–1727. [Google Scholar]
  32. Ladd GW, Dinella LM. (2009) Continuity and change in early school engagement: Predictive of children’s achievement trajectories from first to eighth grade? Journal of Educational Psychology 101: 190–206. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Len-Ríos ME, Bhandari M, Medvedeva YS. (2014) Deliberation of the scientific evidence for breastfeeding: Online comments as social representations. Science Communication 36: 778–801. [Google Scholar]
  34. Lucas M, Gunawardena C, Moreira A. (2014) Assessing social construction of knowledge online: A critique of the interaction analysis model. Computers in Human Behavior 30: 574–582. [Google Scholar]
  35. Luo W, Hughes JN, Liew J, Kwok O. (2009) Classifying academically at-risk first graders into engagement types: Association with long-term achievement trajectories. Elementary School Journal 109: 380–405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. McCallie E, Bell L, Lohwater T, Falk JH, Lehr JL, Lewenstein BV, et al. (2009) Many Experts, Many Audiences: Public Engagement with Science and Informal Science Education. A CAISE Inquiry Group Report. Executive Summary. Washington, DC: Center for Advancement of Informal Science Education. [Google Scholar]
  37. McHugh ML. (2012) Interrater reliability: The kappa statistic. Biochemia Medica 22: 276–282. [PMC free article] [PubMed] [Google Scholar]
  38. Maier M, Rothmund T, Retzbach A, Otto L, Besley JC. (2014) Informal learning through science media usage. Educational Psychologist 49: 86–103. [Google Scholar]
  39. Metag J. (2020) What drives science media use? Predictors of media use for information about science and research in digital information environments. Public Understanding of Science 29(6): 561–578. [DOI] [PubMed] [Google Scholar]
  40. Mochales R, Moens MF. (2011) Argumentation mining. Artificial Intelligence and Law 19: 1–22. [Google Scholar]
  41. Mohammad SM. (2016) 9—sentiment analysis: Detecting valence, emotions, and other affectual states from text. In: Meiselman HL. (ed.) Emotion Measurement. Cambridge: Woodhead Publishing, pp. 201–237. [Google Scholar]
  42. Morphett K, Herron L, Gartner C. (2020) Protectors or puritans? Responses to media articles about the health effects of e-cigarettes. Addiction Research & Theory 28: 95–102. [Google Scholar]
  43. Muntinga DG, Moorman M, Smit EG. (2011) Introducing COBRAs: Exploring motivations for brand-related social media use. International Journal of Advertising 30: 13–46. [Google Scholar]
  44. Murthy D, Sharma S. (2019) Visualizing YouTube’s comment space: Online hostility as a networked phenomena. New Media & Society 21: 191–213. [Google Scholar]
  45. National Science Board (2018) Science and Engineering Indicators 2018. Arlington, VA: National Science Foundation; (NSB-2018-1). [Google Scholar]
  46. Nisbet MC, Mooney C. (2007) Framing science. Science 316: 56. [DOI] [PubMed] [Google Scholar]
  47. Nonnecke B, Preece J. (1999) Shedding light on lurkers in online communities, pp. 123–128. Available at: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.94.4833&rep=rep1&type=pdf
  48. Oeldorf-Hirsch A. (2018) The role of engagement in learning from active and incidental news exposure on social media. Mass Communication and Society 21: 225–247. [Google Scholar]
  49. Orr D, Baram-Tsabari A. (2018) Science and politics in the polio vaccination debate on Facebook: A mixed-methods approach to public engagement in a science-based dialogue. Journal of Microbiology & Biology Education 19: 191134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Pekrun R, Linnenbrink-Garcia L. (2012) Academic emotions and student engagement. In: Christenson SL, Reschly AL, Wylie C. (eds) Handbook of Research on Student Engagement. New York, NY: Springer, pp. 259–282. [Google Scholar]
  51. Perrin A, Anderson M. (2019) Share of U.S. Adults Using Social Media, Including Facebook is Mostly Unchanged Since 2018. Pew Research Center: Internet, Science & Technology. Available at: https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/ [Google Scholar]
  52. Perse EM. (1990) Involvement with local television news cognitive and emotional dimensions. Human Communication Research 16: 556–581. [Google Scholar]
  53. Ragupathy R, Maguluri LP. (2018) Comparative analysis of machine learning algorithms on IVR data. International Journal of Engineering and Technology 5: e4. [Google Scholar]
  54. Reif A, Kneisel T, Schäfer M, Taddicken M. (2020) Why are scientific experts perceived as trustworthy? Emotional assessment within TV and YouTube videos. Media and Communication 8(1): 191–205. [Google Scholar]
  55. Renninger KA, Bachrach JE. (2015) Studying triggers for interest and engagement using observational methods. Educational Psychologist 50: 58–69. [Google Scholar]
  56. Salathé M, Khandelwal S. (2011) Assessing vaccination sentiments with online social media: Implications for infectious disease dynamics and control. PLoS Computational Biology 7: e1002199. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Shao G. (2009) Understanding the appeal of user-generated media: A uses and gratification perspective. Internet Research 19: 7–25. [Google Scholar]
  58. Shapiro MA, Park HW. (2015) More than entertainment: YouTube and public responses to the science of global warming and climate change. Social Science Information 54(1): 115–145. [Google Scholar]
  59. Shapiro MA, Park HW. (2018) Climate change and YouTube: Deliberation potential in post-video discussions. Environmental Communication 12: 115–131. [Google Scholar]
  60. Siersdorfer S, Chelaru S, Nejdl W, Pedro JS. (2010) How useful are your comments? Analyzing and predicting YouTube comments and comment ratings, pp. 891–900. Available at: https://www.researchgate.net/publication/221023733_How_useful_are_your_comments_Analyzing_and_predicting_YouTube_comments_and_comment_ratings
  61. Sinatra GM, Heddy BC, Lombardi D. (2015) The Challenges of Defining and Measuring Student Engagement in Science. New York, NY: Routledge. [Google Scholar]
  62. Skinner E. (2016) Handbook of Motivation at School. New York, NY: Routledge. [Google Scholar]
  63. Skinner E, Belmont MJ. (1993) Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology 85: 571–581. [Google Scholar]
  64. Stevens JP. (2001) Applied Multivariate Statistics for the Social Sciences. Hillsdale, NJ: Lawrence Erlbuam Associates. [Google Scholar]
  65. Tabak I. (2016) Functional scientific literacy: Seeing the science within the words and across the web. In: Corno L, Anderman EM. (eds) Handbook of Educational Psychology, 3rd edn. London: Routledge, pp. 269–280. [Google Scholar]
  66. Thelwall M, Sud P, Vis F. (2012) Commenting on YouTube videos: From guatemalan rock to El Big Bang. Journal of the American Society for Information Science and Technology 63: 616–629. [Google Scholar]
  67. Thompson CG, Kim RS, Aloe AM, Becker BJ. (2017) Extracting the variance inflation factor and other multicollinearity diagnostics from typical regression results. Basic and Applied Social Psychology 39(2): 81–90. [Google Scholar]
  68. Tian Y. (2010) Organ donation on web 2.0: Content and audience analysis of organ donation videos on YouTube. Health Communication 25: 238–246. [DOI] [PubMed] [Google Scholar]
  69. Tian Y, Yoo J. (2020) Medical drama viewing and medical trust: A moderated mediation approach. Health Communication 35: 46–55. [DOI] [PubMed] [Google Scholar]
  70. Toulmin SE. (1958) The Uses of Argument: Updated Edition. Cambridge: Cambridge University Press. [Google Scholar]
  71. Tsovaltzi D, Greenhow C, Asterhan C. (2015) When friends argue: Learning from and through social network site discussions. Computers in Human Behavior 53: 567–569. [Google Scholar]
  72. Wang MT, Eccles JS. (2012) Adolescent behavioral, emotional, and cognitive engagement trajectories in school and their differential relations to educational success. Journal of Research on Adolescence 22: 31–39. [Google Scholar]
  73. Weinberger A, Fischer F. (2006) A framework to analyze argumentative knowledge construction in computer-supported collaborative learning. Computers & Education 46: 71–95. [Google Scholar]
  74. Widyaningrum P, Ruldeviyani Y, Dharayani R. (2019) Sentiment analysis to assess the community’s enthusiasm towards the development chatbot using an appraisal theory 161: 723–730. [Google Scholar]
  75. Yoon S, Parsons FE, Sundquist KJ, Julian J. (2017) Comparison of different algorithms for sentiment analysis: Psychological stress notes. Studies in Health Technology and Informatics 245: 1292. [PMC free article] [PubMed] [Google Scholar]
  76. Zhang N. (2016) Public perceptions of genetically modified food on social media: A content analysis of YouTube comments on videos. Available at: https://scholarcommons.sc.edu/etd/3981

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-pdf-1-pus-10.1177_0963662521990848 – Supplemental material for Interactions between emotional and cognitive engagement with science on YouTube

Supplemental material, sj-pdf-1-pus-10.1177_0963662521990848 for Interactions between emotional and cognitive engagement with science on YouTube by Ilana Dubovi and Iris Tabak in Public Understanding of Science


Articles from Public Understanding of Science (Bristol, England) are provided here courtesy of SAGE Publications

RESOURCES