Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2021 Mar 29;121:106796. doi: 10.1016/j.chb.2021.106796

Designing for fake news literacy training: A problem-based undergraduate online-course

Christian Scheibenzuber a,, Sarah Hofer a,b, Nicolae Nistor a,c
PMCID: PMC9761900  PMID: 36568041

Abstract

In the wake of the Covid-19 pandemic, most universities had to switch to “emergency online learning”. At the same time, academics were in search of means to combat “the infodemic”, a wave of misinformation rolling over the world, affecting social and political life, and undermining efforts to deal with the pandemic. In the framework of emergency online learning, we propose an educational sciences undergraduate online course addressing fake news illiteracy by giving students an insight into the form and effects of fake news with a focus on framing. The course was built upon current fake news research and the problem-based learning approach. The research questions addressed students’ perceptions of critical design elements, their fake news credibility test performance, and their academic achievement. A total of N = 102 undergraduate students participated in the course. Among various design elements, students indicated that online communication and feedback was most appealing. On the other hand, for future course iterations, they suggested improvements to the task descriptions. Fake news credibility decreased significantly (F(1, 36) = 62.64, p < 0.000, partial η2 = 0.64) and final course papers were on average good to very good, indicating strong academic achievement. The study suggests that problem-based online courses can be appropriate learning environments, even in the context of “emergency online learning” and, furthermore, that they can serve as an instrument for combating fake news illiteracy.

Keywords: Problem-based learning, Fake news, Emergency online learning, Higher education, Self-regulated learning

1. Introduction

Education faced at least two major challenges in the year 2020: emergency online learning (Murphy, 2020) and an infodemic (Hua & Shaw, 2020), a wave of misinformation rolling over the world in the form of fake news. The restrictions of social contact aimed at limiting contagion during the Covid-19 pandemic called for emergency online learning, which is the focus of this special issue. In the past three decades, educational researchers have been experimenting with online learning (e.g., Nistor, 2003; Nistor & Neubauer, 2010), but it was not expected that it would be adopted on such a large scale in traditional universities. Due to the pandemic, much of the academic environment had to switch from face-to-face to online learning practically overnight, the challenge at hand being that both many educators and many students were not fully able to deal with online learning. In particular, although students might previously have acquired some cognitive scripts for online collaboration (Fischer et al., 2013), these may not have been enough to enable them to handle study programs that were completely online. Students had to self-regulate their learning processes (Greene & Azevedo, 2010), coordinate their activities in multiple online courses, and successfully apply learning strategies such as time management (Broadbent & Poon, 2018).

The infodemic made Germany the prime target in Europe for misinformation campaigns (EUvsDiSiNFO, 2021). In times of crisis, and particularly during the Covid-19 pandemic, fake news exacerbates the intrinsic danger of the ongoing crisis: On the one hand, various actors are fighting to convey views on how to deal with the crisis. On the other hand, fake news and the associated uncertainty, fear, and a higher need for information, make the confused population less responsive to the crisis management, therefore increasing risks (Hua & Shaw, 2020). Accepting and sharing fake news results in misconceptions and false beliefs about the world (Marsh & Stanley, 2020), i.e., fragmented and inaccurate conceptions at individual level, or deficient comprehension of complex situations. This leads to negative consequences for society, politics and the economy. With the ever-growing quantity of information available, and few reliable possibilities to accurately evaluate its truth value, we have arrived in a so-called post-truth era (Peters, 2017). This state of affairs is problematic for society as it hinders citizens from basing judgments and attitudes on valid information, and thus compromises the democratic decision-making process (Allcott & Gentzkow, 2017). Combatting fake news is an important contemporary goal of research in fields like computer science, mass communication, psychology, and education, a goal which is even more important in times of crisis.

Against this background and in the context of emergency online learning, we propose an educational intervention designed as a problem-based online course to address susceptibility to fake news. The paper is focused on instructional design built upon current fake news research, resulting in a pilot course subjected to a first assessment of student perceptions, academic achievement, and fake news credibility performance. After this introduction, the remainder of the paper includes a literature review focusing on the cognitive processing of fake news, fake news design, and interventions against fake news, as well as the instructional design of problem-based online courses. The second section of the paper includes the research questions and methods, the presentation of results, discussion and conclusions.

2. Theoretical framework

2.1. Fake news

For the purpose of this study, we define fake news as “news articles that are intentionally and verifiably false, and could mislead readers” (Allcott & Gentzkow, 2017, p. 213). In addition, fake news “mimic[s] news media content in form but not in organizational process or intent” (Lazer et al., 2018, p. 1049). Fake news features a variety of different content from many aspects of life, from which conspiracy theories are a prominent recurring topic and therefore dangerous (Lewandowsky et al., 2017), as they can harm public discourse and interaction. In many cases, however, fake news is simply “bullshit” (Pennycook & Rand, 2020) in the sense defined by Harry G. Frankfurt, i.e., the authors do not care if the contents of the communication are true or not as long as somebody reads it and believes it (Frankfurt, 2009).

2.2. Cognitive processes associated with fake news

Currently, there are several approaches explaining why humans are so vulnerable to fake news. In the following, we attempt to provide a synthesis of these explanations from a cognitive perspective that can be subsequently used as a ground for educational interventions. Accordingly, we propose that the cognitive processing of fake news comprises four components: reception, information acceptance, cognitive integration, and sharing (see Table 1 ). To the best of our knowledge, this model integrates the most relevant fake news research findings published so far.

  • (1)

    Reception. At the onset, misinformation may captivate the attention of internet users who may, for various reasons, be interested in the topic. Negativity bias makes humans focus more on negative information, and thus on the majority of fake news (Jaffé & Greifeneder, 2020; Park, 2015). Repeated statements are more easily processed and deemed more credible than completely novel ones, leading to the illusory truth effect (Fazio et al., 2015; Hasher et al., 1977). Together, these two categories of effects provide fake news with a simple entry point to recipients' cognition.

  • (2)

    Information acceptance. Once fake news has found its way into cognition, individuals evaluate the truth value of information, in order to decide whether to accept and integrate the information in their knowledge network. This can be done analytically, e.g., by fact checking; it can be done intuitively, e.g., by stating that the information “feels true” (Schwarz & Jalbert, 2020), or the information can be accepted without any particular evaluation (Pennycook & Rand, 2019). Truth evaluation, especially when done intuitively, can be biased in several ways. Humans are not very good at identifying deception (DePaulo et al., 1997; Rubin & Conroy, 2012), i.e., we tend to believe others and perceive information we receive from them as reliable (truth bias – e.g. van Swol, 2014). Cultural identity can make certain concepts more fluent than others, and thus more truthlike (Oyserman & Dawson, 2020). Pennycook and Rand (2020) describe “bullshit receptivity” as a personality trait of people falling prey to fake news (Pennycook & Rand, 2020). Moreover, information that fits the pre-existing knowledge and attitudes is more easily accepted and integrated, which promotes the confirmation bias: humans specifically look for, and accept information that fits into their worldview, their previous attitudes and opinions (Nickerson, 1998). Cognitive dissonance (Festinger, 1957) is a prominent way of explaining the confirmation bias. In order to avoid or reduce cognitive dissonance, new and dissonant information may get dismissed as biased, untrustworthy or simply false, whereas consonant information may be accepted and integrated in the pre-existing knowledge (McGrath, 2017; Weeks et al., 2017).

  • (3)

    Cognitive integration. If the integrated information or its semantic relationship to other concepts is missing or false, misconceptions develop (diSessa, 2018; Smith et al., 1994). Due to confirmation bias, these misconceptions can be consolidated by selective exposure to information, i.e., by actively searching for new pieces of information conforming to existing misconceptions, assimilating these and thus building larger flawed knowledge structures (Weeks et al., 2017), leading individuals to viewing their perspective of the world as the only valid one and to dismiss alternative information as irrational, ill-informed, or biased (Ross & Ward, 1996; Weeks et al., 2017). This is a cognitive state described as naïve realism (Cheek et al., 2020). Algorithms recommending customized content based on Internet users' previous history limit experiences in the digital world to spaces conforming to the existing worldviews, the so-called filter bubbles (Pariser, 2016), an ideal place for naïve realism and confirmation biases.

  • (4)

    Sharing. Leaving the individual level and looking at fake news from a socio-technical perspective, misinformation is frequently shared among Internet users, above all on social media platforms. This is done for various reasons, such as (dis-) informing others and influencing their decisions (Oyersman & Dawson, 2020), harming them (Maftei & Grigore, 2020), or is simply unreflected (Pennycook & Rand, 2019). Filter bubbles, at their extreme, can turn to echo chambers where a naïve reality is maintained among like-minded Internet users sharing the same information and confirming each other's beliefs (Nguyen, 2020).

Table 1.

Overview of fake news cognitive processing, supportive factors, intervention goals and approaches.

Cognitive processing Supportive factors Intervention goals Intervention approaches
Reception Interests
General perception features ("eye catchers”)
Negativity bias
Illusory truth
Emotional framing
Awareness of perception features and biases
Scepsis towards emotional content
Inoculation
Information acceptance Truth bias
Cultural identity
Bullshit receptivity
Confirmation bias
Value framing
Awareness of biases and cultural identity
Reduction of intuitive truth evaluation
Promotion of analytical truth evaluation
Scepsis towards value framing
Inoculation
Fact checking
  • Information literacy

  • Source evaluation

  • Lateral reading

Cognitive integration Confirmation bias
Selective exposure
Filter bubbles
Naïve realism
Semantic framing
Awareness of biases and filter bubbles
Prevention of creation and perpetuation of misconceptions
Correction of misconceptions and adapt existing beliefs
Scepsis towards semantic framing
Inoculation
Conceptual change
Sharing Echo chambers
Filter bubbles
Awareness of filter bubbles and echo chambers
Restraint of recipient's willingness to share fake news
Inoculation
  • News credibility warnings

Fact-checking
  • Source evaluation training

A synthesis of the above-mentioned deficits that hinder critical thinking about fake news leads to what is termed fake news illiteracy in this paper. In contrast to media literacy (Potter, 2018), which would provide a preemptive safeguard against misinformation, fake news illiteracy fuels flawed evaluation of news and misinformation processing, and results in change resistant misconceptions.

2.3. Fake news design

Like most mass media content, fake news needs to be designed so that it can be more easily received, accepted, cognitively integrated, and shared. This is mainly done by adapting it to consumers' individual characteristics, taking – we assume – the cognitive processes addressed above into consideration. Designing for reception would thus exploit the negativity and truth bias, and the illusory truth effect. The reception process is sustained by the sheer mass of fake news, and by Internet user profiling, so that they can be targeted either as individuals or as filter bubbles inhabitant groups (Cadwalladr & Graham-Harrison, 2018; Vosoughi et al., 2018). Emotional and value framing (Oswald, 2019) can address news consumers' negativity bias, activate negative emotions such as anger, and thus captivate attention. Designing for acceptance can be based on the illusory truth effect and confirmation bias, and comprise bombarding individuals or filter bubble inhabitants with similar or consistent fake news stories from different sources. Consistency between fake news stories reduces the cognitive dissonance and fosters the integration of misinformation (McGrath, 2017). Additionally, value framing can be done by appealing to users' existing beliefs (Oswald, 2019) or cultural identity (Oyersman & Dawson, 2020). Designing for cognitive integration may build upon illusory truth effects and perpetuate misconceptions by continuously addressing them, e.g., as Donald Trump and his followers did with their chant “lock her up”, meaning Trump's opponent Hillary Clinton (Erichsen et al., 2020). Furthermore, semantic framing (Allcott & Gentzkow, 2017; Oswald, 2019) can be used to implicitly address the theme in multiple contexts. As social network systems are among the preferred media for spreading fake news (Allcott & Gentzkow, 2017; Lazer et al., 2018), designing for sharing implies formatting the news to fit social network platforms. Again, user profiling (Cadwalladr & Graham-Harrison, 2018) and filter bubbles (Pariser, 2016) build a strong infrastructure that brings together like-minded Internet users, creating environments where fake news is more efficiently disseminated.

As framing appears to be a powerful mass communication method that we have mentioned above in several places, we would like to close the fake news design subsection with some further clarification. The framing theory (Scheufele, 1999) is based on Goffman's (1974) frame theory, Rumelhart's (1980) cognitive schemata, and Tuchman's (1978) reality construction in mass media. “To frame is to select some aspects of a perceived reality and make them more salient in a communicating text, in such a way as to promote a particular problem definition, causal interpretation, moral evaluation, and/or treatment recommendation for the item described” (Entman, 1993, p. 52). Oswald (2019) identifies three linguistic framing instruments that are most commonly used in political communication: value, emotional, and semantic framing. Value framing is the use of norms and values important for the targeted audience and activated no matter whether they are relevant for the news content or not. Emotional framing means the intentional activation of certain emotions, such as anger, through corresponding language. Semantic framing associates news content with a certain connotation by using specific terms.

2.4. Counteracting fake news

The same four processes addressed above (see also Table 1) are relevant when counteracting fake news. At reception level, news recipients need to be warded against emotional framing and the impacts of negativity bias. Approaches based on inoculation theory provide one way of fostering such healthy scepticism towards emotional content. These were initially developed during the Cold War as a means to counteract propaganda (McGuire, 1964; Papageorgis & McGuire, 1961). Inoculation in the context of fake news refers to building up a defense against persuasion or deception attempts by exposure to fake news in a controlled and safe environment. In this way, online news recipients are provided with an insight into fake news composition, techniques, and effects (Basol et al., 2020; Van der Linden et al., 2017).

At information acceptance level, recipients should become suspicious of value framing techniques, to prevent them from simply evaluating truth in an intuitive manner, or not evaluating it at all. Again, inoculation as described above may be a promising approach in dealing with this issue (Basol et al., 2020). The illusory truth effects could be potentially lessened by informing news consumers about them, and by exposing consumers to personal experience of value framing and commonly used strategies (Osborne, 2018; Revez & Corujo, 2021), for example by reverse engineering news stories (Osborne, 2018). Getting news recipients to rely less on their intuition and more on analytical information processing also appears promising (Roozenbeek & van der Linden, 2019). Inoculation can take place through serious games, such as the “Bad News” game, developed by Roozenbeek and van der Linden (2019). Empirical results with a large sample size of very diverse internet users showed that inoculating players with information about fake news design has a small to moderate effect on the perceived reliability of news (Roozenbeek & van der Linden, 2019). Scheibenzuber & Nistor (2019) were able to replicate the positive effects on subjective learning success and motivation, but the causal comparison between the “Bad News” game and a text-based presentation of the same information did not yield any statistically significant effects.

Information literacy increases the likelihood of debunking fake news, although its positive effects appear rather limited (Jones-Jang et al., 2021). Nevertheless, librarian practice includes disseminating information about fake news and information literacy campaigns, particularly aimed at raising the awareness of the need to evaluate information, and source evaluation training (Osborne, 2018; Revez & Corujo, 2021). An example of carefully designed and evaluated source evaluation training is the lateral reading technique deployed by McGrew (2020) with high school students who could thus improve their ability to select reliable news sites.

Acceptance is not only relevant for the cognitive processing of online news; it can also be critical for interventions against fake news. At reception level, an intervention might not be framed in a way which frequent fake news recipients are familiar with. At acceptance level, the cognitive dissonance created by the intervention poses a significant threat to recipients’ intervention acceptance. Even more so at cognitive integration level, where we deal with change-resistant misconceptions. Finally, at sharing level, the social bubble inhabited by news consumers may disapprove and even reject the person involved in an intervention against fake news. Consequently, interventions need to be not only functional in terms of reducing fake news illiteracy but also designed in a way that appeals to participants and keeps them engaged.

At cognitive integration level, the creation and perpetuation of misconceptions (Chi, 2013) needs to be addressed. Simply warning recipients of semantic framing may not be sufficient (Lazer et al., 2018). However, incorporating new information to correct misconceptions in somebody's worldviews seems to require higher cognitive abilities (De keersmaecker & Roets, 2017). “Detecting and escaping from echo chambers will require a radical restructuring of a member's relationship to their epistemic past” (Nguyen, 2020, p. 143). In other words, conceptual change, including belief revision, mental model transformation, and categorical shift (Chi, 2013) may be necessary and appropriate. Conceptual change has been extensively researched in the last decades, although it has hardly ever been related to fake news.

At sharing level, credibility indicators based on automated or human-made fact checking, and corresponding warning labels can reduce the fake news consumers’ sharing intent (Chung & Kim, 2020; Yaqub et al., 2020). As the perceived information quality predicts news sharing (Koohikamali & Sidorova, 2017), interventions fostering a more accurate evaluation of information sources may indirectly influence the fake news sharing behavior.

Altogether, this brief literature overview suggests that educational interventions aimed at information literacy – fake news literacy, as discussed in this paper – aim at self-evaluation of knowledge and skills (prominently including the various cognitive biases), knowledge construction and reorganization, and knowledge and skills transfer, in order to enable adequate cognitive processing of online news. Interventions are built on inoculation more often than fact checking. The number of intervention studies in the context of formal education is limited.

2.5. Problem-based learning and fake news literacy

As shown in the previous section, educational interventions geared towards fake news literacy aim at individuals’ self-evaluation of knowledge and skills, knowledge construction and reorganization, and transfer of knowledge and skills in order to enable adequate handling of online news. Problem-based learning appears to be a particularly promising way of targeting these intervention goals (see also Table 1) in the context of formal education in a semester-long course at the university. Based on a constructivist approach, problem-based learning embeds student learning into real-life problems that students try to solve collaboratively in groups (e.g., Barrows & Tamblyn, 1980; Hmelo-Silver, 2004). Problem-based learning environments have been shown to support the development of conceptual knowledge and skills in different domains (e.g., Dolmans et al., 2016; Ferreira & Trudel, 2012; Gijbels et al., 2005; Loyens et al., 2015; Şendağ & Ferhan Odabaşı, 2009; Yew et al., 2016). In line with the intervention goals listed above, we hence suggest a problem-based learning environment for constructing and reorganizing as well as flexibly applying (i.e., transferring) conceptual knowledge of characteristics and mechanisms of framing. This conceptual knowledge is key to understanding fake news processing (reception, acceptance, and integration) and practicing related skills. We see these related skills in a broader context, including the skill to perform a literature search or qualitative content analysis as well as critically reflecting on cognitive processes (self-evaluation). In the following section, we go through the problem-based learning process and highlight why the different steps in this process might be particularly advantageous to fake news literacy.

At the beginning of the problem-based learning cycle, students are confronted with a problem scenario that has to be analyzed in order to formulate and evaluate possible solutions. By design, students' existing knowledge is not sufficient for them to come up with a satisfactory solution to the problem at hand. The insight that a given problem cannot be solved by referring to one's pre-existing knowledge stimulates active knowledge construction and reorganization. Related prior knowledge is activated and motivation increases (Sinatra & Pintrich, 2003). Failing to solve the problem can be productive in this context by revealing the limits of the students' existing knowledge and hence initiating conceptual change (see Chinn & Brewer, 1993; Kapur, 2014; Sanchez et al., 2009). At the beginning of a course promoting fake news literacy, students can be instructed to think up answers to the question how and why fake news might trick news recipients before they receive any information on the topic.

After having identified knowledge deficits, students engage in self-regulated learning to acquire the knowledge necessary to address the problem (e.g., Broadbent & Poon, 2015). Designing for fake news literacy course will help students to develop conceptual knowledge about framing techniques including emotional, value and semantic framing and their application in fake news. They are hence instructed to describe different forms of framing based on literature on fake news and framing and to derive a coding system to detect these design elements in actual fake news articles. In the process, they practice their literature search skills as well as the research method of qualitative content analysis.

A next step in the problem-based learning process involves the application of the newly acquired knowledge to the problem and the evaluation of the solution. In a course to increase fake news literacy, this could be implemented by having the students apply their coding systems to a number of real fake news items. In the process, students might discover gaps in their own knowledge. The process supports the development of knowledge that can be activated and flexibly applied in a variety of fake news contexts. (e.g., Gick & Holyoak, 1983; Goldwater & Schalk, 2016).

The next step of the problem-based learning cycle also promotes transfer. If the problem solution is evaluated positively, learners are asked to transfer their ideas to new situations, which further promotes their ability to conceptualize. In the context of a course on fake news literacy, the conceptual knowledge of characteristics and mechanisms of framing acquired during the previous steps can be used by the learners to design an intervention or training on the effects of framing. Likewise, the coding system used to detect and describe framing that was tested on a given corpus of fake news can be applied and adapted to a new corpus covering another topic. Problem-based learning promotes knowledge transfer and is therefore particularly relevant to fake news literacy.

If no problem solution is reached, however, students have to repeat the problem-based learning cycle – or parts of it – thereby practicing monitoring and critically reflecting their own thinking processes (e.g., Barrows & Tamblyn, 1980; Hmelo-Silver, 2004). Being able to reflect on one's own cognitive processes and biases can be seen as an integral part of fake news literacy.

To sum up, confronted with the problem of identifying and understanding the mechanisms behind fake news design (i.e. framing), students collaboratively engage with the conceptual content knowledge on framing and, at the same time, practice evaluation skills, such as literature search and qualitative content analysis as well as self-monitoring (e.g., Gallagher, 1997; Hmelo-Silver, 2004; Polanco et al., 2004).

2.6. Designing a problem-based course

The effectiveness of problem-based learning largely depends on the right balance between students’ prior content knowledge and problem solving, collaboration and self-regulated learning skills, on the one hand, and instructional support, on the other hand (e.g., Kalyuga et al., 2001; Roelle & Berthold, 2013). Different types of instructional support – or instructional design elements – can be considered to avoid overload and help students to learn despite the complexities of the problem-based learning environment. While instructional support can be provided both in an analogue and digital environment, online problem-based learning implies specific affordances, opportunities, and difficulties.

Authentic problem description. The problem scenario should be described in sufficient detail and based on authentic materials that can include interactive media elements (links to webpages, videos or images) to contextualize the problem. The problem should be broken up into separate parts that can be addressed one by one to help students structure the work process. In a problem-based learning curriculum, small groups of students may be confronted with weekly problems they are trying to solve supported by the teacher (Keppell et al., 2001).

Problem-solving resources. Particularly when students are not familiar with the learning domain, instructional guidance encompassing worked out examples, instructional videos, or question prompts can effectively scaffold the problem-solving process (e.g., Hmelo-Silver et al., 2007; Kim et al., 2018; Schmidt et al., 2007). Instructor feedback during the problem-solving process could be a helpful resource for students to calibrate their self-regulated learning activities in a given time (e.g., Mamun et al., 2020). Finally, access to all information, material, and resources necessary to address the problem should be guaranteed. In online settings, learning management systems can be considered a valuable resource to organize and streamline the problem-based learning cycle and, in particular, the provision of problem-solving resources. These systems structure the problem-based learning process (e.g., weekly updates, learning organized around problems or assignments), handle access to course materials, incorporate communication technologies (announcements, feedback, wikis, or discussion boards) and computer assisted learning modules, link to webpages, and permit embedding diverse media content (Petrovic & Kennedy, 2005; Tosun & Taşkesenligil, 2011).

Communication and collaboration resources. Deliberately stimulating interactive activities such as group discussions or peer review that allow social knowledge construction can increase student engagement (Olsen et al., 2020; Swan, 2002). Instructor questions or question prompts provide guidance and structure for group work and discussions (e.g., Garrison & Akyol, 2013; Lee et al., 2017). Discussion boards, etherpads, or wikis, which represent a platform for synchronous collaboration, facilitate collaborative learning and are promising resources in online problem-based learning environments (e.g., Duncan et al., 2013; Zheng et al., 2015).

While the presentation of authentic problems and the learning resources supply may profit from the digital format, social interactions, both among students and with faculty, can be considered one of the biggest challenges (e.g., Delen & Liew, 2016; Olsen et al., 2020; Tsai & Chiang, 2013). Potentially, computer-mediated communication can be as productive for collaborative learning as face-to-face communication; however, the former requires specific communication and collaboration skills or, in current terms of cognitive psychology, collaboration scripts (Fischer et al., 2013; Radkowitsch et al., 2020). A lack of such skills may increase the working time (Broadbent & Poon, 2015; Straus & McGrath, 1994; Valkenburg et al., 2016), and time resources can become scarce if all the courses on a student's study schedule are online. Consequently, the collaboration quality and the learning outcome can be affected.

3. Research questions

The literature review provided above outlines the cognitive processing of fake news and the corresponding interventions, concluding that a problem-based online course aimed at improving fake news literacy is necessary, and suggesting how it could be designed. We have developed an initial pilot course along these lines. Subsequently, in the empirical section of this study, we have addressed two basic areas of interest in order to obtain first indications as to how far the pilot course meets its objectives.

Firstly, we have searched for insight into how our students experience the actual course design, particularly the online implementation of the authentic problem description, the problem-solving resources, and the communication and collaboration resources. This was expected to point out major design flaws and suggest potential improvements for future course iterations, keeping in mind that participants with different study experience (e.g., freshmen vs. junior students) may have different learning needs and thus perceive different design elements as particularly helpful or unhelpful. Hence the first research question:

RQ 1: (a) Which instructional design elements did the participants find particularly appealing (or unappealing) and functional (or dysfunctional), and (b) why? (c) If unappealing or dysfunctional, how could these elements be improved?

Secondly, we looked at the learning outcome. Being highly student-centered, the success of problem-based learning has been generally shown to depend on learners' prior knowledge and cognitive skills, thus on their study experience. Therefore, we differentiated, similarly to RQ1, our learning outcome research questions according to participants' study experience (e.g., freshmen vs. junior students). The learning outcome related to fake news literacy was students’ ability to apply conceptual knowledge and evaluation skills to detect fake news, assuming that junior students may perform better than freshmen. Hence the second research question:

RQ 2: What is students’ pre-post performance change in a fake news literacy test? What is the difference between freshmen and junior students?

The academic learning outcome was that participants learn in a self-regulated and collaborative manner, and synthesize the acquired knowledge in a final course paper. In analogy to the previous research question, the difference in performance between freshmen and junior students is also considered in the third research question:

RQ 3: What is students’ academic achievement as reflected in the final course paper? What is the difference between freshmen and junior students?

4. Research methods

4.1. Research design

The research questions were examined in the field, i.e., within the current emergency online learning situation. The first question was qualitative, the second and third were quantitative. RQ 1 was addressed in online breakout groups and additionally through content analysis of open questionnaire data and email communication with students. For organizational and ethical reasons, a classic evaluation design with treatment and reference groups was not possible. Therefore, RQ 2 was approached using a pre-post-test design including within- and between-subject analysis, and RQ 3 by descriptive statistics and quantitative group comparison.

4.2. Population and sample

The course was part of the study program at the Faculty for Psychology and Educational Sciences of a German state university with more than 50,000 students in 18 faculties. The participants, N = 101 in total, were undergraduates studying educational sciences either as a major or as a minor. The course was taken by both n 1 = 62 freshmen (51 female, 11 male) and n 2 = 39 junior students (36 female, 3 male). Freshmen took this course in the second semester of their undergraduate study and were thus less familiar with scientific problem-solving and collaborative and self-regulated learning, as compared to their junior counterparts, who took the course in their fifth semester. From their first year of study, the junior students were somewhat familiar with single online courses or course modules, however, not with an entire study program being carried out online.

To answer RQ 2, a repeated measures within-between ANOVA with two datapoints was conducted. At the end of the term, 97 participants had completed the course, from which 38 provided valid and complete datasets in both pre- and post-test. A post-hoc power analysis indicated that a sample size of N = 38 is sufficient to detect a medium effect (f = 0.25) with an α error probability of 0.05 and power 1 - β = 0.85, greater than the acceptable minimum of 0.80 and deemed appropriate for a pilot study in the field (GPower 3.1; Faul et al., 2009).

4.3. Course description

4.3.1. Course goals

Courses aimed at four different learning goals. First, participants should either acquire, as in the case of freshman students, or foster existing literature research skills by searching for literature and getting an insight into fake news research with a focus on framing. Second, they should develop specific problem-solving skills by learning and applying qualitative content analysis methods. Specifically, they should develop a coding system for framing and identifying the occurrences of different framing methods. They should not only become familiar with a new research method, but also gain insights into some of the strategies commonly found in fake news and thus decrease fake news illiteracy. Third, they should acquire collaboration skills by practicing online collaboration in small groups and using the wiki-tool provided by moodle to keep records of their progress. Fourth, they should critically reflect their own research progress throughout the term and synthesize the results in an empirical research report submitted as a course paper. For the paper, the instructors suggested the title “Fake News Framing: Developing and Applying a Coding Scheme for Content Analysis”.

4.3.2. Instructional design

Overall, the online course was problem-based. With respect to the subject matter, the course relied on the inoculation approach (Basol et al., 2020; Van der Linden et al., 2017) with a focus on framing (Entman, 1993; Scheufele, 1999). The specific instructional design elements were shaped as follows.

Authentic problem description. Participants worked with the problem of online fake news, currently well-known to the students from the international political discourse, and displaying even higher relevance in the context of the Covid-19 pandemic (Hua & Shaw, 2020). To increase the authenticity of the learning environment, only current fake news on migration from several German, Swiss or Austrian alternative news sites (Vogel & Jiang, 2019) was used as learning material. Migration is a pervasive topic due to the European refugee crisis from the mid 2010s, thus highly authentic. Authenticity was further addressed by the task in which the students later in the term independently searched for fake news on Covid-19, and selected from this the material for the second content analysis.

Problem solving resources. One of the key resources in the course was current research literature on the general definitions and different types of fake news, their effects and form as described through the framing approach. Due to Covid-19 restrictions on-campus libraries were not available to the students and their entire literature research had to be conducted online. Additionally, a “crash-course” on qualitative content analysis with further, more in-depth literature recommendations was compiled and provided by the instructors. It mainly contained a step-by-step tutorial, the bare minimum to get a grasp on the method.

Furthermore, students worked with current fake news articles from the publicly available German Fake News Corpus (Vogel & Jiang, 2019), or GermanFakeNC, a corpus of ~490 manually fact-checked fake news articles from German, Austrian and Swiss alternative news sites. Instructors had selected 20 articles from this corpus with a thematic focus on migration to provide a coherent sample for the participants. Additional research material in the form of fake news articles on Covid-19, used in a second content analysis later in the course, were searched for by the participants themselves. Due to the difficulty of finding adequately fact-checked fake news the minimum amount that had to be collected for analysis was set to 10 articles with a strong recommendation of searching for more. Finally, participants received a set of guidelines for their course paper, including formalities, such as the usage of APA version 7 formatting, as well as a model structure which students should adhere to when writing their final assignment.

Communication and collaboration resources. The course management was based on moodle (v. 3.6), and freshmen and junior students had two separate courses. Both were built with the same data structure and contents. The courses opened with a short welcoming text, the course overview and the schedule. The courses had a weekly structure consisting of task descriptions and associated learning resources. At the start of the course, only the welcoming text, the course overview, and the first task were visible. Throughout the term, the next sections were made visible and became available for students after each task completion, i.e., at the start of the next one. Each course section included a written task description, corresponding resources and, if necessary, assignment submission links for completed tasks.

The communication between instructors and students was mostly asynchronous and handled via moodle through weekly assignments as well as notifications that students received through their campus email every Monday morning. Alongside this regular and steady communication of tasks and new contents, participants communicated directly with instructors through email or the moodle messaging feature. Close to the end of the term, two synchronous meetings, one about the course paper, the other for more general questions, were held using the video conferencing tool zoom (v. 5.0.2, zoom.us). Upon assignment completion, students received feedback from the instructors via email. For communication between participants, the use of zoom was recommended at the beginning of the term. The breakout group discussions held within the last two term weeks also used zoom and its breakout sessions function to provide participants with separate spaces for group discussions.

4.3.3. Course schedule

Week 1 opened with a general introduction to the course featuring the organizational structure and the instruction to autonomously form work groups of three to five students using the moodle etherpad feature. In Week 2, students explored and started to use the moodle wiki feature. Furthermore, participants set, discussed and documented their individual learning goals in the wiki. Week 3 featured the “crash-course” on qualitative content analysis. From Week 4 to Week 8, students conducted the required literature review, collecting the search results in their groups’ wiki. Furthermore, the participants proposed a coding system to analyze the fake news from GermanFakeNC. This analysis began in Week 6 and ended with the submission of the coding system in Week 8. In Week 9 students wrote and handed in a short non-formal concept paper (max. 2 pages) on possible intervention methods to combat fake news illiteracy, as deduced from the results from the first content analysis. Weeks 10 to 13 featured the second content analysis, this time based on Covid-19 fake news found by the participants on the Internet, and on their coding system now adapted to the requirements of the new contents. The students began coding the new material in Week 12 alongside their work on the course paper, which was due in Week 15 (Fig. 1).

Fig. 1.

Fig. 1

Course overview.

4.4. Qualitative data collection

To address RQ 1, the appeal and functionality of our instructional design elements as seen by the students, we conducted discussions in breakout groups. This was inspired by the focus group method that comprises discussions in an informal setting with the goal of identifying participants’ personal experiences with the object of research (McLafferty, 2004; Morgan, 1997). The participants were invited to join one of two online discussions in zoom, for freshmen and for junior students respectively. In total, there were five breakout groups, three for the freshmen and two for the junior courses with four students in each group, all female. Prior to the sessions, the participants received an overview of the recommended discussion steps, as follows: (1) In a plenary brainstorming session, the participants collected points for the student feedback about the course design. (2) In separate breakout rooms, they formulated three feedback statements on the most important design elements of the course, rating them as positive or negative. If the rating was negative, improvements were suggested. (3) The participants returned to the main zoom room and each group presented their statements for a last discussion with the entire group in order to clarify uncertainties, if any, and finally the statements were submitted for the course evaluation. In addition, the participants occasionally provided feedback on the course design and suggested improvements via email during the term. The feedback was recorded and subjected to a thematic content analysis. In both the focused feedback and in email statements, subthemes subordinated to the design elements named above (authentic problem description, problem solving resources, and communication and collaboration resources) were identified, and the corresponding statements were summarized (Creswell, 2007).

4.5. Quantitative variables and measures

RQ 2 entails a single dependent variable, i.e., performance in a test on fake news credibility, measured by showing the participants a set of ten screenshots of online fake news, all taken from the GermanFakeNC (Vogel & Jiang, 2019). A completely different set of ten news article screenshots was used for the post-test questionnaire. All articles were taken from news sources the participants had not worked with during the course. For each screenshot, participants were asked to rate the credibility of the featured content on a seven-point Likert-scale from 1 = absolutely not credible to 7 = absolutely credible. The sum score was used for all calculations (ranging from 10 to 70), with a higher score implying a higher degree of fake news illiteracy. So far, there is no validated test for fake news credibility. However, the assessment method adopted here has been shown to work well in prior research on fake news (e.g., Roozenbeek & van der Linden, 2019). The reliability estimates yielded in our pretest (Cronbach's α = 0.83) and posttest (α = 0.87) indicate good internal consistency.

For RQ3, we operationalized academic achievement as a single multidimensional dependent variable describing the quality of the course paper with the suggested title “Fake News Framing: Developing and Applying a Coding Scheme for Content Analysis.” The papers were recommended to be structured as shown in Table 2 , and were developed step by step as described above in the course schedule. To develop the papers, the students had to apply the skills named above in the course goals, i.e. literature research, collaborative learning, and problem-solving skills. Paper grading was based on normative performance standards of what is expected from undergraduate students of educational sciences, taken from the syllabus. Scores for each criterion were: 1 point = criterion met, 0.5 points = criterion partially met, 0 points = criterion not met. The total points awarded per chapter divided by the maximum number of points and multiplied by the weights indicated in Table 2 resulted in scores for each chapter. These weights were chosen in a way that reflected the importance of each given chapter, focusing on theoretical background and discussion to see whether participants had a solid grasp on the research literature, and results to gauge how well the participants were able to depict their findings in an academic manner. The maximum total score was 100. In order to provide an objective assessment, the papers were coded independently by two instructors to evaluate the objectivity of the rating system. Both instructors reached 100% agreement after brief discussions of less than 10 min each. The final grades were then calculated by transferring the total number of points to the German grading system (100–86 points = very good, 85–74 = good, 73–62 = satisfactory, 61–50 = sufficient, and <50 = insufficient).

Table 2.

Assessment criteria of the course papers.

Chapter Criteria Maximum points
1. Problem statement Coherent and evidence-based problem statement 10
2. Theoretical background Coherent and logical presentation of the current state of fake news research with a focus on framing 20
3. Methodology Complete, adequate and accurate presentation of the research method (qualitative content analysis) 5
Quality of the category systems for migration and Covid-19 respectively 5
4. Results Complete and accurate presentation of results 20
5. Discussion Summary of relevant results linked with research literature 10
Credible and literature-based interpretation of results 10
6. Implications Stating logical consequences for research 5
Stating logical consequences for practice 5
7. Formal aspects Use of adequate and recent literature 5
Correct use of academic language 3
Correct use of template and formatting (APA-7) 2
Total 100

4.6. Data collection and analysis

The course was offered during the summer term of 2020, from late April to early August, over 15 weeks. Data was collected during the first and last term week through online questionnaires in moodle using feedback forms. Each questionnaire had a lifecycle of one week. On Monday, the questionnaires were made visible and the participants were notified by email. Reminders were sent on Wednesday and Friday. On the following Monday, the questionnaires were made invisible again. After the end of term, the course papers were submitted and graded by two instructors. All survey responses were downloaded for analysis and matched using pseudonymized user IDs. Data was analyzed with IBM SPSS Version 26 computing descriptive statistics, repeated measures ANOVA, and Kruskal-Wallis test.

5. Findings

5.1. RQ1: group discussion results

Online learning environment. All five groups found online learning helpful and less stressful than face-to-face learning because they did not have to travel to the university campus. For students who do not live nearby, travel time may be 2 h or more every day. In general, time management was easier, as gaps in course timetables were no longer an issue. They explicitly suggested continuing online teaching and learning after the pandemic. Nevertheless, two groups (one freshmen, one junior) also noted the importance of face-to-face meetings for motivation and focused learning, as well as the social components of student life.

Goal setting. One freshmen group misunderstood the goal setting, in that they had expected to become fake news experts who would recognize fake news at a single glance. Consequently, they were disappointed to see that the work of an entire semester “only” resulted in a differentiated insight into fake news framing. They suggested that the instructors explain the seminar goals at the beginning of the term in a synchronous online meeting, to clarify the expected learning outcome.

Course overview. One freshmen group had perceived the course overview provided by the instructors at the beginning of the term as too generic and felt uncertain about how to proceed with the upcoming course work. They suggested a more specific course overview with weekly task descriptions in advance for the entire term.

Generic task descriptions. All participants, freshmen and juniors alike, felt that they could not always understand the task descriptions immediately and sufficiently. Some of them needed to think them through carefully, while others would have preferred the instructors to explain the tasks in detail in an online meeting. Furthermore, one junior group reported that they had to revise certain task results, mostly the analysis codes, after instructor feedback because they had not correctly understood the initial task description. This additional workload could have been avoided by providing models of completed tasks as worked out examples for the more complex tasks, as suggested by the participants.

Content analysis task description. For the participants who studied educational sciences as a major, content analysis was not included in any lecture on empirical research methods. Therefore, conducting a content analysis was new to them, sometimes a challenge. They found the “crash course” too abstract for them to be able to properly apply this method. They suggested that the instructors give an introduction to content analysis in an online meeting and provide more practical and worked out examples. Moreover, they had not realized that the requirements were low, in the sense of a first insight into qualitative research. Correspondingly, they suggested that the instructors be clearer about this at the beginning of the course. In spite of these difficulties, the participants emphasized a feeling of success stemming from their mostly self-regulated learning activities.

Learning resources. Both junior student groups found the literature recommendations too broad, resulting in an increased effort to identify relevant information from large handbooks of qualitative research methods. They suggested that the instructors focus their recommendations on specific book sections. Regarding the analysis material, one junior student group felt disgusted by certain fake news, and wished the instructors had provided a more pronounced explicit content warning.

Instructor communication and feedback. Three groups (two junior and one freshmen) highly appreciated the communication with the instructors, in particular the instructor feedback regarding the assignments. They perceived communication as timely, clear, constructive, and friendly. They felt their questions were taken seriously, thus they felt relieved and supported. Nevertheless, the participants suggested scheduling obligatory group meetings to discuss participants’ questions.

Peer communication. One freshmen group had found the coordination and division of work within their work groups through online communication somewhat confusing and difficult due to the lack of physical presence. They suggested that the instructors schedule regular online meetings via zoom, at least at the beginning of the term. Complementary to the main zoom room, small group breakout sessions could simulate formal face-to-face meetings and support the coordination within groups.

Assessment. Two freshmen groups felt overwhelmed by the workload during the last four term weeks. This was because the university administration had changed the officially recommended assessment form during the term. As the social distancing and hygiene requirements did not allow large gatherings of students, the initially planned group presentations of results during the term with an additional multiple-choice exam at the end were replaced by a single course paper, a research report expected to be 25–30 pages long and written collaboratively in working groups. This was communicated in week 9, leading to a feeling of pressure to complete the assignment in the remaining 6 weeks of the term. For future online courses, everybody agreed that the assignments need to be clearly defined at the beginning of term. In addition, one freshmen group suggested a schedule with weekly assigned paper section submissions, which would distribute the workload throughout the term and thus reduce the workload at the end of term. However, the junior groups were fine with the actual schedule and would have felt under time pressure if they had had to work even more during the term writing assigned paper sections.

5.2. RQ 2: fake news credibility test performance

In spite of only N = 38 participants (n 1 = 18 Freshmen, n 2 = 20 Junior students) who had responded to the fake news credibility test in both data points, the data were normally distributed (Shapiro-Wilk-Test, p = 0.64 in pretest and p = 0.21 in posttest). This requirement being met, a repeated measures ANOVA was conducted. The test performance was significantly improved (F(1, 36) = 62.64, p < 0.000) both for freshmen (M 1pre = 32.61; SD = 7.00 vs. M 1post = 21.67; SD = 8.73) and for junior students (M 2pre = 34.10; SD = 6.72 vs. M 2post = 22.15; SD = 7.24) resulting in a large effect with partial η 2 = 0.64. The effect of study experience on test performance was not significant (p = 0.61).

5.3. RQ 3: course paper assessment

Over both student groups, the average score on the course paper was M = 85.98 (SD = 9.30), thus very good or good. Junior students have received more points for their course papers (very good, M = 89.53, SD = 7.47) than freshmen (good, M = 83.59, SD = 9.70). Because the number of points awarded for the course papers was not normally distributed, a Kruskal-Wallis test was conducted to test this difference, resulting in H(1) = 7.54, p = 0.006. In more detail, in chapters 1, 2, and 4, as well as in formal aspects, junior students scored significantly higher than freshmen. However, in chapter 6 freshmen scored significantly higher than juniors. There was no significant difference in chapters 3 and 5. Detailed scores are provided in Table 3 .

Table 3.

Comparison of course paper scores.


Freshmen (n = 58)
Juniors (n = 39)
Kruskal-Wallis test results
Chapter M SD M SD
1. Problem statement 8.62 2.25 9.49 1.54 H(1) = 4.24, p = 0.04
2. Theoretical background 15.60 3.87 17.56 3.60 H(1) = 6.56, p = 0.10
3. Methodology 8.94 0.92 8.89 1.10 H(1) = 0.00, p = 0.98
4. Results 16.90 5.98 20.00 0.00 H(1) = 10.83, p = 0.001
5. Discussion 15.92 2.50 15.90 5.32 H(1) = 1.28, p = 0.26
6. Implications 8.45 1.54 7.69 1.33 H(1) = 6.52, p = 0.01
7. Formal aspects 9.17 1.14 10.00 0.00 H(1) = 22.27, p = 0.000
Total 83.59 9.70 89.53 7.47 H(1) = 7.54, p = 0.006

6. Discussion

The first aim of this study was to design an online undergraduate course to address fake news illiteracy. To achieve this, we started from a literature overview focused on the cognitive processing of fake news, the resulting fake news design, and interventions against fake news. Whereas all these, interventions included, were framed by the cognitive processing components fake news reception, information acceptance, cognitive integration, and sharing, the interventions against fake news were essentially inoculation and fact checking, the former mainly addressing the reception, acceptance and sharing levels, and the latter the sharing of fake news. Our intervention, the pilot online course, was built on inoculation, which was put into practice as students’ in-depth insight into fake news framing. More specifically, the course participants started from the problem of fake news illiteracy, reviewed communication research literature about framing, developed a framing-centered coding schema, and finally applied this to analyze fake news contents. Thus, our online course was designed according to a problem-based approach.

The design aim of this study was complemented by a basic assessment of students' use of the pilot course learning and their learning outcome. More specifically, we examined students’ perceptions of the learning environment (RQ 1), and recoded their suggestions for improvement. Further on, the learning outcome was operationalized as the pre-post change in a fake news credibility test (RQ 2), and the academic quality of the course papers submitted at term end (RQ 3).

In spite of a relatively turbulent transition from face-to-face learning with a few online elements to the exclusively online setting, students' perceptions of the pilot course (RQ 1) were predominantly positive. They especially valued the communication and collaboration resources, as well as problem-solving resources, due to which their collaborative problem-based learning was successful even in the completely remote setting. Given the value of social knowledge construction and its effects on student engagement (Olsen et al., 2020), it is promising to see that in an online course these benefits may also hold during the stressful periods of study during a pandemic. However, especially for freshmen who are not yet fully familiar with the university environment and have not yet developed online collaboration skills (Fischer et al., 2013), regularly scheduled meetings, as requested by one breakout group, appear necessary to ensure productive learning. Instructor input and feedback was highly appreciated by the course participants, which is in line with prior research that stresses the importance of guidance and structure, especially in cooperative learning situations (Garrison & Aykol, 2013; Lee et al., 2017). On a similar note, especially when dealing with new learning content, instructional guidance is a valuable resource for learners to organize and structure their self-regulated learning processes (Kim et al., 2018; Mamun et al., 2020). In remote teaching in general and specifically for asynchronous learning environments, this may be of the utmost importance to guarantee students’ success. Additionally, despite promising results with regard to collaboration tools that support collaborative learning (e.g., Zheng et al., 2015), there was a notable absence of student feedback on the provided collaboration tool in form of the wiki in moodle. This may indicate that students did not see the need to utilize the tool to its fullest or simply that they had found it useful, but not so useful that they specifically praised it. In terms of the authentic problem description, the fundament of problem-based learning, we received the biggest array of reported insecurities and misunderstandings when faced with the weekly task at hand. Despite splitting tasks into bite-sized weekly pieces for students to work on (Keppell et al., 2001), our participants asked for more details and model examples of task descriptions in order to avoid confusion. This may be due to the asynchronous nature of our course which somewhat limited the opportunities to ask comprehension questions on account of the added hurdle of having to email the instructors. More generally, participants attributed their course success to the high degree of freedom in self-regulated learning, which originated from overcoming difficulties self-directedly and with a relatively low amount of instructional support. This is in line with research showing positive relationships between self-regulated learning and academic achievement (e.g., Abar & Loken, 2010; Greene & Azevedo, 2010; Mega et al., 2014).

Regarding participants’ performance in the fake news credibility test (RQ 2), which was based on their conceptual knowledge gain regarding fake news and more specifically on their understanding of framing, we saw substantial improvements at the end of the course compared to the beginning. Test performance significantly improved in both freshmen and juniors with no significant effect of prior study experience. This is promising as it suggests that even at the entry level of higher education, such as for our freshmen, an intervention against fake news such as ours may yield significant results in reducing fake news illiteracy at reception and acceptance level. This increase in test performance is in line with prior research showing that the problem of fake news illiteracy can be addressed by online interventions based on the inoculation approach (e.g., Roozenbeek & van der Linden, 2019). Participants seemed to have developed cognitive skills at the reception and acceptance levels of fake news processing (Gijbels et al., 2005; Loyens et al., 2015; Walker & Leary, 2009) or, in more current terms, cognitive scripts (Fischer et al., 2013; Radkowitsch et al., 2020) that enabled them to more accurately recognize fake news by looking at their framing.

In terms of academic achievement and the conveyance of research skills (RQ 3), the course can be seen as a success with participants passing with good to very good final grades, on average. Junior students scored significantly higher than freshmen, especially in the presentation of results and in overall formal aspects of the course paper. This difference could be explained by juniors' greater familiarity and experience with writing academic papers, which usually develops over the course of students’ university careers (Lea & Street, 1998). Notably, we did not find this difference between juniors and freshmen students in the fake news literacy test.

6.1. Limitations

As this study is focused on instructional design driven by previous theories and empirical findings, the associated empirical study had explorative character and the validity of its findings displays some limitations. Our research design does not include reference groups, therefore it does not support causal conclusions. In other words, although substantial pre-post performance changes were found, we cannot claim that these were caused by the course or by its instructional design. Causal conclusions should be drawn in future research employing causal group comparisons, best under controlled laboratory conditions. In the long term, the course quality should be systematically improved by design-based research.

The setting and sample of our study are also limited. Conducting a field study, we have examined a single set of courses in the context of an emergency online semester with possible interdependencies with other courses, online lectures, or small research projects we could not control for. Our course design was tested only on a narrow sample of students of educational sciences, all of them willing to learn and very cooperative. It needs to be expanded to different contexts to provide representative results for a larger, more heterogenous set of students, and to people outside the academic world, who are probably more affected by fake news than university students. Additionally, the small sample size – due to attrition during the study – needs to be expanded in further research. A possible solution for this would be to use learning analytics methods such as log data analysis (e.g., Lerche & Kiel, 2018) to objectively assess students’ activity without the need for them to take part in an additional study during the semester.

The basic data collection instruments may have introduced further limitations. The group discussions (RQ 1) were initiated by a few questions that were very generic and that may not have captured important aspects of the learning process. Further research should be based on more precise questions specifically directed to instructional design elements addressing the fake news cognitive processing. The fake news credibility test (RQ 2) does not give any insight into the associated cognitive processes, i.e., analytic or intuitive (Schwarz & Jalbert, 2020), or into participants’ evaluation of the information quality (e.g., Hahnel et al., 2020). Social desirability may have also inflated the results. Assessing the quality of course papers (RQ 3) manifests all the advantages and disadvantages (Galla et al., 2019), hence this may not be the ideal measure of academic achievement and the acquisition of research skills.

6.2. Consequences for educational practice

Reflecting upon the overarching theme of this special issue, higher education teaching and learning in times of crisis, a few conclusions result. First, as long as the pandemic is still raging on and traditional teaching methods must be limited (Crawford et al., 2020; Murphy, 2020), our study suggests that a problem-based online course not only works even in an emergency online semester, but may also produce significant learning.

In terms of design improvements, the breakout group discussions yielded several possible improvements for future course iterations. In brief, task descriptions need to be as concise and clear as possible, and to include worked out examples and online meetings. Further improvements include a more specific course overview, featuring weekly task descriptions, and a clearly communicated goal-setting at the beginning of term to set expectations properly. Finally, aside from the well-received instructor communication and feedback, peer communication can be enhanced through the addition of regular and obligatory online plenary meetings. These improvements will be undertaken in further development of this course concept.

6.3. Consequences for educational research

The theory-driven instructional design was the focus of the study, but the resulting research agenda necessarily appears to be the most important part of this discussion. As outlined in our literature review, fake news research has been strongly developing in recent years, mainly positioned in computer science, communication, and social psychology. The fake news related educational research trails somewhat behind, but it is also developing.

Currently, two main research lines stand out, related to interventions based on inoculation and fact checking. In both areas, comprehensive literature reviews and meta-analyses focusing on fake news reception, acceptance, cognitive integration and sharing appear necessary and helpful. Inoculation research has a longer history and has therefore made more progress. Our study was also centered on inoculation. As suggested in the limitations section, the development and validation of measure instruments (including those based on learning analytics) that not only assess fake news credibility, but also give insight into the underlaying cognitive processing, would be fruitful for both lines of research. Laboratory and experimental research on the effects of various intervention design components on fake news processing may largely complement the findings currently available. This may also include both main and interaction effects of inoculation with fact checking, both of interventions with individual traits, and of interventions with various instructional designs. Fake news research should further involve more diverse participants, also including individuals with lower education degrees, higher exposure to fake news, and various political and religious orientation (Pennycook & Rand, 2020).

Research positioned at the cognitive integration level is particularly scarce. Interestingly, conceptual change has been extensively investigated during the last few decades (Amin & Levrini, 2018). However, there are hardly any studies of repairing misconceptions built around cognitively integrated fake news (e.g., Chi, 2013). For a start, this research topic could easily be carried out in the academic world.

Research on online learning, the overarching theme of this special issue, has been conducted at least since the early 1990s, and has become increasingly specialized. Our study calls for research on self-regulated learning skills (Kirschner et al., 2006; Nistor, Dascălu, & Trăușan-Matu, 2020; Pedrotti & Nistor, 2019). In this context, an investigation of students' learning strategies (Broadbent & Poon, 2015; Pedrotti & Nistor, 2019) would be of interest. Furthermore, online communication and collaboration scripts in group-based online courses are essential (Valkenburg et al., 2016). A more in-depth look at the degree of development of these in different student groups could suggest ways of fostering these skills and improving students’ learning experience in future collaborative online learning environments.

Author contribution

Christian Scheibenzuber: Conceptualization, Data curation; Formal analysis, Investigation, Methodology, Resources Writing - Original Draft, Writing – review & editing, Visualization Sarah Hofer: Conceptualization, Writing – original draft; Writing – review & editing. Nicolae Nistor: Conceptualization, Supervision, Writing – original draft; Writing – review & editing

Declaration of competing interest

The authors have no conflict of interest.

Acknowledgements

Sarah Isabelle Hofer was supported by a Jacobs Foundation Research Fellowship (number: 2018 128809), https://jacobsfoundation.org/.

References

  1. Abar B., Loken E. Self-regulated learning and self-directed study in a pre-college sample. Learning and Individual Differences. 2010;20:25–29. doi: 10.1016/j.lindif.2009.09.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Allcott H., Gentzkow M. Social media and fake news in the 2016 election. The Journal of Economic Perspectives. 2017;31(2):211–236. doi: 10.1257/jep.31.2.211. [DOI] [Google Scholar]
  3. Amin T.G., Levrini O., editors. Converging perspectives on conceptual change: Mapping an emerging paradigm in the learning sciences. Routledge; 2018. [Google Scholar]
  4. Barrows H.S., Tamblyn R.M. Springer; 1980. Problem-based learning: An approach to medical education. [Google Scholar]
  5. Basol M., Roozenbeek J., Van der Linden S. Good news about Bad News: Gamified inoculation boosts confidence and cognitive immunity against fake news. Journal of Cognition. 2020;3(1):2. doi: 10.5334/joc.91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Broadbent J., Poon W.L. Internet and Higher Education; 2015. Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. [DOI] [Google Scholar]
  7. Cadwalladr C., Graham-Harrison E. Revealed: 50 million facebook profiles harvested for cambridge analytica in major data breach. The Guardian. 2018;17:22. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election [Google Scholar]
  8. Cheek N.N., Blackman S.F., Pronin E. Journal of behavioral decision making. 2020. Seeing the subjective as objective: People perceive the taste of those they disagree with as biased and wrong. (in press) [DOI] [Google Scholar]
  9. Chi M.T.H. In: International handbook of research on conceptual change. Vosniadou S., editor. Routledge; 2013. Two kinds and four sub-types of misconceived knowledge, ways to change it, and the learning outcomes; pp. 49–70. [Google Scholar]
  10. Chinn C.A., Brewer W.F. The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research. 1993;63:1–49. doi: 10.3102/00346543063001001. [DOI] [Google Scholar]
  11. Chung M., Kim N. When I learn the news is false: How fact-checking information stems the spread of fake news via third-person perception. Human Communication Research. 2020;47(1):1–24. doi: 10.1093/hcr/hqaa010. [DOI] [Google Scholar]
  12. Crawford J., Butler-Henderson K., Rudolph J., Malkawi B., Glowatz M., Burton R., Magni P.A., Lam S. COVID-19: 20 countries' higher education intra-period digital pedagogy responses. Journal of Applied Learning & Teaching. 2020;3(1) doi: 10.37074/jalt.2020.3.1.7. [DOI] [Google Scholar]
  13. Creswell J.W. Sage; 2007. Qualitative inquiry & research design, choosing among five approaches. [DOI] [Google Scholar]
  14. De keersmaecker J., Roets A. ‘Fake news’: Incorrect, but hard to correct. The role of cognitive ability on the impact of false information on social impressions. Intelligence. 2017;65:107–110. doi: 10.1016/j.intell.2017.10.005. [DOI] [Google Scholar]
  15. Delen E., Liew J. The use of interactive environments to promote self-regulation in online learning: A literature review. European Journal of Contemporary Education. 2016;15(1) doi: 10.13187/ejced.2016.15.24. [DOI] [Google Scholar]
  16. DePaulo B.M., Charlton K., Cooper H., Lindsay J.J., Muhlenbruck L. The accuracy-confidence correlation in the detection of deception. Personality and Social Psychology Review. 1997;24(3):208–216. doi: 10.1207/s15327957pspr0104_5. [DOI] [PubMed] [Google Scholar]
  17. diSessa A.A. In: Converging perspectives on conceptual change: Mapping an emerging paradigm in the learning sciences. Amin T.G., Levrini O., editors. Routledge; 2018. Knowledge in pieces: An evolving framework for understanding knowing and learning; pp. 9–16. [Google Scholar]
  18. Dolmans D.H.J.M., Loyens S.M.M., Marcq H., Gijbels D. Deep and surface learning in problem-based learning: A review of the literature. Advances in Health Sciences Education: Theory and Practice. 2016;21(5):1087–1112. doi: 10.1007/s10459-015-9645-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Duncan M.J., Smith M., Cook K. Implementing online problem based learning (PBL) in postgraduates new to both online learning and PBL: An example from strength and conditioning. Journal of Hospitality, Leisure, Sports and Tourism Education. 2013;12(1):79–84. doi: 10.1016/j.jhlste.2012.11.004. [DOI] [Google Scholar]
  20. Entman R.M. Framing: Toward clarification of a fractured paradigm. Journal of Communication. 1993;43(4):51–58. doi: 10.1111/j.1460-2466.1993.tb01304.x. [DOI] [Google Scholar]
  21. Erichsen K., Schrock D., Dowd-Arrow B., Dignam P. Bitchifying Hillary: Trump supporters' vilification of Clinton during the 2016 presidential election. Social Currents. 2020;7(6):526–542. doi: 10.1177/2329496520941022. [DOI] [Google Scholar]
  22. EUvsDiSiNFO . 2021. Villifying Germany, wooing Germany.https://euvsdisinfo.eu/villifying-germany-wooing-germany/ Retrieved from. [Google Scholar]
  23. Faul F., Erdfelder E., Buchner A., Lang A.G. Statistical power analyses using G∗ Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods. 2009;41(4):1149–1160. doi: 10.3758/BRM.41.4.1149. [DOI] [PubMed] [Google Scholar]
  24. Fazio L.K., Brashier N.M., Keith Payne B., Marsh E.J. Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General. 2015;144(5):993–1002. doi: 10.1037/xge0000098. [DOI] [PubMed] [Google Scholar]
  25. Ferreira M.M., Trudel A.R. The impact of problem-based learning (PBL) on students attitudes towards science, problem-solving skills, and sense of community in the classroom. Journal of Classroom Interaction. 2012;47(1):23–30. [Google Scholar]
  26. Festinger L. Stanford University Press; 1957. A theory of cognitive dissonance. [Google Scholar]
  27. Fischer F., Kollar I., Stegmann K., Wecker C. Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist. 2013;48(1):56–66. doi: 10.1080/00461520.2012.748005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Frankfurt H.G. Princeton University Press; 2009. On bullshit. [Google Scholar]
  29. Gallagher S.A. Problem-based learning: Where did it come from, what does it do, and where is it going? Journal for the Education of the Gifted. 1997;20(4):332–362. doi: 10.1177/016235329702000402. [DOI] [Google Scholar]
  30. Galla B.M., Shulman E.P., Plummer B.D., Gardner M., Hutt S.J., Goyer J.P.…Duckworth A.L. Why high school grades are better predictors of on-time college graduation than are admissions test scores: The roles of self-regulation and cognitive ability. American Educational Research Journal. 2019;56(6):2077–2115. doi: 10.3102/0002831219843292. [DOI] [Google Scholar]
  31. Garrison D.R., Akyol Z. In: Handbook of distance education. Grahame Moore M., editor. Routledge; 2013. The community of inquiry theoretical framework; pp. 104–119. [Google Scholar]
  32. Gick M.L., Holyoak K.J. Schema induction and analogical transfer. Cognitive Psychology. 1983;15:1–38. doi: 10.1016/0010-0285(83)90002-6. [DOI] [Google Scholar]
  33. Gijbels D., Dochy F., van den Bossche P., Segers M. Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research. 2005;75(1):27–61. doi: 10.3102/00346543075001027. [DOI] [Google Scholar]
  34. Goffman E. Harvard University Press; 1974. Frame analysis: An essay on the organization of experience. [Google Scholar]
  35. Goldwater M.B., Schalk L. Relational categories as a bridge between cognitive and educational research. Psychological Bulletin. 2016;142(7):729–775. doi: 10.1037/bul0000043. [DOI] [PubMed] [Google Scholar]
  36. Greene J.A., Azevedo R. The measurement of learners' self-regulated cognitive and metacognitive processes while using computer-based learning environments. Educational Psychologist. 2010;45:203–209. doi: 10.1080/00461520.2010.515935. [DOI] [Google Scholar]
  37. Hahnel C., Eichmann B., Goldhammer F. Evaluation of online information in university students: Development and scaling of the screening instrument EVON. Frontiers in Psychology. 2020;11 doi: 10.3389/fpsyg.2020.562128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Hasher L., Goldstein D., Toppino T. Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior. 1977;16(1):107–112. doi: 10.1016/S0022-5371(77)80012-1. [DOI] [Google Scholar]
  39. Hmelo-Silver C.E. Problem-based learning: What and how do students learn? Educational Psychology Review. 2004;16(3):235–266. doi: 10.1023/B:EDPR.0000034022.16470.f3. [DOI] [Google Scholar]
  40. Hmelo-Silver C.E., Duncan R.G., Chinn C.A. Scaffolding and achievement in problem-based and inquiry learning: A response to kirschner, sweller, and clark (2006) Educational Psychologist. 2007;42(2):99–107. doi: 10.1080/00461520701263368. [DOI] [Google Scholar]
  41. Hua J., Shaw R. Corona virus (Covid-19) “infodemic” and emerging issues through a data lens: The case of China. International Journal of Environmental Research and Public Health. 2020;17(7):2309. doi: 10.3390/ijerph17072309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Jaffé M.E., Greifeneder R. In: The psychology of fake news: Accepting, sharing, and correcting misinformation. Greifeneder R., Jaffé M.E., Newman E.J., Schwarz N., editors. Routledge; 2020. Can that be true or is it just fake news? New perspectives on the negativity bias in judgements of truth; pp. 111–126. [Google Scholar]
  43. Jones-Jang S.M., Mortensen T., Liu J. Does media literacy help identification of fake news? Information literacy helps, but other literacies don't. American Behavioral Scientist. 2021;65(2):371–388. doi: 10.1177/0002764219869406. [DOI] [Google Scholar]
  44. Kalyuga S., Chandler P., Tuovinen J., Sweller J. When problem solving is superior to studying worked examples. Journal of Educational Psychology. 2001;93(3):579–588. doi: 10.1037/0022-0663.93.3.579. [DOI] [Google Scholar]
  45. Kapur M. Productive failure in learning math. Cognitive Science. 2014;38:1008–1022. doi: 10.1111/cogs.12107. [DOI] [PubMed] [Google Scholar]
  46. Keppell M.J., Kennedy G.E., Elliott K.A., Harris P.J. Transforming traditional curricula: Enhancing medical education through problem-based learning, multimedia and web-based resources. Interactive Multimedia Electronic Journal of Computer-Enhanced Learning. 2001;3(1):1–6. [Google Scholar]
  47. Kim N.J., Belland B.R., Walker A.E. Effectiveness of computer-based scaffolding in the context of problem-based learning for STEM education: Bayesian meta-analysis. Educational Psychology Review. 2018;30(2):397–429. doi: 10.1007/s10648-017-9419-1. [DOI] [Google Scholar]
  48. Kirschner P.A., Sweller J., Clark R.E. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist. 2006;41(2):75–86. doi: 10.1207/s15326985ep4102_1. [DOI] [Google Scholar]
  49. Koohikamali M., Sidorova A. Information re-sharing on social network sites in the age of fake news. Informing Science: The International Journal of an Emerging Transdiscipline. 2017;20:215–235. http://www.informingscience.org/Publications/3871 Retrieved from. [Google Scholar]
  50. Lazer D.M.J., Baum M.A., Benkler Y., Berinsky A.J., Greenhill K.M., Menczer F., Zittrain J.L. The science of fake news. Science. 2018;359(6380):1094–1096. doi: 10.1126/science.aao2998. [DOI] [PubMed] [Google Scholar]
  51. Lea M.R., Street B.V. Student writing in higher education: An academic literacies approach. Studies in Higher Education. 1998;23(2):157–172. doi: 10.1080/03075079812331380364. [DOI] [Google Scholar]
  52. Lee L., Lajoie S.P., Poitras E.G., Nkangu M., Doleck T. Co-regulation and knowledge construction in an online synchronous problem based learning setting. Education and Information Technologies. 2017;22(4):1623–1650. doi: 10.1007/s10639-016-9509-6. [DOI] [Google Scholar]
  53. Lerche T., Kiel E. Predicting student achievement in learning management systems by log data analysis. Computers in Human Behavior. 2018;89:367–372. doi: 10.1016/j.chb.2018.06.015. [DOI] [Google Scholar]
  54. Lewandowsky S., Ecker U.K.H., Cook J. Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition. 2017;6(4):353–369. doi: 10.1016/j.jarmac.2017.07.008. [DOI] [Google Scholar]
  55. Loyens S.M., Jones S.H., Mikkers J., van Gog T. Problem-based learning as a facilitator of conceptual change. Learning and Instruction. 2015;38:34–42. doi: 10.1016/j.learninstruc.2015.03.002. [DOI] [Google Scholar]
  56. Maftei A., Grigore A.N. In: Paths of communication in postmodernity. Boldea I., Sigmirean C., Buda D., editors. 2020. Online moral disengagement and fake news: Exploring the links with cyber-aggression, cyber-victimization, gender, and age; pp. 287–291. (Arhipelag XXI) [Google Scholar]
  57. Mamun M.A.A., Lawrie G., Wright T. Instructional design of scaffolded online learning modules for self-directed and inquiry-based learning environments. Computers & Education. 2020;144 doi: 10.1016/j.compedu.2019.103695. [DOI] [Google Scholar]
  58. Marsh E.J., Stanley M.L. In: Converging perspectives on conceptual change. Mapping an emerging paradigm in the learning sciences. Amin T.G., Lervrini O., editors. Routledge; 2020. False beliefs. Byproducts of an adaptive knowledge base? pp. 127–141. [Google Scholar]
  59. McGrath A. Dealing with dissonance: A review of cognitive dissonance reduction. Social and Personality Psychology Compass. 2017;11(12) doi: 10.1111/spc3.12362. [DOI] [Google Scholar]
  60. McGrew S. Learning to evaluate: An intervention in civic online reasoning. Computers & Education. 2020;145 doi: 10.1016/j.compedu.2019.103711. (in press) [DOI] [Google Scholar]
  61. McGuire W.J. Some contemporary approaches. Advances in Experimental Social Psychology. 1964;1:191–229. doi: 10.1016/S0065-2601(08)60052-0. [DOI] [Google Scholar]
  62. McLafferty I. Focus group interviews as a data collecting strategy. Methodological Issues in Nursing Research. 2004;48(2):187–194. doi: 10.1111/j.1365-2648.2004.03186.x. [DOI] [PubMed] [Google Scholar]
  63. Mega C., Ronconi L., De Beni R. What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. Journal of Educational Psychology. 2014;106(1):121–131. doi: 10.1037/a0033546. [DOI] [Google Scholar]
  64. Morgan D.L. 2nd ed. Sage; 1997. Focus groups as qualitative research. [Google Scholar]
  65. Murphy M.P.A. COVID-19 and emergency eLearning: Consequences of the securitization of higher education for post-pandemic pedagogy. Contemporary Security Policy. 2020;41(3):492–505. doi: 10.1080/13523260.2020.1761749. [DOI] [Google Scholar]
  66. Nguyen C.T. Echo chambers and epistemic bubbles. Episteme. 2020;17(2):141–161. doi: 10.1017/epi.2018.32. [DOI] [Google Scholar]
  67. Nickerson R.S. Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology. 1998;2(2):175–220. doi: 10.1037/1089-2680.2.2.175. [DOI] [Google Scholar]
  68. Nistor N. In: Towards the virtual university - international on-line learning perspectives. Nistor N., English L., Wheeler S., editors. Information Age; 2003. Problem-based virtual seminars: Concept and evaluation; pp. 175–186. [Google Scholar]
  69. Nistor N., Dascălu M., Trăușan-Matu S. Joining informal learning in online knowledge communities and formal learning in higher education: Instructional design and evaluation of a blended-learning seminar with learning analytics support. International Journal of Interaction Design & Architecture(s) 2020;43:110–127. http://www.mifav.uniroma2.it/inevent/events/idea2010/doc/43_6.pdf [Google Scholar]
  70. Nistor N., Neubauer K. From participation to dropout: Quantitative participation patterns in online university courses. Computers & Education. 2010;55(2):663–672. doi: 10.1016/j.compedu.2010.02.026. [DOI] [Google Scholar]
  71. Olsen J.K., Faucon L., Dillenbourg P. Transferring interactive activities in large lectures from face-to-face to online settings. Information and Learning Sciences. 2020;121(7/8):559–567. doi: 10.1108/ILS-04-2020-0109. [DOI] [Google Scholar]
  72. Osborne C.L. Programming to promote information literacy in the era of fake news. International Journal of Legal Information. 2018;46(2):101–109. doi: 10.1017/jli.2018.21. [DOI] [Google Scholar]
  73. Oswald M. Springer; 2019. Strategisches framing: Eine einführung. [Strategic framing: An introduction] [Google Scholar]
  74. Oyersman D., Dawson A. In: The psychology of fake news: Accepting, sharing, and correcting misinformation. Greifeneder R., Jaffé M.E., Newman E.J., Schwarz N., editors. Routledge; 2020. Your fake news, our facts. Identity-based motivation shapes what we believe, share, and accept; pp. 71–86. [Google Scholar]
  75. Papageorgis D., McGuire W.J. The generality of immunity to persuasion produced by pre-exposure to weakened counterarguments. Journal of Abnormal and Social Psychology. 1961;62(3):475–481. doi: 10.1037/h0048430. [DOI] [PubMed] [Google Scholar]
  76. Pariser E. Penguin; 2016. The filter bubble: How the new personalized web is changing what we read and how we think. [Google Scholar]
  77. Park C.S. Applying “negativity bias” to twitter: Negative news on twitter, emotions, and political learning. Journal of Information Technology & Politics. 2015;12(4):342–359. doi: 10.1080/19331681.2015.1100225. [DOI] [Google Scholar]
  78. Pedrotti M., Nistor N. In: Transforming learning with meaningful technologies. 14th European Conference on Technology-Enhanced Learning, EC-TEL 2019 Delft, The Netherlands, September 16-19, 2019, Proceedings. Scheffel M., Broisin J., Pammer-Schindler V., Ioannou A., Schneider J., editors. Springer; 2019. How students fail to self-regulate their online learning experience; pp. 377–385. [Google Scholar]
  79. Pennycook G., Rand D.G. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition. 2019;188:39–50. doi: 10.1016/j.cognition.2018.06.011. [DOI] [PubMed] [Google Scholar]
  80. Pennycook G., Rand D.G. Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality. 2020;88(2):185–200. doi: 10.1111/jopy.12476. [DOI] [PubMed] [Google Scholar]
  81. Peters M.A. Education in a post-truth world. Educational Philosophy and Theory. 2017;49(6):563–566. doi: 10.1080/00131857.2016.1264114. [DOI] [Google Scholar]
  82. Petrovic T., Kennedy G. How often do students use a learning management system in an on-campus, problem-based learning curriculum. In Proceedings of ASCILITE 2005 - The Australasian Society for Computers in Learning in Tertiary Education. 2005:535–538. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.92.229&rep=rep1&type=pdf [Google Scholar]
  83. Polanco R., Calderon P., Delgado F. Effects of a problem-based learning program on engineering students' academic achievements in a Mexican university. Innovations in Education & Teaching International. 2004;41(2):145–155. doi: 10.1080/1470329042000208675. [DOI] [Google Scholar]
  84. Potter W.J. Sage; 2018. Media literacy. [Google Scholar]
  85. Radkowitsch A., Vogel F., Fischer F. Good for learning, bad for motivation? A meta-analysis on the effects of computer-supported collaboration scripts. International Journal of Computer-Supported Collaborative Learning. 2020;15(1):5–47. doi: 10.1007/s11412-020-09316-4. [DOI] [Google Scholar]
  86. Revez J., Corujo L. Librarians against fake news: A systematic literature review of library practices (jan. 2018–Sept. 2020) The Journal of Academic Librarianship. 2021;47(2):102304. doi: 10.1016/j.acalib.2020.102304. [DOI] [Google Scholar]
  87. Roelle J., Berthold K. The expertise reversal effect in prompting focused processing of instructional explanations. Instructional Science. 2013;41(4):635–656. doi: 10.1007/s11251-012-9247-0. [DOI] [Google Scholar]
  88. Roozenbeek J., van der Linden S. Fake news game confers psychological resistance against online misinformation. Palgrave Communications. 2019;5(1) doi: 10.1057/s41599-019-0279-9. [DOI] [Google Scholar]
  89. Ross L., Ward A. In: Values and knowledge, the Jean Piaget symposium series. Reed E.S., Turiel E., Brown T., editors. Lawrence Erlbaum Associates; 1996. Naive realism in everyday life: Implications for social conflict and misunderstanding; pp. 103–135. [Google Scholar]
  90. Rubin V.L., Conroy N. Discerning truth from deception: Human judgments and automation efforts. First Monday. 2012;17(3) doi: 10.5210/fm.v17i3.3933. [DOI] [Google Scholar]
  91. Rumelhart D.E. In: Theoretical issues in reading comprehension. Spiro R.J., editor. Lawrence Erlbaum Associates; 1980. Schemata: The building blocks of cognition. [Google Scholar]
  92. Sanchez E., Garcia-Rodicio H., Acuna S.R. Are instructional explanations more effective in the context of an impasse? Instructional Science. 2009;37:537–563. doi: 10.1007/s11251-008-9074-5&#x00026;psfiisttps. [DOI] [Google Scholar]
  93. Scheufele D.A. Framing as a theory of media effects. Journal of Communication. 1999;49(1):103–122. doi: 10.1111/j.1460-2466.1999.tb02784.x. [DOI] [Google Scholar]
  94. Schmidt H.G., Loyens S.M.M., van Gog T., Paas F. Problem-based learning is compatible with human cognitive architecture: Commentary on Kirschner, Sweller, and Clark (2006) Educational Psychologist. 2007;42(2):91–97. doi: 10.1080/00461520701263350. [DOI] [Google Scholar]
  95. Scheibenzuber C., Nistor N. In: Transforming learning with meaningful technologies. 14th European Conference on Technology-Enhanced Learning, EC-TEL 2019 Delft, The Netherlands, September 16-19, 2019, Proceedings. Scheffel M., Broisin J., Pammer-Schindler V., Ioannou A., Schneider J., editors. Springer; 2019. Media literacy training against fake news in online media; pp. 688–691. [Google Scholar]
  96. Schwarz N., Jalbert M. In: The psychology of fake news: Accepting, sharing, and correcting misinformation. Greifeneder R., Jaffé M.E., Newman E.J., Schwarz N., editors. Routledge; 2020. When fake news feels true; pp. 71–86. [Google Scholar]
  97. Şendağ S., Ferhan Odabaşı H. Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Computers & Education. 2009;53(1):132–141. doi: 10.1016/j.compedu.2009.01.008. [DOI] [Google Scholar]
  98. Sinatra G.M., Pintrich P.R. Lawrence Erlbaum; 2003. Intentional conceptual change. [Google Scholar]
  99. Smith J.P., III, diSessa A.A., Roschelle J. Misconceptions reconceived: A constructivist analysis of knowledge in transition. The Journal of the Learning Sciences. 1994;3(2):115–163. doi: 10.1207/s15327809jls0302_1. [DOI] [Google Scholar]
  100. Straus S.G., McGrath J.E. Does the medium matter? The interaction of task type and technology on group performance and member reactions. Journal of Applied Psychology. 1994;79(1):87–97. doi: 10.1037/0021-9010.79.1.87. [DOI] [PubMed] [Google Scholar]
  101. Swan K. Building learning communities in online courses: The importance of interaction. Education, Communication & Information. 2002;2(1):23–49. doi: 10.1080/1463631022000005016. [DOI] [Google Scholar]
  102. Tosun C., Taşkesenligil Y. Using the MOODLE learning management system in problem based learning method. International Online Journal of Educational Sciences. 2011;3(3) http://www.acarindex.com/dosyalar/makale/acarindex-1423904361.pdf [Google Scholar]
  103. Tsai C.W., Chiang Y.C. Research trends in problem-based learning (PBL) research in e-learning and online education environments: A review of publications in SSCI-indexed journals from 2004 to 2012. British Journal of Educational Technology. 2013;44(6):185–190. doi: 10.1111/bjet.12038. [DOI] [Google Scholar]
  104. Tuchman G. Free Press; 1978. Making news. A study in the construction of reality. [Google Scholar]
  105. Valkenburg P.M., Peter J., Walther J.B. Media effects: Theory and research. Annual Review of Psychology. 2016;67:315–338. doi: 10.1146/annurev-psych-122414-033608. [DOI] [PubMed] [Google Scholar]
  106. Van Swol L. In: Levine T., editor. Vol. 1. Sage; 2014. Truth bias; pp. 904–906. (Encyclopedia of deception). [Google Scholar]
  107. van der Linden S., Leiserowitz A., Rosenthal S., Maibach E. Inoculating the public against misinformation about climate change. Global Challenges. 2017;1(2) doi: 10.1002/gch2.201600008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  108. Vogel I., Jiang P. In: Doucet A., Isaac A., Golub K., Aalberg T., Jatowt A., editors. Vol. 11799. Springer; 2019. Fake news detection with the new German dataset “GermanFakeNC”. (Digital libraries for open knowledge. TPDL 2019. Lecture notes in computer science). [DOI] [Google Scholar]
  109. Vosoughi S., Roy D., Aral S. The spread of true and false news online. Science. 2018;359(6380):1146–1151. doi: 10.1126/science.aap9559. [DOI] [PubMed] [Google Scholar]
  110. Walker A., Leary H. A problem-based learning meta analysis: Differences across problem types, implementation types, disciplines, and assessment levels. Interdisciplinary Journal of Problem-based Learning. 2009;3(1):12–43. doi: 10.7771/1541-5015.1061. [DOI] [Google Scholar]
  111. Weeks B.E., Lane D.S., Kim D.H., Lee S.S., Kwak N. Incidental exposure, selective exposure, and political information sharing: Integrating online exposure patterns and expression on social media. Journal of Computer-Mediated Communication. 2017;22(6):363–379. doi: 10.1111/jcc4.12199. [DOI] [Google Scholar]
  112. Yaqub W., Kakhidze O., Brockman M.L., Memon N., Patil S. CHI '20: Proceedings of the 2020 CHI conference on human factors in computing systems. 2020. April). Effects of credibility indicators on social media news sharing intent; pp. 1–14. [DOI] [Google Scholar]
  113. Yew E.H.J., Goh K. Problem-based learning: An overview of its process and impact on learning. Health Professions Education. 2016;2(2):75–79. doi: 10.1016/j.hpe.2016.01.004. [DOI] [Google Scholar]
  114. Zheng B., Niiya M., Warschauer M. Wikis and collaborative learning in higher education. Technology, Pedagogy and Education. 2015;24(3):357–374. doi: 10.1080/1475939X.2014.948041. [DOI] [Google Scholar]

Articles from Computers in Human Behavior are provided here courtesy of Elsevier

RESOURCES