Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2020 Oct 7;163:104041. doi: 10.1016/j.compedu.2020.104041

Learning analytics dashboards for adaptive support in face-to-face collaborative argumentation

Jeongyun Han a, Kwan Hoon Kim b, Wonjong Rhee a, Young Hoan Cho b,
PMCID: PMC7539901  PMID: 33046948

Abstract

Despite the potential of learning analytics for personalized learning, it is seldom used to support collaborative learning particularly in face-to-face (F2F) learning contexts. This study uses learning analytics to develop a dashboard system that provides adaptive support for F2F collaborative argumentation (FCA). This study developed two dashboards for students and instructors, which enabled students to monitor their FCA process through adaptive feedback and helped the instructor provide adaptive support at the right time. The effectiveness of the dashboards was examined in a university class with 88 students (56 females, 32 males) for 4 weeks. The dashboards significantly improved the FCA process and outcomes, encouraging students to actively participate in FCA and create high-quality arguments. Students had a positive attitude toward the dashboard and perceived it as useful and easy to use. These findings indicate the usefulness of learning analytics dashboards in improving collaborative learning through adaptive feedback and support. Suggestions are provided on how to design dashboards for adaptive support in F2F learning contexts using learning analytics.

Keywords: Adaptive instruction, Collaborative learning, Argumentation, Learning analytics, Dashboard

Highlights

  • Dashboards were developed to support collaborative argumentation in face-to-face contexts.

  • Based on learning analytics, dashboards timely provided feedback and adaptive support.

  • Dashboards improved collaborative argumentation process and group performance.

  • Dashboards positively influenced situational interest and perceived learning outcomes.

  • This study provided implications on developing a dashboard system to support face-to-face learning.

1. Introduction

Collaborative argumentation is a group-based activity wherein students work together to produce an integrated group opinion regarding a discussion topic (Noroozi, Weinberger, Biemans, Mulder, & Chizari, 2012; Nussbaum, 2008). Because the topic usually includes a complex and ill-structured problem that does not have a single solution, group members likely have different opinions in the initial phase of collaborative argumentation. As the activity progresses, initial opinions are modified, synthesized, or rejected through a group discussion of sharing different perspectives and negotiating the meanings of opinions and evidence (Cho & Jonassen, 2002; Noroozi et al., 2012). These activities are effective in improving students’ understanding of a discussion topic and changing their naïve beliefs or misconceptions (Andriessen, Baker, & Suthers, 2003; Jonassen & Cho, 2011; Nussbaum, 2008). Collaborative argumentation benefits meaningful learning because students gain new insights and knowledge while comparing, evaluating, and synthesizing different opinions based on evidence and rationale. Furthermore, collaborative argumentation is helpful in developing the competencies to argue and collaborate with others, which are essential for living in the future society (Noroozi et al., 2012). In a constantly changing world, students will likely meet new challenges (e.g., COVID-19) that have not existed before and require cooperation across multiple disciplines. Collaborative argumentation skills are essential in critically analyzing a complex problem and jointly generating creative solutions.

Although collaborative argumentation plays an important role in building knowledge and improving key competencies, students may have difficulty in jointly generating a group argument. They may lack the competencies to take others' perspectives, support their opinions with evidence, make counterarguments, and integrate different perspectives, which may interfere with learning from collaborative argumentation (Andriessen et al., 2003; Clark, D'Angelo, & Menekse, 2009; Nussbaum & Schraw, 2007). Furthermore, not all students may actively participate in collaborative argumentation because of low motivation or social presence (Kwon, Liu, & Johnson, 2014; Verdú & Sanuy, 2014). To enhance collaborative argumentation, previous studies investigated ways to provide instructional support and feedback on participation, interaction patterns, and arguments (Iandoli, Quinto, De Liddo, & Buckingham Shum, 2014; Jeong & Joung, 2007; Van Amelsvoort, Andriessen, & Kanselaar, 2007; Zheng, Xu, Li, & Su, 2018). In computer-supported collaborative argumentation, a variety of data can be automatically collected and visualized in order to encourage students to monitor and regulate their argumentation activities (Buckingham-Shum, 2003). Recently, a growing number of studies focus on learning analytics, which collect, analyze, and report data of students and their contexts, in order to adaptively support collaborative learning. A learning analytics dashboard can help visualize data of collaborative argumentation and provide students feedback at the right time (Duval, 2011; Few, 2013; Verbert et al., 2014).

However, previous studies have limitations in developing a learning analytics dashboard for collaborative argumentation. The dashboards lack the function of adaptively supporting students or groups because they merely visualize data without considering the phases of collaborative argumentation and group performance levels (e.g., Van Amelsvoort et al., 2007; Van Leeuwen, Janssen, Erkens, & Brekelmans, 2014). In addition, previous studies mainly focused on online learning contexts, wherein students asynchronously interact with each other, rather than face-to-face (F2F) contexts. Although a growing number of classes use laptop computers and smart devices to support collaborative learning in F2F settings, research on a dashboard to support F2F collaborative argumentation (FCA) is lacking (e.g., Iandoli et al., 2014; Zheng et al., 2018). Furthermore, previous studies did not pay enough attention to instructors' role in dashboard design. Students can get adaptive support not only from their own dashboard but also from an instructor who monitors student activities through a dashboard in FCA. In large classes, instructors’ dashboards help them identify groups that need instructional support and recognize their challenges (Van Leeuwen et al., 2014).

To address these limitations, this research develops and evaluates a dashboard system that adaptively supports collaborative argumentation in F2F contexts. This study develops dashboards for students and instructors so that students receive adaptive feedback and support on collaborative argumentation from an instructor and from their dashboard. The effectiveness of the dashboard system is investigated in a large-size class at a university in terms of group process and achievement, individual learning, and students’ perception of the dashboard system. The research questions are as follows:

  • What are the characteristics of the dashboard system that adaptively supports face-to-face collaborative argumentation?

  • What are the effects of the dashboard system on group process and achievement?

  • What are the effects of the dashboard system on individual learning?

  • What are students' perceptions of the dashboard system regarding usefulness, usability, and student attitudes?

2. Literature review

2.1. Students’ difficulties in collaborative argumentation

Collaborative argumentation is effective when all group members actively participate in sharing, evaluating, and integrating different opinions based on evidence and rationale. In collaborative argumentation, students and their groups may face several difficulties that hinder the effectiveness of this group activity. First, group members tend to face difficulties in presenting various opinions on the discussion topic. Their opinions can be similar to each other from the beginning of the group decision (Clark et al., 2009), or become biased in accordance with the opinions of hard-liners who make strong assertions (Jeong & Joung, 2007). Diversity in opinions is beneficial for productive group discussions in collaborative argumentation. Group discussion can be broad and deep when group members have diverse opinions, ensuring they gain more insight and find better solutions by exploring various ideas (Clark et al., 2009; Jonassen & Cho, 2011; Noroozi et al., 2012). In contrast, when a group deals only with biased opinions, it will be difficult to reach a productive conclusion that promotes a comprehensive understanding of the discussion topic at the group level (i.e., beyond the individual level).

Second, students encounter challenges regarding passive participation and interaction in collaborative argumentation. They may be passive in or unfamiliar with presenting their opinions (Noroozi et al., 2012), and make less effort in a group task when compared to an individual one (Karau & Williams, 1993; Kerr & Bruun, 1983). However, in collaborative argumentation, students need to negotiate and persuade other members to integrate their diverse opinions into one group-level conclusion (Clark et al., 2009; Gu, Shao, Guo, & Lim, 2015; Noroozi et al., 2012). Thus, all group members should actively participate in the group process and interact with peers to understand different perspectives and share their knowledge on the topic (Cho & Lee, 2013; Weinberger, Stegmann, Fischer, & Mandl, 2007). Passive participation and interaction can decrease others’ motivation, and consequently hinder group performance.

Last, students have difficulty creating high-quality arguments that include essential elements of argumentation (Jeong & Joung, 2007). Groups engaging in collaborative argumentation should produce an integrated conclusion (Jeong & Joung, 2007), and pay attention to argumentation elements such as claims, reasons, evidence, and counterarguments (Jonassen & Cho, 2011; Toulmin, 2003). However, students often have difficulty considering these elements in their arguments. For instance, they are not likely to present counterarguments because they think their arguments become weak or less persuasive when mentioning perspectives that contrast with their own, even though they may gain more conviction through rebuttal (Andriessen et al., 2003; Brooks & Jeong, 2006; Toulmin, 2003).

To overcome these challenges, students need to obtain adaptive support corresponding to each challenge. In a F2F classroom setting, instructors provide students with adaptive feedback and support when they need help. However, in a large class where an instructor supports multiple groups of students, it is difficult to provide the right support to the right group at the right time (Magnisalis, Demetriadis, & Karakostas, 2011; Walker, Rummel, & Koedinger, 2011). To resolve this issue and foster the effectiveness of collaborative argumentation, educators should adopt learning analytics, which is helpful in monitoring many students simultaneously and providing adaptive support when needed.

2.2. Learning analytics dashboard for adaptive support

Learning analytics refers to the process of measuring, collecting, analyzing, and reporting data about students' learning to understand and improve it (Pardo, 2014). For this purpose, a generic set of data utilization methods including prediction, visualization, and structure discovery algorithms are used to uncover meaningful patterns in educational data (Siemens & Long, 2011). Learning analytics can be used to adapt and personalize students' learning with customized educational resources and support that reflects their current learning state (Soller, Martinez-Monés, Jermann, & Muehlenbrock, 2005). It can also help students achieve learning goals by giving them tools that support monitoring their learning process and determining what actions produce better outcomes (Viberg, Hatakka, Bälter, & Mavroudi, 2018). Students’ motivation, achievement, and confidence in learning can be enhanced when they timely receive information of their learning progress and suggestions on how to improve their learning (Siemens et al., 2011).

A dashboard system using learning analytics can be effective in providing adaptive support to address the difficulties in FCA contexts. Typically, dashboards have a well-organized display consisting of visual elements such as graphs, charts, and alert mechanisms with color codes, which enable users to intuitively capture critical information required in decision-making processes (Few, 2013; Verbert et al., 2014). In educational contexts, dashboards are used to increase students' awareness about their learning and help them achieve their learning goals (Kim, Jo, & Park, 2016). By extracting and visualizing key aspects of learning from data, learning analytics dashboards help students efficiently understand not only their own learning status but also the status of other group members (Teasley, 2017; Verbert et al., 2014). This is beneficial in a group-based activity where group members need to be aware of each other's learning process for collaboration. By visualizing how much each group member contributes to the collaboration process, dashboards can increase students' awareness of group members' participation, facilitating equal and active participation in group activities (e.g., Janssen, Erkens, & Kirschner, 2011; Sun & Vassileva, 2006). Enhanced awareness of group processes also encourages students to regulate their learning, improving their performance and quality of collaboration (e.g., Iandoli et al., 2014; Kim et al., 2016). Furthermore, collaboration scripts added on the dashboard can help students interpret visual feedback and to make a decision on how to improve group collaboration (e.g., Dillenbourg & Jermann, 2007). The scripts can be adapted depending on the target group's learning status, and specific guidance can be provided on what activities are needed to improve current learning processes.

Learning analytics dashboards can also be used for instructors. By referring to the student information visualized in dashboards, instructors can monitor the overall learning status of the class. Dashboards can track students' learning progress and reveal the details or patterns therein, which instructors may not initially notice (Verbert et al., 2014). This helps them better understand students' learning, identify difficulties students face, and support them with appropriate instructional feedback (e.g., Mazza & Dimitrova, 2007; Maldonado, Kay, Yacef, & Schwendimann, 2012; Van Leeuwen et al., 2014, 2015). Instructor dashboards can be used to lower the effort required for managing the class and augment instructors’ roles in a learning context (Siemens & Long, 2011).

Despite the potential of learning analytics dashboards, previous studies have limitations in adaptively supporting FCA. In previous studies, dashboards focused on giving fixed information to students rather than adaptively supporting them. The prime function of most existing dashboards is to display organized information about learning status without instructional feedback and support to improve learning. Research is lacking on dashboards that automatically generate feedback depending on learning processes (Sedrakyan, Malmberg, Verbert, Järvelä, & Kirschner, 2020). In collaborative learning contexts, students need different levels of support depending on the learning phase (Brusilovsky & Peylo, 2003). However, most dashboards merely display predetermined information without adjusting support levels as the class progresses. This can have negative results or limit the effects of dashboards because of excessive feedback (Dillenbourg, 2002) and inappropriate timing (Rummel, Walker, & Aleven, 2016). Particularly in F2F learning, which occurs in a limited time, a dashboard should provide differentiated levels of learning support according to students’ learning progress (Roberts, Howell, & Seaman, 2017).

Second, previous studies seldom investigated how learning analytics dashboards supported collaborative learning in F2F contexts. Although previous studies showed the positive effect of a dashboard, most focused on online learning contexts, not F2F ones. For collaborative argumentation, F2F learning contexts are favorable because students can communicate with each other verbally or non-verbally such as through gestures or facial expressions. This feature can serve to intensify their interaction (Al Saifi, Dillon, & McQueen, 2016; Shu & Gu, 2018), which facilitates generating innovative ideas (Callister & Love, 2016; Rouhshad, Wigglesworth, & Storch, 2016). With the advancement of technology, various collaboration tools are available for F2F environments as well as online ones. The tools can augment the advantages of F2F collaborative learning and promote the integration of online with F2F learning environments for knowledge building (e.g., Angeli, Howard, Ma, Yang, & Kirschner, 2017; Yamaç, Öztürk, & Mutlu, 2020). Thus, more research is needed to develop dashboards for collaborative argumentation in F2F contexts and to investigate how to effectively use dashboards for FCA.

Finally, previous studies did not sufficiently consider instructors' role when developing learning analytics dashboards. In F2F contexts, instructors play a crucial role in monitoring students' learning and deciding on instructional support (Webb, 2009). By monitoring the learning process, instructors can adjust learning activities and timely provide support based on students' needs (Pozzi, Manca, Persico, & Sarti, 2007; Van Leeuwen et al., 2014). Particularly in large F2F classes, a dashboard can help instructors identify at-risk groups and recognize their challenges (Kinshuk, 2016). Using a learning analytics dashboard, instructors can efficiently carry out their complex roles in collaborative learning and manage their class by understanding each group's learning status.

To address the limitations, this study develops learning analytics dashboards for FCA and evaluates their effectiveness in a higher education context. Dashboards can improve collaborative argumentation by adaptively supporting students who have difficulties making, sharing, and integrating their arguments. FCA dashboards can significantly contribute to developing students’ collaborative argumentation skills, which are essential in future society.

3. Methods

3.1. Participants

A total of 88 pre-service teachers (56 females, 32 males) participated in the evaluation phase of this study. They were assigned to 22 student groups with 3 or 4 members. Their gender, major, and pre-experience of collaborative learning were considered to make each group heterogeneous. All participants carried out FCA activities as part of their coursework at a university in South Korea.

3.2. Development procedure

This study developed a dashboard system that provides adaptive support for FCA. This system consists of two types of learning analytics dashboards: one for students and one for instructors. First, this study reviewed the literature and student responses to FCA collected in previous university courses to identify needs pertaining to dashboards. From multiple resources, we identified and synthesized the difficulties students could face in FCA and key indicators for evaluating group processes. Second, we reviewed the characteristics of previously developed dashboards, which included visualization methods (e.g., color-coding schemes) and adaptive support strategies (e.g., collaboration scripts). Based on the reviews, we elaborated strategies for providing visual feedback with collaboration scripts, avoiding excessive support, and adjusting the support level according to students' needs. Third, we selected Trello (https://trello.com) as the collaboration software for FCA and explored how to collect, analyze, and utilize students' activity data. This free online software helps users organize and manage their group tasks by sharing each other's progress. Trello was used as a shared working space for group discussion in FCA and as a data collection tool for the dashboard system operation. Trello recorded students' every learning behavior as activity logs in real-time, and the logs were collected to the database in the form of student activity data by the web application programming interface (API) we developed. The web API also played a role in responding to requests from the two types of dashboards. Each dashboard requested and received important information for its role through the web API, which analyzed and extracted vital information from the student activity data. Fig. 1 shows the overall architecture of the FCA dashboard system. Fourth, we developed dashboard prototypes by applying the adaptive support strategies elaborated in the second phase. Color-coding schemes were used for both student and instructor dashboards in order to enhance usability. We also considered the FCA procedure and characteristics of the university classroom (e.g., group position) in developing the prototypes. Finally, usability tests were performed on the prototypes twice: six people in the first test and seven in the second. We developed the final version of the dashboards by improving the prototypes based on the usability tests.

Fig. 1.

Fig. 1

Architecture of the FCA dashboard system.

3.3. Evaluation procedure

Before collecting the data for evaluation, students had a three-week pre-training period to become familiar with Trello, the collaboration software, and FCA activities. They were asked to bring their own devices (e.g., laptop, tablet PC, etc.) to set-up and use the software throughout the training period. After the pre-training period, the effectiveness of the dashboard system was evaluated in two rounds of two weeks each. In the first round, participants carried out FCA using only the collaboration software without the dashboard system. In the second round, they used the dashboard system as well as the collaboration software. Participants conducted FCA for 110 minutes once a week, meaning that FCA was carried out twice per round. Discussion topics varied across four FCA classes: student-centered learning, individualized learning, learning with smart devices, and game-based learning. For each class, the following activities were performed in a university classroom.

  • Introducing an ill-structured problem: At the beginning of the class, the instructor introduced an ill-structured problem as a discussion topic (e.g., Do you agree with the use of smart devices in class?). Then, the instructor gave a short lecture to provide background knowledge of the topic in order to promote FCA, and gave students simple examples of arguments on both the Agree and Disagree positions. There was no difference between the first and second rounds in regard to introducing a problem.

  • Creating individual arguments: Every group member chose a position regarding the discussion topic and added a new card in the collaboration software to create their initial arguments (Fig. 2 -a to b). Students were recommended to consider and add six labels that represent the essential argumentation elements (Fig. 2-c). In the second round of the evaluation procedure, not in the first, students used the student dashboard during their learning process.

  • Sharing and revising arguments: Group members shared their initial arguments by reading each other's cards in the collaboration software, exchanged comments, and revised their writing in reference to the comments (Fig. 2-d). In the second round, adaptive support was provided through the student dashboard. Each group's current learning status was automatically assessed, and the collaboration scripts were delivered with color-coding.

  • Creating a group argument: Groups were asked to integrate their varied arguments into a single group argument, which was the final product of the group task (Fig. 2-e). In the second round, students were continuously supported through their dashboard because the script and color code were updated according to changes in their learning status. They could also ask the instructor to visit and help them using the Help button of the dashboard.

  • Reflecting on FCA: Every group argument was shared through another Trello board to which all students had access. The instructor helped students compare group arguments and reflect on their learning.

Fig. 2.

Fig. 2

Activity phases of FCA with the collaboration software.

3.4. Data collection and analysis

To evaluate the dashboard system, we investigated the improvement of group process, group achievement, and individual learning from Round 1 (FCA without dashboards) to Round 2 (FCA with dashboards), and students' perceptions of the dashboards. The group process was investigated by analyzing student activity data and conducting surveys. First, each group's opinion balance, comment count, network density, and counts of argumentation elements were extracted from the student activity data. Opinion balance is defined as the negative absolute difference between the number of cards on the Agree and Disagree positions in a group. For example, in Fig. 2-b, two cards are on the Agree position, and one card is on the Disagree. In this case, the opinion balance is - | 2–1 | = - 1. As a group develops more balanced opinions on the two positions, this variable becomes larger and closer to zero. The comment count is the total number of comments exchanged during class. By exchanging comments, group members could be connected to each other in the network graph. Network density is based on the connections, and refers to the proportion of existing connections over all possible connections among group members in a network of interactions (Wasserman & Faust, 1994). This metric is used as an important indicator of effective collaboration (Gu et al., 2015; Wasserman & Faust, 1994). As group members exchange comments with more peers, the density becomes closer to one, indicating the full connection among all peers. The use of argumentation elements consists of six sub-variables corresponding to the six argumentation elements: claims, reasons, evidence, counterarguments, theory, and originality. Each sub-variable was defined as the usage count of the corresponding element reported by students in the FCA process. Second, students' perceived group process was measured through the surveys of participation, interaction, group regulation, and group conflict (see Table 1 ). The surveys included 12 items that were developed based on previous research (Dewiyanti, Brand-Gruwel, Jochems, & Broers, 2007; DiDonato, 2013; Michinov & Michinov, 2009). Each item was measured on a five-point Likert scale (strongly disagree: 1, strongly agree: 5). The internal consistency of each sub-category ranged from 0.677 to 0.913.

Table 1.

Survey items for investigating group process, group achievement, and individual learning.

Category Sub-category Item example Number of items Cronbach's alpha
Group process Participation All group members actively expressed their own opinions. 3 .843–.913
Interaction Group members exchanged questions that promoted each other's thoughts. 3 .677–.823
Group regulation I knew what other group members were working on during our group task. 3 .679–.752
Group conflict A group member often had conflicting opinions with me. 3 .718–.888
Group achievement Perceived group performance My group successfully completed the task. 3 .697–.853
Individual learning Situational interest I enjoyed participating in collaborative argumentation activities. 4 .821–.869
Perceived learning outcomes I have achieved the learning objectives through collaborative learning. 4 .771–.885

Note. : Reverse item. Survey items were translated from Korean to English.

Group achievement was measured by evaluating the quality of group arguments and surveying students' perceived group performance. The group argument was the final output of every week's FCA. Two researchers independently evaluated the quality of group arguments using a rubric modified from previous studies (Jonassen & Cho, 2011; Nussbaum & Schraw, 2007) comprising six categories: claims (Are claims expressed clearly and consistently?), reasons (Are suitable reasons provided?), evidence (Are objective and concrete pieces of evidence provided?), counterarguments (Is the opposite position's opinion considered and rebutted appropriately?), theory (Are pedagogical theories used to support arguments?), and originality (Are novel perspectives or unique claims presented in the argument?). Each category was rated from 0 to 2 points (low-quality: 0, medium-quality: 1, high-quality: 2), so group achievement scores ranged from 0 to 12 points. The inter-rater reliability (Cohen's Kappa) of the six categories ranged from 0.753 to 0.888, and all differences in rating were resolved through discussion between the researchers. In addition, perceived group performance was surveyed at the end of every class. The survey contained three items (the second category in Table 1) measured on a five-point Likert scale. The internal consistency of survey items ranged from 0.697 to 0.853.

Individual learning was measured through surveys of situational interest and perceived learning outcomes at the end of every class. The surveys contained eight items based on previous research (Rotgans & Schmidt, 2011) and were measured on a five-point Likert scale. The internal consistency of each sub-category ranged from 0.771 to 0.885. Three participants did not complete the surveys on group process, group achievement, and individual learning. Thus, the survey data of 85 students were analyzed.

Finally, students’ perceptions of the dashboard system were investigated through a survey of usefulness, usability, and attitude after completing all FCA classes (see Table 2 ). The survey was based on the study of Park and Nam (2012). All items were measured on a five-point Likert scale, and the internal consistency of each survey category ranged from 0.811 to 0.945. Seven participants did not complete the survey; therefore, survey data from 81 participants were analyzed.

Table 2.

Survey items for investigating perceptions of the dashboard system.

Category Item example Number of items Cronbach's alpha
Usefulness The dashboard system helped us to monitor and improve our collaborative activities. 4 .945
Usability I was able to use the dashboard system without much effort. 4 .811
Attitude I would like to use the dashboard system for collaborative learning again. 4 .912

Note. Survey items were translated from Korean to English.

The data for group process, group achievement, and individual learning were collected every week and averaged for each round. Paired t-tests were conducted for comparative analysis between Rounds 1 and 2. The analysis units were individuals for the survey data and groups for the activity data. Pearson's correlation analyses were performed to investigate the relationships among group process, group achievement, and individual learning using survey data. Last, descriptive statistics were implemented to investigate students' perceptions of the dashboard system.

4. Findings

4.1. Characteristics of dashboards

4.1.1. Student dashboard

The main purpose of the student dashboard (Fig. 3 ) is to allow students to (1) monitor their current learning status, (2) receive adaptive feedback and support for FCA, and (3) ask for the instructor's help in class. To use this dashboard, every student group was provided with a dedicated tablet PC.

Fig. 3.

Fig. 3

Student dashboard of FCA.

Students could monitor their current learning status through the dashboard. Their learning status was analyzed and identified based on three key indicators: opinion distribution, participation and interaction, and the use of six argumentation elements. Each indicator was presented on three separate panels on the student dashboard (see Fig. 3). The first panel is the opinion count. It shows the distribution of group members’ opinions on a spectrum ranging from agreeing to disagreeing. The distribution was visualized as a bar chart, so students could easily identify whether their opinions were balanced or biased. By providing an extra “other opinions” option, students could create opinions other than agreement or disagreement. The second panel is participation and interaction, which summarizes the learning status of individual participation and group interaction by visualizing it as a network graph consisting of nodes (the green circles in the graph) and edges (the gray lines between nodes). The size of each node represents the amount of individual participation, and the width of the edge represents the amount of interaction between connected members. The third panel shows the argumentation elements. It shows the use of six labels representing the essential elements of written argumentation: claims, reasons, evidence, counterarguments, theory, and originality. We developed a set of essential elements considering the purposes and features of the learning context. Because we intended to encourage students to use pedagogical theories to solve problems within a discussion topic, the theory label was added to the set. Furthermore, the originality label was used to enable students to create original insights based on materials from the Internet rather than merely copying them (Rowe, 2004). When students wrote their arguments, they were advised to consider these elements by clicking a label button corresponding to each element in the collaboration software. This clickstream was recorded and used to visualize the pattern of use of each label in a radar chart. Through this chart, students were able to easily monitor the amount and types of elements they considered in their written argumentation.

The student dashboard has two support levels, before and after half the class, and is designed to allow students to receive explicit support specific to their needs. In the latter part of the class, collaboration scripts started to be delivered to each dashboard panel with a traffic light color-coding scheme (see Table 3 ). The contents of the scripts were organized to encourage students to perform their desired learning behavior by clearly describing their learning status. Color-coding was used to help students understand their overall learning status at a glance. The collaboration script and corresponding color code were continuously updated in real-time during the class using rule-based algorithms that diagnose and evaluate the group's current learning status based on the aforementioned key indicators.

Table 3.

Examples of collaboration scripts with color codes.

Panel Good Fair Poor
Opinion count Image 1 Image 2 Image 3
Participation and interaction Image 4 Image 5 Image 6
Argumentation elements Image 7 Image 8 Image 9

Students also had the option of receiving an in-depth instructional guide by pressing the Details button (see Fig. 4 ), which opened a new window showing detailed, explicit instructions. For the first time, this guide highlighted issues in the group's current learning processes, and encouraged group members to take specific actions to address them. For example, if a group had no opinion on the disagreement position, the collaboration script of the first panel highlighted the empty position and suggested adding new opinions to balance the overall group opinion. If their learning behavior did not change after the first guide, it provided specific examples or hints to help students take the necessary actions. For example, if a group still had an empty position, keywords were provided to help students create a new opinion on it.

Fig. 4.

Fig. 4

Example of the popup window for the in-depth instructional guide.

In addition, students could request the instructor's direct help at any time using the Help button (the purple button on the student dashboard). When they felt the final level of the in-depth guide was still not sufficient to determine the necessary actions, or when they had questions about the discussion topic, this button was used to ask the instructor for further guidance. The help request was also displayed on the instructor dashboard in real-time, so the instructor could visit the group easily and quickly.

4.1.2. Instructor dashboard

An instructor dashboard was also developed (see Fig. 5 ) to enhance the role of instructors in FCA by helping them make decisions about instructional support and respond to students’ learning needs based on their pedagogical knowledge and expertise. The dashboard helps instructors (1) monitor the overall learning status of the whole class, and (2) efficiently identify the groups requesting help.

Fig. 5.

Fig. 5

Instructor dashboard for FCA.1.

Each student group's current learning status was summarized in a set of six blocks consisting of three rows and two columns. The three rows corresponded to the three panels on the student dashboard: O, P, I, and E, which stand for opinion count, participation, interaction, and elements of argumentation, respectively. The two columns represent the early and latter parts of the class: the suffixes 1 and 2 represent the early and latter parts, respectively. These two columns functioned as a historical record to ensure the instructor could trace each group's learning status as the class progressed. The sets of blocks were arranged to represent each group's physical location in the classroom, so the instructor could intuitively identify each group's location. Groups' help requests were highlighted with a purple background (see Groups 6 and 14 in Fig. 5). After helping groups, the instructor could remove the highlights from requests by tapping the buttons on the right side of the dashboard.

4.2. Effects of dashboards on group process and achievement

Table 4 summarizes the results of the paired t-test on the group process. By adopting the dashboard system, student groups had more balanced opinions (t (21) = 4.174, p < .001), added more comments (t (21) = 6.527, p < .001), and interacted with more peers (t (21) = 2.889, p < .01). Using the dashboards, they included more argumentation elements in their written arguments in Round 2 than Round 1 (claim, t (21) = 3.792, p < .01; reason, t (21) = 4.469, p < .001; evidence, t (21) = 5.369, p < .001; counterargument, t (21) = 3.705, p < .01; theory, t (21) = 6.019, p < .001; originality, t (21) = 3.705, p < .01). All of these changes identified in the activity data analysis were statistically significant. Students’ perceived group process significantly improved by using the dashboards: participation (t (84) = 3.352, p < .01), interaction (t (84) = 3.778, p < .001), and group regulation (t (84) = 7.868, p < .001). There was no significant difference in group conflict between Rounds 1 and 2.

Table 4.

Results of paired t-tests on group process.

Variable Round 1 (without dashboards)
Round 2 (with dashboards)
t
Mean SD Mean SD
Opinion balance −1.886 .755 -.864 .743 4.174∗∗∗
Comment count 5.182 2.621 9.818 2.575 6.527∗∗∗
Network density .716 .342 .950 .147 2.889∗∗
Argumentation elements Claim 3.864 .889 4.659 .793 3.792∗∗
Reason 2.818 1.129 4.114 .899 4.469∗∗∗
Evidence 2.045 1.143 3.455 .975 5.369∗∗∗
Counterargument 1.909 .921 2.818 .839 3.705∗∗
Theory 1.000 .845 2.432 .849 6.019∗∗∗
Originality 1.909 .921 2.818 .839 3.705∗∗
Participation 4.245 .597 4.443 .493 3.352∗∗
Interaction 4.233 .533 4.455 .457 3.778∗∗∗
Group regulation 3.653 .631 4.169 .575 7.868∗∗∗
Group conflict 1.435 .510 1.346 .492 −1.471

Note. : Reverse item, ∗∗p < .01, ∗∗∗p < .001.

Table 5 summarizes the results of the paired t-tests on group achievement. Using the dashboard system significantly improved group achievement: quality of group arguments (t (21) = 7.241, p < .001) and perceived group performance (t (84) = 4.593, p < .001).

Table 5.

Results of paired t-tests on group achievement.

Variable Round 1 (without dashboards)
Round 2 (with dashboards)
t
Mean SD Mean SD
Quality of group arguments 7.932 1.365 10.136 1.136 7.241∗∗∗
Perceived group performance 3.939 .494 4.155 .524 4.593∗∗∗

Note. ∗∗∗p < .001.

4.3. Effects of dashboards on individual learning

Alongside the positive effects of dashboards on group process and achievement, individual learning significantly improved from Round 1 to Round 2. Table 6 summarizes the results of the paired t-tests on individual learning. The dashboard system effectively improved situational interest (t (84) = 2.773, p < .01) and perceived learning outcomes (t (84) = 4.268, p < .001). This result indicates that the dashboard promoted not only group learning but also individual learning. Correlation analyses showed that individual learning variables were significantly correlated with group process and achievement variables (see Table 7 ). The dashboards supported group activities of FCA, which might facilitate individual learning. Interestingly, individual learning variables were negatively correlated with group conflict in Round 1, but the negative correlation was not significant in Round 2. As such, dashboards might reduce the negative influence of group conflict on individual learning.

Table 6.

Results of paired t-tests on individual learning.

Variable Round 1 (without dashboards)
Round 2 (with dashboards)
t
Mean SD Mean SD
Situational interest 4.033 .569 4.193 .636 2.773∗∗
Perceived learning outcomes 3.835 .584 4.057 .633 4.268∗∗∗

Note. : Reverse item, ∗∗p < .01, ∗∗∗p < .001.

Table 7.

Results of correlation analyses between group and individual learning variables.

Variable 1 2 3 4 5 6 7
Round 1 1 Participation 1 .431∗∗ .400∗∗ -.230∗ .409∗∗ .330∗∗ .253∗
2 Interaction 1 .600∗∗ -.264∗ .606∗∗ .433∗∗ .517∗∗
3 Group regulation 1 -.240∗ .528∗∗ .320∗∗ .407∗∗
4 Group conflict 1 -.301∗∗ -.260∗ -.284∗∗
5 Perceived group performance 1 .578∗∗ .537∗∗
6 Situational interest 1 .696∗∗
7 Perceived learning outcomes 1
Round 2 1 Participation 1 .855∗∗ .517∗∗ -.208 .574∗∗ .493∗∗ .442∗∗
2 Interaction 1 .708∗∗ -.260∗ .637∗∗ .509∗∗ .457∗∗
3 Group regulation 1 -.156 .726∗∗ .472∗∗ .458∗∗
4 Group conflict 1 -.106 -.116 -.204
5 Perceived group performance 1 .628∗∗ .605∗∗
6 Situational interest 1 .754∗∗
7 Perceived learning outcomes 1

Note. 1–5: group learning variables, 6–7: individual learning variables.

4.4. Student perception of dashboards

Students’ perception of the dashboard system was positive overall. They perceived it as useful (M = 3.929, SD = 0.968) and easy to use (M = 4.034, SD = 0.718). They also demonstrated positive attitudes toward the dashboard system (M = 3.907, SD = 0.958). Students perceived that they received helpful feedback and support for FCA from the system, not unnecessary interference or disturbing surveillance.

5. Discussion

This study developed a dashboard system for adaptive support in FCA. The system included dashboards for students and instructors, making the collaborative argumentation process more visible using student activity data. The student dashboard provided appropriate instructional feedback and guidance to the right group according to their current learning status. The instructor dashboard helped instructors monitor and support students in FCA.

The dashboard system improved the group process. Student groups had more balanced opinions after using the system, which promotes deeper understanding and productive group discussion (Andriessen et al., 2003; Jonassen & Cho, 2011). This opinion distribution results in better solutions for complex problems in the discussion topic by exploring various ideas (Johnson & Johnson, 1996; Jonassen & Cho, 2011; Noroozi et al., 2012). The system also facilitated student participation and interaction during the group activity. With the dashboard system, the number of comments and network density improved significantly, and nearly all group members interacted with each other. This indicates that students shared their opinions enough during the group discussion, and integrated all members' ideas into their final group opinion based on shared understanding (Malmberg, Järvelä, & Järvenoja, 2017; Noroozi et al., 2012; Stegmann, Wecker, Weinberger, & Fischer, 2012). Using the dashboard, students considered significantly more elements of argumentation. This implies that the argumentation elements in the third panel were used as a checklist, which encouraged students to consider more of the elements required to form their arguments. By using the dashboard, student groups improved the structure of their arguments and generated more convincing opinions, benefiting learning outcomes. Students' perception of the group process was consistent with the results of the activity data analysis. Excepting group conflict, students’ perception of the group process was more positive in Round 2 than Round 1.

Group achievement also improved significantly with the dashboard system. In Round 2, the quality of group arguments and students' perceived group performance increased to a statistically significant level. The improvement of group process might positively influence group achievement. Previous studies showed that collaborative learning processes significantly influence group performance (Chen & Chiu, 2016; Stegmann et al., 2012). In addition, the dashboard system was helpful for individual learning. Although the dashboards were not designed to directly support individual learning, the improvement of group activities might positively influence students’ situational interest and perceived learning outcomes. This is supported by the correlation analysis results in this study. Janssen, Kirschner, Erkens, Kirschner, and Paas (2010) emphasized investigating both individual learning effects and group performance in collaborative learning studies. Because the dashboard system helped all group members to equally participate in FCA, the system might be beneficial for both group achievements and individual learning outcomes. As a result, students showed positive attitudes toward the dashboard system, perceiving it as useful and easy to use.

The findings of this study provide implications for developing learning analytics dashboards for adaptive support in F2F learning. First, the information provided through a dashboard needs to be organized with theory-based key indicators regarding the effective learning process. Generally, adaptive and personalized learning environments focus on learning process (Kinshuk, 2016). Particularly in F2F classes, dashboards should focus on learning process more than learning outcomes because the classes are usually not long enough to measure several learning outcomes. As the learning environment develops, learning process can be observed closely through students' activity data, which record details of their learning behaviors (Greller & Drachsler, 2012). Among the records, selecting key indicators influencing the success or failure of learning is not a problem that can be addressed through trial-and-error. In this situation, educational theory can provide worthwhile guidance on how to utilize data to determine influential behaviors in the learning context (Jivet, Scheffel, Drachsler, & Specht, 2017). In this study, key indicators for effective collaborative argumentation were selected based on previous studies, and the dashboard system was designed and operated by identifying indicators from students' activity data. As a result, we found not only positive changes in the learning process but also significant improvements in learning outcomes. These findings highlight the importance of utilizing students’ activity data to support their learning process based on educational theory.

Second, this study suggests that the system should not provide too much instructional feedback or guidance during the learning process. This encourages practices that improve students' regulation skills and not their dependence on explicit instructional support (DiDonato, 2013; Kim et al., 2016; Malmberg et al., 2017). For this reason, early in the class, our system enabled students to monitor their current learning status, encouraging them to regulate their learning by themselves. Later in the class, more specific feedback and guidance were gradually provided. Specifically, the system provided the in-depth instruction guide, the most explicit level of support, only when students pressed the Details button. The Help button also allowed them to request the instructor's help according to their needs. As a result, students' perception of group regulation improved significantly when using the dashboard system. This result indicates that by gradually providing instructional support and allowing students to choose the level of support, the negative impact of excessive guidance can be minimized, improving students' group regulation (Dillenbourg, 2002; Soller, Martínez-Monés, Jermann, & Muehlenbrock, 2005). In F2F learning sessions held intensively for a short time, it is not easy for the system to detect the optimal support timing. Thus, dashboards should include interactive functions that reflect students' preference and their willingness to receive support.

Third, real-time feedback is essential in supporting synchronous learning activities in F2F learning. In dynamic activities such as collaborative learning where multiple students actively interact, students need to understand the current learning status, not the delayed (Dillenbourg, 1999; Khan & Pardo, 2016; Zurita & Nussbaum, 2004). The dashboard system in this study enabled group members to monitor their current learning status and provided feedback according to the condition at the time. This increased the system's usefulness by enhancing the relevance of the feedback. Students responded positively to the questionnaire about whether the dashboard was useful. Furthermore, during the evaluation phase, students were much interested in their dashboard, which displayed and updated their learning status in real-time. Situational interest significantly improved with the use of the dashboard, which might positively influence learning outcomes (Rotgans & Schmidt, 2011).

Fourth, students should understand that the dashboard system is not for surveillance but for better learning support. In F2F learning environments, students and instructors share the same physical space, and the system can augment instructors' influence by allowing them to monitor and control student activities (Van Leeuwen, Janssen, Erkens, & Brekelmans, 2015). If instructors were to interfere with students' learning too frequently based on the information reported by the system, students would think they were being watched. This may induce negative emotion in learning and hinder students' self-regulation skills (Cho, Kim, & Han, 2019). In collaborative learning dealing with ill-structured problems, instructors' direct instruction and intervention can negatively influence group participation and interaction (Cohen, 1994). In order to prevent such surveillance issues, we included a help request button on the student dashboard so that students could control instructors’ direct feedback as per their needs and priorities. This might positively influence their perceptions of the system to support FCA.

Last, dashboard systems that support F2F learning should prevent excessive competition among students. When the system monitors the learning situation in real-time, the learning atmosphere in the classroom can become competitive, which may negatively influence learning motivation and cause misbehaviors like gaming the system (Baker, Corbett, Koedinger, & Wagner, 2004; Teasley, 2017). While peer comparison driving competition among students is used in learning analytics dashboard research, Jivet et al. (2017) highlighted using it with caution, as different educational effects have been identified in educational research. Therefore, we tried to remove comparative factors on the student dashboard that could cause competition, and set specific criteria for desirable learning behaviors in learning process using traffic light color-coding for students to refer to. If student comparison is necessary for educational purposes, we suggest disclosing this information in the instructor dashboard, not the student dashboard. The instructor should then also consider various student factors and contexts to ensure appropriate use of the information.

This study has limitations that future studies should address. First, this study adopted the one-group pretest-posttest research design. Because of the repeated measurements within individuals, we cannot rule out the possibility that the effects of the dashboards could be influenced by other factors such as group members' increasing familiarity, differences in discussion topics, and learning effects of previous exposure to FCA activities. In order to confirm the impact of the dashboards, quasi-experimental studies with control groups need to be conducted. In addition, this study was conducted in a higher education context; thus, the results of this study are not easily generalized to other learning contexts such as K-12 schools. Further research is necessary to determine whether the dashboard system improves students’ learning process and outcomes in other contexts. Finally, this study includes a technical limitation in implementing the dashboard system at scale. Because the system was developed for research purposes, several basic configurations for the system operation were completed at the source code level. Future studies are necessary to develop additional components to allow other educators to use the dashboard system for diverse purposes in their classes.

6. Conclusion

This study found that learning analytics dashboards are effective in facilitating collaborative argumentation in F2F learning contexts, which contributes to the research on learning analytics as well as the practice of collaborative argumentation. In this study, a dashboard system was developed to overcome the limitations of previous studies on adaptive support, F2F learning contexts, and the role of an instructor. The learning analytics dashboards were helpful because they provided visualized feedback and adaptive support on student activities. The dashboards included functions (e.g., the Help button) to support F2F interaction, and students were given adaptive support not only from the student dashboard but also from an instructor who monitored the collaborative argumentation process using the instructor dashboard. The dashboards can contribute to improving the quality of collaborative argumentation, which helps students to develop key competencies for future society.

This study shows that dashboards can be beneficial in F2F learning particularly when an instructor has difficulty in identifying students who need help in a large class. Further research should explore the potential of learning analytics in understanding students and optimizing learning in F2F contexts. This study provides useful implications for researchers and practitioners who intend to use learning analytics to improve collaborative learning in F2F contexts. Future research is needed to understand individual differences in responding to the feedback of learning analytics dashboards and explore types of adaptive support that can enhance deep learning. Finally, this study recommends developing learning analytics dashboards through applying the implications of this study to diverse contexts including online and blended learning.

Credit author statement

Jeongyun Han: Investigation, Software, Writing – Original Draft; Kwan Hoon Kim: Investigation, Methodology; Wonjong Rhee: Software, Supervision; Young Hoan Cho: Conceptualization, Investigation, Writing – Review & Editing.

Acknowledgments

Special thanks to Hyunghun Cho for technical advice on the development of the system. This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (Grant No. NRF-2017R1E1A1A03070560).

Footnotes

1

The figure includes a part of the instructor dashboard that shows the activities of all groups at a glance.

References

  1. Al Saifi S.A., Dillon S., McQueen R. The relationship between face to face social networks and knowledge sharing: An exploratory study of manufacturing firms. Journal of Knowledge Management. 2016;20(2):308–326. doi: 10.1108/JKM-07-2015-0251. [DOI] [Google Scholar]
  2. Andriessen J., Baker M., Suthers D. Argumentation, computer support, and the educational context of confronting cognitions. In: Andriessen J., Baker M., Suthers D., editors. Arguing to learn: Confronting cognitions in computer-supported collaborative learning environments. Kluwer Academic Publishers; Dordrecht: 2003. pp. 1–25. [Google Scholar]
  3. Angeli C., Howard S.K., Ma J., Yang J., Kirschner P.A. Data mining in educational technology classroom research: Can it make a contribution? Computers & Education. 2017;113:226–242. [Google Scholar]
  4. Baker R.S., Corbett A.T., Koedinger K.R., Wagner A.Z. Proceedings of the SIGCHI conference on human factors in computing systems. ACM; 2004, April. Off-task behavior in the cognitive tutor classroom: When students game the system; pp. 383–390. [Google Scholar]
  5. Brooks C.D., Jeong A. Effects of pre-structuring discussion threads on group interaction and group performance in computer-supported collaborative argumentation. Distance Education. 2006;27(3):371–390. doi: 10.1080/01587910600940448. [DOI] [Google Scholar]
  6. Brusilovsky P., Peylo C. Adaptive and intelligent web-based educational systems. International Journal of Artificial Intelligence in Education. 2003;13(2–4):156–169. [Google Scholar]
  7. Buckingham-Shum S. Springer-Verlag; London: 2003. The roots of computer supported argument visualization, Visualizing argumentation: Software tools for collaborative and educational sense-making. [Google Scholar]
  8. Callister R.R., Love M.S. A comparison of learning outcomes in skills-based courses: Online versus face-to-face formats. Decision Sciences Journal of Innovative Education. 2016;14(2):243–256. [Google Scholar]
  9. Chen C.H., Chiu C.H. Collaboration scripts for enhancing metacognitive self-regulation and mathematics literacy. International Journal of Science and Mathematics Education. 2016;14(2):263–280. doi: 10.1007/s10763-015-9681-y. [DOI] [Google Scholar]
  10. Cho K.L., Jonassen D.H. The effects of argumentation scaffolds on argumentation and problem solving. Educational Technology Research & Development. 2002;50(3):5–22. doi: 10.1007/BF02505022. [DOI] [Google Scholar]
  11. Cho Y.H., Kim K.H., Han J. Student perception of adaptive collaborative learning support through learning analytics. Journal of Educational Information and Media. 2019;25(1):25–57. doi: 10.15833/KAFEIAM.25.1.025. [DOI] [Google Scholar]
  12. Cho Y.H., Lee S.E. The role of co-explanation and self-explanation in learning from design examples of PowerPoint presentation slides. Computers & Education. 2013;69:400–407. [Google Scholar]
  13. Clark D.B., D'Angelo C.M., Menekse M. Initial structuring of online discussions to improve learning and argumentation: Incorporating students' own explanations as seed comments versus an augmented-preset approach to seeding discussions. Journal of Science Education and Technology. 2009;18(4):321–333. https://www.learntechlib.org/p/76358/ Retrieved from. [Google Scholar]
  14. Cohen E.G. Restructuring the classroom: Conditions for productive small groups. Review of Educational Research. 1994;64(1):1–35. doi: 10.3102/00346543064001001. [DOI] [Google Scholar]
  15. Dewiyanti S., Brand-Gruwel S., Jochems W., Broers N.J. Students' experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior. 2007;23(1):496–514. doi: 10.1016/j.chb.2004.10.021. [DOI] [Google Scholar]
  16. DiDonato N.C. Effective self- and co-regulation in collaborative learning groups: An analysis of how students regulate problem solving of authentic interdisciplinary tasks. Instructional Science. 2013;41(1):25–47. doi: 10.1007/s11251-012-9206-9. [DOI] [Google Scholar]
  17. Dillenbourg P. What do you mean by collaborative learning? In: Dillenbourg P., editor. Collaborative-learning: Cognitive and computational approaches. Elsevier; Oxford: 1999. pp. 1–19. [Google Scholar]
  18. Dillenbourg P. Over-scripting CSCL: The risks of blending collaborative learning with instructional design. In: Kirschner P.A., editor. Three worlds of CSCL. Can we support CSCL? Open Universiteit Nederland; Heerlen: 2002. pp. 61–91. [Google Scholar]
  19. Dillenbourg P., Jermann P. Designing integrative scripts. In: Fischer F., Kollar I., Mandl H., Haake J., editors. Scripting computer-supported collaborative learning: Cognitive, computational and educational perspectives. Springer; New York: 2007. pp. 275–301. [Google Scholar]
  20. Duval E. Proceedings of LAK11: 1st international conference on learning analytics and knowledge. 2011, February. Attention please! Learning analytics for visualization and recommendation; pp. 9–17. [DOI] [Google Scholar]
  21. Few S. Vol. 5. Analytics Press; Burlingame: 2013. (Information dashboard design: Displaying data for at-a-glance monitoring). [Google Scholar]
  22. Greller W., Drachsler H. Translating learning into numbers: A generic framework for learning analytics. Educational Technology & Society. 2012;15(3):42–57. [Google Scholar]
  23. Gu X., Shao Y., Guo X., Lim C.P. Designing a role structure to engage students in computer-supported collaborative learning. The Internet and Higher Education. 2015;24:13–20. [Google Scholar]
  24. Iandoli L., Quinto I., De Liddo A., Buckingham Shum S. Socially augmented argumentation tools: Rationale, design and evaluation of a debate dashboard. International Journal of Human-Computer Studies. 2014;72(3):298–319. doi: 10.1016/j.ijhcs.2013.08.006. [DOI] [Google Scholar]
  25. Janssen J., Erkens G., Kirschner P.A. Group awareness tools: It's what you do with it that matters. Computers in Human Behavior. 2011;27(3):1046–1058. doi: 10.1016/j.chb.2010.06.002. [DOI] [Google Scholar]
  26. Janssen J., Kirschner F., Erkens G., Kirschner P.A., Paas F. Making the black box of collaborative learning transparent: Combining process-oriented and cognitive load approaches. Educational Psychology Review. 2010;22(2):139–154. doi: 10.1007/s10648-010-9131-x. [DOI] [Google Scholar]
  27. Jeong A., Joung S. Scaffolding collaborative argumentation in asynchronous discussions with message constraints and message labels. Computers & Education. 2007;48(3):427–445. doi: 10.1016/j.compedu.2005.02.002. [DOI] [Google Scholar]
  28. Jivet I., Scheffel M., Drachsler H., Specht M. European conference on technology enhanced learning. Springer; Cham: 2017, September. Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice; pp. 82–96. [Google Scholar]
  29. Johnson D., Johnson R. Cooperation and the use of technology. In: Jonassen D., editor. Handbook of research for educational communications and technology. MacMillan; London: 1996. pp. 785–812. [Google Scholar]
  30. Jonassen D.H., Cho Y.H. Fostering argumentation while solving engineering ethics problems. Journal of Engineering Education. 2011;100(4):680–702. doi: 10.1002/j.2168-9830.2011.tb00032.x. [DOI] [Google Scholar]
  31. Karau S., Williams K. Social loafing: A meta-analytic review and theoretical integration. Journal of Personality and Social Psychology. 1993;65(4):681–706. https://doi.org/citeulike-article-id:1260763. [Google Scholar]
  32. Kerr N.L., Bruun S. The dispensability of member effort and group motivation losses: Free-rider effects. Journal of Personality and Social Psychology. 1983;44:78–94. [Google Scholar]
  33. Khan I., Pardo A. Proceedings of the sixth international conference on learning analytics & knowledge – LAK ’16. ACM; 2016, April. Data2U; pp. 249–253. [Google Scholar]
  34. Kim J., Jo I.H., Park Y. Effects of learning analytics dashboard: Analyzing the relations among dashboard utilization, satisfaction, and learning achievement. Asia Pacific Education Review. 2016;17(1):13–24. doi: 10.1007/s12564-015-9403-8. [DOI] [Google Scholar]
  35. Kinshuk . Routledge, Taylor & Francis Group; New York: 2016. Designing adaptive and personalized learning environments. [Google Scholar]
  36. Kwon K., Liu Y.-H., Johnson L.P. Group regulation and social-emotional interactions observed in computer supported collaborative learning: Comparison between good vs. poor collaborators. Computers & Education. 2014;78:185–200. doi: 10.1016/j.compedu.2014.06.004. [DOI] [Google Scholar]
  37. Magnisalis I., Demetriadis S., Karakostas A. Adaptive and intelligent systems for collaborative learning support: A review of the field. IEEE Transactions on Learning Technologies. 2011;4(1):5–20. doi: 10.1109/TLT.2011.2. [DOI] [Google Scholar]
  38. Maldonado R.M., Kay J., Yacef K., Schwendimann B. International conference on intelligent tutoring systems. Springer; Heidelberg: 2012. An interactive teacher's dashboard for monitoring groups in a multi-tabletop learning environment; pp. 482–492. [Google Scholar]
  39. Malmberg J., Järvelä S., Järvenoja H. Capturing temporal and sequential patterns of self-, co-, and socially shared regulation in the context of collaborative learning. Contemporary Educational Psychology. 2017;49:160–174. doi: 10.1016/j.cedpsych.2017.01.009. [DOI] [Google Scholar]
  40. Mazza R., Dimitrova V. CourseVis: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies. 2007;65(2):125–139. doi: 10.1016/j.ijhcs.2006.08.008. [DOI] [Google Scholar]
  41. Michinov N., Michinov E. Investigating the relationship between transactive memory and performance in collaborative learning. Learning and Instruction. 2009;19(1):43–54. doi: 10.1016/j.learninstruc.2008.01.003. [DOI] [Google Scholar]
  42. Noroozi O., Weinberger A., Biemans H.J.A., Mulder M., Chizari M. Argumentation-based computer supported collaborative learning (ABCSCL): A synthesis of 15 years of research. Educational Research Review. 2012;7(2):79–106. doi: 10.1016/j.edurev.2011.11.006. [DOI] [Google Scholar]
  43. Nussbaum E.M. Collaborative discourse, argumentation, and learning: Preface and literature review. Contemporary Educational Psychology. 2008;33(3):345–359. doi: 10.1016/j.cedpsych.2008.06.001. [DOI] [Google Scholar]
  44. Nussbaum E.M., Schraw G. Promoting argument-counterargument integration in students' writing. The Journal of Experimental Education. 2007;76(1):59–92. doi: 10.3200/jexe.76.1.59-92. [DOI] [Google Scholar]
  45. Pardo A. Designing learning analytics experiences. In: Larusson J.A., White B., editors. Learning analytics. Springer; New York: 2014. pp. 15–38. [Google Scholar]
  46. Park S.Y., Nam M.W. An analysis of structural equation model in understating university students' behavioral intention to use mobile learning based on technology acceptance model. The Journal of Educational Information and Media. 2012;18(1):51–75. [Google Scholar]
  47. Pozzi F., Manca S., Persico D., Sarti L. A general framework for tracking and analysing learning processes in computer-supported collaborative learning environments. Innovations in Education & Teaching International. 2007;44(2):169–179. [Google Scholar]
  48. Roberts L.D., Howell J.A., Seaman K. Give me a customizable dashboard: Personalized learning analytics dashboards in higher education. Technology, Knowledge and Learning. 2017;22(3):317–333. [Google Scholar]
  49. Rotgans J.I., Schmidt H.G. Situational interest and academic achievement in the active-learning classroom. Learning and Instruction. 2011;21(1):58–67. doi: 10.1016/j.learninstruc.2009.11.001. [DOI] [Google Scholar]
  50. Rouhshad A., Wigglesworth G., Storch N. The nature of negotiations in face-to-face versus computer-mediated communication in pair interactions. Language Teaching Research. 2016;20(4):514–534. [Google Scholar]
  51. Rowe N.C. Cheating in online student assessment: Beyond plagiarism. Online Journal of Distance Learning Administration. 2004;7(2):1–10. [Google Scholar]
  52. Rummel N., Walker E., Aleven V. Different futures of adaptive collaborative learning support. International Journal of Artificial Intelligence in Education. 2016;26(2):784–795. doi: 10.1007/s40593-016-0102-3. [DOI] [Google Scholar]
  53. Sedrakyan G., Malmberg J., Verbert K., Järvelä S., Kirschner P.A. Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior. 2020;107:105512. doi: 10.1016/j.chb.2018.05.004. [DOI] [Google Scholar]
  54. Shu H., Gu X. Determining the differences between online and face-to-face student-group interactions in a blended learning course. The Internet and Higher Education. 2018;39:13–21. [Google Scholar]
  55. Siemens G., Gasevic D., Haythornthwaite C., Dawson S., Shum S.B., Ferguson R. Society for Learning Analytics Research; 2011. Open learning analytics: An integrated & modularized platform. Proposal to design, implement and evaluate an open platform to integrate heterogeneous learning analytics techniques. [Google Scholar]
  56. Siemens G., Long P.D. Penetrating the fog: Analytics in learning and education. Educause Review. 2011;46(5):31–40. [Google Scholar]
  57. Soller A., Martínez-Monés A., Jermann P., Muehlenbrock M. From mirroring to guiding: A review of state of the art technology for supporting collaborative learning. International Journal of Artificial Intelligence in Education. 2005;15(4):261–290. [Google Scholar]
  58. Stegmann K., Wecker C., Weinberger A., Fischer F. Collaborative argumentation and cognitive elaboration in a computer-supported collaborative learning environment. Instructional Science. 2012;40(2):297–323. doi: 10.1007/s11251-011-9174-5. [DOI] [Google Scholar]
  59. Sun L., Vassileva J. Social visualization encouraging participation in online communities. In: Dimitriadis Y.A., Zigurs I., Gómez-Sánchez E., editors. Groupware: Design, implementation, and use. Springer; Heidelberg: 2006. pp. 349–363. [DOI] [Google Scholar]
  60. Teasley S.D. Student facing dashboards: One size fits all? Technology. Knowledge and Learning. 2017;22(3):377–384. doi: 10.1007/s10758-017-9314-3. [DOI] [Google Scholar]
  61. Toulmin S.E. Cambridge University Press; Cambridge: 2003. The uses of argument. [Google Scholar]
  62. Van Amelsvoort M., Andriessen J., Kanselaar G. Representational tools in computer-supported collaborative argumentation-based learning: How dyads work with constructed and inspected argumentative diagrams. The Journal of the Learning Sciences. 2007;16(4):485–521. doi: 10.1080/10508400701524785. [DOI] [Google Scholar]
  63. Van Leeuwen A., Janssen J., Erkens G., Brekelmans M. Supporting teachers in guiding collaborating students: Effects of learning analytics in CSCL. Computers & Education. 2014;79:28–39. doi: 10.1016/j.compedu.2014.07.007. [DOI] [Google Scholar]
  64. Van Leeuwen A., Janssen J., Erkens G., Brekelmans M. Teacher regulation of cognitive activities during student collaboration: Effects of learning analytics. Computers & Education. 2015;90:80–94. doi: 10.1016/j.compedu.2015.09.006. [DOI] [Google Scholar]
  65. Verbert K., Govaerts S., Duval E., Santos J.L., Van Assche F., Parra G. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing. 2014;18(6):1499–1514. doi: 10.1007/s00779-013-0751-2. [DOI] [Google Scholar]
  66. Verdú N., Sanuy J. The role of scaffolding in CSCL in general and in specific environments. Journal of Computer Assisted Learning. 2014;30(4):337–348. doi: 10.1111/jcal.12047. [DOI] [Google Scholar]
  67. Viberg O., Hatakka M., Bälter O., Mavroudi A. The current landscape of learning analytics in higher education. Computers in Human Behavior. 2018;89:98–110. [Google Scholar]
  68. Walker E., Rummel N., Koedinger K.R. Designing automated adaptive support to improve student helping behaviors in a peer tutoring activity. International Journal of Computer-Supported Collaborative Learning. 2011;6(2):279–306. doi: 10.1007/s11412-011-9111-2. [DOI] [Google Scholar]
  69. Wasserman S., Faust K. Vol. 5. Cambridge University Press; Cambridge: 1994. (Social network analysis: Methods and applications). [Google Scholar]
  70. Webb N.M. The teacher's role in promoting collaborative dialogue in the classroom. British Journal of Educational Psychology. 2009;79(1):1–28. doi: 10.1348/000709908X380772. [DOI] [PubMed] [Google Scholar]
  71. Weinberger A., Stegmann K., Fischer F., Mandl H. Scripting argumentative knowledge construction in computer-supported learning environments. In: Fischer F., Kollar I., Mandl H., Haake J.M., editors. Scripting computer-supported collaborative learning. Springer; Boston: 2007. pp. 191–211. [Google Scholar]
  72. Yamaç A., Öztürk E., Mutlu N. Effect of digital writing instruction with tablets on primary school students' writing performance and writing knowledge. Computers & Education. 2020;157:103981. doi: 10.1016/j.compedu.2020.103981. [DOI] [Google Scholar]
  73. Zheng Y., Xu C., Li Y., Su Y. Measuring and visualizing group knowledge elaboration in online collaborative discussions. Educational Technology & Society. 2018;21(1):91–103. [Google Scholar]
  74. Zurita G., Nussbaum M. Computer supported collaborative learning using wirelessly interconnected handheld computers. Computers & Education. 2004;42(3):289–314. doi: 10.1016/j.compedu.2003.08.005. [DOI] [Google Scholar]

Articles from Computers & Education are provided here courtesy of Elsevier

RESOURCES