Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2022 Sep 9;35(1):40–68. doi: 10.1007/s12528-022-09338-x

Using chatbots to support student goal setting and social presence in fully online activities: learner engagement and perceptions

Khe Foon Hew 1,, Weijiao Huang 1, Jiahui Du 1, Chengyuan Jia 2
PMCID: PMC9458299  PMID: 36101883

Abstract

Although fully online learning is now the ‘new normal’ in many parts of the world, its implementation is often beset by challenges such as the lack of student self-regulation, and the sense of isolation. In this paper, we explored the use of chatbots to support student goal setting (Study 1) and social presence (Study 2) in online activities. In Study 1, participants in a fully online course were invited to complete a goal setting activity prior to attending class via a goal-setting chatbot. The chatbot engaged participants with five questions developed based on the SMART (specific, measurable, achievable, realistic, and timely) goal setting framework. In Study 2, English-as-Foreign-Language participants in a fully online course were tasked to complete listening practices. The learning buddy chatbot was designed based on the social presence framework (interpersonal communication, open communication, cohesive communication) to guide students through listening exercises. In both Study 1 and 2, we evaluated participants’ behavioral engagement by measuring their conversation records with the chatbots, as well as participants’ perceived usefulness and ease of use of the chatbots. We also gathered in-depth interview data concerning the participants’ perceptions of interacting with the chatbots. Overall, our findings found positive learner experiences with both chatbots with regard to the chatbots’ perceived usefulness and perceived ease of use. We also provided suggestions for instructors to apply chatbots in teaching and learning.

Keywords: Online learning, Chatbot, Goal-setting, Social presence, Higher education

Introduction

Although fully online learning has been in existence since the late 1980s, many educators have eschewed using it. Despite the advantages of fully online courses such as convenience and cost benefit to students, and economies of scale benefit to institutions, more often than not, fully online courses suffer mostly from a lack of student engagement (Starr-Glass, 2020). Disengagement has been conceptualised as students not participating in or withdrawing from the course activities (Chipchase et al., 2017). The dropout rates in online higher education remain a pressing and persistent problem (Xavier & Meneses, 2020). It is not surprising that prior to the health crisis, many educators and students were not interested to adopt fully online courses in their curricula. As many as 91% of 11,141 faculty members from more than 130 U.S. institutions do not wish to teach in a fully online environment (Pomerantz & Brooks, 2017). As many as 70% of more than 40,000 students across 118 American preferred mostly or completely face-to-face learning (Gierdowski, 2019).

The reservation of using fully online learning, however, changed when the pandemic struck. Fully online learning has dramatically moved from the margins to the mainstream of education. Even when the current pandemic ends, there is a critical need for institutions to plan for possible future school closures due to new pandemics or other disasters. Fully online learning is now no longer an option but an absolute necessity in order to alleviate the disruptions caused by any future crisis. It is therefore important for us to explore ways to alleviate student disengagement in fully online learning.

Research on student disengagement in online courses has revealed several factors that can lead to student disengagement—this includes the lack of student self-regulation (Barrot et al., 2021; Pedrotti & Nistor, 2019), and the sense of isolation (Barrot et al., 2021; Chametzky, 2021). In this study, we aimed to explore the innovative use of chatbots to enhance students’ online learning engagement in two exploratory case studies, which involved two classes of students enrolled in two fully online courses: an adult education course with 29 postgraduate students (Study 1) and an EFL listening course with 38 s-year undergraduate students. Specifically, we investigated the effects of using a goal-setting chatbot to facilitate student goal setting (Study 1) and a learning buddy chatbot to present social presence (Study 2) in online activities. Since Study 1 and Study 2 addressed the same problem (i.e., students’ disengagement in online learning) by using the same instructional tool (i.e., chatbot), we put them together in this article.

We designed rule-based chatbots in the two case studies. We chose to develop rule-based chatbots because we wanted to encourage replication from educators, rather than AI programmers. We argue that educators instead of programmers should take ownership of developing chatbots for their own classes. With more and more data from users stored into the chatbot knowledge base, a more intelligent rule-based chatbot will be produced. Rule-based chatbots can recognize user’s queries based on a set of pre-defined rules, by which the responses from this type of chatbots are precise (Singh et al., 2019). It is very flexible for users to create new patterns to extend rule-based chatbots’ knowledge bases or delete patterns to solve bugs. So far, commercial industries prefer to use rule-based chatbots, as the rule-based chatbots can achieve higher accuracy in responding to users’ inputs than self-learnable chatbots which can learn from users’ inputs but are not yet well-developed (Thorat & Jadhav, 2020).

We developed the goal-setting chatbot and the learning buddy chatbot using Google Dialogflow, a visual development tool that requires no prior computer programming knowledge. A useful chatbot system should be always available and provide instant responses in a natural way through conversational interfaces (Smutny & Schreiberova, 2020). Students in this study can easily access to the two chatbots by clicking on the course page in the Moodle Learning Management System without the limitation of time, location, and devices. After entering the chatbot interface, students can interact with the chatbot with the same way as messaging a friend.

This study offers the following contributions. First, we explored the use of chatbots in two fully online learning activities to foster students’ online engagement. As the COVID-19 pandemic remains around the world, many institutions continue adopting fully online learning as the main teaching approach. This study provided practical knowledge, and empirical evidence to widen teachers’ choice of appropriate instructional tool to enhance students’ online engagement. Second, one of the novelties of this study is to design educational chatbots based on theoretical frameworks, rather than from a mere technological perspective. We employed two theoretical frameworks (i.e., SMART goal setting framework and Social Presence framework) to drive the design of chatbot–student conversations. Despite the popularity of using chatbots in education, the lack of teaching and learning objectives in designing educational chatbots remains a problem. So far, the chatbot development is primarily driven by technology rather than a clear pedagogical focus on supporting students outcomes (Wollny et al., 2021). In this study, we focused on promoting students’ online engagement and therefore employed chatbots as interactive instructional tools that can help students set learning goals and facilitate the social presence during their online learning process. Third, we provided a thick description of the development of two chatbots to encourage replication by other researchers and educators who have no computer programming experience.

The following research questions (RQ) guided this study:

Research question 1 What are the effects of the goal-setting chatbot and the learning buddy chatbot on students’ behavioral engagement in online learning?

Research question 2 What are students’ perceived usefulness and ease of use of the two chatbots?

Research question 3 What are students’ perceptions of goal-setting process and social presence supported by the chatbots?

Research question 4 What are students' suggestions for improving the design of educational chatbots?

Theoretical background

Self-regulated learning: importance of goal setting

During online learning activities, students are often given more autonomy over their learning process. The “when” and “where” of students’ working on the online contents is often up to the students themselves to decide which significantly increases the level of student self-regulation (Hartley & Bendixen, 2001). According to Zimmerman (2013), self-regulation theory contains three phases (forethought, performance, self-reflection), with each phase representing a process that occurs before learning, during learning and after the learning effort. The forethought phase refers to the stage where students prepare for learning. Goal setting is a key element in this phase which refers to students deciding on the intended outcomes of a learning effort (Schwarzenberg & Navon, 2020). The performance phase refers to use of strategies such as self-monitoring to keep track of one’s performance during learning. The self-reflection phase refers to processes that occur after learning where students reflect on their learning experience and outcomes. These self-reflections can influence subsequent forethought processes (Zimmerman, 2013).

Self-regulated students are more likely to perform better in their learning (Lachmann & Kiefel, 2012). However, many online students lack self-regulation skills and are not ready for conducting autonomous learning (Wong et al., 2021). This can cause students to feel disengaged in the online activities. Students who lack self-regulation skills of their own learning tend to be unsuccessful in online learning (Yilmaz & Keser, 2017).

Recent investigations of self-regulation strategies in online learning settings stressed the key role of goal-setting (Pedrotti & Nistor, 2019). As Latham and Locke (1991) argued in a seminal paper, “goal setting facilitates self-regulation in that the goal defines for the person what constitutes an acceptable level of performance” (p. 234). A self-regulated learner will first set specific learning goals at the beginning of their learning experience. In the goal setting process, learners will define the outcomes for their learning and propose how they plan to reach their goals (Nussbaumer et al., 2011). Actions that fall short of a described goal level will result in a negative performance evaluation while actions that attain or exceed the desired goal will lead to a positive performance evaluation (Latham & Locke, 1991).

In this study we focused on the goal-setting process since clear goals are essential for student self-regulation, and successful completion of courses (Handoko et al., 2019). Many students with lower self-regulation skills tend to have ambitious and unrealistic goals, which may disappoint the learners after their learning (Shih et al., 2010). To help students set goals, previous studies have employed a variety of means such as the use of prompts embedded in lecture videos (Moos & Bonde, 2016; van Alten et al., 2020; Wong et al., 2021). Students were required to think about the prompts (e.g., What are your goals when learning from this homework video?) and answer the question in order to continue the video (e.g., van Alten et al., 2020).

Although the use of prompts displayed as questions can help students formulate their learning goals, some of these questions can be quite vague in nature. For example, questions such as “what are your goals when learning from this video” may not help students set effective goals that are measurable and achievable. In this study, we chose the SMART goal framework (Doran, 1981). The SMART framework contains five key elements of writing goals: specific, measurable, achievable or assignable, realistic, and time-related. The SMART method is commonly considered as the standard for developing effective, measurable goals and is widely used for developing program goals and objectives (Bjerke & Renger, 2017). In this study, the goal setting chatbot would engage participants at the start of the course with five questions developed based on the SMART method (see the Study 1 section for more detail).

Sense of isolation: facilitating social presence

Besides the lack of self-regulation skills, online students also feel a sense of isolation (Bączek et al., 2021; Chametzky, 2021) due to the lack of interactions among the participants in the online space (Wut & Xu, 2021). For example, a student may pose a question online, or ask for help regarding a lesson activity but fail to get a response from other people. Even though recent video-conferencing technologies such as Zoom can enable online conversation that mimics real-time face-to-face conversation, 33% of 400 respondents stated that they were less willing to respond to questions during Zoom sessions compared with traditional in-person lessons (Cavinato et al., 2021).

One way to alleviate isolation during online activities is to facilitate a sense of social presence. A heightened level of social presence can increase the frequency of online interaction (Tu & McIsaac, 2002). Social presence can be defined as “the ability of participants to project themselves socially and emotionally, as real people” (Garrison et al., 1999, p. 94). Previous studies have suggested that social presence can reduce stress and the sense of loneliness (Whiteside et al., 2014), and enhance student satisfaction with the course (Richardson et al., 2017). It is important to note that social presence does not mean supporting engagement simply for social purposes (Garrison, 2011). Instead, social presence in an educational setting means creating a climate that supports open and cohesive communication so that participants feel comfortable to ask questions without the fear of hurting somebody’s feelings and damaging a relationship (Garrison, 2011).

In the present study, we explored the use of a chatbot to facilitate a sense of social presence. According to social response theory (Nass & Moon, 2000), human–computer interactions are basically social where humans have the innate tendency to perceive computers as social beings, even when they know that machines do not have feelings or intentions. By interacting with an anthropomorphized machine, a user may perceive a sense of social presence (Adam et al., 2021).

Unlike a human being, a chatbot is available online 24/7 for participants who wish to converse with it. To help us design the chatbot, we refer to the specific categories of social presence described by Garrison (2011): interpersonal communication, open communication, and cohesive communication. Interpersonal communication refers to messages that create a sense of belonging with other people such as through affective expressions (e.g., emoticons, emojis). Open communication refers to messages that explicitly recognize other people through quoting from other participants’ messages, agreeing with, and expressing appreciation. Cohesive communication aims to build a sense of community through the use of phatics messages (commonly known as small talk, e.g., “nice morning today!”), vocatives (e.g., addressing participants by name), salutations, and inclusive pronouns such as “we”. In this way, this study examined and tested the usefulness of the interpersonal, open and cohesive strategies in the context of chatbot communication to induce social presence.

Overview of chatbots

A chatbot is a software tool that can interact with users by means of text or voice using natural language (Smutny & Schreiberova, 2020). The first chatbot, ELIZA, developed in 1966, simulated a psychotherapist’s conversation with humans, where users could communicate with ELIZA by entering text inputs, and the chatbot would respond in kind (Weizenbaum, 1966). Since then, voice inputs and responses have become possible modes of user-chatbot interaction (e.g., Apple Siri), although the majority of chatbots today still utilize text-based communication without physical or dynamic representations (Adam et al., 2021).

In recent years, the use of chatbots has become widespread in many sectors such as retail customer service and internet banking (Følstad & Brandtzæg, 2017). Many organizations, for example, have employed chatbots on social media platforms such as Facebook Messenger, WhatsApp, and WeChat to answer customers’ questions at any time of the day (Insider Intelligence, 2021). Insider Intelligence forecasts that consumer retail spending via chatbots will reach $142 billion by 2024, compared to just $2.8 billion in 2019. The use of chatbot has also made inroads into the educational field. Chatbots have been used for student skill improvement particularly in language learning, such as vocabulary and grammar development (Wollny et al., 2021). For example, the Wordsworth chatbot provides users quiz questions to test their vocabulary skills (Smutny & Schreiberova, 2020).

Figure 1 shows an instructional design framework of chatbot activity. Basically, a chatbot runs four working procedures during a human–chatbot conversation: (a) Question Analysis, where the chatbot uses natural language processing techniques to classifies user’s inputs by breaking down one sentence into several parts, labelling each part, then understanding the sentence; (b) Hypothesis Generation, where the chatbot searches possible contents related to responses from knowledge bases which can be designed in advance according to different topics; (c) Hypothesis Scoring, where the chatbot, using ranking algorithm techniques and reasoning capabilities, scores the consistency between hypothesis and inputs; (d) Ranking and Confidence Estimation, in which the chatbot ranks the highly matched hypothesis as correct answers and applies machine learning to train classifiers with known accurate responses in order to generate an intelligent chatbot that can process similar inputs from more users in the future.

Fig. 1.

Fig. 1

Instructional design of chatbot activities

In general, chatbots can be developed either using programming languages such as Python or a chatbot development platform (Nayyar, 2019). In the case of the latter, a developer can use a chatbot visual flow builder to develop a chatbot without any coding. Examples of some common chatbot visual flow builders include IBM Watson Assistant system, and Google Dialogflow. Both IBM Watson and Dialogflow are AI-powered systems that uses machine learning and natural language to facilitate the development of chatbots (Zuckerman, 2020).

In this study, we use Google Dialogflow platform to build our chatbot activities. A typical chatbot dialog in Dialogflow consists of three components: intents, entities, and the responses. Intents refer to users’ possible question or responses. Entities are keywords or synonyms which help the chatbots to recognize a user’s localize words. For example, when a user asks the questions “What is experiential learning?” or “Can you explain the definition of experiential learning?”, these two sentences will be recognized as one same question asking for the definition of experiential learning by searching out intent definition and entity experiential_learning. A response would be the answer provided by the chatbot.

A developer can test the accuracy of a chatbot by inputting as many questions as possible. A chatbot can be trained if its output is wrong. Developers can manually classify the confused inputs into correct intents; meanwhile the chatbot knowledge base will be updated and enriched automatically. With more and more data from users stored into the chatbot knowledge base, a better intelligent chatbot will be produced.

The design and development of two chatbots

Study 1: the goal-setting chatbot

To recall, we focused on the goal-setting process in this study because clear goals are essential for student self-regulation, and successful completion of courses (Handoko et al., 2019). The goal-setting chatbot helped students set personal learning goals concerning the course they attended. Students were required to interact with the goal-setting chatbot on Moodle webpage before coming to the first class. The goal-setting chatbot engaged students with five goal-setting questions based on SMART framework (Table 1). We used the Google Dialogflow platform to build the chatbot. According to Azran (2019), Dialogflow provides the easiest and quickest way to create a custom conversational bot.

Table 1.

Goal-setting chatbot design based on the SMART framework

SMART Rules Questions in Chatbot Choice examples Recommendation examples
Specific Why do you take this course? I’ll teach adults Good to know! Keep the adult learning strategies introduced in this course in mind and reflect them on your own experience
Measurable May I know what grade level you want to get in this course? Grade A, B, or C? Please refer to the recommendations and seek for extra help whenever you feel unconfident
Attainable Can you foresee any difficulties you may have while pursuing your learning goal in this course? I might have difficulties in understanding or applying the adult learning strategies appropriately in my own study Try to engage more in the pre and in-class activities/cases. Those stories and scenarios will help you better understand the concept and relate to real-life situations
Relevant Could you tell me what you want to gain most from this course? Knowledge of adult learning strategies and how to apply them in my own study Then I will advise you to look for more supplemental readings on the strategies that you find appropriate for your learning
Timely May I know when you plan to complete all the six written assignments? I might not be able to complete it in time Then I would like you to learn about all the assignments and plan ahead. You will be able to complete them!

For each question, potential options were offered to inspire the directions of students’ self-regulated learning goals and expectations of the course. Options can boost a chatbot’s engagement with a user because it reduces the potential for misunderstandings between the chatbot and the user (LiveChat, 2022). Additionally, providing students with options to respond generally yield higher response rate than open-ended questions (Reja et al., 2003). The options listed by the chatbot were developed in consultation with the course teacher who had in-depth knowledge of the general expectations or concerns faced by students based on his previous experience teaching the course over many years.

For example, before students attended the first lesson, the chatbot asked students “could you tell me what you want to gain most from this course?” followed by three options (Fig. 2). The three options represented three possible learning goals. All students were given the opportunity to express their expectations by choosing the options as their learning goals. Student in Fig. 2 replied to the chatbot with “I may choose 2nd one”, which means this student chose option B. Based on the student’s answer to each question, the goal-setting chatbot would provide relevant recommendations that suit each student’s preference. Hence, in Fig. 2, the goal-setting chatbot responded to the student with “I see … I believe you can do better than this!” According to students’ responses in the interviews and open-ended surveys, none of the students mentioned that they felt lost or overwhelmed. Instead, many students indicated that this activity helped them clarified learning goals and they hoped to get more recommendations during the process. Together with the recommendations after each choice, students can have a better idea of achieving the particular goals they set for the course.

Fig. 2.

Fig. 2

The dialogue between the goal-setting chatbot and students

The instructor of this course participated in the goal-setting chatbot design and development. Aligned with the course learning outcomes, the instructor worked out the goal-setting questions and relevant self-regulated learning recommendations based on the SMART framework. For example, in terms of “Measurable”, students were expected to set learning goals that could be measured by evidence. The instructor drafted the question “May I know what grade level you want to get in this course?” and possible answers, like “Grade A”, “Grade B”, or “Grade C”. Then the chatbot developer categorized the data into intents, entities, and responses according to the Dialogflow system. The intents were pre-defined keywords of students’ possible inputs. For example, an intent named “Grade A” was created. The data set in this intent were all possible answers related to the keyword “Grade A”, such as “I would like to gain Grade A” and “I want to have an A”. The entities were the synonyms and misspellings of keywords for different intents. For example, we added some synonyms for intent “Grade A” as the entities, including “level A”, “A−”, and “A+”. Recommendations for self-regulated learning strategies were fed into the responses database.

To minimise students’ off-topic replies, we labelled three potential options (A, B, or C) for each question posed. Figure 3 shows the flowchart for designing the goal-setting chatbot. The goal-setting chatbot was tested by the instructor and the developer before it was launched on Moodle (Fig. 4).

Fig. 3.

Fig. 3

The flowchart of the goal-setting chatbot design

Fig. 4.

Fig. 4

Goal-setting chatbot in Moodle system

Study 2: the learning buddy chatbot

Out-of-class practice is essential to students’ improvement in foreign language learning due to the insufficient time in classroom (García Botero et al., 2019; Kennedy & Levy, 2009). However, students’ engagement in the out-of-class practice is relatively low because they need more opportunities to interact with instructor and peer students to alleviate the sense of isolation while learning (Jia & Hew, 2022). Social presence can be a solution, because it has been shown to reduce stress and the sense of loneliness (Whiteside et al., 2014).

In out-of-class online learning, students usually complete online learning activities (e.g., watching videos and reading materials) by themselves before having a synchronous online meeting with teachers and classmates (Jia et al., 2022). To facilitate a sense of social presence in the out-of-class online learning, a social presence chatbot was created to act as a learning buddy, who guided students through the daily EFL listening practice. Students were provided with daily listening practices, in the form of dictation exercises. A total of 10 sessions were administered. Daily tasks were estimated to be completed in 10 min. While students interacted with the chatbot on Moodle course webpage, the chatbot guided the students to complete listening tasks, provided immediate feedback, and employed specific communication strategies to project social presence into its interaction with participants. We used Google Dialogflow to build the chatbot conversation. The chatbot was integrated into Moodle via embedded code and presented in a format of activity page. Once students clicked that activity page, the learning buddy chatbot would pop up and start conversation with students (Fig. 5).

Fig. 5.

Fig. 5

The dialogue between the learning buddy chatbot and student in Moodle system

The design of the learning buddy chatbot was underpinned by the specific communication strategies of social presence described by Garrison (2011) (Table 2). Interpersonal communication was presented with the help of emoticons. Emoticons are applicable to express respect and welcome to participants when physical and vocal cues are not present (Garrison, 2011). In terms of the open communication, the learning buddy chatbot was able to reply students’ inputs and guide students step by step to complete the tasks (continuing a thread).

Table 2.

Learning buddy chatbot designed based on the social presence categories

Categories Indicators Conversation examples
Interpersonal communication Affective expression by using emoticons “Looking forward to learning with you during the following days! Inline graphic
Open communication Continuing a thread The learning buddy conversed with students step by step
Quoting from others’ messages “Here are the common mistakes that previous students encountered.”
Expressing appreciation “I really appreciate your hard work.”
Expressing agreement “You’re right!”
Cohesive communication Using greetings, closures, vocatives “Hi, <name>! Welcome to today’s practice.”
Using inclusive pronouns “Let’s work hard together!”

Common mistakes reported by previous students were listed out after students finished the tasks (quoting from others’ messages). EFL learners have long recognized listening as a challenging skill to acquire (Nushi & Orouji, 2020). Worse still, they tend to blame themselves for the lack of ability in listening (Cauldwell, 2018). Cauldwell (2018) pointed out that it is horrible for students when they feel that “only me” have this problem, and everyone else is doing good. Therefore, to reduce the self-blame feeling of inadequacy as a learner listener, it is important to share with students the mistakes and difficulties encountered by other students when practicing listening. The learning buddy chatbot expressed appreciation throughout the learning process. For example, when students tried several times but still could not figure out the task, the chatbot appreciated students’ hard work to protect students’ self-esteem. Once students got correct answers, prompt agreements were given by the chatbot. To project cohesive communication, the learning buddy chatbot greeted students every day when students entered the system, addressed students’ name during the conversation, and used inclusive pronouns (i.e., we, us, and our) to create a collaborative learning environment (Fig. 5).

Methods

Participants and context

Study 1 involved 29 postgraduate students (26 females, 3 males) who enrolled in a fully online course at a large public university in Asia. Study 2 involved 38 s-year undergraduate students (29 female, 9 male) of a fully online EFL listening course also at a public university in Asia. Ethical approval to conduct the two study was granted by the university’s Institutional Review Board. We used purposeful sampling to select the courses in Study 1 and Study 2. Purposeful sampling is widely used in qualitative research to identify information-rich cases related to the phenomenon of interest (Palinkas et al., 2015). The “phenomenon of interest” refers to the use of chatbots in fully online classes. We chose these fully online classes because the instructors of the two courses were interested to use chatbots in their lessons. The researchers worked closely with the instructor of the course in Study 1, and the instructor in Study 2 to develop the chatbots.

Data collection and analysis

To address the first research question “what are the effects of the goal-setting chatbot and the learning buddy chatbot on students’ behavioral engagement in online learning”, we evaluated students’ behavioral engagement by measuring their conversation records with the goal-setting chatbot and the learning buddy chatbot (utterance turn, session length, goal completion rate). Utterance turn refers to the number of back-and-forth exchange between a chatbot and a user (Yao, 2016). Session length is the amount of time that elapses between the moment a user starts to converse with a chatbot and the moment they end the conversation (Mead, 2019). The goal completion rate in this study would be the number of times the chatbot was successful in helping students complete the learning tasks.

To evaluate students’ perceived usefulness and ease of use of the chatbots, a five-point scale questionnaire, ranging from 1 (i.e., strongly disagree) to 5 (i.e., strongly agree), was used (adapted from Davis, 1989, p. 340). Perceived ease of use can be defined as “the degree to which a person believes that using a particular system would be free of effort” (Davis, 1989, p. 320). There were four items for perceived usefulness. The sample item were “Using the chatbot made it easier to complete my goal setting process” (for study 1) and “Using the chatbot made it easier to complete my daily listening practices” (for study 2). Perceived usefulness is “the degree to which a person believes that using a particular system would enhance his or her job performance” (Davis, 1989, p. 320). The scale of perceived ease of use also included 4 items. The sample item was “I found it is easy to use the chatbot to communicate.”

To answer the third and fourth research questions regarding students’ perceptions and suggestions for improving the chatbots, individual interviews and open-ended surveys were used respectively. We conducted individual interviews to analyze students’ perceived role that chatbot played in assisting students to set learning goals (Study 1) and to complete the EFL listening practices (Study 2). Examples of the interview questions were “How did the chatbot help you to set your learning goals for this course?” (Study 1) and “In what aspects did the chatbot help to conduct social interaction with you?” (Study 2). Open-ended surveys were conducted to obtain students’ suggestions for improving the chatbots. A sample question was “Do you have any suggestions for the future design of the chatbot activity?”.

Results

Study 1: the goal-setting chatbot

Behavioral engagement

Students (n = 29) set their learning goals with the assistance of the goal-setting chatbot before coming to the first class. The usefulness scale showed a high level of internal consistency with a Cronbach’s alpha of 0.859. The Cronbach’s alpha for perceived ease of use items was 0.742. The results of students’ behavioral engagement with the chatbot activity were shown in Table 3. Students interacted with the chatbot within an average of about 8 turns (M = 7.97), with a standard deviation of 1.36. The duration of student-chatbot conversation was average 4 min (SD = 2.65). The conversation records revealed that almost half of the students (n = 14) completed the goal-setting activity within 7 turns, followed by 8 students talking to the chatbot within 8 utterance turns. There were 3 students achieved 10 turns (n = 2) and 13 turns (n = 1). After finishing the goal-setting task, students continued to talk with the chatbot to ask for other topics. For example, students asked chatbot “Can you speak more words?” and “When is the assignment due?” The goal completion rate was 100%.

Table 3.

Utterance turns and duration for the goal-setting chatbot

N Minimum Maximum Mean Std. Deviation
Utterance turn 29 7.00 13.00 7.97 1.35
Session length 29 1.00 11.00 4.00 2.65

Perceived usefulness and ease of use

Sixteen students completed the questionnaire (Table 4). The average mean of students’ perceived ease of use (M = 3.91, SD = 0.50) was slightly higher than that of the perceived usefulness of the goal-setting chatbot (M = 3.89, SD = 0.59). Students reported that goal-setting chatbot was easy to communicate. During the interaction, the goal-setting chatbot performed in expected ways to help students to set personal goals.

Table 4.

Perceived usefulness and ease of use of the goal-setting chatbot

Item N Mean (SD)
Usefulness 1. Using the chatbot enabled me to set my learning goals 16 3.94 (.77)
2. Using the chatbot made it easier to complete my goal setting process 16 4.00 (.73)
3. The chatbot enhanced my effectiveness in setting my learning goals 16 3.69 (.87)
4. Overall, I found the chatbot was useful in my learning 16 3.94 (.68)
Ease of use 1.1 found it easy to use the chatbot to communicate 16 4.19 (.54)
2. The chatbot often behaves in expected ways 16 4.00 (.89)
3.1 found it easy to recover from errors encountered while using the chatbot 16 3.19 (.66)
4. Overall, I found the chatbot easy to use 16 4.25 (.68)

Students’ perceptions of goal setting supported by chatbot

Altogether 18 students voluntarily participated in our interview, and they were asked to describe in what ways they believed the chatbot helped them in their goal setting process. We employed a grounded approach to analyze students’ interview data. from which students’ responses could be categorized into different themes inductively. One researcher analyzed all the students’ data and generated the following three main themes. To ensure the reliability of the data analysis, the second researcher randomly coded 25% interview data. The inter-coder agreement was 90%. The disagreement was resolved by a discussion between the two researchers. Three main themes emerged from the study (Table 5).

  1. Clarify the learning goals

Table 5.

Summary of interview data

Number of students Main ideas
Clarify the learning goals 7 Get an overview of the course
Clarify and visualize the rough goal in mind
Helps point out a clear direction of future learning
Techniques of setting goals 3 Learn more about how to use goal setting strategies
Applicable for future practice
Raise awareness 6 Become aware of setting goals before learning
Inspire to think about the expectations and plans

Students found the conversation with chatbot useful in the way that it helped them clarify the learning goals. Students’ perceptions can be interpreted from two aspects. First, some students were not sure about what the course would be like when they first came. In this case, setting specific and relevant goals can be extremely tough for students due to the lack of information toward future learning process. The guiding questions proposed in chatbot displayed in the format of multiple choice questions, which gave students some hints of what could be the possible paths they might follow toward completing the learning activities. For example, the “Relevant” question asked students to indicate what they want to gain most from this course. Considering that students can barely describe what they want without knowing what they could possibly gain from the course, we provided different options for students to choose so that they could perceive the potential direction for the future learning in this course:

“I think it's an engaging experience. I enjoy doing it. The chatbot did well in supporting students and pointing the clear direction for their learning.” (Student O)

Second, some students indicated that they had some rough goals in their mind, but those goals were not clear enough for them to follow. In this case, the goal-setting chatbot enabled students to specify their plans and expectations by answering the five guiding questions based on the SMART framework, which allowed students to clearly identify and express their learning goals. More specifically, students could directly choose the options listed by the chatbot as their goals and type their answers to the chatbot. As we mentioned earlier, students seldom expressed their learning goals clearly before attending the first lesson and communicating with teachers. The provided options could direct them to set specific learning goals. Options can boost a chatbot’s engagement with a user because it reduces the potential for misunderstandings between the chatbot and the user (LiveChat, 2022). Additionally, providing students with options to respond generally yield higher response rate than open-ended questions (Reja et al., 2003).

“This activity helps me clarify my goals and plans. Sometimes it seems like I had a goal for my learning, but not very clear. So, the goal-setting chatbot activity helps me specify and makes it visible.” (Student H)

  • b)

    Techniques of setting goals

As mentioned in the previous sections, the guiding questions in chatbot were designed based on SMART framework (Doran, 1981). Instead of vaguely asking students to set goals at the beginning of the course, our guiding questions adopted the SMART framework with an aim of guiding students to follow the scientific ways of setting effective goals. Additionally, students were provided with a link to read more about this SMART framework at the end of the conversation with chatbot. The conceptual and practical interaction with chatbot largely enriched students’ goal setting experience. As some students indicated that the techniques of how to use these goal setting strategies they learned in this activity could shed light on both the attempts in current course as well as future learning.

“It gives me the idea of how to use these strategies, for example, I need to set a goal before I start to learn, and I need to know what to do for each week, how to learn. And I think I can apply these strategies in my future learning.” (Student E)

  • c)

    Raise awareness

SRL could be easily ignored in that it mainly relates to the learning strategies rather than learning contents. As a result, many students are not aware of the importance of adopting various SRL strategies according to different learning phases. This chatbot activity aims to address the importance of setting effective learning goals, which is a step that many students will skip in their actual learning process. After completing this chatbot activity, many students indicated that they became aware of setting goals and aiming to achieve them further on:

“After doing this activity, I became aware of setting goals and monitoring my progress during the learning process.” (Student F)

Some students also mentioned that since they haven’t thought about their learning goals for this course, the questions in chatbot inspired them to think about their goals and plans for this course:

“The conversations and questions raised by the chatbot inspire me to think. For students like me who might not have any goals or expectations toward the course when I first come, it provides a good chance for me to think about what I really want.” (Student J)

Students’ suggestions to improve the chatbot

An open-ended survey was conducted to explore students’ suggestions for the improvement of the goal-setting chatbot. We analyzed students’ responses using grounded approach. Four directions to revise the chatbot activity were identified: (a) richer recommendation, (b) more intelligence, (c) more interactive function, and (d) long-term use. First, in total of 7 students suggested the chatbot could provide more “customized recommendation (Student O)”. In this present study, we pre-defined three options (i.e., A, B, and C) for each goal-setting question to assist students to label their inputs. However, “more options (Student A and Student B)” were expected to be offered. Second, students mentioned the goal-setting chatbot could be more intelligent by “answering faster (Student J)” and “chatting like Siri (Student C)”. Third, more interactive functions could be added during the conversation. Student K mentioned using “emojis in the sentences” could help enhance the interaction between students and the chatbot. Lastly, 2 students expressed their willingness to continue using the goal-setting chatbot for more sessions. Student E expected the chatbot “can help students setting a timetable and remind students about each assignment they should do in Moodle”.

Study 2: the learning buddy chatbot

Behavioral engagement

Students (n = 38) completed listening practices with the learning buddy chatbot during their out-of-class learning. Table 6 indicated the results of students’ behavioral engagement with the chatbot activity. During the 2-week intervention, students interacted with the chatbot within an average of 17 utterance turns (M = 17.06) per day, with a standard deviation of 6.58. The average conversational duration was about 7 min (M = 6.84, SD = 6.08). The conversation records indicated that students who interacted with the chatbot within 9 turns completed one compulsory listening activity (i.e., a multiple-choice question), while students who had 30 utterance turns continued to participate in the optional activities (i.e., fill-in-the-blank and whole sentence dictation). The goal completion rate was 92%.

Table 6.

Utterance turns and duration for the learning buddy chatbot

N Minimum Maximum Mean Std. Deviation
Utterance turn 38 9.00 30.00 17.06 6.58
Session length 38 1.00 34.00 6.84 6.08

Perceived usefulness and ease of use

All participants in study 2 (n = 38) conducted the questionnaire (Table 7). The Cronbach’s alpha for students’ perceived usefulness and ease of use for learning buddy chatbot were 0.874 and 0.873, individually. The average mean of students’ perceived usefulness (M = 4.15, SD = 0.70) of the learning buddy chatbot was higher than students’ perceived ease of use (M = 3.68, SD = 1.19). Students reported that the use of the learning buddy chatbot enabled them to learn EFL listening materials and helped them to comprehend the listening practices. Students perceived that the learning buddy chatbot is easy to communicate. The second item of the perceived ease of use scale showed the lowest average mean of 3.15 (SD = 1.11). The possible reason students reported was the unstable internet connection which caused the delayed feedbacks from the chatbot. However, when encountering some technical errors, students perceived that the chatbot could still easily recover.

Table 7.

Perceived usefulness and ease of use of the learning buddy chatbot

Item N Mean (SD)
Usefulness 1. Using the chatbot enabled me to learn daily listening materials 38 4.21 (.73)
2. Using the chatbot made it easier to complete my daily listening practices 38 4.12 (.88)
3. The chatbot enhanced my effectiveness in learning English listening 38 4.00 (.95)
4. Overall, I found the chatbot was useful in my English listening learning 38 4.27 (.75)
Ease of use 1. I found it easy to use the chatbot to communicate 38 4.12 (1.10)
2. The chatbot often behaves in expected ways 38 3.15 (1.11)
3. I found it easy to recover from errors encountered while using the chatbot 38 3.59 (1.08)
4. Overall, I found the chatbot easy to use 38 3.88 (1.10)

Students’ perception of social presence supported by chatbot

We conducted individual interviews with the students to explore their perception of social presence while learning with the learning buddy chatbot in the EFL listening practices. Altogether 11 students voluntarily participated in our interview, and they were asked about their perceptions while interacting with the chatbot. We used Garrison’s social presence framework (2011) to construct our coding framework. The indicators of social presence are grouped into three broad categories: interpersonal communication, open communication, and cohesive communication (see Table 2). After one researcher coded all the students’ answers, the second researcher randomly selected and coded 25% of the data to ensure the reliability of the data analysis. The interrater reliability was high with 96% of the agreement between the two coders. The results were presented in Table 8.

  1. Interpersonal communication

Table 8.

Students’ perception of social presence in the chatbot activity

Category Main idea Number of students
Interpersonal communication Interpersonal communication is supported via the use of emojis 9
Open communication Open communication is fostered by 7
Presenting common challenges by quoting from other’ messages
Continuing the conversation by providing feedback
Expressing appreciation, compliment, and encouragement
Cohesive communication Cohesive communication is facilitated by referring to students’ name 2

The most commonly reported category of social presence from the participants was interpersonal communication (n = 9). All of the nine students liked the emojis sent by the chatbot, because it helped foster a higher level of affective expression. According to the interviewed students, the use of emojis “made the conversation more vivid” (Student 3), helped increase the intimacy between students and chatbot (e.g., Student 2), and afforded the students a feeling “as if they were learning with a real buddy” (Student 4). However, one student (Student 1) mentioned that this positive feeling of the use of emojis disappeared in the latter half of the intervention as the novelty wore off.

  • b)

    Open communication

Open communication is another commonly cited category of social presence (n = 7). Students reported that reading other students’ messages on the learning difficulties relieved their stress. In this chatbot supported listening practices, chatbot provided anonymous learning journals by quoting from previous students’ messages, in which they recorded their difficulties and frustration while doing the practices. The participants in the study found “reading others’ difficulties helped reduce their pressure, because they [I] learned to know they were [I am] not alone as a frustrating learner” (Student 5).

In addition, students indicated that they liked learning through the conversation, because the feedback provided by the chatbot guided them through practices step by step. For example, Student 5 replied:

“I preferred learning with the chatbot to the conventional learning approach, because I enjoyed learning through conversation and interaction. It gave me timely feedback at every step of learning, rather than only receiving the feedback in the end conventionally”.

The word of appreciation, compliment and encouragement from chatbot was also reported as enjoyable and helpful. Student 3’s answer was typical:

“When the chatbot congratulated me on my performance, I felt a sense of accomplishment. It was not easy to persist in the practice, so I found this compliment helpful.”

  • c)

    Cohesive communication

Two students mentioned that chatbot referring their names in the conversation increased the intimacy with the chatbot, which contributed to building cohesive communication. For example, “referring my names made me feel closer to the bot” (Student 5), which made the interaction “more vivid and interesting” (Student 10).

Students’ suggestions to improve the chatbot

An open-ended survey was administered with the student to explore what their suggestions are for the improvement of the learning buddy chatbot. We used grounded approach to analyze students’ responses. Four major themes concerning students’ suggestions for chatbot improvement were identified: (a) more fluency, (b) richer content, (c) less repetition, and (d) more intelligence, as shown in Table 9.

Table 9.

Students’ suggestions for chatbot improvement in the EFL listening practices

Theme Explanation Number of students
More fluency Expect more fluent experience when interacting with the chatbot 9
Richer content Prefer more various responses from the chatbot 5
Less repetition The guidance and response from the chatbot were a bit repetitive and wordy, which could be designed more concise 5
More intelligence Hope the chatbot to be more intelligent in understanding their answers and offering personalized feedback 3

By and large the most commonly mentioned theme students reported was that they need more fluent experience while interacting with the learning buddy chatbot (n = 9). For example, Student 28 reported “sometimes the chatbot was lagging, which affected his learning experience”. The delayed interaction was possibly caused by the unstable internet connection and the loading of the videos the chatbot sent. Participants frequently listed richer content as another suggestion for improvement. Five students reported that they “expected more various responses from the chatbot”. Since the students had done the tasks every day for 2 weeks, they “could already predict what the chatbot would respond” in the latter half of the practice. This made the interaction less fun as time went by, although they had found it interesting at the beginning. Five students also mentioned sometimes they felt the guidance and response from the chatbot was a bit repetitive and wordy. They preferred more direct and concise interaction, especially when they got used to the practice later. Three students remarked that the learning buddy chatbot could be more intelligent by understanding their answers better and offering more personalized feedback.

Discussion

In this study, we developed and tested the innovative use of chatbots to support student goal setting (Study 1) and social presence (Study 2) in online activities. We chose goal-setting because setting specific goals is essential for student self-regulation in that goals define for the individual what constitutes an acceptable level of performance. Social presence can alleviate the sense of online isolation. In this section, we revisited the major findings of Study 1 and Study 2 based on our research questions and discussed practical suggestions in using chatbots for teaching and learning purposes.

Students’ behavioral engagement with the chatbots

Overall, the analyses of the student–chatbot conversation records revealed that the average session length for both chatbots was 4 min and 7 min respectively. Although the ideal average session length will differ based on the context of the conversation, the longer the session length, the better a chatbot is at creating an engaging conversation experience for the user (Phillips, 2018). At this moment, we are unable to state what the optimal average session length should be. However, from the instructor’s perspective, the average session length is indicative of an engaging chatbot experience for the learners. Both chatbots registered high goal completion rates (100% for the goal-setting chatbot and 92% for the learning buddy chatbot) which suggest that the two chatbots were successful in fulfilling the purposes they were created for.

The average number of utterance turn for the learning buddy chatbot (M = 17.06) was higher than that for the goal-setting chatbot (M = 7.97). Students in Study 2 could complete daily required EFL listening practices within 9 turns, which indicated they continued learning with the learning buddy chatbot to explore optional activities after finishing the required learning practices. The reason is that students tend to engage in more interactions when the learning buddy chatbot guided them by using various social presence indicators (e.g., affective expression, expressing appreciation and agreement). This finding suggests that social presence can be cultivated by a chatbot. By using the various social presence indicators, a chatbot is able to project its presence online to the other participants. According to the social response theory, people are inclined to treat computers as social beings (Nass & Moon, 2000), particularly when the chatbots exhibits human-like behaviors (Holzwarth et al., 2006), such as taking turns in conversation and showing social presence cues (e.g., using emojis to express emotions).

Students’ learning experience with the chatbots

Overall, we also found positive user experiences with both the goal-setting and learning buddy chatbots with regard to the chatbots’ perceived usefulness and perceived ease of use. Perceived usefulness and perceived ease of use have a direct impact on people’s intentions to use an information tool or system (He et al., 2018). In other words, if a user feels that the chatbot enhances his or her learning, and that using the chatbot is be free of effort, the user will be willing to use the tool.

One explanation of students’ high perceived usefulness of the chatbots is that we invited the teachers as designers to work out the chatbots’ contents based on the intended learning objectives. For example, the teacher in Study 1 designed the goal-setting questions with clear purposes, which were to let students express their expectations about the online course and provide personal recommendations based on each student’s goals in an interactive way. To improve students’ perceived usefulness of chatbot activities, we highlight the need for having pedagogical foci in education chatbots (mentioned by Wollny et al., 2021) and encourage teachers to participate in the chatbot design process.

Another reason why students perceived the chatbots enhanced their learning is that students could learn with the chatbots and receive teacher-design feedbacks without time and location limits. The “24/7” features of chatbots provided immediate responses and guidance when students needed help, while a human teacher having limited office hours could not offer such instant availability. Moreover, based on the principles of multimedia use for learning (Mayer, 2017), individuals can learn better when the learning contents are presented in the form of a conversation. When setting learning goals and accomplishing listening tasks, students interacted with the chatbots like texting with friends. The interaction between students and chatbots can help engage students during online learning. Hence, we suggest teachers to consider using chatbot as a virtual learning partner to interact with students when teachers are unavailable.

Students perceptions of goal-setting and social presence supported by chatbots

In Study 1, we examined how a chatbot can help students in their goal setting process. First, seven students indicated that this activity helped them clarify their learning goals by pointing out the direction and visualizing the goals they roughly outlined in their minds. This finding thus shows that using a chatbot can be effective in providing students with useful guidance, since previous research indicated that the lack of guidance is one of the major frustrations that students encountered during their SRL (Wong et al., 2019). Second, students indicated this goal setting activity was effective in raising their awareness of setting goals and inspiring them to think about what they really want. This is important because by raising students’ awareness of using goal setting strategies in their learning, students can be trained to become better self-regulated learners (Handoko et al., 2019).

In Study 2, we examined how a chatbot can guide students through English listening exercises by providing immediate feedback, and employing specific communication strategies to project social presence into its interaction with participants. Our findings suggest that a chatbot can be capitalized to induce perceptions of social presence in the online interactions between the chatbot and human participants. To be specific, when designing chatbot’s messages, instructors can focus on projecting social presence by adopting Garrison’s interpersonal communication, open communication, and cohesive communication strategies. A chatbot, for example, can be programmed to express affective messages such as by using emojis when chatting with participants. The chatbot can also actively ask questions or give comments to continue a dialogue instead of waiting for participants to initiate questions, express agreement, and showing appreciation. Lastly, the chatbot should employ cohesive communication by addressing participants by names to make them feel being recognized, and using collective pronouns to foster a sense of interconnectedness between the chatbot and the participants.

Implications for future educational chatbot design

One of the aims of this study is to help inform the development of future chatbots. Based on the participants’ responses concerning what improvements should be done, several recommendations can be made. We synthesized their responses into two major themes: technological limitations and non-technological problems.

The most common reported technological limitation in the two studies was the chatbot’s limited artificial intelligence. In both chatbot activities, students mentioned that the chatbots could be more intelligent, for example, by understanding the answers better and offering more personalized responses. This finding was consistent with those of previous research, as a more recent review on chatbot research revealed that the limited intelligence was the most frequently reported challenge in using chatbots (Huang et al., 2022). Students in both studies reported the limited responses as a disadvantage of chatbot. For example, students in Study 1 expected more recommendation tips rather than the three pre-defined options. The challenge was more obvious in Study 2, where the participants interacted with the chatbot for longer periods. Specifically, they expected more various responses from the chatbot, because it became easy to predict the chatbot’s answers in the latter half of the practice, which made the interaction less interesting.

Although the two chatbots have been programmed with many of the most commonly asked questions, it is impossible to anticipate every question a student may ask. With more and more data from students stored into the chatbot knowledge base, a more intelligent chatbot will be produced. Meanwhile, as a stop gap measure, we propose that educators design the chatbot to respond with a fallback message. A fallback message refers to a response given by the chatbot that expresses uncertainty and suggests alternative solutions (Følstad & Taylor, 2020) such as I am sorry, I do not understand. Please let me know if you wish to talk to a teaching assistant [link]. Although at first glance the use of fallback messages may put off users, results from a preliminary study showed there is no negative impact on the conversation outcome when chatbots express uncertainty and suggest likely alternatives (Følstad & Taylor, 2020).

In addition, the students in Study 2 considered the repetitive and wordy conversation as another demerit, although they enjoyed the detailed guidance from the chatbot in the beginning of the study. It is therefore suggested that in the long-term use of chatbot, it is important to add variety to sustain the interest in learning with the chatbot (Keller, 1987), by either simplifying or making some changes to the reoccurring sentences. For instance, some basic introduction or guidance can fade out, as students gradually get used to the activities.

Limitations

There are several limitations of this study that warrant caution when interpreting the study findings. Overall, the sample sizes in both Study 1 and Study 2 were relatively small. Furthermore, the majority of both this study’s participants were female Asian. There are other students who do not fit this demographic profile, and their opinions concerning the use of chatbots are just as valid. For future research, we plan to implement the two chatbots with larger samples of participants involving other courses. Using larger sample sizes and other courses can help us gain a more in-depth understanding of chatbot use. We also plan to measure the performance of the chatbots using other metrics such as chatbot fallback rate. Chatbot fallback rate refers to the number of times a chatbot is not able to understand a user’s message and provide a relevant response (Phillips, 2018). The lower the fallback rate, the higher will be the user satisfaction. In contrast, a high fallback rate means that more effort should be spent on training the chatbots. We also plan to implement a chatbot-supported recommender system to support other phases of student self-regulated learning in addition to goal-setting. The chatbot will serve as a conversational agent to interact with the participants by asking questions and/or offering personalized suggestions concerning self-regulation-related skills to each student.

Conclusion

The magnitude at which universities and schools around the world have unanimously embraced online learning in past 2 years is truly unprecedented. Yet, despite the significant interest in online learning, the lack of student engagement in online activities remains a persistent problem. Research on student disengagement in online learning activities has revealed several factors that can lead to student disengagement, notably the lack of student self-regulation, and the sense of isolation. This study advances the online learning literature by illustrating the potential of chatbots to help fully online students set personal learning goals through five questions based on the SMART goal setting framework, and to guide students through listening exercises by providing immediate feedback, and employing specific communication strategies to project social presence into its interaction with participants. We evaluated students’ behavirol engagement, perceived usefulness and ease of use of the chatbots, perceptions of learning with chatbots, and suggestions for educational chatbot design. The findings indicated students’ positive learning experience with the goal-setting chatbot and the learning buddy chatbot, which suggested the effectiveness of employing theoretical frameworks in designing chatbots for teaching and learning. It is hoped the findings and suggestions can help teachers and researchers to design chatbot-supported learning activities and facilitate students’ online learning.

Biographies

Khe Foon Hew

is an Associate Professor of information Technology in Education at the University of Hong Kong. His primary research interests focus on how technology can be used to support learning and engagement in both blended-learning and online-learning contexts.

Weijiao Huang

is a Ph.D. candidate in Information and Technology Studies at the University of Hong Kong. Her research interests include e-learning instructional design and technology innovation application such as chatbots in education.

Jiahui Du

is a Ph.D. candidate in Information and Technology Studies at the University of Hong Kong. Her current research interest is on using technologies to promote students’ self-regulated learning.

Chengyuan Jia

is an Assistant Professor at the College of Education, Zhejiang University. Her research interests lie in online and blended learning, and technology assisted language learning.

Funding

Funding was provided by the University of Hong Kong (Teaching Development Grant 2019).

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Adam M, Wessel M, Benlian A. AI-based chatbots in customer service and their effects on user compliance. Electronic Markets. 2021;31(2):427–445. doi: 10.1007/s12525-020-00414-7. [DOI] [Google Scholar]
  2. Azran, S. (2019). Google Dialogflow vs Microsoft LUIS vs IBM Watson Assistant – conversational AI comparison.https://medium.com/@samuel_22971/google-dialogflow-vs-microsoft-luis-vs-ibm-watson-assistant-conversational-ai-comparision-eb2d374f1413
  3. Bączek M, Zagańczyk-Bączek M, Szpringer M, Jaroszyński A, Wożakowska-Kapłon B. Students’ perception of online learning during the COVID-19 pandemic. Medicine. 2021 doi: 10.1097/MD.0000000000024821. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Barrot JS, Llenares II, del Rosario LS. Students’ online learning challenges during the pandemic and how they cope with them: The case of the Philippines. Education and Information Technologies. 2021;26(6):7321–7338. doi: 10.1007/s10639-021-10589-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bjerke MB, Renger R. Being smart about writing SMART objectives. Evaluation and Program Planning. 2017;61:125–127. doi: 10.1016/j.evalprogplan.2016.12.009. [DOI] [PubMed] [Google Scholar]
  6. Cauldwell R. A syllabus for listening: Decoding. Speech in Action; 2018. [Google Scholar]
  7. Cavinato AG, Hunter RA, Ott LS, Robinson JK. Promoting student interaction, engagement, and success in an online environment. Analytical and Bioanalytical Chemistry. 2021;413:1513–1520. doi: 10.1007/s00216-021-03178-x. [DOI] [PubMed] [Google Scholar]
  8. Chametzky, B. (2021). Communication in online learning: Being meaningful and reducing isolation. In I. Management Association (Eds.), Research anthology on developing effective online learning courses (pp. 1184–1205). IGI Global. 10.4018/978-1-7998-8047-9.ch058
  9. Chipchase L, Davidson M, Blackstock F, Bye R, Clothier P, Klupp N, Nickson W, Turner D, Williams M. Conceptualising and measuring student disengagement in higher education: A synthesis of the literature. International Journal of Higher Education. 2017;6(2):31–42. doi: 10.5430/ijhe.v6n2p31. [DOI] [Google Scholar]
  10. Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly. 1989;13(3):318–339. doi: 10.2307/249008. [DOI] [Google Scholar]
  11. Doran GT. There’s a SMART way to write management’s goals and objectives. Management Review. 1981;70(11):35–36. [Google Scholar]
  12. Følstad A, Brandtzæg PB. Chatbots and the New World of HCI. Interactions. 2017;24(4):38–42. doi: 10.1145/3085558. [DOI] [Google Scholar]
  13. Følstad A, Taylor C. Chatbot research and design. CONVERSATIONS 2019. Springer; 2020. Conversational repair in chatbots for customer service: The effect of expressing uncertainty and suggesting alternatives. [Google Scholar]
  14. García Botero G, Questier F, Zhu C. Self-directed language learning in a mobile-assisted, out-of-class context: Do students walk the talk? Computer Assisted Language Learning. 2019;32(1–2):71–97. doi: 10.1080/09588221.2018.1485707. [DOI] [Google Scholar]
  15. Garrison DR. E-learning in the 21st century: A framework for research and practice. Taylor & Francis; 2011. [Google Scholar]
  16. Garrison DR, Anderson T, Archer W. Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education. 1999;2(2):87–105. doi: 10.1016/s1096-7516(00)00016-6. [DOI] [Google Scholar]
  17. Gierdowski, D. C. (2019). ECAR study of undergraduate students and information technology. EDUCAUSE Center for Analysis and Research. https://library.educause.edu/resources/2019/10/2019-study-of-undergraduate-students-and-information-technology
  18. Handoko E, Gronseth SL, McNeil SG, Bonk CJ, Robin BR. Goal setting and MOOC completion: A study on the role of self-regulated learning in student performance in massive open online courses. The International Review of Research in Open and Distributed Learning. 2019 doi: 10.19173/irrodl.v20i4.4270. [DOI] [Google Scholar]
  19. Hartley K, Bendixen LD. Educational research in the internet age: Examining the role of individual characteristics. Educational Researcher. 2001;30(9):22–26. doi: 10.3102/0013189X030009022. [DOI] [Google Scholar]
  20. He Y, Chen Q, Kitkuakul S. Regulatory focus and technology acceptance: Perceived ease of use and usefulness as efficacy. Cogent Business & Management. 2018 doi: 10.1080/23311975.2018.1459006. [DOI] [Google Scholar]
  21. Holzwarth M, Janiszewski C, Neumann MM. The influence of avatars on online consumer shopping behavior. Journal of Marketing. 2006;70(4):19–36. doi: 10.1509/jmkg.70.4.019. [DOI] [Google Scholar]
  22. Huang W, Hew KF, Fryer LK. Chatbots for language learning—Are they really useful? A systematic review of chatbot-supported language learning. Journal of Computer Assisted Learning. 2022;38(1):237–257. doi: 10.1111/jcal.12610. [DOI] [Google Scholar]
  23. Insider Intelligence. (2021). Chatbot market in 2021: Stats, trends, and companies in the growing AI chatbot industry. Insider. Retrieved from https://www.businessinsider.com/chatbot-market-stats-trends
  24. Jia C, Hew KF. Supporting lower-level processes in EFL listening: The effect on learners’ listening proficiency of a dictation program supported by a mobile instant messaging app. Computer Assisted Language Learning. 2022;35(1–2):141–168. doi: 10.1080/09588221.2019.1671462. [DOI] [Google Scholar]
  25. Jia C, Hew KF, Bai S, Huang W. Adaptation of a conventional flipped course to an online flipped format during the Covid-19 pandemic: Student learning performance and engagement. Journal of Research on Technology in Education. 2022;54(2):281–301. doi: 10.1080/15391523.2020.1847220. [DOI] [Google Scholar]
  26. Keller JM. Development and use of the ARCS model of instructional design. Journal of Instructional Development. 1987;10(3):2–10. doi: 10.1007/BF02905780. [DOI] [Google Scholar]
  27. Kennedy C, Levy M. Sustainability and computer-assisted language learning: Factors for success in a context of change. Computer Assisted Language Learning. 2009;22(5):445–463. doi: 10.1080/09588220903345218. [DOI] [Google Scholar]
  28. Lachmann, P., & Kiefel, A. (2012). Recommending learning activities as strategy for enabling self-regulated learning. In 2012 IEEE 12th international conference on advanced learning technologies (pp. 704–705). IEEE. 10.1109/ICALT.2012.222
  29. Latham GP, Locke EA. Self-regulation through goal setting. Organizational Behavior and Human Decision Process. 1991;50(20):212–247. doi: 10.1016/0749-5978(91)90021-K. [DOI] [Google Scholar]
  30. LiveChat. (2022). Chatbot best practices. https://www.chatbot.com/chatbot-best-practices/
  31. Mayer RE. Using multimedia for e-learning. Journal of Computer Assisted Learning. 2017;33(5):403–423. doi: 10.1111/jcal.12197. [DOI] [Google Scholar]
  32. Mead, J. (2019). What session length can tell you about your chatbot performance. Inf. Commun. https://inform-comms.com/session-length-chatbot-performance
  33. Moos DC, Bonde C. Flipping the classroom: Embedding self-regulated learning prompts in videos. Technology, Knowledge and Learning. 2016;21(2):225–242. doi: 10.1007/s10758-015-9269-1. [DOI] [Google Scholar]
  34. Nass C, Moon Y. Machines and mindlessness: Social responses to computers. Journal of Social Issues. 2000;56(1):81–103. doi: 10.1111/0022-4537.00153. [DOI] [Google Scholar]
  35. Nayyar, D. A. (2019). Chatbots and the open source tools you can use to develop them. Open Source For You. Retrieved from https://opensourceforu.com/2019/01/chatbots-andthe-open-source-tools-you-can-use-to-develop-them/
  36. Nushi M, Orouji F. Investigating EFL teachers’ views on listening difficulties among their learners: The case of Iranian context. SAGE Open. 2020 doi: 10.1177/2158244020917393. [DOI] [Google Scholar]
  37. Nussbaumer, A., Albert, D., & Kirschenmann, U. (2011). Technology-mediated support for self-regulated learning in open responsive learning environments. In 2011 IEEE global engineering education conference (EDUCON) (pp. 421–427). IEEE. 10.1109/EDUCON.2011.5773171
  38. Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health and Mental Health Services Research. 2015;42(5):533–544. doi: 10.1007/s10488-013-0528-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Pedrotti M., & Nistor N. (2019) How students fail to self-regulate their online learning experience. In M. Scheffel, J. Broisin, V. Pammer-Schindler, A. Ioannou, J. Schneider (Eds.), Transforming learning with meaningful technologies (Vol. 11722, pp. 377–385). EC-TEL 2019. Lecture Notes in Computer Science. Springer. 10.1007/978-3-030-29736-7_28
  40. Phillips, C. (2018). Chatbot analytics 101: The essential metrics you need to track. Chatbots Magazine. https://chatbotsmagazine.com/chatbot-analytics-101-e73ba7013f00
  41. Pomerantz. J., & Brooks, D. C. (2017). ECAR study of faculty and information technology. Research report. ECAR. https://library.educause.edu/-/media/files/library/2017/10/facultyitstudy2017.pdf
  42. Reja, U., Manfreda, K. J., Hlebec, V., & Vehovar, V. (2003). Open-ended vs. close-ended questions in web questionnaires. Developments in Applied Statistics. http://mrvar.fdv.uni-lj.si/pub/mz/mz19/reja.pdf
  43. Richardson JC, Maeda Y, Lv J, Caskurlu S. Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior. 2017;71:402–417. doi: 10.1016/j.chb.2017.02.001. [DOI] [Google Scholar]
  44. Schwarzenberg P, Navon J. Supporting goal setting in flipped classroom. Interactive Learning Environments. 2020;28(6):671–684. doi: 10.1080/10494820.2019.1707691. [DOI] [Google Scholar]
  45. Shih KP, Chen HC, Chang CY, Kao TC. The development and implementation of scaffolding-based self-regulated learning system for e/m-learning. Educational Technology & Society. 2010;13(1):80–93. [Google Scholar]
  46. Singh J, Joesph MH, Jabbar KBA. Rule-based chabot for student enquiries. Journal of Physics: Conference Series. 2019;1228(1):012060. [Google Scholar]
  47. Smutny P, Schreiberova P. Chatbots for learning: A review of educational chatbots for the Facebook messenger. Computers & Education. 2020;151:103862. doi: 10.1016/j.compedu.2020.103862. [DOI] [Google Scholar]
  48. Starr-Glass D. Encouraging engagement: Video-conference augmentation of online distance learning environments. On the Horizon. 2020;28(3):125–132. doi: 10.1108/oth-06-2020-0020. [DOI] [Google Scholar]
  49. Thorat, S. A., & Jadhav, V. (2020). A review on implementation issues of rule-based chatbot systems. In Proceedings of the international conference on innovative computing & communications (ICICC).
  50. Tu C-H, McIsaac M. The relationship of social presence and interaction in online classes. American Journal of Distance Education. 2002;16(3):131–150. doi: 10.1207/S15389286AJDE1603_2. [DOI] [Google Scholar]
  51. van Alten DC, Phielix C, Janssen J, Kester L. Self-regulated learning support in flipped learning videos enhances learning outcomes. Computers & Education. 2020;158:104000. doi: 10.1016/j.compedu.2020.104000. [DOI] [Google Scholar]
  52. Weizenbaum J. ELIZA—A computer program for the study of natural language communication between man and machine. Communications of the ACM. 1966;9(1):36–45. doi: 10.1145/365153.365168. [DOI] [Google Scholar]
  53. Whiteside, A., Dikkers, A., & Lewis, S. (2014). The power of social presence for learning. EDUCAUSE Review Online. Retrieved from https://er.educause.edu/articles/2014/5/the-power-of-social-presence-for-learning
  54. Wollny S, Schneider J, Di Mitri D, Weidlich J, Rittberger M, Drachsler H. Are We There Yet?—A systematic literature review on chatbots in education. Frontiers in Artificial Intelligence. 2021;4:654924. doi: 10.3389/frai.2021.654924. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Wong J, Baars M, Davis D, Van Der Zee T, Houben GJ, Paas F. Supporting self-regulated learning in online learning environments and MOOCs: A systematic review. International Journal of Human-Computer Interaction. 2019;35(4–5):356–373. doi: 10.1080/10447318.2018.1543084. [DOI] [Google Scholar]
  56. Wong J, Baars M, He M, de Koning BB, Paas F. Facilitating goal setting and planning to enhance online self-regulation of learning. Computers in Human Behavior. 2021;124:106913. doi: 10.1016/j.chb.2021.106913. [DOI] [Google Scholar]
  57. Wut T-M, Xu J. Person-to-person interactions in online classroom settings under the impact of COVID-19: A social presence theory perspective. Asia Pacific Education Review. 2021;22:371–383. doi: 10.1007/s12564-021-09673-1. [DOI] [Google Scholar]
  58. Xavier, M., & Meneses, J. (2020). Dropout in online higher education: A scoping review from 2014 to 2018. Barcelona: eLearn Center, Universitat Oberta de Catalunya. 10.7238/uoc.dropout.factors.2020
  59. Yao, M. (2016). 5 metrics every chatbot developer needs to track. VentureBeat, The Machine. https://venturebeat.com/2016/10/04/5-metrics-every-chatbot-developer-needs-to-track/
  60. Yılmaz R, Keser H. The impact of interactive environment and metacognitive support on academic achievement and transactional distance in online learning. Journal of Educational Computing Research. 2017;55(1):95–122. doi: 10.1177/0735633116656453. [DOI] [Google Scholar]
  61. Zimmerman BJ. From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist. 2013;48(3):135–147. doi: 10.1080/00461520.2013.794676. [DOI] [Google Scholar]
  62. Zuckerman, A. (2020). IBM Watson vs Dialogflow comparison 2021. https://comparecamp.com/ibm-watson-vs-dialogflow-comparison/

Articles from Journal of Computing in Higher Education are provided here courtesy of Nature Publishing Group

RESOURCES