Abstract
This qualitative study has three objectives: (1) to develop a predictive AI model to categorize the online learning behavior of Thai students who study through a Thai Massive Open Online Course (MOOC); (2) to categorize students’ online behavior in a Thai MOOC; and (3) to evaluate the prediction accuracy of the developed predictive AI models. Data were collected from 8000 learners enrolled in the KMUTT015 course on the Thai MOOC platform. The k-means clustering algorithm classified learners enrolled in the Thai MOOC platform based on their online learning behaviors. The decision tree algorithm was used to assess the accuracy of the AI model prediction capability. The study finds the predictive AI model successfully categorizes learners based on their learning behaviors and predicts their future online learning behaviors in the online learning environment. The k-means clustering algorithm yields three groups of learners in the Thai MOOC platform: High Active Participants (HAP), Medium Active Participants (MAP), and Lurking participants. The findings also indicate high predictive accuracy rates for each behavioral group (HAP cluster = 0.98475, Lurking participants cluster = 0.967625, and MAP cluster = 0.955375), indicating the proficiency of the AI predictive model in forecasting learner behavior. The results of this study will benefit the design of online courses that respond to the needs of students with different online learning characteristics and help them achieve a high level of academic performance.
Keywords: AI predictive model, Online learning behavior, Thai MOOC, Online learning
Highlights
-
•
A predictive AI model was developed to increase the efficiency of online learning in Thai MOOCs.
-
•
AI model predicted online learning behavior with high accuracy, grouping learners into HAP, MAP, and LP using k-means clustering.
-
•
This predictive AI model will help instructors tailor their online curricula to match diverse learners' characteristics and enhance their achievement.
1. Introduction
The emergence of online learning platforms has ushered in a transformative era in education, expanding opportunities for accessible and flexible learning experiences [1,2]. While these platforms have significantly broadened educational access, they have also introduced unique challenges, particularly in ensuring learner engagement, success, and retention. The fundamental concern plaguing online education is the substantial gap between the number of students enrolled in courses and the successful course completion rate, which generally hovers between 6 % and 13 % [3,4]. This has raised critical questions about the efficacy of online teaching methodologies and the need for personalized learning experiences tailored to diverse learner behaviors [[5], [6], [7]], influenced by factors such as individual aptitude, prior knowledge, and the learning environment [5,[8], [9], [10]]. Learners can exhibit various behaviors, from solid foundational knowledge to grappling with course content, varying degrees of time management skills, and differing abilities to navigate the online learning ecosystem [3,6,11].
Massive Open Online Courses (MOOCs) offer a model of online education characterized by unlimited enrollment and open web-based access. The pedagogical approach of these courses often centers on pre-recorded video lectures, readings, online assessments, and discussion forums [12]. Prominent MOOC providers include Coursera, Udacity, and EdX. The open-source Open EdX software has reached international prominence as a platform for MOOC development, powering systems like JMOOC, KMOOC, and XuetangX [[13], [14], [15]]. In Thailand, the Thai MOOC has been used as a platform for organizing online teaching in a national open system for the public since 2016 under the supervision of the Office of the Permanent Secretary, Ministry of Higher Education, Science, Research and Innovation, to expand educational opportunities for Thai people and raise the quality of online courses to international standards [1]. However, it was found that the average student completion rate compared to registration numbers was very low (12.6 %) since this platform allows students to enroll and plan their studies independently without supervision. Some students did not effectively manage their studies, while others registered for courses that did not match their needs. Furthermore, some students lacked the necessary background knowledge and could not understand the lesson, resulting in their dropping out [3].
Innovative solutions, such as predictive AI models, are needed to increase the efficiency of online learning through MOOCs [3]. An AI predictive model is created with the use of machine learning and deep learning capabilities to analyze historical and current data to make predictions on future outcomes [16]. These models leverage machine-learning techniques to analyze the vast number of learner interactions and behavior datasets in online learning environments. AI-driven solutions provide personalized learning experiences by identifying patterns, making data-driven predictions, and optimizing learner outcomes [4]. Today, many educators and academicians use AI technology to improve learning experiences by tailoring courses for students with different online learning behaviors. For instance Ref. [17] employed AI technology to analyze teaching strategies and increase the efficiency of online courses [18]. used AI to predict the dropout behaviors of students who left school and find timely solutions [19]. used AI to classify learners' behaviors from MOOC data to improve teaching styles and design courses to meet student needs. This evidence indicates the AI model's capability to analyze students' learning behavior to improve online learning experiences.
Although the predictive AI model's ability to categorize and predict students' learning behavior, has been acknowledged, AI has never been used to improve the efficiency of Thai MOOCs. Hence, this research aims to develop a predictive AI model and evaluate its accuracy. The insights produced by the model can support instructors in designing effective online courses and refining their teaching approaches, ultimately leading to improved rates of student success. The objectives are as follows: (1) develop a predictive AI model to categorize the online learning behavior of Thai students studying through a Thai MOOC, (2) categorize the online behavior of students enrolled in a Thai MOOC, and (3) evaluate the accuracy of the developed predictive AI models in forecasting learners' online behavior.
2. Related work
In the landscape of online learning and predictive AI models, we developed our research study based on prior research endeavors to improve understanding and enhance learner behavior prediction in online learning environments. This section reviews the contributions from the existing literature that inform and contextualize our study.
2.1. Grounded theories
Self-determined learning forms a critical aspect of online education, particularly in MOOCs. This refers to learners’ ability to take ownership of their learning process, set their own goals, and regulate their learning activities [[20], [21], [22]]. Self-determined learning is essential in MOOCs as it empowers learners to take control of their own education, set meaningful goals, and engage in activities that align with their interests and needs [[22], [23], [24]]. Adaptive learning, a methodology that focuses on customizing the learning experience to meet the unique needs of each individual learner, is also important [25]. In MOOCs, adaptive learning technologies use data and algorithms to assess the strengths, weaknesses, and learning style of each learner. This allows educational content to be tailored and delivered to those specific requirements [[26], [27], [28]]. This personalized approach can enhance self-determined learning by providing content and activities that align with individual goals and interests. Additionally, adaptive learning enables learners to set achievable goals through the provision of the appropriate level of challenge and support.
2.2. Predictive AI models for online learning
As the popularity of online learning continues to rise, so does the need for innovative solutions to enhance learner engagement, success, and retention. Predictive AI models have emerged as a promising avenue to address these challenges [3]. These models leverage machine-learning techniques to analyze the many learner interactions and behavior datasets in online learning environments. AI-driven solutions provide personalized learning experiences by identifying patterns, making data-driven predictions, and optimizing learner outcomes [4].
Predictive models in online education are not new. The methodology of this study aligns with approaches used in other studies. For example, similar studies have used decision trees to develop predictive AI models [[29], [30], [31]], whereas others have utilized k-means clustering for student classification and for informing the training of predictive models [32,33]. Researchers have explored various methodologies to predict learner outcomes, including dropout rates, course completion, and academic performance [[8], [9], [10], [11],29,31,[34], [35], [36], [37]]. These studies have often relied on learning analytics methods to extract valuable insights from the data in online learning environments [[8], [9], [10], [11],29,31,[34], [35], [36], [37]]. Applying machine-learning algorithms, such as k-means clustering and decision trees, has proven effective in identifying behavioral patterns and making predictions [2,6,7,9,11,36].
2.3. Learner behavior classification
Based on online learning behaviors and participation rate in the course activities [38], classified learners into three types: lurking, moderately active, and memorably active, based. These studies identified commonalities and differences among learners, mirroring our effort in Phase 1 to categorize learners into High Active Participants (HAP), Medium Active Participants (MAP), and Lurking participants (LP) using k-means clustering. This classification serves as the foundation for our predictive AI models.
One of the primary goals of predictive AI models in online learning environments is to enable personalized and adaptive learning experiences [5,10,38]. These models categorize learners into groups based on their behavior: the HAP, MAP, and LP groups [5,39]. Once categorized, learners receive tailored content and interventions that align with their needs and motivations [8,29,31,[34], [35], [36]] improving engagement, satisfaction, and learning outcomes.
While predictive AI models offer significant promise, they also raise ethical considerations, particularly regarding learner privacy and data security [1,8]. It is essential to ensure that AI-driven insights are used responsibly and transparently. Learner consent mechanisms and clear guidelines for data usage are critical components of ethical AI applications in education.
2.4. MOOC platforms
The Thai MOOC platform, derived from the Open Edx software, aligns with the broader MOOC trend. Research guides, such as the Edx Research Guide [40], have provided valuable insights into tracking student behavior in MOOCs. Our study borrows from these insights and adapts them to the specific context of the Thai MOOC platform [40,41].
Online courses hosted on the Thai MOOC platform predominantly use video to deliver content, constituting over 65 % of the total content [13]. In addition, they incorporate interactive elements such as quizzes (review form during class) and exams (evaluation form at the end of the lesson) [1,2,13]. Other content types, such as text, images, and discussion forums, may also be present. Consequently, the logs of online learning behavior data collected from learners are categorized into four primary types [1,13].
Common Event Type: This category entails fundamental behavioral information about learners’ interactions with online courses. Key data points include attendance records, academic performance checks, time spent studying, and success rates based on course registrations and completion. These metrics provide a foundational understanding of learner engagement and progress.
Navigational Event Type: Navigational events capture learners’ interactions with navigation buttons in online courses and web pages. These interactions include clicking forward or backward in a course module or exiting the online course learning screen. Analyzing these events offers insights into how learners navigate course content and their engagement patterns.
Video Interaction Events Type: Given the prominent use of video content, this category focuses on behavioral data related to learners’ interactions with video components. These interactions include video playback, skipping video content, adjusting playback speed, and completing video viewing. Understanding how learners engage with videos is crucial to providing insights into their multimedia learning preferences and behaviors.
Problem Interaction Events Type: This category entails learners' interactions with quizzes and exams throughout the learning period. Data collected include responses to quiz questions, exam completion, and any interactions related to assessments. Analyzing this data type sheds light on learners’ performance, comprehension, and assessment-related behaviors.
Categorizing online learning behavior data into these four types, as shown in Fig. 1 provides insights into how learners engage with the online courses in the Thai MOOC platform. This multidimensional approach enables researchers to explore various facets of learner behavior, from foundational metrics to multimedia engagement, navigation patterns, and assessment performance. This rich dataset supports developing, evaluating, and enhancing predictive AI models for online education in the dynamic landscape of the Thai MOOC platform [40]. Thus, Fig. 1 provides a category name that lists insights in events in the tracking logs (ETL) of the Thai MOOC Platform, including common event, navigational event, video interaction events, and problem interaction events types.
Fig. 1.
Events in the Tracking Logs (ETL) of the Thai MOOC platform.
3. Research methodology
This study employed a quantitative research method to obtain outcomes based on the research objectives. The research procedures are detailed below.
3.1. Data sources
This study's primary data source was the online learning behavior of 8000 learners enrolled in the KMUTT015 course on the Thai MOOC platform. The data analysis focuses on ETL (Fig. 1), with a log file size encompassing 4 categories and 14 variables, and the clustering process use k-means clustering; this course was selected because of its specific characteristics. The collected data includes a large dataset comprising learner interactions and the participants' high achievement rates.
Data collection was conducted over 11 months, from February 1, 2021, to December 30, 2021. This timeframe allowed for a comprehensive exploration of seasonal variations and long-term trends in online learning behavior. The primary data sources comprised ETL maintained by the Thai MOOC platform. These logs adhered to the data variables and tracking mechanisms outlined in the Open Edx system manual [40] This meticulous adherence to data variables ensured the accuracy and consistency of the dataset.
3.2. Instruments
Data preprocessing played a pivotal role in the research process. ELK (Elasticsearch, Logstash, Kibana), a versatile and robust data analysis system, was installed on a dedicated server hosted at elk.thaimooc.org to efficiently organize and structure the raw data. ELK served as the core instrument for data preprocessing, facilitating the conversion of raw data into a format suitable for in-depth analysis.
This study used ELK to collect and filter ETL data from the Thai MOOC Platform server, filtering students’ behavioral data for targeted subjects and formatting the data into data frames for simplified k-means learner clustering.
3.3. Processes
Following data preprocessing, learners were classified based on their online learning behavior. The k-means clustering algorithm, a robust, unsupervised machine-learning technique, classified data points into distinct clusters. This algorithm was broadly accepted for its effectiveness in classifying students based on their similarities [[42], [43], [44]]. We utilized the classified data to develop predictive AI models using a decision tree algorithm. Both k-means clustering and decision trees are commonly used in educational research due to their simplicity, accuracy, and interpretable results. Therefore, this study uses this algorithm for the preprocessed data. The process involved two steps.
3.3.1. Develop a predictive AI model to categorize online learning behavior
This step creates an AI model that is capable of predicting online learning behavior and classifying students into groups based on this behavior. The model derives its predictive ability from being trained on data from 8000 categorized students, utilizing advanced predictive modeling algorithms.
3.3.2. Classify learners based on online learning behaviors
This step created an AI model that autonomously categorizes learners into groups based on their online learning behaviors. This classification was performed without direct instructor involvement to enhance learner autonomy.
3.3.3. Assess the accuracy of the predictive AI model
At this stage, the decision tree algorithm was used to assess the AI predictive model's accuracy because of its capability of presenting graphic demonstrations of the findings and its precise calculations, which are suitable for predictive model development [44,45].
4. Research findings
This section describes the research findings.
4.1. Development of a predictive AI model
With reference to research objective 1, this study seeks to develop a predictive AI model to categorize the online learning behavior of Thai students studying with a Thai MOOC. The development phase of the predictive AI model yielded promising outcomes.
4.1.1. Behavioral classification capability
K-means clustering successfully grouped learners based on online learning behavior into three categories: the HAP, MAP, and LP groups. Of 8000 participants, 4191 students were grouped into Cluster 1, 2632 into Cluster 2, and 1177 into Cluster 3, as presented in Fig. 2. This classification was used to teach the AI predictive model to classify learners based on their online learning behaviors.
Fig. 2.
Learners' online learning behavioral classification by AI predictive model using the k-means clustering technique.
Fig. 2 provides the results of k-means clustering, where students are classified into three groups on the basis of their online learning behavior. The figure also provides the number of students appearing in each cluster.
4.1.2. Predictive capabilities
The developed predictive model accurately predicts learners based on their learning behavior. The results from the development of predictive AI show a high prediction accuracy rate.
4.1.3. Application flexibility
This AI predictive model works automatically, adjusting the data size to fit the model training to increase prediction accuracy. This is useful for both instructors and learners.
4.2. Categorizing the online behavior of students enrolled in a Thai MOOC
With reference to research objective 2, this study seeks to categorize the online behavior of students enrolled in a Thai MOOC. The details are as follows:
Before using the k-means algorithm to classify learners’ online behaviors, the elbow technique was used to determine the k-value (number of categorized groups). The results revealed that although the graph illustrated eight clusters in total, the appropriate point for the number of clusters was the point where the graph had the most “elbow” shape, indicating that the number of clusters equals three. Therefore, a k-value of 3 is appropriate for dividing students from this data frame, as illustrated in Fig. 3.
Fig. 3.
K-value (number of categorized groups).
Fig. 3 presents the results of the elbow technique used to determine the appropriate number of clusters for k-means classification in online learning behaviors.
Categorizing learners according to their online learning characteristics using the k-means clustering algorithm, using ETL data of online learners interacting with the Thai MOOC platform system, drawing on the results of the elbow technique, set the number of clusters to three, is illustrated in the Radar chart in Fig. 4. The chart illustrates the behavior of students interacting with the system in each cluster as follows.
Cluster 1
(Medium Active Participants)
Fig. 4.
Radar chart of learners' behavior classified by the k-means clustering algorithm.
Learners in this group interacted with the Thai MOOC platform at a moderate level and passed the course evaluation criteria. The interaction of these students was moderate in almost all aspects, except accessing courses and checking academic results, which were the lowest of all three groups of students.
Cluster 2
(Lurking Participants)
The students in this group had minimal interaction with the Thai MOOC platform. The low number of clicks on sequential content observed by the Radar chart indicated that they hardly learned the content in the order assigned or did not study it at all. Moreover, they never encountered a video content page. Despite their low interaction with the MOOC platform, the learners in this group had the highest academic success rate and the highest examination scores of the three groups.
Cluster 3
(Highly Active Participants)
Learners in this group interacted highly with the Thai MOOC platform in almost every aspect compared with learners in other groups, especially in navigational and video interaction events. This shows that the students were trying to study in the order the teachers had designed.
Fig. 4 presents a radar chart to visualize learners’ behavior, classified into three clusters using k-means clustering. This chart presents a multidimensional to show how learners in each cluster interact with online learning behaviors.
The classification of online learning behaviors exhibited by Thai MOOC students offers valuable insights for course developers and platform owners. By understanding distinct online learning behavioral clusters, course developers can tailor their courses to more effectively cater to students’ diverse needs. In addition, platform owners can utilize this information to establish effective guidelines for online course development.
4.3. Predictive AI Model's accuracy in predicting learners' future behaviors
With regard to research objective 3, this study seeks to evaluate the accuracy of the predictive AI models developed to forecast learners’ online behavior. The details are as follows:
Accuracy evaluation using the decision tree algorithm provides essential insights into the effectiveness of predictive AI models. The reported accuracy rates for decision tree algorithms typically range between 0.89 and 0.98 [29,46,47], The model's overall accuracy rate was 0.953875, reflecting a high level of accuracy, as presented in Fig. 5.
Fig. 5.
Overall accuracy rate of the AI predictive model.
Fig. 5 presents the overall accuracy rate achieved by the AI predictive model using a decision tree algorithm.
The decision tree assessment revealed an 0.98475 accuracy rate of the AI predictive model of the HAP cluster, followed by the LP cluster at 0.967625, and the MAP cluster at 0.955375. All indicated high-accuracy prediction, as presented in Fig. 6.
Fig. 6.
Accuracy rate of the AI predictive model classified by student online behaviors.
Fig. 6 presents the accuracy rates achieved by the AI predictive model, using a decision tree algorithm, for three groups of learners, classified based on their online learning behavior.
These accuracy rates indicate that the AI predictive model is proficient in forecasting learners’ future behavior, paving the way for making personalized learning recommendations.
The predictive AI model's accuracy in predicting learners' future behaviors can enable teachers to anticipate how groups of students will interact with the online learning environment. This will help teachers adapt their teaching styles and update the content to better suit the needs of students with diverse characteristics.
5. Discussion
This research provides comprehensive insights into the AI-driven system and its ability to categorize learners’ online behavior and predict their future academic success. A primary contribution of this study is the successful development of predictive AI models for classifying learners based on their online behavior without the need for direct instructor involvement.
By implementing this AI predictive model, we found that learners who participated in Thai MOOCs can be classified into three groups: HAPs, MAPs, and LPs. This finding is consistent with the study of [32,33,48] which analyzed and classified students in a Thai MOOC into three groups according to their online learning behavior: active learners, passive learners, and bystanders. However, our findings diverge from studies like [49,50], which identified larger numbers of learner clusters. This discrepancy may reflect variations in data sources and clustering methodologies [38]. analyzed the online behavior of learners who participated in a Thai MOOC using data obtained from interviews and documents and divided learners into three groups: students in the lurking group participate in less than 50 % of the curriculum activities; those in the moderately active group participate in more than 50%–65 % of course activities; and those in the memorably active group participates in more than 65 % of activities.
As a result, using the decision tree algorithm to predict future learners' academic achievement, we achieved high-accuracy rates for the HAP, MAP, and LP learner groups, underscoring the robustness of these systems. This finding is consistent with the study of [51], who used the decision tree algorithm to accurately predict a 96 % dropout rate of MOOC learners. The experiment allows educators to make predictions early in the management process [52]. analyzed the behavior and motivation of online learners using the decision tree algorithm and predicted learner motivations in the MOOC online teaching system into three types: motivation, intrinsic motivation, and external motivation (intrinsic, extrinsic, and motivation) with an accuracy of 75 %. Therefore, the AI predictive model can enhance learner success and retention in online learning environments. Our findings also indicate that AI-driven solutions play a pivotal role: by providing timely interventions and personalized support based on predictive insights, institutions can proactively address learners’ needs and challenges, ultimately leading to improved learning outcomes.
This study's classification of the online behaviors of Thai MOOC students offers valuable insights for course developers and owners of platforms. Tailoring course designs and establishing development guidelines can clarify student needs according to these behavioral clusters. The accurate predictive AI model further empowers teachers. Being able to anticipate student interactions, teachers can adapt their teaching styles and content to better suit diverse learners. An important limitation of this study is that the data were only collected from the KMUTT015 course. To enhance the predictive accuracy of the AI model and potentially discover more nuanced behavioral clusters, future research should employ larger datasets encompassing a wider range of MOOC courses. Moreover, the information available on the ETL on the Thai MOOC platform is not comprehensive due to the limitations in the course design previously mentioned. Access to data from other platforms could potentially reveal a clearer picture of the diversity of behavioral clusters.
6. Recommendation
This study develops a successful predictive AI model for classifying learners based on their online learning behavior. The findings shed light on the potential of predictive AI models to revolutionize the landscape of online education in the Thai MOOC platform by categorizing learners based on their learning characteristics and predicting learners' future academic achievement by automating this process without direct instructor involvement. Our predictive AI model empowers learners to navigate online courses at their own pace and style. Future research can apply this procedure to categorize and predict students’ behavior. This study can also offer valuable insights for the effective design of online courses on the Thai MOOC platform and other LMSs. Educational institutions, policymakers, and instructors can leverage the distinct needs of different learner groups to develop courses and guidelines that enhance the quality of online learning experiences.
Future research should explore alternative prediction algorithms or hybrid predictive models to optimize their prediction potential and adapt them to changing data trends. This approach may increase prediction efficiency.
7. Conclusions
This study explored the potential of predictive AI models in online learning environments, focusing on the Thai MOOC platform. Our research journey encompassed developing an AI predictive model, classifying online learners participating in Thai MOOCs, and evaluating the system's prediction accuracy.
In our first and second sections, our research demonstrated the successful development of predictive AI models capable of classifying learners into HAP, MAP, and LP groups. This achievement was pivotal in enhancing learner autonomy in online courses, aligning with contemporary educational paradigms emphasizing personalized learning experiences.
In the third section, we evaluated the accuracy of these AI models, which were remarkably accurate across diverse learner groups. These findings substantiated the efficacy of predictive AI models in forecasting learner behavior and performance. Our findings underscore the potential of AI-driven solutions to enhance the quality of online education by offering tailored support to learners at varying levels of engagement.
Ethical approval statement
Ethical approval was obtained from the ethics committee at King Mongkut's University of Technology Thonburi (Ref No: KMUTT-IRB-COE2019-200).
Data availability statement
The dataset supporting the findings of this study is available from the corresponding author upon reasonable and justified request and a compiled version of the published data can be accessed at [link]."
Funding
This research did not receive any specific funding.
CRediT authorship contribution statement
Jira Chonraksuk: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. Surapon Bunlue: Supervision.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
The author would like to thank Executives and officials from the Thai Cyber University Project Office of the Permanent Secretary, Ministry of Higher Education, Science, Research and Innovation that has allowed the researcher to use data on online learning behavior (ETL) of students on the Thai MOOC Platform system and provide advice on the knowledge of organizing online teaching in the MOOC format as well. Moreover, I extend my gratitude to the instructors in the subject of Digital media creation on social networks (KMUTT015) who have developed a quality online course that has gained popularity with a large number of students and the students in the experimental subjects. This results in a wealth of online study behavior data for researchers to study student behavior and develop a predictive model.
Contributor Information
Jira Chonraksuk, Email: jira.cho@mail.kmutt.ac.th.
Surapon Boonlue, Email: surapon.boo@kmutt.ac.th.
References
- 1.Nittayathammakul V., Rattanasak S., Wannapiroon P., Nilsook P., Arora R., Thararattanasuwan K. Imagineering MOOC instructional design model to enhance creative thinking and creative health media innovation. International Journal of Emerging Technologies in Learning (IJET) 2023;18:84–102. doi: 10.3991/ijet.v18i19.38129. [DOI] [Google Scholar]
- 2.Wang R., Cao J., Xu Y., Li Y. Learning engagement in massive open online courses: a systematic review. Front Educ (Lausanne) 2022;7 doi: 10.3389/feduc.2022.1074435. [DOI] [Google Scholar]
- 3.Reich J. 2014. MOOC Completion and Retention in the Context of Student Intent. [Google Scholar]
- 4.Onah D., Sinclair J. 2014. Dropout Rates of Massive Open Online Courses: Behavioural Patterns. [DOI] [Google Scholar]
- 5.Murtaza M., Ahmed Y., Shamsi J.A., Sherwani F., Usman M. AI-based personalized E-learning systems: issues, challenges, and solutions. IEEE Access. 2022;10 doi: 10.1109/ACCESS.2022.3193938. [DOI] [Google Scholar]
- 6.Xie S.T., Chen Q., Liu K.H., Kong Q.Z., Cao X.J. Learning behavior analysis using clustering and evolutionary error correcting output code algorithms in small private online courses. Sci. Program. 2021;2021 doi: 10.1155/2021/9977977. [DOI] [Google Scholar]
- 7.Cen W., Gao Z., Xu R., Wu B., Zheng L., Zhao W., et al. 2021 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech) 2021. Extraction method for constructive proposals based on online comments; pp. 884–889. [DOI] [Google Scholar]
- 8.Raleiras M., Nabizadeh A.H., Costa F.A. Automatic learning styles prediction: a survey of the State-of-the-Art (2006–2021) Journal of Computers in Education. 2022;9 doi: 10.1007/s40692-021-00215-7. [DOI] [Google Scholar]
- 9.Wu M., Zhao H., Yan X., Guo Y., Wang K. 15th International Conference on Computer Science and Education, ICCSE 2020. 2020. Student achievement analysis and prediction based on the whole learning process. [DOI] [Google Scholar]
- 10.Liang J., Hare R., Chang T., Xu F., Tang Y., Wang F.Y., et al. Student modeling and analysis in adaptive instructional systems. IEEE Access. 2022;10 doi: 10.1109/ACCESS.2022.3178744. [DOI] [Google Scholar]
- 11.Hussain M., Hussain S., Zhang W., Zhu W., Theodorou P., Abidi S.M.R. ACM International Conference Proceeding Series. 2018. Mining moodle data to detect the inactive and low-performance students during the moodle course. [DOI] [Google Scholar]
- 12.Baturay M.H. An overview of the world of MOOCs. Procedia Soc Behav Sci. 2015;174:427–433. doi: 10.1016/j.sbspro.2015.01.685. [DOI] [Google Scholar]
- 13.Yuanyuan Z. MOOC teaching model of basic education based on fuzzy decision tree algorithm. Comput. Intell. Neurosci. 2022;2022 doi: 10.1155/2022/3175028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Thammetar T., Boonlue S., Khlaisang J., Duangchinda V., Theeraroungchaisri A. vol. 2022. 2022. (Trend Report of Higher Education & E-Learning in ASEAN : THAILAND). [Google Scholar]
- 15.Dhawal Shah. By the Numbers: MOOCs During the Pandemic n.d. https://www.classcentral.com/report/mooc-stats-pandemic (accessed August 24, 2023).
- 16.Donthu S., Veeran L., Sai Lakshmi Y.P., Yadav B. The intersection of AI and consumer behavior: predictive models in modern marketing. Remmitance Review. 2023;8 [Google Scholar]
- 17.Lin Y.S., Lai Y.H. Analysis of AI precision education strategy for small private online courses. Front. Psychol. 2021;12 doi: 10.3389/fpsyg.2021.749629. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Chi Z., Zhang S., Shi L. Analysis and prediction of MOOC learners' dropout behavior. Appl. Sci. 2023;13 doi: 10.3390/app13021068. [DOI] [Google Scholar]
- 19.Khalil M., Ebner M. Clustering patterns of engagement in Massive Open Online Courses (MOOCs): the use of learning analytics to reveal student categories. J. Comput. High Educ. 2017;29:114–132. doi: 10.1007/s12528-016-9126-9. [DOI] [Google Scholar]
- 20.Blaschke L.M. Heutagogy and lifelong learning: a review of heutagogical practice and self-determined learning. Int. Rev. Res. Open Dist. Learn. 2012;13:56–71. doi: 10.19173/irrodl.v13i1.1076. [DOI] [Google Scholar]
- 21.Blaschke L.M. Using social media to engage and develop the online learner in self-determined learning. Res. Learn. Technol. 2014;22 doi: 10.3402/rlt.v22.21635. [DOI] [Google Scholar]
- 22.Agonács N., Matos J.F. Learner agency in distance education settings: understanding language MOOC learners’ heutagogical attribute, Unleashing the Power of Learner. Agency. 2021:107–115. [Google Scholar]
- 23.Hendriks R.A., de Jong P.G.M., Admiraal W.F., Reinders M.E.J. Motivation for learning in campus-integrated MOOCs: self-determined students, grade hunters and teacher trusters. Computers and Education Open. 2024;6 doi: 10.1016/j.caeo.2023.100158. [DOI] [Google Scholar]
- 24.Agonács N., Matos J.F., Bartalesi-Graf D., O'Steen D.N. Are you ready? Self-determined learning readiness of language MOOC learners. Educ. Inf. Technol. 2020;25:1161–1179. doi: 10.1007/s10639-019-10017-1. [DOI] [Google Scholar]
- 25.Wang S., Christensen C., Cui W., Tong R., Yarnall L., Shear L., et al. When adaptive learning is effective learning: comparison of an adaptive learning system to teacher-led instruction. Interact. Learn. Environ. 2023;31:793–803. doi: 10.1080/10494820.2020.1808794. [DOI] [Google Scholar]
- 26.Gligorea I., Cioca M., Oancea R., Gorski A.T., Gorski H., Tudorache P. Adaptive learning using artificial intelligence in e-learning: a literature review. Educ. Sci. 2023;13 doi: 10.3390/educsci13121216. [DOI] [Google Scholar]
- 27.Li Y.H., Zhao B., Gan J.H. 10th International Conference on Computer Science and Education. 2015. Make adaptive learning of the MOOC: the CML model. ICCSE 2015. [DOI] [Google Scholar]
- 28.Rimbaud Y., McEwan T., Lawson A., Cairncross S. vol. 2015. February, 2015. Adaptive learning in computing for non-native speakers. (Proceedings - Frontiers in Education Conference, FIE). [DOI] [Google Scholar]
- 29.Hmedna B., El Mezouary A., Baz O. A predictive model for the identification of learning styles in MOOC environments. Cluster Comput. 2020;23:1303–1328. doi: 10.1007/s10586-019-02992-4. [DOI] [Google Scholar]
- 30.Xing W., Du D. Dropout prediction in MOOCs: using deep learning for personalized intervention. J. Educ. Comput. Res. 2019;57 doi: 10.1177/0735633118757015. [DOI] [Google Scholar]
- 31.Khor ET Features identification and classification of discussion threads in Coursera MOOC forums. Transforming Teaching and Learning in Higher Education: A Chronicle of Research and Development in a Singaporean Context. 2020 doi: 10.1007/978-981-15-4980-9_10. [DOI] [Google Scholar]
- 32.Tseng S.F., Tsao Y.W., Yu L.C., Chan C.L., Lai K.R. Who will pass? Analyzing learner behaviors in MOOCs. Res. Pract. Technol. Enhanc. Learn. (RPTEL) 2016;11 doi: 10.1186/s41039-016-0033-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Dyulicheva Y.Y. Learning analytics in MOOCs as an instrument for measuring math anxiety. Voprosy Obrazovaniya/Educational Studies Moscow. 2021;2021:243–265. doi: 10.17323/1814-9545-2021-4-243-265. [DOI] [Google Scholar]
- 34.Nouri J., Saqr M., Fors U. ICSIT 2019 - 10th International Conference on Society and Information Technologies, Proceedings. 2019. Predicting performance of students in a flipped classroom using machine learning: towards automated data-driven formative feedback. [Google Scholar]
- 35.Fahd K., Miah S.J., Ahmed K. Predicting student performance in a blended learning environment using learning management system interaction data. Appl. Comput. Inform. 2021 doi: 10.1108/ACI-06-2021-0150. [DOI] [Google Scholar]
- 36.Macarini L.A.B., Cechinel C., Machado M.F.B., Ramos V.F.C., Munoz R. Predicting students success in blended learning-Evaluating different interactions inside learning management systems. Appl. Sci. 2019;9 doi: 10.3390/app9245523. [DOI] [Google Scholar]
- 37.Nespereira C.G., Elhariri E., El-Bendary N., Vilas A.F., Redondo R.P.D. vol. 407. 2016. Machine learning based classification approach for predicting students performance in blended learning. (Advances in Intelligent Systems and Computing). [DOI] [Google Scholar]
- 38.Bingöl I., Kursun E., Kayaduman H. Factors for success and course completion in massive open online courses through the lens of participant types. Open Prax. 2019;12:223. doi: 10.5944/openpraxis.12.2.1067. [DOI] [Google Scholar]
- 39.Liao P., Xu J., Gong S., Liu W., Yi Y. ICCSE 2021 - IEEE 16th International Conference on Computer Science and Education. 2021. Clustering analysis of learners' watching sequences on MOOC videos. [DOI] [Google Scholar]
- 40.EdX Research Guide Events in the tracking logs n.d. https://edx.readthedocs.io/projects/devdata/en/stable/internal_data_formats/tracking_logs.html
- 41.Yu H., Miao C., Leung C., White T.J. Towards AI-powered personalization in MOOC learning. NPJ Sci Learn. 2017;2 doi: 10.1038/s41539-017-0016-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Sinaga K.P., Yang M.S. Unsupervised K-means clustering algorithm. IEEE Access. 2020;8:80716–80727. doi: 10.1109/ACCESS.2020.2988796. [DOI] [Google Scholar]
- 43.Sreedhar C., Kasiviswanath N., Chenna Reddy P. Clustering large datasets using K-means modified inter and intra clustering (KM-I2C) in Hadoop. J Big Data. 2017;4 doi: 10.1186/s40537-017-0087-2. [DOI] [Google Scholar]
- 44.Cui M. vol. 1. Clausius Scientific Press; 2020. pp. 5–8. (Introduction to the K-Means Clustering Algorithm Based on the Elbow Method). [Google Scholar]
- 45.Barros R.C., Basgalupp M.P., Freitas A.A., De Carvalho A.C.P.L.F. Evolutionary design of decision-tree algorithms tailored to microarray gene expression data sets. IEEE Trans. Evol. Comput. 2014;18 doi: 10.1109/TEVC.2013.2291813. [DOI] [Google Scholar]
- 46.Liang J., Li C., Zheng L. ICCSE 2016 - 11th International Conference on Computer Science and Education. Institute of Electrical and Electronics Engineers Inc.; 2016. Machine learning application in MOOCs: dropout prediction; pp. 52–57. [DOI] [Google Scholar]
- 47.Swai C.T., Mangowi S.E. Mining school teachers' MOOC training responses to infer their face-to-face teaching strategy preference. International Journal of Information and Learning Technology. 2022;39:82–94. doi: 10.1108/IJILT-07-2021-0102. [DOI] [Google Scholar]
- 48.Shrestha S., Pokharel M. Machine Learning algorithm in educational data. International Conference on Artificial Intelligence for Transforming Business and Society. 2019 doi: 10.1109/AITB48515.2019.8947443. AITB 2019. [DOI] [Google Scholar]
- 49.Lan M., Hou X., Qi X., Mattheos N. Self-regulated learning strategies in world's first MOOC in implant dentistry. Eur. J. Dent. Educ. 2019;23 doi: 10.1111/eje.12428. [DOI] [PubMed] [Google Scholar]
- 50.Li Y. Evaluation of learning efficiency of massive open online courses learners. International Journal of Emerging Technologies in Learning. 2022;17 doi: 10.3991/ijet.v17i17.33849. [DOI] [Google Scholar]
- 51.Panagiotakopoulos T., Kotsiantis S., Kostopoulos G., Iatrellis O., Kameas A. Early dropout prediction in moocs through supervised learning and hyperparameter optimization. Electronics (Switzerland) 2021;10 doi: 10.3390/electronics10141701. [DOI] [Google Scholar]
- 52.Al-Shabandar R., Hussain A.J., Liatsis P., Keight R. Analyzing learners behavior in MOOCs: an examination of performance and motivation using a data-driven approach. IEEE Access. 2018;6:73669–73685. doi: 10.1109/ACCESS.2018.2876755. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The dataset supporting the findings of this study is available from the corresponding author upon reasonable and justified request and a compiled version of the published data can be accessed at [link]."