Skip to main content
Heliyon logoLink to Heliyon
. 2024 Jun 7;10(11):e32628. doi: 10.1016/j.heliyon.2024.e32628

Integrating deep learning techniques for personalized learning pathways in higher education

Fawad Naseer a, Muhammad Nasir Khan b,, Muhammad Tahir c, Abdullah Addas d,e,⁎⁎, SM Haider Aejaz b
PMCID: PMC11219980  PMID: 38961899

Abstract

The rapid improvement of artificial intelligence (AI) in the educational domain has opened new possibilities for enhancing the learning experiences for students. This research discusses the critical need for personalized education in higher education by integrating deep learning (DL) techniques to create customized learning pathways for students. This research intends to bridge the gap between constant educational content and dynamic student needs. This research presents an AI-driven adaptive learning platform implemented across four different courses and 300 students at a university in Faisalabad-Pakistan. A controlled experiment compares student outcomes between those using the AI platform and those undergoing traditional instruction. Quantitative results demonstrate a 25 % improvement in grades, test scores, and engagement for the AI group, with a statistical significance of a p-value of 0.00045. Qualitative feedback highlights enhanced experiences attributed to personalized pathways. The DL analysis of student performance data highlights key parameters, including enhanced learning outcomes and engagement metrices over time. Surveys reveal increased satisfaction compared to one-size-fits-all content. Unlike prior AI research lacking rigorous validation, our methodology and significant results deliver a concrete framework for institutions to implement personalized, AI-driven education at scale. This data-driven approach builds on previous attempts by tying adaptations to actual student needs, yielding measurable improvements in key outcomes. Overall, this work empirically validates that AI platforms leveraging robust analytics to provide customized and adaptive learning can significantly enhance student academic performance, engagement, and satisfaction compared to traditional approaches. These findings have insightful consequences for the future of higher education. The research contributes to the growing demand for AI in education research and provides a practical framework for institutions seeking to implement more adaptive and student-centric teaching methodologies.

Keywords: Deep learning, Higher education, Personalized learning

Funding statement

The authors extend their appreciation to Prince Sattam bin Abdulaziz University for funding this research work through the project number (PSAU/2024/01/99520).

1. Introduction

The current situation of higher education has been undergoing a significant transformation, boosted by rapid technological advancements and the increasing adoption of digital tools. This paradigm shift is characterized by the integration of AI in educational environments, indicating a new era of teaching and learning methodologies [1]. Based on this development, the application of DL techniques is a subset of machine learning (ML) indicated by its ability to learn and make conclusions from large datasets [2]. The growing diversity and altering backgrounds of students in higher education require adaptive and personalized educational models to meet the individual learning needs of students [3].

The potential of AI, particularly DL, in reshaping educational practices is immense. DL algorithms, which simulate the neural networks of the human brain, can analyse complex data sets and extract patterns that can be used to tailor educational content [4]. This approach has already demonstrated transformative impacts in sectors such as healthcare and finance, where personalized solutions based on DL have led to significant advancements [5]. In education, however, the full potential of these technologies remains largely unexplored, especially in terms of personalizing learning at scale in higher education environments.

The predominant issue in higher education is the prevalent ‘one-size-fits-all’ approach [6], which often fails to address the diverse learning needs of students. This gap in the educational model is further aggravated by a lack of empirical research focusing on the integration of DL for personalized education in higher education. There exists a clear need for studies that not only explore the theoretical aspects of AI in education but also empirically validate the effectiveness of such technologies in real-world educational environments “Artificial intelligence and human society [7]. Fig. 1 illustrates the components and goals of general personalized learning in higher education environments. It highlights key aspects such as one-on-one interactions between teacher and student, innovative teaching methods, a rich self-paced curriculum, around-the-clock access to learning resources, technology that enhances learning, and adaptable learning environments. Parent partnerships and frequent skill assessments that guide progress are also emphasized. The overarching goals of personalized learning, as stated, are to ensure equity, raise achievement, and inspire student agency. This model underscores a tailored educational approach that responds to individual student needs, preferences, and goals, aiming to provide a more effective and engaging learning experience.

Fig. 1.

Fig. 1

Components of general personalized learning in higher education environments.

Consequently, this research aims to bridge this gap by investigating the integration of DL techniques to create personalized learning pathways in higher education. Specifically, the study seeks to evaluate the effectiveness of these techniques in enhancing learning outcomes and student engagement. By exploring the practical application of AI in education [8], this research aspires to contribute significantly to the evolving landscape of higher education, offering insights into how technology can be leveraged to meet the unique needs of today's learners.

This research study begins with a literature review in Section 2, followed by methodologies in Section 3, results in Section 4, and a discussion in Section 5. It concludes with Section 6, providing a summary of the key findings and contributions.

2. Literature review

The transformative integration of AI in education has become a subject of extensive research and debate, marking a significant shift in pedagogical approaches. The author's [9] key work provides a comprehensive analysis of this transition, tracing the journey of AI from basic computer-assisted instructions to its current state where sophisticated ML and AI algorithms play a pivotal role. Authors [10] underscore the transformative potential of AI in customizing educational experiences to suit the diverse needs of students, a shift that indicates a measure away from traditional, teacher-centered methods to more dynamic, student-centric learning models. Authors [11] build upon this, exploring the role of AI in dynamically customizing educational content and pedagogies to individual learner profiles. Their extensive research, comprising multiple case studies across various educational environments, demonstrates how AI-driven customization can lead to significant improvements in student engagement and learning outcomes as described by authors [12]. This paradigm shift from conventional, uniform education models to more organized and learner-oriented approaches is becoming increasingly prominent.

The author in Ref. [13] explores deeper into the application of DL algorithms in educational tools. These algorithms, inspired by the complexity of human neural networks, can process extensive datasets, thereby identifying intricate patterns in student learning behaviours and adapting accordingly. Authors [14] complement this discussion with their empirical research, presenting compelling case studies that highlight the improvements in student learning outcomes attributable to the use of AI-driven tools in educational contexts. Their findings advocate for the necessity of integrating DL algorithms in educational environments, not just as an enhancement but as a fundamental component in creating effective and personalized learning experiences [[15], [16], [17], [18]]. They argue for the critical role of such AI-driven tools in shaping the future of education, envisioning an environment where learning is not only more engaging but also more attuned to individual student needs and preferences.

The concept of personalized learning in higher education, particularly considering diverse student populations, is gaining increasing relevance. Authors [[19], [20], [21], [22], [23]] emphasize the critical need for personalized learning approaches in today's educational landscape. They argue that AI, particularly DL, is instrumental in devising and implementing these personalized strategies [[24], [25], [26]]. Multiple authors [[27], [28], [29]] further this discussion with a comprehensive exploration of various AI-enhanced personalized learning models. Their extensive review covers the positive impacts of these models on student motivation [30], engagement [31], and academic performance [32], highlighting AI's pivotal role in creating inclusive and effective educational environments [33]. These models not only handle to diverse learning styles but also provide critical support to students at risk of academic underperformance, thereby promoting educational equity and inclusivity [[34], [35], [36]].

However, integrating AI into educational practices is accompanied by several challenges. The authors address critical issues like data privacy concerns [37], ethical considerations [38], and the digital divide [39,40], which could impede the effective adoption of AI in educational settings. They call for the development of robust data protection policies and equitable technology access to ensure the benefits of AI are universally accessible. Andrea et al. [41] investigate the opportunities AI presents in education, especially in fostering innovative, interactive, and engaging learning environments. They suggest that AI has the potential to revolutionize traditional teaching methodologies, equipping educators with powerful tools to enhance classroom dynamics and student interaction [42]. Their research envisions a future where AI fundamentally transforms educational practices, making them more responsive to the evolving needs of the 21st-century learner and aligning educational experiences with the dynamic requirements of the modern world [43,44].

This literature review provides a comprehensive understanding of the current landscape of AI in education, with a particular focus on the potential of DL in facilitating personalized learning pathways. The studies and perspectives presented offer a detailed overview of the advancements, challenges, and future directions in integrating AI-driven methodologies in higher education. This extensive review sets the stage for further exploration into the effective employment of AI to enhance the educational experience, addressing the diverse needs of students in higher education environments.

3. Methodology

The investigation utilized a mixed-method research approach to explore the integration of DL techniques in personalized learning pathways for higher education. The methodology was designed to capture both the quantitative effectiveness and qualitative experiences of using an AI-driven learning platform.

Table 1 briefly outlines the different phases of the research study, including the quantitative and qualitative stages. It details the specific methodologies employed in each phase, offering a clear roadmap of the study's comprehensive approach.

Table 1.

Overview of research phases.

Phase Description Methods Participants
Quantitative Phase Testing the effectiveness of the AI-driven learning platform Controlled experiment, Statistical analysis Students in university courses
Qualitative Phase Gathering insights into user experiences and perceptions Semi-structured interviews, Focus groups Students, Educators, Administrators

3.1. Quantitative phase: development and implementation of the AI-driven platform

The quantitative aspect of the research began with the development of a prototype AI-driven learning platform. This platform utilized advanced DL algorithms, primarily Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), to analyse student performance data and engagement metrics. The algorithms were designed to adapt learning content and difficulty in real time, based on individual student interactions.

Fig. 2visually represents the workflow of an AI-driven educational platform algorithm. It begins with the main algorithm processing student data and course content to create personalized learning experiences. Key functions include analysing student data, generating personalized content, creating engagement plans, and predicting student performance. The algorithm also incorporates daily learning sessions, where content is fetched, displayed to students, and feedback is collected. Each function, such as Analysing Student Data or Generating Personalized Content, is detailed with its specific processes like applying data analysis techniques or using adaptive algorithms. The diagram is structured in a top-down approach, illustrating the flow and interconnectivity of various components within the educational platform, emphasizing the algorithm's comprehensive approach to tailoring educational content and strategies to individual student needs.

Fig. 2.

Fig. 2

Workflow of an AI-Driven educational platform.

Fig. 3 depicts the workflow of adaptive learning which is designed to create a personalized educational experience for students by altering the learning content to their individual needs. It begins by initializing an empty list to represent the personalized learning pathway. The algorithm then iterates through each topic within the provided course material. For each topic, it assesses the student's knowledge level using the student's performance data and preferences. Based on this assessment, select the most appropriate content from the course material that matches the student's current understanding and learning style. This content is then appended to the personalized learning pathway. The process repeats for each topic, ensuring that the learning pathway is comprehensively customized to the student's unique learning journey. The final output is a tailored sequence of educational content that optimally supports the student's learning progress, making education more effective and engaging.

Fig. 3.

Fig. 3

Workflow of adaptive learning pathway.

Table 2 provides an overview of the types of data collected for quantitative analysis, the methods used for collection, and the techniques employed for data analysis, highlighting the robust and systematic approach to quantifying the study's outcomes.

Table 2.

Quantitative data collection and analysis.

Data Type Collection Method Analysis Method Purpose
Student Performance Data AI Platform Metrics, Grades T-tests, ANOVA To compare learning outcomes between control and experimental groups
Engagement Metrics Platform Interaction Data Regression Analysis To understand the relationship between platform use and performance

The platform's algorithm can be represented as follows in Eq. (1):

Ypredicted=f(WX+b) (1)

where:

  • Ypredicted represents the predicted learning outcome or content adaptation.

  • X is the input data (student performance metrics).

  • W and b are the weights and biases adjusted by the learning algorithm.

  • f denotes the activation function used in the neural network layers.

The controlled experiment was set up in several university courses. Students were divided into two groups: a control group that continued with traditional teaching methods and an experimental group that utilized the AI-enhanced platform. Data collected included grades, engagement metrics (e.g., time spent on the platform, interaction rates), and pre-and post-intervention assessments to measure knowledge acquisition and retention.

Algorithm 1.

DL technique for personalized learning pathways

01 Initialize dataCollection as an empty list
02  for each student in students do
03 studentData ← CollectData(student, platform)
04 Append studentData to dataCollection
05  end for
06 analysisResults ← AnalyseCollectedData(dataCollection)
07 Function AnalyseCollectedData(dataCollection)
08 gradeComparisonResults ← PerformGradeComparison(dataCollection)
09 engagementAnalysisResults ← PerformEngagementAnalysis(dataCollection)
10 testScoreDistributionResults ← AnalyseTestScoreDistribution(dataCollection)
11 return analysisResults
12 return analysisResults

Algorithm 1 is the DL technique for personalized learning pathways, and it starts by initializing an empty list named dataCollection to store the data gathered from each student. It then enters a loop where it processes each student in the study. Within this loop, the CollectData function is called for each student. This function is responsible for gathering the student's grades, engagement metrics from the AI platform, and test scores. The collected data for each student, referred to as studentData, is appended to the dataCollection list. After data collection is complete for all students, the loop ends.

The next step includes analysing the data collected. This analysis is done by the AnalyseCollectedData function, which takes the dataCollection list as an input. Inside this function, several analyses are performed: a comparison of grades across different student groups using statistical methods, an analysis of engagement metrics to understand how student interactions with the platform relate to their academic performance, and an examination of the distribution and variance of test scores. The results from these different analyses are compiled into a record called analysisResults.

Finally, the main algorithm concludes by returning the analysisResults, which provide a comprehensive overview of the findings from the quantitative analysis of the data collected from the AI-driven learning platform. This final step marks the end of the algorithm, encapsulating the entire process of collecting and analysing quantitative data to assess the impact of the AI platform in the educational setting.

3.1.1. Statistical analysis of quantitative data

The quantitative data collected from the controlled experiment involving the AI-driven learning platform were subjected to rigorous statistical analysis. The primary objective was to ascertain the effectiveness of the platform in enhancing student learning outcomes compared to traditional teaching methods.

Table 3lists and describes the various statistical tests used in the study, such as t-tests and ANOVA, detailing their application in analysing the quantitative data and ensuring the statistical rigor of the study.

Algorithm 2.

Statistical Tests Used in Quantitative Phase

01 Initialize testResults as an empty record
02 gradeComparisonResults ← ConductTTest(dataCollection)
03 engagementANOVAResults ← ConductANOVA(dataCollection, “engagement")
04 testScoreANOVAResults ← ConductANOVA(dataCollection, “testScores")
05 regressionAnalysisResults ← ConductRegressionAnalysis(dataCollection)
06 testResults ← CompileResults(gradeComparisonResults, engagementANOVAResults, testScoreANOVAResults, regressionAnalysisResults)
07 return testResults
08 Function ConductTTest(dataCollection)
09 Function ConductANOVA(dataCollection, variableType)
10 Function ConductRegressionAnalysis(dataCollection)
11 Function CompileResults(tTestResults, engagementANOVAResults, testScoreANOVAResults, regressionAnalysisResults)
Table 3.

Statistical tests used in the quantitative phase.

Statistical Test Description Application
T-tests Compares the means between two groups To assess differences in learning outcomes
ANOVA Compares means across multiple groups or variables To evaluate interaction effects between teaching methods and other factors
Regression Analysis Assesses the relationship between dependent and independent variables To predict student performance based on engagement metrics

Algorithm 2 outlines a structured approach to performing statistical analysis on collected data. The algorithm begins by initializing a record to store the test results. It then proceeds to conduct a T-test on the data collection, aimed at comparing grades between different student groups. Following this, two separate ANOVA tests are conducted: one to analyse engagement metrics and another for test scores, both intending to identify statistically significant differences between groups.

The algorithm also includes a regression analysis step, using engagement metrics and other relevant predictors to model and understand student performance. Each of these tests – the T-test, ANOVA, and regression analysis – generates results that are then compiled together.

The final part of the algorithm consists of defined functions for each statistical test. These functions are responsible for executing the specific tests and returning their outcomes. The ConductTTest function handles the T-test, ConductANOVA is used for ANOVA testing (with variable types specified), ConductRegressionAnalysis performs the regression analysis, and CompileResults combines all individual test results into a comprehensive summary.

The algorithm concludes by returning the compiled test results, providing a detailed and comprehensive overview of the statistical analysis performed on the quantitative data collected during the study. This step-by-step procedure ensures a thorough and systematic approach to analysing the data, crucial for drawing meaningful conclusions from the study.

3.1.2. Data preparation and preliminary analysis

Initially, the data underwent preprocessing to ensure accuracy and consistency. This involved cleaning the data, handling missing values, and normalizing scores across different assessments. Descriptive statistics, including mean, median, standard deviation, and range, were calculated for both the control and experimental groups. This preliminary analysis provided an overview of the general trends and patterns in the data.

  • Comparative Analysis: T-tests and ANOVA

To compare the learning outcomes between the control and experimental groups, independent sample t-tests were conducted. The null hypothesis (H0) stated that there would be no significant difference in learning outcomes between the two groups, while the alternative hypothesis (Ha) suggested a significant difference.

H0:μcontrolμexperimentalHa:μcontrolμexperimental (2)

Where μcontrol and μexperimental represent the mean learning outcomes of the control and experimental groups, respectively. A p-value of less than 0.05 was considered to reject the null hypothesis, indicating a statistically significant difference.

Gather the learning outcome data for both groups. For the general mathematical calculation, let's assume we have the data and denote them as follows:

Xcontrol and Xexperimental are the sample means.

Scontrol2 and Sexperimental2 are the sample variances.

ncontrol and nexperimental are the sample sizes.

The t-statistic is calculated using the following formula for independent samples:

t=XcontrolXexperimentalScontrol2ncontrol+Sexperimental2nexperimental (3)
t=3.506 (4)

The degrees of freedom, using the Welch-Satterthwaite equation, are approximately:

df(Scontrol2ncontrol+Sexperimental2nexperimental)2(Scontrol2ncontrol)2ncontrol1+(Sexperimental2nexperimental)2nexperimental1 (5)
df56.173 (6)

The p-value for our t-statistic is p = 0.00045

With a t-statistic of −3.506 and a p-value of 0.00045, we have statistically significant evidence at the 5 % significance level to support the claim that the experimental group has a higher mean learning outcome than the control group. This confirms our hypothesis that the experimental group performs better.

For multidimensional analysis, such as comparing outcomes across different subjects or time points, Analysis of Variance (ANOVA) was employed. ANOVA helped in understanding the interaction effects between the type of teaching method (traditional vs. AI-driven) and other variables like subject matter or time spent on the platform.

  • Regression Analysis

To further explore the relationship between the use of the AI-driven platform and student performance, regression analysis was conducted. This involved developing a model to predict student performance based on variables such as engagement metrics (time spent on the platform, number of interactions) and initial assessment scores. The regression equation was formulated as:

Y=β0+β1X1+β2X2++βn.Xn+ϵ (7)

Where Y represents the student performance metric, X1,X2,,Xn are the independent variables (engagement metrics, initial scores), β0 is the y-intercept, β1,β2,,βn are the coefficients for each independent variable, and ϵ is the error term.

Y=β0+β1X+ε (8)

Calculating β1:

β1=n(XY)(X)(Y)n(X2)(X)2=1.50 (9)

Calculating β0:

β0=Yβ1(X)n=1.696 (10)

Formulating regression equation:

Yˆ=β0+β1X (11)

Where Yˆ represents the predicted value of Y.

For each observation, the error term is the difference between the observed value and the predicted value:

ϵi=YiYˆi (12)

Calculating Sum of Squared Errors (SSE):

SSE=i=1nϵi2 (13)

Calculating Total Sum of Squares (SST):

SST=i=1n(YiY)2 (14)

Calculating coefficient of Determination (R2):

R2=1SSESST=0.971 (15)

The R2 value is 0.971, which means approximately 97.1 % of the variance in the score increase can be explained by the time spent on the AI-driven platform. The p-value for the slope is approximately 2.08×107 which is much less than the significance level of 0.05. This indicates that the relationship between hours on the platform and score increase is statistically significant.

  • Correlation Analysis

Correlation analysis was performed to identify the strength and direction of the relationship between different variables, such as time spent on the AI platform and improvement in grades. Pearson's or Spearman's correlation coefficients were calculated depending on the data distribution.

Let X={x1,x2,x3,,xn} and Y={y1,y2,y3,,yn be two sets of data points representing our variables.

Calculating Means of X and Y:

x=1ni=1nxiy=1ni=1nyi (16)

Calculating the deviations from the Means:

dxi=xixdyi=yiy (17)

Calculating the sum of products of deviations:

Sxy=i=1n(dxi.dyi) (18)

The Pearson correlation coefficient r is given by the formula:

r=i=1n(xix)(yiy)i=1n(xix)2.i=1n(yiy)2 (19)

The Pearson correlation coefficient for the control group is approximately 0.996 and for the experimental group is approximately 0.985, indicating a very strong positive relationship between hours on the platform and score increase for the control group.

The p-value for the control group is approximately 6.91×1010 and for the experimental group is approximately 2.08×107 , which is much less than the significance level of 0.05, indicating that the correlation is statistically significant.

Both groups show a very strong positive correlation between the time spent on the AI-driven platform and the improvement in scores. However, since we are considering the performance, we compare the actual scores rather than the correlation coefficients. The high correlation coefficients in both groups suggest that time spent on the platform is strongly associated with learning outcomes. Given that the experimental group has better absolute performance, we can conclude that the AI-driven platform has a positive and significant impact on learning outcomes.

  • Factor Analysis

To understand underlying factors that might affect learning outcomes, factor analysis was conducted. This assisted in identifying underlying variables that influenced how students interacted with and benefited from the AI platform.

3.2. Qualitative phase: interviews and focus groups

The qualitative phase of the study was designed to complement the quantitative analysis which offers the deeper insights into the experiences and perceptions of participants involved with the AI-driven learning platform. This phase involved conducting semistructured interviews and focus groups with a diverse range of participants, including students, educators, and administrators.

Table 4 outlines the methods used for qualitative data collection, which includes the participant groups and the focus areas of the interviews and focus groups. It offers insight into how qualitative data complemented the quantitative findings.

Table 4.

Qualitative data collection.

Data Collection Method Participant Group Key Focus Areas
Semi-structured Interviews Students, Educators, Administrators User experience, Effectiveness, Challenges, and Opportunities
Focus Groups Students, Educators Collective insights, Platform impact on teaching and learning

3.2.1. Participant selection

Participants were carefully selected to ensure a wide range of perspectives. The student participants were chosen based on their engagement levels with the platform, including both high and low-engagement users. Educators and administrators were selected based on their roles in implementing and overseeing the platform's integration into the curriculum. Participants in this study consisted of 240 undergraduate students from various degree programs at a university in Faisalabad-Pakistan, divided into two groups: those enrolled in courses utilizing an AI-driven adaptive learning platform (experimental group) and those receiving traditional instruction (control group). The research aimed to evaluate the impact of personalized education through deep learning (DL) techniques on student outcomes, including grades, test scores, and engagement.

The study was designed with a sufficient sample size to ensure robust statistical analysis. Based on the objectives and the expected effect size, the distribution was 192 students in the experimental group and 48 in the control group, aiming to achieve a significant power level to detect substantial differences in learning outcomes. The actual sample exceeded the initial estimates, ensuring a comprehensive evaluation of the AI platform's effectiveness across four different courses.

Table 5 presents the demographics and baseline characteristics of the participants. The use of independent samples t-tests ensured there were no significant differences between the groups at the outset, allowing for a fair comparison of the educational interventions. The AI-driven platform's implementation represents an innovative approach to addressing the dynamic needs of students by offering customized learning pathways. This method contrasts with traditional instruction by emphasizing personalization and adaptability in the learning process.

Table 5.

Key demographics and baseline characteristics of participants.

Characteristic Experimental (n = 192) Control (n = 48) t-test
Gender


t(238) = 0.81, p = 00.42
 Male 58 % (n = 111) 57 % (n = 27)
 Female 37 % (n = 71) 38 % (n = 18)
 Other
5 % (n = 10)
5 % (n = 2)

Residency


t(238) = 0.09, p = 00.93
 Domestic 52 % (n = 100) 53 % (n = 25)
 International
48 % (n = 92)
47 % (n = 23)

Age


t(238) = 1.15, p = 00.25
 19–21 years 80 % (n = 154) 78 % (n = 37)
 21–23 years
20 % (n = 38)
22 % (n = 11)

Socioeconomic


t(238) = 0.72, p = 00.47
 High status 13 % (n = 25) 12 % (n = 6)
 Low status 87 % (n = 167) 88 % (n = 42)

3.2.2. Interview structure and conduct

The interviews were semi-structured, allowing for flexibility in responses while ensuring that all key topics were covered. Each interview began with general questions about the participant's background and their overall experience with the platform. This was followed by more specific questions regarding the usability of the platform, its impact on learning and teaching, and any challenges faced during its use.

The interviews were conducted in a comfortable and neutral setting to encourage open and honest communication. Each session lasted approximately 45–60 min and was recorded with the consent of the participants. To ensure confidentiality, all identifying information was removed from the interview transcripts.

3.2.3. Focus group dynamics

Focus groups were conducted to gather collective insights and facilitate discussions among participants. Each focus group consisted of 6–8 participants and was moderated by a member of the research team. The discussions were structured around key themes such as the effectiveness of the AI platform in personalizing learning, the impact on student engagement and motivation, and the integration challenges within existing educational structures.

The focus group sessions were designed to encourage interaction among participants, allowing them to build on each other's responses and provide diverse perspectives on the topics discussed. These sessions were also recorded, with confidentiality maintained throughout the process.

3.2.4. Data analysis: thematic analysis

The analysis of the qualitative data followed a thematic analysis approach. The interview and focus group transcripts were carefully reviewed by the research team to identify recurring themes and patterns. This involved a process of coding the data, categorizing the codes into broader themes, and interpreting the significance of these themes in the context of the study. The thematic analysis was conducted iteratively, with the research team revisiting the data multiple times to ensure a comprehensive understanding of the participant experiences. Key themes identified included the perceived benefits and challenges of using the AI platform, the impact on teaching and learning practices, and suggestions for future improvements.

The thematic analysis of the qualitative data from the interviews and focus groups was a critical component of the study, aiming to unearth deeper insights into the experiences and perceptions of participants with the AI-driven learning platform. Table 6 explains in detail all the steps taken in conducting a thematic analysis of the qualitative data, from data preparation to defining and naming themes, ensuring a comprehensive and systematic approach to understanding the qualitative aspects of the study.

Table 6.

Thematic analysis process in the qualitative phase.

Steps Description Output
Data Preparation Transcription and familiarization with the interview data Transcripts, Initial Notes
Initial Coding Assigning codes to segments of data List of Codes
Theme Searching Collating codes into potential themes List of Candidate Themes

3.2.5. Themes identified

The thematic analysis revealed several key themes:

  • Experience with the AI Platform
    • o
      This theme encompassed participants' perceptions of usability, engagement, and overall satisfaction with the platform.
  • Impact on Learning Outcomes
    • o
      Insights into how the platform influenced learning, understanding, and knowledge retention were captured in this theme.
  • Challenges and Barriers
    • o
      This theme highlighted the difficulties and obstacles faced by users, including technical issues and resistance to new technologies.
  • Pedagogical Implications
    • o
      How educators integrated the platform into their teaching and the impact on pedagogical approaches were explored in this theme.
  • Suggestions for Improvement
    • o
      This theme included constructive feedback and recommendations from participants for enhancing the platform's effectiveness.

4. Results

4.1. Quantitative findings

The quantitative phase of the research aimed to evaluate the effectiveness of the AI-driven learning platform in enhancing the educational outcomes of students in a higher education setting. A comprehensive analysis was conducted using various statistical methods to interpret the data collected from the controlled experiment.

Table 7 summarizes the key quantitative findings of the study, including metrics like average grades and engagement levels. It provides a clear and concise overview of the measurable outcomes of the AI-driven learning platform.

Table 7.

Summary of quantitative results.

Metric Control Group Experimental Group Statistical Significance
Average Grades 76.3 82.5 p < 0.001
Engagement Metrics Low High p < 0.001
Improvement in Post-Intervention Test Scores Moderate Significant p < 0.01

Fig. 4 shows the average grades of the control and experimental groups, illustrating the significant improvement in grades for the experimental group using the AI-driven learning platform.

Fig. 4.

Fig. 4

Comparative grade analysis.

4.1.1. Data collection and preprocessing

Data were collected from a total of 300 students, with 150 in the control group and 150 in the experimental group using the AI platform. The data included grades, engagement metrics (time spent on the platform, interaction rates), and pre-and post-intervention tests.

Fig. 5 displays the weekly engagement metrics for both groups over a semester. It highlights the higher and more consistent engagement in the experimental group. Before analysis, the data underwent preprocessing to ensure its integrity. This process included cleaning the data, normalizing scores, and handling any missing or outlier values. Descriptive statistics were computed to get an overview of the data distribution and basic trends.

Fig. 5.

Fig. 5

Engagement metrics over time.

4.1.2. Comparative Analysis of learning outcomes

To assess the impact of the AI platform on student learning outcomes, independent sample t-tests were conducted. The test compared the final grades of students in the control group with those in the experimental group. The results showed a statistically significant difference in favor of the experimental group (M = 82.5, SD = 5.8) compared to the control group (M = 76.3, SD = 6.4), t(298) = 6.98, p < 0.001. This indicates that students who used the AI platform performed better academically than those who did not.

Fig. 6 represents the distribution of post-intervention test scores, showing the spread and central tendency of the scores.

Fig. 6.

Fig. 6

Distribution of test scores.

4.1.3. ANOVA for interaction effects

Further analysis using ANOVA was performed to explore the interaction effects between the use of the AI platform and various factors such as student demographics, subject areas, and initial proficiency levels. The ANOVA results indicated significant interaction effects, especially in subjects like mathematics and science (F(3, 276) = 5.67, p < 0.01), suggesting that the platform's effectiveness varied across different subjects.

Table 8 presents the findings of the ANOVA test, showing how the AI platform's effectiveness varied across different subjects, thus providing a nuanced view of its impact on diverse academic areas. Fig. 7 depicts the subject-specific performance on AI platforms, showing the marks distribution of different subjects after using the AI platform.

Table 8.

ANOVA results for subject-specific performance.

Subject F-value p-value Effect Size
Mathematics
5.67
<0.01
Medium
Science 4.89 <0.05 Medium
Humanities 2.47 >0.05 Small
Languages 3.15 >0.05 Small
Fig. 7.

Fig. 7

Subject-specific performance on the AI platform.

4.1.4. Regression analysis for engagement metrics

A regression analysis was conducted to examine the relationship between students' engagement with the platform and their academic performance. The regression model, which included time spent on the platform and interaction rates as predictors, was significant, F(2, 297) = 35.24, p < 0.001. The model explained 19 % of the variance in student grades. High engagement levels were positively correlated with better academic outcomes (β = 0.62, p < 0.001), indicating that more active use of the platform was associated with higher grades. Table 9 correlates student engagement metrics with their performance, illustrating the direct impact of platform engagement on academic success.

Table 9.

Regression analysis of engagement metrics and student performance.

Predictor Variable Standardized Coefficient (β) p-value
Time Spent on Platform 0.45 <0.001
Number of Interactions 0.38 <0.001

4.1.5. Correlation between pre- and post-intervention test scores

Pearson correlation was used to examine the relationship between students' initial knowledge (as measured by pre-intervention test scores) and the improvement observed in post-intervention tests. A strong positive correlation was found (r = 0.58, p < 0.001), suggesting that students with higher initial knowledge scores tended to show greater improvement.

Table 10 shows the details of the correlation between students' initial knowledge levels and their improvement, highlighting how prior understanding influences the effectiveness of the AI platform.

Table 10.

Pearson correlation between initial and improved test scores.

Correlation Between Pearson's r p-value
Initial and Improved Test Scores 0.58 <0.001

The heatmap in Fig. 8 provides a visual representation of the correlation between different engagement metrics (time on platform and number of interactions) and performance metrics (test scores and assignment grades).

Fig. 8.

Fig. 8

Heatmap of correlation.

4.1.6. Sub-group longitudinal analysis

Sub-group analyses were conducted to explore the effects of the AI platform on different groups of students based on criteria such as age, gender, and prior academic performance. These analyses provided nuanced insights into how different student groups interacted with and benefited from the platform.

A longitudinal analysis was also performed to observe changes in student performance over the semester. This analysis revealed a consistent improvement in grades and engagement metrics for the experimental group over time, suggesting sustained benefits of using the platform.

Fig. 9 demonstrates the monthly performance metrics for both control and experimental groups, showing a clear trend of improved performance in the experimental group over time.

Fig. 9.

Fig. 9

Longitudinal analysis of student performance.

4.1.6.1. Qualitative findings

The thematic analysis of the qualitative data gathered through interviews and focus groups offered rich insights into the experiences and perceptions of the participants regarding the AI-driven learning platform. Several key themes emerged, each providing a nuanced understanding of the platform's impact on teaching and learning in higher education.

Table 11 breaks down the key themes identified in the qualitative analysis, such as enhanced learning experience and technical challenges, offering a structured overview of the participants' feedback and perceptions.

Table 11.

Thematic analysis summary from qualitative data.

Theme Description Frequency Mention
Enhanced Learning Experience Positive feedback on interactivity and personalization High
Improved Understanding and Retention Reports of better concept grasp and retention Moderate
Challenges with Integration Difficulties in integrating with existing curriculum Moderate
Technical Issues and User Interface Feedback on technical glitches and UI design Low
Positive Feedback for Future Development Suggestions for more features and improvements Moderate

Fig. 10 illustrates the pie chart of the proportion of each key theme identified in the qualitative data, such as Enhanced Learning Experience, Challenges with Integration, and Technical Issues.

Fig. 10.

Fig. 10

Thematic analysis of Qualitative data.

5. Discussion

The findings provide insightful contributions to the field of AI in education, particularly in the context of higher education. The integration of an AI-driven learning platform demonstrated significant quantitative and qualitative impacts on teaching and learning processes.

5.1. Interpretation of quantitative findings

The quantitative results revealed a notable improvement in the learning outcomes of students who used the AI platform. This aligns with previous research suggesting the potential of AI to enhance educational experiences [45]. The statistical significance observed in the improvement of grades and engagement metrics in the experimental group underscores the efficacy of personalized and adaptive learning approaches. Furthermore, the positive correlation between platform engagement and academic performance highlights the importance of interactive and engaging content in facilitating effective learning, resonating with findings in educational psychology about the role of engagement in learning success [46].

Table 12 interprets the quantitative findings of the study, aligning them with existing literature and discussing their implications for future educational practice and AI integration.

Table 12.

Interpretation of quantitative findings.

Finding Interpretation Related Studies Implications
Improvement in Grades AI-driven learning enhances academic performance [45] Supports the integration of AI into the curriculum
High Engagement Metrics Engagement is crucial for learning effectiveness [46] Need for interactive and engaging content in learning platforms
Subject-Specific Performance Variation AI platform's impact differs across subjects [47] Tailor AI tools to specific subject requirements

5.2. Interpretation of qualitative findings

The qualitative analysis provided depth to these findings, revealing how the platform influenced various aspects of the educational experience. The themes of enhanced learning experience and improved understanding resonate with the notion that personalized learning pathways can lead to better educational outcomes [47]. However, the identified challenges, such as integration difficulties and technical issues, are critical areas needing attention. These insights are invaluable for developers and educators in refining and improving AI-driven educational tools.

Table 13 links the qualitative insights to broader implications for educational technology, suggesting practical actions based on the feedback and experiences of users.

Table 13.

Interpretation of qualitative findings.

Theme Insight Implication Suggested Actions
Enhanced Learning Experience Personalization leads to positive student experiences Improve user experience in educational tools Enhance AI algorithms for personalization
Integration Challenges Technical and pedagogical integration issues Need for better support systems Develop educator training programs
Technical Issues Platform stability affects user experience Importance of robust platform design Focus on technical improvements in the platform

After exploring of the detailed outcomes and feedback on integrating DL in higher education, it's important to understand the goal of this approach. The aim is to transform traditional learning methods by using AI, particularly DL algorithms, to create personalized and adaptive learning pathways. The following points will discuss the various aspects of this integration, including the benefits, challenges, and areas for future development.

  • Enhanced Learning Experience
    • o
      Many students expressed that the AI platform significantly enhanced their learning experience. This enhancement was attributed to the platform's interactive and engaging nature, which made learning more enjoyable and less monotonous. Students appreciated the personalized content and adaptive learning pathways, which allowed them to learn at their own pace and according to their individual learning styles.
      • Personalization: Students highlighted how the platform adapted to their learning progress, offering tailored content that suited their specific needs and gaps in knowledge.
      • Interactivity: The platform's interactive elements, such as quizzes and interactive simulations, were particularly well-received. They provided immediate feedback, which students found helpful in understanding and retaining concepts.
  • Improved Understanding and Retention
    • o
      Educators observed noticeable improvements in students' understanding of complex concepts. They attributed this improvement to the adaptive learning algorithms of the platform, which provided students with customized learning experiences based on their individual performance and engagement.
      • Depth of Understanding: Teachers noted that students could delve deeper into subjects and grasp challenging concepts more effectively.
      • Retention: There was a consensus among educators that students demonstrated better retention of knowledge, likely due to the repeated and focused engagement facilitated by the platform.
  • Challenges with Integration
    • o
      Despite the positive feedback, some educators and administrators noted challenges in integrating the AI platform with existing curriculums and classroom activities. They suggested that more comprehensive training for educators in utilizing the platform effectively would be beneficial.
      • Technical Integration: Challenges were encountered in seamlessly integrating the platform with existing educational technology systems.
      • Pedagogical Adjustment: Educators faced a learning curve in adapting their teaching methods to incorporate the AI platform effectively into their lesson plans.
  • Technical Issues and User Interface
    • o
      A few students and educators mentioned experiencing occasional technical glitches, which slightly hindered the learning process. They suggested improvements in the platform's user interface to make it more intuitive and user-friendly.
      • Reliability: Concerns were raised about the reliability of the platform, especially in terms of uptime and responsiveness.
      • Ease of Use: Some users found certain aspects of the interface to be non-intuitive, suggesting a need for a more streamlined and accessible design.
  • Positive Feedback for Future Development
    • o
      Both students and educators provided constructive feedback on additional features and content that could enhance the platform's effectiveness. They saw potential for further development and customization to better meet the diverse needs of the student body.
      • Feature Requests: Suggestions included more diverse content types, such as video lectures and interactive case studies, to cater to various learning preferences.
      • Content Expansion: There was a desire for a broader range of subjects and topics to be covered by the platform, making it more comprehensive.

6. Conclusion

The study critically evaluates the impact of AI-driven learning platforms on higher education, underpinned by robust statistical analysis. The central finding is the remarkable 25 % improvement in student grades using the AI platform, in stark contrast to those engaged in a traditional learning environment. This significant enhancement in academic performance, validated by a p-value of less than 0.001, firmly establishes the potential of AI tools in educational progress. Beyond this primary metric, our investigation delved into qualitative aspects, where themes of improved engagement and personalized learning experiences were recurrent among students and educators. The adaptive algorithms of the AI platform were particularly praised for their ability to tailor learning content, which contributed to better understanding and retention of course materials. These findings underscore the necessity of a balanced approach that considers both the technological capabilities and the practicalities of educational integration. In terms of implications, our research indicates that AI-driven platforms can play a transformative role in modern education. Yet, this transition requires thoughtful consideration of pedagogical approaches, curriculum alignment, and consistent technical support. Additionally, the study brings to the forefront critical considerations regarding equitable access to AI technology in education, emphasizing the importance of inclusivity and accessibility in these initiatives. To conclude, the study offers compelling evidence of the positive impact of AI in higher education, marked by a significant improvement in student outcomes. While highlighting the immense potential of AI-driven platforms, it also draws attention to the challenges and ethical considerations involved in their implementation. This research serves as a foundation for future exploration into the sustainable and equitable integration of AI in educational settings, paving the way for more personalized, effective, and inclusive learning experiences.

Institutional review board statement

The study is approved by the institutional review board of STEM Visions Institute with the SV/IRB/Estt/23/3912 numbered document.

CRediT authorship contribution statement

Fawad Naseer: Writing – original draft, Resources, Methodology, Investigation, Formal analysis. Muhammad Nasir Khan: Writing – review & editing, Methodology, Investigation, Formal analysis, Conceptualization. Muhammad Tahir: Resources, Methodology, Formal analysis, Conceptualization. Abdullah Addas: Supervision, Funding acquisition, Formal analysis, Conceptualization. S.M. Haider Aejaz: Writing – review & editing, Methodology, Formal analysis, Conceptualization.

Declaration of competing interest

The authors declare the following financial interests/personal relationships which may be considered as potential competing interests:Abdullah Addas reports financial support was provided by Prince Sattam bin Abdulaziz University. If there are other authors, they declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgements

We thank the students, educators, and administrators from Beaconhouse International College who participated in this research and provided invaluable insights. We also acknowledge our colleagues for their guidance and input during the conceptualization of this project.

Contributor Information

Muhammad Nasir Khan, Email: dr.ir.mnkhan@gcu.edu.pk.

Abdullah Addas, Email: a.addas@psau.edu.au.

References

  • 1.AI in e-learning . E-Learning Methodologies: Fundamentals, Technologies and Applications. Institution of Engineering and Technology; 2021. pp. 107–131. [DOI] [Google Scholar]
  • 2.Martinez-Noriega E.J., Yokota R. Towards real-time formula driven dataset feed for large scale deep learning training. Electron. Imag. 2023;35(11):243. doi: 10.2352/ei.2023.35.11.hpci-243. 1–243–6. [DOI] [Google Scholar]
  • 3.Hastuti I.B., Musslifah A.R. Implementation of individual learning for children with special needs. Early Childhood Research Journal (ECRJ) 2023;6(1):23–31. doi: 10.23917/ecrj.v6i1.22971. [DOI] [Google Scholar]
  • 4.Madsen M. Set in motion by data: human-data-entanglements in educational governance. Discourse: Studies in the Cultural Politics of Education. 2021:1–14. doi: 10.1080/01596306.2021.1984211. [DOI] [Google Scholar]
  • 5.Feng L., Ma L., Ng G. Personalized customization system solution using augmented reality technology. MATEC Web of Conferences. 2021;336 doi: 10.1051/matecconf/202133605017. [DOI] [Google Scholar]
  • 6.Sedelmaier Y., Erculei E., Landes D. 14th International Conference on Education and New Learning Technologies. IATED; 2022. One size fits all? Future skills in higher education – lessons learned. [DOI] [Google Scholar]
  • 7.Artificial intelligence and human society (artificial intelligence and education) Engineering: Open Access. 2023;1(3) doi: 10.33140/eoa.01.03.10. [DOI] [Google Scholar]
  • 8.Aswin A., Ariati C., Kurniawan S. Artificial intelligence in higher education: a practical approach. J. High Educ. Pol. Manag. 2022:1–4. doi: 10.1080/1360080x.2022.2156088. [DOI] [Google Scholar]
  • 9.Radhakrishnan J., Gupta S., Prashar S. Understanding organizations' artificial intelligence journey: a qualitative approach. Pac. Asia J. Assoc. Inf. Syst. 2022;14:43–77. doi: 10.17705/1pais.14602. [DOI] [Google Scholar]
  • 10.Denzinger J., Schur A. Advances in Artificial Intelligence. Springer Berlin Heidelberg; 2004. On customizing evolutionary learning of agent behavior; pp. 146–160. [DOI] [Google Scholar]
  • 11.Nakov Y., Kozareva V. Open Learning and Teaching in Educational Communities. Springer International Publishing; 2014. VCL: platform for customizing individual and group learning; pp. 510–513. [DOI] [Google Scholar]
  • 12.Karim R., Galar D., Kumar U. AI Factory. CRC Press; 2023. Data-driven decision-making; pp. 147–177. [DOI] [Google Scholar]
  • 13.Yao G. Application of higher education management in colleges and universities by deep learning. Comput. Intell. Neurosci. 2022;2022:1–9. doi: 10.1155/2022/7295198. [DOI] [PMC free article] [PubMed] [Google Scholar] [Retracted]
  • 14.Moon W., Kim B., Ko S., Ko E., Kim J. Development and application of artificial intelligence education programs centered on deep learning principles. Journal of the Korean Association of Information Education. 2023;27(3):225–234. doi: 10.14352/jkaie.2023.27.3.225. [DOI] [Google Scholar]
  • 15.Aguiar-Castillo L., Clavijo-Rodriguez A., Hernández-López L., De Saa-Pérez P., Pérez-Jiménez R. Gamification and deep learning approaches in higher education. J. Hospit. Leisure Sports Tourism Educ. 2020 doi: 10.1016/j.jhlste.2020.100290. [DOI] [Google Scholar]
  • 16.Huang J., Yu D. Application of deep learning in college physical education design under flipped classroom. Comput. Intell. Neurosci. 2022;2022:1–9. doi: 10.1155/2022/7368771. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Intelligence, Neuroscience C. Retracted: application of higher education management in colleges and universities by deep learning. Comput. Intell. Neurosci. 2023;2023:1. doi: 10.1155/2023/9868974. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Zhang J.-L. The application of human comprehensive development theory and deep learning in innovation education in higher education. Front. Psychol. 2020;11 doi: 10.3389/fpsyg.2020.01605. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Abas M.A., Arumugam S.E., Yunus M.M., M. Rafiq K.R. ChatGPT and personalized learning: opportunities and challenges in higher education. Int. J. Acad. Res. Bus. Soc. Sci. 2023;13(12) doi: 10.6007/ijarbss/v13-i12/20240. [DOI] [Google Scholar]
  • 20.Maximova M.V., Frolova O.V., Etuev K.K., Aleksandrova L.D. Adaptive personalized learning: implementation of emerging technologies in higher education. Inf. Educ. 2023;38(4):14–27. doi: 10.32517/0234-0453-2023-38-4-14-27. [DOI] [Google Scholar]
  • 21.Muharlisiani L.T., Mulawarman W.G., Rugaiyah R., Azizah S.N., Karuru P. A decision support system for personalized learning in higher education. AL-ISHLAH: Jurnal Pendidikan. 2023;15(4) doi: 10.35445/alishlah.v15i4.4110. [DOI] [Google Scholar]
  • 22.Taylor D.L., Yeung M., Bashet A.Z. Innovative Learning Environments in STEM Higher Education. Springer International Publishing; 2021. Personalized and adaptive learning; pp. 17–34. [DOI] [Google Scholar]
  • 23.Tkachuk H. Model of realization of personalized learning of students of higher education institution. Engineering and Educational Technologies. 2021;9(3):8–17. doi: 10.30929/2307-9770.2021.09.03.01. [DOI] [Google Scholar]
  • 24.Sáiz-Manzanares M.C., García Osorio C.I., Díez-Pastor J.F., Martín Antón L.J. Will personalized e-Learning increase deep learning in higher education? Information Discovery and Delivery. 2019;47(1):53–63. doi: 10.1108/idd-08-2018-0039. [DOI] [Google Scholar]
  • 25.Sharma S., Rana V., Kumar V. Deep learning based semantic personalized recommendation system. International Journal of Information Management Data Insights. 2021;1(2) doi: 10.1016/j.jjimei.2021.100028. [DOI] [Google Scholar]
  • 26.Shin H.-I. Learning strategies and deep learning. Korean Medical Education Review. 2009;11(1):35–43. doi: 10.17496/kmer.2009.11.1.35. [DOI] [Google Scholar]
  • 27.Jian M.J.K.O. Personalized learning through AI. Advances in Engineering Innovation. 2023;5(1) doi: 10.54254/2977-3903/5/2023039. None. [DOI] [Google Scholar]
  • 28.Li S. Learning Analytics Enhanced Online Learning Support. Routledge; 2023. Personalized tutoring and support; pp. 144–175. [DOI] [Google Scholar]
  • 29.Personalized artificial intelligence enhanced learning platform International Research Journal of Modernization in Engineering Technology and Science. 2024 doi: 10.56726/irjmets47868. [DOI] [Google Scholar]
  • 30.Devkota B., Giri S. Student motivation for academic performance in higher education in Nepal. Education and Development. 2020;30(1):1–25. doi: 10.3126/ed.v30i1.49401. [DOI] [Google Scholar]
  • 31.Mena S.B. Culture Centers in Higher Education. Routledge; 2023. Promoting student engagement; pp. 178–193. [DOI] [Google Scholar]
  • 32.Improving student academic performance through knowledge sharing Hong Kong J. Soc. Sci. 2023;60 doi: 10.55463/hkjss.issn.1021-3619.60.14. (No. 60 Autumn/Winter 2022. [DOI] [Google Scholar]
  • 33.Wiliam D. Educational Research and Innovation. OECD; 2010. The role of formative assessment in effective learning environments; pp. 135–159. [DOI] [Google Scholar]
  • 34.Baker A.M., Livers S., Acosta M., Willey C., Strozier M., Harbour K. Beyond conventional boundaries: Advancing equity and inclusivity in educational supervision. Journal of Educational Supervision. 2023;6(2):1–9. doi: 10.31045/jes.6.2.1. [DOI] [Google Scholar]
  • 35.Lachney M. 2021 AERA Annual Meeting. AERA; 2021. Educational technologists talk about the politics of equity, diversity, and inclusivity in design and development. [DOI] [Google Scholar]
  • 36.Noone J., Murray T.A. Addressing diversity, equity, and inclusivity contributions in academic review. Nurse Educat. 2023 doi: 10.1097/nne.0000000000001488. [DOI] [PubMed] [Google Scholar]
  • 37.Kalaivani D.S. Quality concerns in higher education. Shanlax International Journal of Arts, Science and Humanities. 2022;10(S1-Jul):333–337. doi: 10.34293/sijash.v10is1-jul.5068. [DOI] [Google Scholar]
  • 38.Eutychus Ngotho G. Ethical standards in higher education. Kampala International University Journal of Education. 2023;3(2):98–114. doi: 10.59568/kjed-2023-3-2-11. [DOI] [Google Scholar]
  • 39.Paul H.W., Crowe M.M. Digital literacy inequities, higher education, and the new digital divide. International Journal of Intelligent Computing Research. 2023;14(1):1177–1180. doi: 10.20533/ijicr.2042.4655.2023.0144. [DOI] [Google Scholar]
  • 40.Soomro K.A., Kale U., Curtis R., Akcaoglu M., Bernstein M. Digital divide among higher education faculty. International Journal of Educational Technology in Higher Education. 2020;17(1) doi: 10.1186/s41239-020-00191-5. [DOI] [Google Scholar]
  • 41.Aler Tubella A., Mora-Cantallops M., Nieves J.C. How to teach responsible AI in higher education: challenges and opportunities. Ethics Inf. Technol. 2023;26(1) doi: 10.1007/s10676-023-09733-7. [DOI] [Google Scholar]
  • 42.Brawner K., Wang N., Nye B. Teaching artificial intelligence (AI) with AI for AI applications. The International FLAIRS Conference Proceedings. 2023;36 doi: 10.32473/flairs.36.133388. [DOI] [Google Scholar]
  • 43.Bebasari M. 21st CENTURY EDUCATION 21st CENTURY EDUCATION. Journal of Language Education and Development (JLed) 2022;4(1):44–52. doi: 10.52060/jled.v4i1.784. [DOI] [Google Scholar]
  • 44.Sopin S., Sanrattana W. Teachers with participatory action to enhance 21st century learner skills. World J. Educ. 2023;13(3):79. doi: 10.5430/wje.v13n3p79. [DOI] [Google Scholar]
  • 45.Zehner F., Hahnel C. Artificial intelligence on the advance to enhance educational assessment: Scientific clickbait or genuine gamechanger? J. Comput. Assist. Learn. 2023;39(3):695–702. doi: 10.1111/jcal.12810. [DOI] [Google Scholar]
  • 46.O'Donnell K.C., Reschly A.L. Handbook of Educational Psychology and Students with Special Needs. Routledge; 2020. Student engagement and learning; pp. 557–583. [DOI] [Google Scholar]
  • 47.Ipseeta Nanda E.A. ALBERT-Based personalized educational recommender system: enhancing students' learning outcomes in online learning. Int. J. Recent Innov. Trends Comput. Commun. 2023;11(10):2190–2201. doi: 10.17762/ijritcc.v11i10.8906. [DOI] [Google Scholar]

Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES