Skip to main content
Scientific Reports logoLink to Scientific Reports
. 2025 Jul 19;15:26246. doi: 10.1038/s41598-025-09405-0

The effectiveness of MOOCs in Technical Education: an Indian perspective

Priyanka Jarial 1,, Himanshu Aggarwal 1, Bhim Sain Singla 2
PMCID: PMC12276326  PMID: 40683915

Abstract

The analysis of user engagement in online educational courses has drawn considerable attention from researchers due to the rise of Massive Open Online Courses (MOOCs). The effectiveness of MOOCs (EMOOCs) has been identified as one of the crucial factors by numerous researchers. This led to the development of various MOOC frameworks, models, theories, designs, and principles for evaluating online learning for both students and instructors. This study uses the IBM SPSS statistical package to identify the factors contributing to the effectiveness of MOOCs (EMOOCs). It examined eight direct and indirect determinants contributing to the effectiveness of MOOCs. Data were collected from Technical courses offered at Tertiary institutions in the Punjab region, Northern India, assuming that the students already possessed knowledge of Massive Open Online Courses (MOOCs). The study revealed that the effectiveness of MOOCs (EMOOCs) is enhanced by concentrating on factors such as Course Content and Design, Performance Evaluation, Learner Feedback, Teacher-Student Interaction, and Technological Support in Massive Open Online Courses. The Discriminant Validity (HTMT ratio and Fornell-Larcker Criteria) validates the results. This study provides valuable insights into the development and implementation of Massive Open Online Courses (MOOCs) tailored to meet the specific needs of learners in Northern India, with a focus on Punjab. The research identifies key characteristics that impact the effectiveness of MOOCs, providing practical suggestions for educators and organizations seeking to enhance the learning experience in this domain.

Supplementary Information

The online version contains supplementary material available at 10.1038/s41598-025-09405-0.

Keywords: Confirmatory factor analysis, Reliability analysis, Structural equation modelling (SEM), Discriminant Validity, EMOOCs (effectiveness of MOOCs)

Subject terms: Psychology, Engineering

Introduction

MOOCs refer to the Massive (huge)–Open (free)-Online (web-based) – Courses (resources). Massive refers to the sharing of online repositories via the web, while Open refers to the readily available e-resources and digital tools on digital platforms1,2. Online refers to the exchange of knowledge and ideas among peers, regardless of geographical location, time zone, and age, with a click. Courses refer to the structured Course e-content and curriculum design, as per the level of understanding (beginner, intermediate, advanced) of users’ to engage in online platforms3. Unlike traditional methods, the users found MOOCs to be a self-instructional tool, offering readily available e-resources and tools4. Therefore, MOOCs are evolving as one of the most recent developments in online education5. It has been established as a reliable medium for sharing e-resources and a golden opportunity for students to enrol in top-notch tertiary educational institutions worldwide. This suggests that MOOCs are enhancing learners’ engagement, reliability, and desire to rely on online media6. Due to the growing demand and increasing involvement of students, educational institutions are working diligently to create more online programs. As these e-resources provide students with more possibilities for learning7. Thus, it is concluded that MOOCs are a widely acknowledged breakthrough in revolutionizing the higher education teaching and learning methodology8.

Additionally, sophisticated platforms and integrated technologies such as Google Meet, Microsoft Forms, Zoom, and WebEx9 have made it easy to access online resources. Therefore, MOOCs have gained widespread recognition as a transformative innovation in the educational system8. Shah et al.10,11 found that learner enrollment has surged by 30% worldwide in a single year due to the introduction of e-resources. This increase suggests that reliance on the Internet has intensified considerably. Aside from this, it is observed that in the upcoming years, educational institutions will introduce an average of 2000 new courses annually12. This drift has led to a growing interest among researchers in the utilization and performance of data tools. Thus, scholars have formulated numerous conclusions and hypotheses to analyze the diverse teaching-learning methodologies7 and factors influencing tertiary educational institutions to adopt MOOCs.

The Indian Education system has also adopted digital transformation of in teaching and learning methodologies, replacing traditional approaches. To promote the e-learning system, the Indian Government has collaborated with the Ministry of Education, which was previously known as the Ministry of Human Resource Development (MHRD). Ministry of Human Resource Development (MHRD) has taken the initiative to promote the online education system. Thus, it has introduced the SWAYAM (Study Webs of Active-Learning for Young Aspiring Minds) platform to achieve three cardinal principles of education policy: access, equality, and quality. The main goals of the e-learning education policy are to make the most of internet resources and deliver the most excellent teaching and learning materials to every home. In Higher Education Institutions, Teaching-learning system and students’ opinions have been profoundly impacted by this change in the learning approaches13. This shift has led to numerous academics and researchers learning about the effects that various technological tools and resources have had on teachers and students.

India has gained second place in embracing this digital transformation after the USA14.This suggests that the shift from the traditional educational system to online teaching has had a positive impact on the relationships between tutors and learners15. Present studies also suggest that the online tools adopted by tutors positively impact user’s understanding of technical features and personality traits16. Specifically, the use of online tools identifies four major factors influencing the work of academicians- efficiency, expectations, experience and external support17,18. It has been concluded that the most effective way to assess digital platforms (e-resources) is through the extended utilization of technological acceptance models.

Therefore, this paper incorporates the Structural Equation Modelling (SEM) methodology to robustly demonstrate the influence of the online medium on the users. Additionally, it unfolds the factors contributing to the effectiveness of MOOCs (EMOOCs) among the users by following this structure.

  1. Initially, a Comprehensive Literature Review will be conducted to explore the impact of MOOCs on learners and tutors.

  2. Building upon the existing literature, a Research Methodology is adopted to analyze the effectiveness of MOOCs (Massive Open Online Courses, or EMOOCs).

  3. The Research Questions (elaborated in Appendix 1), depicted in Table 1, serve as the foundation for the study, followed by the presentation of the results and their Testing.

  4. Finally, a Hypothesis will be formulated using Structural Equation Modelling to illustrate the factors that influence the effectiveness of MOOCs (EMOOC*).

Table 1.

Research questions.

Sr. no. Research questions(elaborated in Appendix 1)
1 How does the learner’s characteristic affect the e-learning platform?
2 How does the tutor’s characteristics affect the e-learning platform?
3 How does the teacher-student interaction affect the e-learning platform?
4 Are the course materials updated? How does their presentation contribute to the effectiveness of the e-learning platform?
5 Does Technological Support influence e-learning?
6 How is the Performance Evaluation of the courses carried out?
7 How does the course feedback support e-learning?
8 How much does an online resource affect the student’s evaluation?
9. How do MOOCs influence the measurement of their effectiveness in terms of achieving goals?

The study’s findings would benefit both learners and academicians in their theoretical and practical applications in their respective areas of interest.

Literature review and hypothesis framework development

Online Courses are recognized as one of the most convenient ways to provide learners with an affordable platform for knowledge acquisition. These courses serve as an Open Educational Resource that can be adapted, acknowledged, and distributed freely19. MOOCs have played a transformative role in the Education system. The Government’s introduction of MOOCs guidelines has significantly accelerated the Open Educational Resources (OER) movement in India20. This transformation has enabled Indian students to assess structured learning through schedules, assessments, and an interactive learning environment21.

India’s journey in e-learning began with the introduction of the Computer Literature and Social Studies (CLASS) program in the early 1980s. The program was introduced in 2,598 schools, resulting in the construction of nearly 600,000 schools over nearly three decades. The Indira Gandhi National Open University (IGNOU) initiated the broadcasting of educational programs on Doordarshan in 1991. Subsequently, Netvarsity was established by NIIT (National Institute of Information Technology) in 1996, with the launch of a virtual campus in 199922,23. The efforts resulted in programs such as the Gyandarshan education channel in 2000 and the launch of EDUSAT in 2004, the world’s first educational satellite, which expanded access to Education1,13.

The Indian Institute of Management in Bangalore (IIM-B) collaborated with the Commonwealth of Learning (COL), laying the groundwork for future digital education policies in India. It has been highlighted that MOOCs are essential in a nation like India, where the majority of the population resides in rural regions and lacks access to quality education and skill development due to financial constraints, physical limitations, or transportation issues. MOOCs offer a valuable approach to addressing these issues and promoting universal access to education. With India being home to one of the world’s largest youth populations and a rapidly expanding internet base, MOOCs are increasingly viewed as a strategic tool for skill development and global competitiveness24,25. These platforms help bridge the gap between educational supply and demand, especially when university seats are limited26. Indian learners rank among the most active MOOCs users globally, with significant participation in platforms like Coursera, edx, and Udacity. The studies revealed that 61% of Indian users come from metropolitan cities, with Mumbai and Bangalore (in the Southern region of India) alone accounting for 18% of the national user base27. The number of Indian learners on edX rose from 11 to 27% in a single year.

In comparison, 19% of Udacity’s nano-degree participants were Indian. (Shah, 2016), indicated that the increasing reliance on MOOCs for career advancement and higher education preparation over digital platforms like SWAYAM, mookit, NPTEL, and IITBombayX have become key to this educational surge. This e-learning model is supported by various Government aided and non-aided educational institutions12,20.

Additionally, the CEO of edX addressed that the digital platform has significant potential in India. Capable Indian students in various Educational institutions face limited opportunities due to inadequate financial support from the Government9. Thus, the success rate of MOOCs in Indian educational institutions has not been systematically evaluated in Higher Education. The utilization of MOOCs as an e-learning platform in India would be a practical strategy for implementing green computing in tertiary education. Thus, MOOCs are found to be a key element in transforming the Indian education system to align with global trends worldwide.

MOOCs have significantly transformed tertiary education by offering scalable and accessible learning solutions that address global skill gaps and align with industry demands. This alignment ensures that students acquire relevant skills that enhance employability and meet the evolving demands of diverse industries. It now plays a crucial role in bridging the gap between academic programs and the evolving demands of the global labor market. By mid-century, MOOCs will have grown into a global phenomenon, with over 120 million learners enrolled in more than 13,500 courses offered by over 900 universities worldwide28. This expansion reflects the increasing demand for, as well as the flexibility and accessibility of, educational options that transcend geographical and economic barriers. Table 2 illustrates the development phases of MOOCs in India and their global acceptance.

Table 2.

Growth paradigm of MOOCs in india.

Year Development phase of MOOCs in India
1980 Introduction to e-learning
1991 IGNOU aired educational programs in collaboration with National Television (Doordarshan)
1996 NIIT commercially stepped into the e-learning platform
1999 India collaborated with the UK to launch Bachelor’s Degree programs29
2000 The GOI &IGNOU collaborated and launched the Gyandarshan Educational Channel
2003 UGC collaborated with the COL
2004 ISRO launched the first educational satellite (EDUSAT)
2005-14 MOOCs (Massive Open Online Courses) have gained popularity
2014 Indians became the second-largest group after the USA in utilising MOOCs
2016 An approximately 18–20% increase in Indian users’ enrollment in courses on platforms like Udacity and edX resulted in a 53% employability rate for the users upon completing their bachelor’s programs
2017 MHRD introduced SWAYAM
2019 Indians became the most aggressive users of MOOCs (Massive Open Online Courses)
2023-28 An expected 40% increase in the adoption of the MOOCs30

COL- Commonwealth of Learning, IGNOU-Indira Gandhi National Open University, GOI-Government Of India, NIIT-National Institute of Information Technology, UGC-Union Grants Commission., SWAYAM-Study Webs of Active-Learning for Young Aspiring Minds, MHRD- Ministry of Human Resource & Development.

The proposed study framework of the research has opted for the31 and 32 models as the base model to understand the factors affecting MOOCs. This base study involves system design, system delivery, and service quality. Additionally, factors such as course content design and presentation, teacher-student interaction, tutor characteristics, Student Evaluation, Effectiveness of MOOCs (EMOOCs), Course Feedback, Participant Characteristics, and Performance Evaluation are the constructs for the study framework.

One of the key determinants of MOOCs effectiveness (EMOOCs) is the design of Course Content coupled with Technological Support, which serves as the foundation for Hypotheses H1 and H2. Prior studies, including those by Fianu33 and Yang34, have highlighted the critical role of content structure, presentation quality, and instructional design in enhancing learner engagement and satisfaction. Clear module organization, effective use of multimedia, and adherence to sound pedagogical principles contribute to improved comprehension and learner retention. In addition, Technological features—such as intuitive navigation, simulations, and interactive platforms—streamline access to learning materials and equip participants with essential digital competencies35,36. Merely providing internet connectivity is insufficient; the effectiveness of MOOCs (EMOOCs) also relies heavily on reliable Technical infrastructure and user-centric digital interfaces37,38, which collectively support the assimilation of complex concepts.

Closely tied to these elements are Teacher-Student Interactions and Tutor Characteristics, underpinning Hypotheses H3 through H6. Scholars like Subramanian39 and Moore40,41 have underscored the importance of instructor-learner engagement in fostering motivation and deep learning. Drawing on Vygotsky’s Zone of Proximal Development (1978) and its extensions by Cole4, guided interaction emerges as a central mechanism for enhancing learning outcomes. Learner commitment and academic accomplishment are strongly influenced by tutors’ traits, such as teaching methodology, attentiveness, and evaluation techniques42,43. Additionally, learner-specific characteristics, including communication abilities, preferred learning styles, and self-regulation, play a critical role in determining how learning occurs36,44.Studies by Hong et al.45 and Willis46 consistently affirm that quality interaction between students and instructors positively affects performance in blended and online learning environments.

Equally important is the role of student Assessment and Feedback, which informs content improvement and Learners’ Performance, forming the basis for Hypotheses H7 and H8. Real-time feedback allows educators to adapt course content and enables students to monitor their progress effectively47,48. The integration of advanced tools—such as Python-based text mining, natural language processing, and knowledge graph frameworks—has revolutionized the collection and interpretation of learner feedback4951. These technologies address traditional scalability and personalization challenges, enabling tailored learning experiences5254. MOOCs, in contrast to traditional educational institutions, offer continuous assessments through tests and interactive exercises55, enabling real-time performance monitoring5. Such dynamic evaluation techniques greatly improve learner outcomes and facilitate the broader integration of MOOCs into higher education systems, according to research by Chen et al. (2018)47,56. Feedback systems enhance the effectiveness of training by fostering a learner-centred ecosystem and enhancing the quality of course material57,58.

Lastly, the growing importance of blended and hybrid learning models—where Massive Open Online Courses (MOOCs) augment in-person instruction—is highlighted by contemporary educational trends. Based on Vygotsky’s developmental theory, research5961 shows that hybrid approaches combine the best features of traditional and digital formats to achieve better academic results62. These models are particularly effective when supported by equitable assessment practices and opportunities for collaborative interaction6365. Evidence suggests that such approaches foster deeper engagement, critical thinking, and knowledge construction while also enabling personalized learning trajectories6668. These observations are consistent with the research’s conceptual framework, which views technology infrastructure, instructional assistance, and content quality as crucial components in assessing the efficacy of MOOCs (EMOOCs). As a result, the encouraging effects of online resources in mixed learning settings support the necessity of doing empirical research on the related theories listed in Table 7.

Table 7.

Hypotheses formulated.

H1: The Course Content and Design have a positive effect on the MOOCs*
H2: The Technological Support has a positive effect on the EMOOCs
H3: The Teacher-Student Iinteraction has a positive effect on the MOOCs*
H4: The Participant’s Characteristics have a positive effect on the Teacher-Student Interaction
H5: The Student’s Evaluation has a positive effect on the Teacher-Student Interaction
H6: The Teacher’s Characteristics have a positive effect on the Teacher-Student Interaction
H7: The Performance Evaluation has a positive effect on the EMOOCs
H8: The Course Feedback has a positive effect on the Course Content and Design

Methods

Sampling design

The responses were gathered from students of Technical courses studying at Universities in the Northern Region of India, and a random sampling method was employed to select the students for the response collection. The link to the Google form was sent to selected students, and they were requested to participate in the survey. The link was sent to 600 students, assuming an 80% response rate. The complete responses were gathered from 385 students, which were further used for statistical analysis and Hypothesis testing. The sample size of 385 responses was considered satisfactory and representative, as it satisfies the criteria of Bartlett’s sample size requirement for a population size of 30 billion, assuming a 95% confidence level and a 5% margin of error. The sample size is also correct according to Nunnally (1979), which states that the sample size should be greater than 8 to 10 times the number of statements included in the structural model. The study has 45 statements, and the minimum sample size required is 360. Thus, the sample size of 385 students is representative of the study population.

Survey instrument design

The questionnaire was designed following an extensive literature review of established variables scales. The responses were collected through a well-designed and pre-tested questionnaire (elaborated in Appendix 1) for this study. Respondents from Northern India, specifically those from Punjab, provided their responses on a 5-point Likert agreement scale, ranging from ‘strongly disagree’ (1) to ‘strongly agree’ (5). This scale indicates the respondents’ agreement level with each instrument statement regarding their experience with a particular online platform. The confidentiality of the response data ensures that no individual respondent can be exported from the study’s results. All data items were exported to the IBM Statistical Package for the Social Sciences (SPSS 26) software for analysis. A combination of Exploratory and Confirmatory Factor Analysis methods was employed to validate the dataset’s measurement results. Finally, the Structural Equation Model was formulated for Hypothesis Testing using SmartPLS 4 software.

Research procedures

Study design and participant selection criteria

This study focuses on learners who utilize online resources to increase their skill set. It involved the learners from the graduation or Post-graduation technical courses in the Northern Region, Punjab, India. Participants in the study must be able to utilize online resources effectively, regardless of gender. All participants involved in the study provided Informed Consent in the questionnaire. The criteria for excluding participants from our study dataset were incomplete or missing responses and inconsistent data. A total of 487 participants filled out responses. However, 79 entries were diminished during the data cleaning process. By applying these exclusion criteria, we retrieved the 385 responses for the validity of our results regarding the factors influencing the effectiveness of MOOCs (EMOOCs). The research methodology, purpose, and protocols were rigorously reviewed and subsequently approved by our department’s ethics committee. The methods employed in this study conformed to applicable guidelines and regulations. The Declaration of Helsinki’s ethical guidelines for medical research involving human participants were strictly followed in the study’s planning, implementation, and reporting. All personal identifiers were eliminated to protect the privacy of the participants, and confidentiality was upheld during the data processing stage.

Ethics approval

This study was approved by the Departmental DRB committee in the ANNUAL PROGRESS REPORT of session April 24. All research involving human subjects adheres to the ethical standards of the institution and aligns with the 1964 Helsinki Declaration and its subsequent updates or equivalent ethical standards. The consent was obtained from all participants in the study before the questionnaire was administered, as their identities would be kept confidential and their responses would be used solely for research purposes.

The questionnaire was designed using sample items shown in Appendix 1. For the survey platform, the questionnaire was distributed via online media using Google Forms and emails. This medium was considered reliable by previous researchers54,69, as they have collected respondents’ responses via paper and paperless means like the Internet, newspapers, TV and institutional invitations. Furthermore, the findings derived from the data collection were validated using various statistical approaches, including validity and reliability tests38,69. Following the same research paradigm, the survey questionnaire data were distributed to the Technical Educational Institutions in the Northern Region of India. This included students, faculty members and research scholars who had prior knowledge of online platforms like ZOOM, Microsoft Teams, and Google Meet through personal networks like WhatsApp and email. Volunteer participation was also invited to contribute to the survey. The sampling methodology followed a similar approach to that adopted by Wang70,71 for the validation and hypothesis testing of the sampling dataset by using Structural Equation Modelling.

Analysis and Results

Descriptive static results

Overall, 487 individual respondents made their submissions. Nevertheless, only 408 responders answered all of the questions. After the responses’ missing values were eliminated through the data cleaning process, 385 responses made up the final dataset that formed the basis of this investigation. According to a sampling study72, when the population size exceeds 50,000, a minimum sample size of 384 is considered valid and sufficient for analysis (with a 95% confidence level). The sample consisted of 228 females (59.22%) and 157 males (41.75%), with the majority of (90.25%) pursuing a Bachelor’s degree or Higher qualification. The survey instrument consisted of 9 dimensions (see Table 5) with a total of 45 items (see Appendix 1). A 5-point Likert scale was used, ranging from 1 to 5, with 1 being the lowest and 5 being the highest. For Reliability and validity Testing, Factor loadings and internal consistency (Cronbach’s alpha) were calculated21 using SPSS. Items with factor loadings above 0.500 were retained73. Finally, AVE (Average Variance Extracted) and CR (Composite Reliability) for the covariates were also calculated. The Fornell-Larker method and the HTMT ratio for discriminant Validity were also employed to assess the validity of the covariates (Tables 3 and 4).

Table 5.

Reliability test.

S. no. Latent variables Number of items
(Items from Table 1)
Reliability (Cronbach’s α)
1 Participant’s Characteristics

4

(10,15,21,36)

0.936
2 Tutors Characteristics

6

(6,11,14,20,31,37)

0.897
3 Teacher-Student Interaction

4

(8,27,29,39)

0.917
4 Course Content Design & Presentation

9

(1,4,13,22,24,17,30,34,38)

0.878
5 Technological Support

4

(5,25,32,45)

0.893
6 Performance Evaluation

4

(28,33,40,43)

0.9
7 Course Feedback

4

(3,18,26,42)

0.897
8 Student Evaluation

5

(9,12,16,23,41)

0.904
9. Effectiveness of MOOCS (EMOOCs)

5

(2,7,19,35,44)

0.917

Table 3.

KMO and Bartlett’s test.

KMO and Bartlett’s test
Kaiser-Meyer-Olkin measure of sampling adequacy 0.955
Bartlett’s test of sphericity Approx. Chi-square 12948.320
Df 990
Sig. 0.000

Table 4.

EFA (Exploratory Factor Analysis):

Constructs
Course content and design Tutors characteristics Student evaluation EMOOCs Course feedback Participants characteristics Performance evaluation Teacher-student interaction Technological support Extracted communalities
CC9 0.761 0.726
CC7 0.748 0.676
CC1 0.739 0.691
CC2 0.719 0.635
CC5 0.717 0.671
CC6 0.701 0.695
CC3 0.687 0.645
CC4 0.673 0.647
CC8 0.659 0.697
TC3 0.795. 0.745
TC6 0.784 0.691
TC4 0.748 0.668
TC2 0.737 0.655
TC1 0.719 0.682
TC5 0.709 0.662
SE4 0.796 0.749
SE3 0.763 0.746
SE1 0.760 0.713
SE5 0.751 0.733
SE2 0.728 0.684
Emooc*2 0.746 0.763
Emooc*3 0.720 0.785
Emooc*5 0.713 0.730
Emooc*1 0.705 0.736
Emooc*4 0.687 0.743
CF4 0.838 0.820
CF3 0.815 0.789
CF2 0.777 0.732
CF1 0.742 0.742
PC1 0.839 0.748
PC2 0.837 0.739
PC3 0.812 0.697
PC4 0.796 0.758
PE4 0.813 0.813
PE1 0.790 0.778
PE3 0.717 0.727
PE2 0.693 0.805
TSI2 0.731 0.792
TSI4 0.731 0.780
TSI1 0.716 0.782
TSI3 0.700 0.783
TS4 0.700 0.825
TS3 0.693 0.842
TS1 0.682 0.801
TS2 0.669 0.787
Eigen values (> 1) 18.020 2.822 2.540 2.226 2.0542 1.947 1.427 1.049 1.019
% of variance 40.045 6.272 5.643 4.946 4.565 4.326 3.171 2.332 2.265

Extraction: Principal Component Analysis, Rotation Method: Varimax with Kaiser Normalisation, Rotation convergent in 7 iterations.

Structural Equation Model (SEM) Assessment

The Structural Equation Model (SEM) assessment approach enables simultaneous testing of all proposed Hypotheses (H1–H8, as shown in Table 7) within a unified framework. It facilitates the examination of both direct and indirect relationships among variables, allowing for a comprehensive assessment of the multiple factors that influence the effectiveness of MOOCs (EMOOCs). Thereby this section elaborates on the research questions presented in Table 1. It also discusses the results of the reliability analysis (in Table 5), Exploratory Factor Analysis EFA (in Table 4), Construct Validity Analysis (CFA) (in Table 6), the Hypotheses developed (in Table 7).

Table 6.

CFA (Confirmatory Factor Analysis) results.

Constructs Outer loadings Cronbach’s alpha (α) Composite reliability (CR) Average variance extracted (AVE) VIF
CC1 Course Content and Design 0.765 0.936 0.935 0.617 2.566
CC2 0.723 2.288
CC3 0.749 2.099
CC4 0.805 2.319
CC5 0.775 2.485
CC6 0.822 2.642
CC7 0.752 2.356
CC8 0.876 2.545
CC9 0.795 2.803
CF1 Course Feedback 0.958 0.897 0.896 0.684 2.341
CF2 0.767 2.298
CF3 0.789 2.861
CF4 0.779 3.148
Emooc*1 EMOOCs 0.823 0.917 0.917 0.688 2.536
Emooc*2 0.797 2.69
Emooc*3 0.867 3.051
Emooc*4 0.85 2.543
Emooc*5 0.806 2.438
PC1 Participants Characteristics 0.774 0.878 0.878 0.643 2.411
PC2 0.831 2.287
PC3 0.803 2.098
PC4 0.798 2.318
PE1 Performance Evaluation 0.818 0.893 0.893 0.678 2.224
PE2 0.957 2.78
PE3 0.795 2.303
PE4 0.704 2.503
SE1 Student Evaluation 0.788 0.9 0.9 0.643 2.341
SE2 0.804 2.221
SE3 0.812 2.413
SE4 0.727 2.472
SE5 0.872 2.442
TC1 Teacher Characteristics 0.794 0.897 0.897 0.592 2.063
TC2 0.74 2.067
TC3 0.801 2.602
TC4 0.698 2.037
TC5 0.832 2.094
TC6 0.744 2.153
TS1 Technological Support 0.84 0.921 0.921 0.746 2.859
TS2 0.878 2.728
TS3 0.864 3.537
TS4 0.871 3.232
TSI1 Teacher Student Interaction 0.842 0.904 0.904 0.701 2.583
TSI2 0.83 2.635
TSI3 0.867 2.611
TSI4 0.809 2.553

Exploratory Factor Analysis (EFA)

In order to identify the principal components or latent factors affecting the study, EFA (Exploratory Factor Analysis) was applied to the 45 items listed in Appendix 1. EFA is used to identify principal components or latent factors based on the correlation among pairs of items. It is the initial step to analyze the latent structure among observed variables without applying any predetermined framework or constraints. During EFA, items with high correlation among themselves are grouped under one factor construct. These constructs are identified by analyzing their loadings74. In this study, EFA resulted in the formation of eight constructs, each with its corresponding items, as listed in Table 4: Course Content Design & Presentation, Technological Support, Teacher-Student Interaction, Tutors’ Characteristics, Student Evaluation, EMOOCs (Effectiveness of MOOCs), Course Feedback, Participants’ Characteristics, and Performance Evaluation. This measuring methodology validates the structure of the suggested Hypothesis dimensions (H1 to H8) by confirming that each item aligns with its intended factor, which is crucial when applying it to new contexts, such as Indian MOOCs. The significant factor loadings for the extracted factors lie within the range of Course Content & Design(0.659–0.761), Technological Support(0.665–0.703), Teacher Student Interaction (0.603–0.737), Tutor’s Characteristics (0.709–0.795), Student Evaluation(0.728–0.796), EMOOCs=(0.687–0.746), Course Feedback (0.742–0.838), Participants Characteristics (0.796–0.839) and Performance Evaluation(0.693–0.813). The percentage of variance achieved in the extracted factors with Eigen value > 1 is 70%, while the discarded items with Eigen values less than 1 contributed to the remaining 30%of the variance. The results indicate that the communalities of the extracted items are greater than 0.5, indicating a good fit for PCA (Principal component analysis) in the study38,75,76.

Table 4 represents the item numbers from the questionnaire used as the basis for factor loadings. To apply EFA to the selected items (refer to Appendix 1), two assumptions were initially examined: the adequacy of sample variation (as measured by the Kaiser-Meyer-Olkin (KMO test) and the non-identity of the correlation matrix (as tested using the Bartlett test) Table 3. The KMO statistics value of 0.951(greater than the minimum required value of 0.6) indicates that the sample size is adequate for factor analysis. As suggested by scholars for use in structural equation modelling methodology. In Bartlett’s test, the p-value of the chi-square statistics (11289.358) is found to be 0.0, indicating that the application of EFA methods77 (Principal Component and Varimax methods for factor rotation) is helpful in exploring the latent factors formed on the correlations among the items.

Reliability and construct validity

Meanwhile, the internal consistency of the measurement scale for the different factors influencing EMOOCS effectiveness is analyzed by calculating the Average Variance Extracted (AVE), Composite Reliability (CR), and Cronbach’s alpha to test the validity and reliability of the data. The results of the Reliability analysis are reported in Table 5, which confirms significant internal consistency as Cronbach’s alpha values(Hair et al., 2006) for all the included factors are found to be greater than 0.7(Course Content and Design = 0.878, Course Feedback = 0.897, EMOOCs = 0.917, Participants Characteristics = 0.936, Performance Evaluation = 0.9, Student Evaluation = 0.904, Teacher Characteristics = 0.893, Technological Support = 0.893, and Teacher-Student Interaction = 0.917) which is significantly acceptable49. The construct validity of the measurement scale, which indicates the different factors influencing EMOOCs effectiveness, is examined using Confirmatory Factor Analysis (CFA) in Table 6. CFA confirms the reliability and validity (Convergent and Discriminant) of the constructs before testing structural relationships among them. It is also used to test whether data fit a hypothesized measurement model based on the EFA (Exploratory Factor Analysis) results. Convergent Validity is examined with the help of construct loadings (CL), composite reliability (CR) and average variance extracted (AVE). In contrast, Discriminant Validity is examined using the HTMT ratio and Fornell-Laracker criteria (Table 8). The results of the convergent validity are reported in Table 6, and those of the Discriminant Validity in Tables 9 and 10. The results show that the CL of all the items for different constructs is greater than 0.7. Researchers42,7880 have adopted the AVE > 0.4000 and CR > 0.600 as significant. Furthermore, the CR(Composite reliability) and AVE(Average Variance Extracted) indicators for all the constructs included in the measurement model are found to be greater than 0.781,82 and 0.583 respectively(Course Content and Design CR = 0.935, AVE = 0.617, Course Feedback: CR = 0.896, AVE = 0.684, EMOOCS: CR = 0.917, AVE = 0.688, Participants Characteristics: CR = 0.878, AVE = 0.643, Performance Evaluation: CR = 0.893, AVE = 0.678, Student Evaluation: CR = 0.9, AVE = 0.643, Teacher Characteristics: CR = 0.897, AVE = 0.592, Technological Support: CR = 0.921, AVE = 0.746, and Teacher-Student Interaction: CR = 0.904, AVE = 0.701). Thus, the measurement scale used is concluded to satisfy all the criteria for convergent validity.

Table 8.

Hypothesis Testing using SEM.

Hypothesis Exogenous construct Endogenous construct Path coeff. SE T-stats P-values F-Sq R-Sq (Q square)
H1: CCD→EM Course Content and Design EMOOCs effectiveness 0.264 0.05 5.218 0 0.081

61.5%

(0.503)

H7: PE→EM Performance Evaluation 0.371 0.052 7.156 0 0.241
H3:TSI→EM Teacher-Student Interaction 0.15 0.057 2.677 0.008 0.025
H2:TS→EM Technological Support 0.148 0.06 2.447 0.015 0.022
H8:CF→CCD Course Feedback Course Content and Design 0.52 0.049 10.646 0 0.381 27.3% (0.262)
H4:PC→TSI Participants Characteristics Teacher Student Interaction 0.211 0.047 4.447 0 0.077 45.9% (0.437)
H5:SE→TSI Student Evaluation 0.393 0.056 7.057 0 0.224
H6:TC→TSI Teacher Characteristics 0.271 0.048 5.607 0 0.108
Table 9.

HTMT ratio for discriminant validity.

Course Content and Design Course Feedback EMOOCs Participants Characteristics Performance Evaluation Student Evaluation Teacher Characteristics Teacher-Student Interaction Technological Support
Course Content and Design
Course Feedback 0.559
EMOOCs 0.709 0.408
Participants Characteristics 0.361 0.286 0.333
Performance Evaluation 0.553 0.476 0.72 0.268
Student Evaluation 0.558 0.487 0.518 0.334 0.517
Teacher Characteristics 0.53 0.496 0.477 0.326 0.392 0.501
Teacher-Student Interaction 0.723 0.435 0.685 0.457 0.555 0.639 0.564
Technological Support 0.761 0.514 0.705 0.376 0.591 0.629 0.552 0.834
Table 10.

Fornell Larcker Criteria for discriminant validity.

Course Content and Design Course Feedback EMOOCs* Participants Characteristics Performance Evaluation Student Evaluation Teacher Characteristics Teacher-Student Interaction Technological Support
Course Content_ and Design 0.786
Course_Feedback 0.564 0.827
EMOOCs 0.709 0.408 0.829
Participants_Characteristics 0.359 0.287 0.333 0.802
Performance_Evaluation 0.556 0.473 0.724 0.268 0.823
Student_Evaluation 0.559 0.488 0.519 0.336 0.515 0.802
Teacher _Characteristics 0.531 0.496 0.476 0.326 0.395 0.503 0.769
Teacher_Student_Interaction 0.723 0.436 0.685 0.456 0.552 0.64 0.565 0.837
Technological_Support 0.762 0.514 0.705 0.376 0.59 0.63 0.553 0.834 0.863

Common method bias

The biased responses lead to biased conclusions. Thus, the responses collected in the study are expected to be free from bias. The common method bias is estimated in the responses using the Harman single-factor method76. The EFA is applied with the restriction of one single factor to be extracted after applying the PCA method. The results of the Harman single-factor method indicate that only 39.605% of the variance in the entire data can be explained by the single extracted factor, which is less than the cutoff value of 50%. Thus, it is concluded that the received responses are free from the common method bias problem, and the conclusions made in the study are unbiased and can be generalized to the population.

Hypothesis Testing

The study formulated various Hypotheses using the Structural Equation Modelling (SEM) approach based on Partial Least Squares (PLS) and employed Smart-PLS 4 software for analysis. Table 7 presents the developed hypothesis for the study. Once the model was confirmed, SEM (Structural Equation Modelling) analysis, depicted in Fig. 1, was carried out to assess the research Hypothesis. Table 8 presents the results of the path analysis, including the standardizedstandardised estimates of the loadings and R2 values of the indicator variables. The proposed model exhibited significant explanatory power, as indicated by R2 values of 61.5%, 27.3%, and 45.9% for the variables, with Q² values of 0.503, 0.262, and 0.437, respectively.

Fig. 1.

Fig. 1

Structural Equation Modelling (SEM) Framework.

The results of hypothesis testing supported the H1, which posits that Course Contents and Design significantly influence the EMOOCs effectiveness (path coefficient = 0.264, t stats = 5.218**). In other words, by making the course more engaging and user-friendly, high-quality course design and content boost the efficacy of EMOOCs. Furthermore, the findings validated H2, showing that course feedback had a large and positive impact on EMOOCs (path coefficient = 0.52, t-statistic = 10.646**).The feedback from the users is shown to have a crucial impact on improving the MOOCs* effectiveness.H3 suggests that participants’ characteristics(path coefficient = 0.211, t stats = 4.447**)and teachers’ characteristics(Path coefficient = 0.271, t stats = 0.048**) influences the Teacher-Student Interaction, ultimately making the learning experience more convenient and understandable. Furthermore, the results indicate that student evaluation has a significant and positive influence on teacher-student interaction.

Conclusions and Discussion

The study utilized SEM (Structural Equation Modelling) methods to investigate the factors affecting MOOCS. It has investigated the critical factors (Course Contents and Design, Performance Evaluation, Teacher-Student Interaction, Technological Support, Course Feedback, Participant Characteristic, Student Evaluation, and Teacher Characteristics) affecting the effectiveness of MOOCs (EMOOCs). Holsapple et al.31,84 model is chosen as the base model for the study by incorporating a broader set of pedagogical and technological factors like Performance Evaluation, Student Evaluation, Teacher-Student Interaction, Teacher Characteristics and Participant Characteristics for formulating the MOOC design. This study yielded the following findings. (a) The Course Design and Content, Performance Evaluation, Teacher-Student Interaction, and Technological Support positively impact the effectiveness of EMOOCs. (b) The Course Design and Contents are greatly influenced by the comments received. (c) In conclusion, the Teacher-Student Interaction is influenced by the characteristics of the participant, the Student Evaluation, and the Tteacher Characteristics.

Factors affecting the effectiveness of MOOCs

Course Contents and Design, Performance Evaluation, Teacher-Student Interaction, and Technological Support are identified as the direct determinants affecting the effectiveness of MOOCs (EMOOCs). Each component factor is recognized as a crucial element in shaping the overall learning experience of users on digital platforms. These components work in unison to create an environment that supports learning by promoting ease of access, flexibility, and active participation. The study highlights how online learning platforms facilitate efficient knowledge sharing by bringing educators and students together in a shared digital environment. This seamless interaction reinforces the growing preference for MOOCs among users. Technological Support plays a crucial role in promoting the widespread use of these platforms by ensuring easy navigation, instant communication, and access to a variety of educational materials. In addition, well-crafted course content, thoughtful instructional design, and clear Performance Evaluation methods are key elements that attract learners. These characteristics make MOOCs a useful tool for both professional and personal development, as they not only increase their legitimacy as trustworthy learning platforms but also facilitate the acquisition of useful skills, thereby enhancing the effectiveness of MOOCs (EMOOCs) through Learner Feedback.

Course Feedback has a significant impact on how MOOCs are designed and delivered, which is one of the study’s main conclusions. Learner Feedback provides instructors and course developers with real-time insights to enhance course structure, instructional strategies, and content presentation. It is a valuable tool. This supports an earlier study75, which highlights the importance of adapting course materials to meet learners’ expectations better in order to boost motivation and academic achievement. This finding is consistent with the research of Topali et al.85, who highlighted that the relevance and efficacy of MOOCs (EMOOCs) are greatly enhanced by incorporating structured feedback mechanisms into the learning design.

Deeper involvement and improved learning outcomes have also been shown to be facilitated by peer assessment technologies86. Students who actively participated in peer reviews did better academically and provided valuable comments for improving the course, according to Braconnier and Liu (2024). The course content consistently meets user needs, preferences, and expectations through the iterative incorporation of learner input. As students are more likely to participate in classes that take into account their feedback, responsiveness increases learner satisfaction and also helps to increase enrollment and retention rates. To further enhance the learning process, innovative feedback solutions, such as the e-FeeD4Mi platform, are being adopted. These tools make it easier to offer feedback in a timely and tailored manner.

The Impact of Interpersonal dynamics on Learner Engagement

Interpersonal dynamics, including interactions among students, teachers, and peers significantly influence MOOC participation. Although MOOCs are flexible and scalable, the lack of traditional classroom settings may lead students to feel isolated, which can reduce their motivation and participation. To maintain high levels of engagement in online learning environments, a recent study emphasises the significance of developing interpersonal ties. The study has also highlighted the importance of effective communication in enhancing the overall learning experience, both between students and teachers, as well as among peers. For example, it has been demonstrated that options for instructor feedback and active peer cooperation significantly increase students’ interest in the course and their retention of the material. This is especially crucial in MOOCs, as the lack of in-person connection might hinder the growth of a feeling of community. Receiving tailored feedback and participating in regular group activities and conversations contribute to the development of a more engaged and encouraging atmosphere, which in turn raises retention rates and improves overall employee satisfaction. MOOCs can change the conventional online learning experience by incorporating these interpersonal components, making it more dynamic and captivating. By encouraging students to stay engaged and develop a deeper emotional and intellectual connection to the course material, strengthening social presence through such dynamic interactions eventually improves learning results.

The Impact of Teacher-Student Interaction in enhancing MOOCs effectiveness (EMOOCs)

Teacher-student interactions stand out as a crucial variable among the many elements that impact MOOC efficacy. According to recent studies, regular and meaningful engagement between teachers and students builds motivation and trust, underscoring the critical role that instructional communication plays in online learning. For example, a study by Miao et al.87 found that social presence and learning engagement in online contexts are significantly influenced by interactions between students and teachers. The association between these interactions and general learning engagement is mediated by social presence, highlighting the importance of interpersonal dynamics in enhancing student happiness and participation.

Additionally, research by Akram and Li93 examined the intricate relationships between academic motivation, online learning engagement, and teacher-student relationships. According to their findings, better levels of student engagement in online learning settings result from positive teacher-student connections that are mediated by either intrinsic or extrinsic academic incentives. These insights align with earlier studies33,70,72,88,89, which highlight the indispensable role of instructional communication in online education. Collectively, these findings underscore the strategic importance of fostering robust teacher-student interactions to enhance the effectiveness of MOOCs (EMOOCs) and promote sustained learner engagement.

Implications for the design and Implementation of MOOCs

The results of this study provide valuable insights for enhancing the design and delivery of Massive Open Online Courses (MOOCs). To fully realize their potential, course designers, academic institutions, and platform developers need to focus on creating adaptive content, implementing comprehensive Evaluation systems, and establishing inclusive communication models. Rather than being passive content delivery platforms, MOOCs are increasingly evolving into interactive, learner-centred environments. MOOCs can be powerful instruments for scalable, adaptable, and influential education if they are designed to incorporate continuous feedback, accommodate student diversity, and offer robust pedagogical support. This is especially important for universities that want to increase access and encourage programs for lifelong learning. Additionally, this study supports previous research90, which asserts that MOOCs —when based on sound instructional design and supported by robust digital infrastructure—have the potential to accelerate learning, especially in tertiary and continuing education contexts.

Theoretical and Practical Implications of the study

This paper has concluded the following contribution in its findings. The study offers significant insights into the design and implementation of MOOCs, particularly within the Indian educational landscape. The research identifies four pivotal factors influencing the effectiveness of MOOCs (Massive Open Online Courses): Course Content and Design, Performance Evaluation, Teacher-Student Interaction, and Technological Support. The key findings of the study underscore the necessity for well-structured and engaging Course Content and Design, as the course materials cater to the diverse learning needs of the learners. The study’s results also demonstrated that learner-centric design strategies improve MOOC comprehension and engagement. As a result, it is also proposed that the Course Design and Content engage students and are helpful in breaking up the routine of traditional classroom instruction91.Digital learning aids help users understand concepts more effectively, enhancing their knowledge according to their preferences and interests with a single click92. This proves beneficial for learners in mapping their skill set to the fluctuating demands of the industrial and global markets.

Secondly, implementing robust assessment mechanisms for Performance Evaluation is also found crucial for tracking learner progress and ensuring the attainment of learning outcomes. Additionally, effective evaluation strategies have been proven to be a significant contributing factor to course completion rates and learners’ satisfaction. Additionally, the study found that improving students’ engagement and retention increases teacher-student contact, which in turn motivates students. Our research supports that of Lee7, who found that students are reluctant to use traditional teaching and learning approaches because they lack motivation. Hence, MOOCs have facilitated Teacher-Student Interaction by analyzing individual Participants’ Characteristics. Finally, Technological Support is found to be a reliable and accessible pillar for the effectiveness of MOOCs (EMOOCs). This is so because technological infrastructure plays a pivotal role in the seamless delivery of MOOCs over digital platforms. The study also revealed that integrating e-learning platforms and tools facilitates compelling learning experiences for learners.

This study is regarded as the first comprehensive analysis of MOOCs use in Northern India, a region where the adoption of online education is rapidly expanding. For educational institutions seeking to expand their digital offerings and connect with a broader range of learners, the findings provide valuable insights. The study’s results also highlighted the importance of incorporating interactive components and ongoing feedback systems into course design to enhance student engagement and foster skill development and lifelong learning. By focusing on the social, psychological, and behavioral characteristics of Indian users, researchers can also expand their scope to investigate other factors influencing the transition from traditional chalk-and-board instruction to a blended learning mode. Lastly, the study’s results can be utilized to create a recommendation system for Indian educators, administrators, and legislators. These kinds of studies provide a more profound understanding of how to optimize MOOC design and implementation tactics. This study sheds statistically significant light on the variables influencing MOOC efficacy in the Punjab region of Northern India. The results highlight the significance of Technology Support, Teacher-Student Interaction, Performance Evaluation, and Course Design and Content in boosting MOOCc efficacy. Although these findings are relevant to the local context, they may not be applicable in other cultural or geographic contexts. It is recommended that further research be conducted to investigate whether these findings can be applied to diverse geographical areas and educational settings.

Limitations

The study provides valuable insights into the factors that influence the effectiveness of MOOCs (MOOCs) within the Indian educational context. However, several limitations should be considered when interpreting the findings: a)Sample Diversity: This study primarily focused on participants from Technical and Post-secondary Educational Institutions. This particular focus may limit the generalizability of the findings to other educational levels, such as primary, secondary, and non-technical schooling. Participant diversity could provide a comprehensive understanding of MOOCs’ effectiveness in diverse educational settings b) Data Collection Methods: The exclusive use of quantitative data restricts the range of insights into the experiences and perspectives of learners. Utilizing qualitative methods, such as focus groups and interviews, which offer nuanced perspectives that would be challenging to gather through quantitative measures alone, could improve the results. With this mixed-methods approach, a more comprehensive knowledge of the factors influencing MOOCs performance would be achievable. c) Demographic Data Limitations: Due to the limited amount of demographic data collected for the study, important factors like socioeconomic status, location, and cultural background may have been overlooked. These elements may have a big impact on MOOCs participation and accessibility. In order to detect and resolve any discrepancies in MOOCs effectiveness and accessibility, future research should strive for more thorough demographic profiling. d) Analytical Framework: The research did not investigate the intricate relationships between variables through numerous latent factors despite using Structural Equation Modelling (SEM) to validate its proposed model. Future analysis that take these factors into account may provide more profound understanding of the dynamics influencing the effectiveness of MOOCs (EMOOCs), allowing for more focused and successful interventions. e) Technological and Infrastructural Challenges: The study may not have considered all the Technological and Infrastructural obstacles that students encounter, including poor internet access, a lack of digital devices, and disparities in digital literacy. These difficulties are especially relevant in the Indian context and have the potential to affect significantly students’ capacity to participate in MOOCs. f) Language and Cultural Considerations: The study may have overlooked the impact of cultural and linguistic barriers on learners’ engagement with MOOCs. A wider range of students may find courses more accessible and relevant if they are offered in multiple languages and contain culturally relevant content. g) Motivational Factors and Learner Engagement: A more thorough investigation of the motivational elements affecting student engagement and course completion rates would be beneficial to the study. Strategies to increase engagement and lower dropout rates can be informed by knowledge of the intrinsic and extrinsic motivators that encourage students to enroll in and finish Massive Open Online Courses (MOOCs). h) Assessment and Feedback Mechanisms: The success and satisfaction of learners in MOOCs depend heavily on how well the Assessment and Feedback systems work. The effectiveness and timeliness of feedback provided to students, which can impact their learning experience and outcomes, may not have been thoroughly investigated in the study. Future studies should address these issues to improve the findings’ validity and relevance, which will ultimately help to optimize MOOCs for a variety of learner demographics.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1 (279.6KB, pdf)

Author contributions

All authors contributed to the study conception and design of the study. Material preparation, data collection and analysis were performed by Priyanka Jarial,. Dr.Himanshu Aggarwal and Dr. Bhim Sain Singla contributed to the final version of the manuscript and supervised the study at every stage. All authors discussed the results and contributed to the final manuscript.

Data availability

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.Point of Contact: [Priyanka Jarial ], [jarial.priyanka@gmail.com].

Declarations

Consent for publication

I give my consent for the publication of identifiable details, which can include a photograph(s) and/or videos and/or case history and/or details within the text (“Material”) to be published in the above Journal and Article.

Consent to participate

The consent was obtained before the questionnaire was administered to all participants in the study, as their identities would be kept confidential and their responses would be used solely for research purposes.

Ethics approval

The study adheres to the relevant ethical guidelines.

Informed consent

The privacy rights of human subjects have been observed, and informed consent was obtained for experimentation with human subjects while filling the questionnaire. This study was approved by the Departmental Research Board (DRB) committee in the ANNUAL PROGRESS REPORT of April 24.

Statement regarding research involving human participants and/or animals

All research involving human subjects adheres to the ethical standards of the institution and aligns with the 1964 Helsinki Declaration and its subsequent updates or equivalent ethical standards.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Consortium for Educational Communication. http://cec.nic.in/edusat/Pages/default.aspx (EDUSAT).
  • 2.Liyanagunawardena, T. R., Lundqvist, K., Mitchell, R., Warburton, S. & Williams, S. A. A MOOC taxonomy based on classification schemes of MOOCS. Eur. J. Open. Distance Elearning. 22(1). 10.2478/eurodl-2019-0006 (2019).
  • 3.Sokolik, M. 2 What constitutes an effective language MOOC? In Language MOOCS (eds Martín-Monje, E. & Bárcena, E.), 16–32 10.2478/9783110420067.2 (Sciendo Migration, 2014).
  • 4.Cole, A. W. & Timmerman, C. E. What do current college students think about MOOCS? MERLOT J. Online Learn. Teach.11(2), 188–201 (2015). [Google Scholar]
  • 5.Zhou, M. Chinese university students’ acceptance of MOOCS: A self-determination perspective. Comput. Educ.92(1), 194–203. 10.1016/j.compedu.2015.10.01 (2016). [Google Scholar]
  • 6.Bremer, C. New format for online courses: The open course future of learning. In Proceedings of eLearning Baltics eLBa 2012 Conference, 63–90. https://core.ac.uk/download/pdf/18325863.pdf (2012).
  • 7.Lee, J. The effects of knowledge sharing on individual creativity in higher education institutions. Socio-Technical View Administrative Sci.8(2), 21. 10.3390/admsci8020021 (2018). [Google Scholar]
  • 8.Selwyn, N., Bulfin, S. & Pangrazio, L. Massive open online change? Exploring the discursive construction of the ‘MOOC’ in newspapers. High. Educ. Q.69(2), 175–192. 10.1111/hequ.12061 (2015). [Google Scholar]
  • 9.Cochron, W. G. Sampling Techniques (Wiley, 1963).
  • 10.Shah, D. Year of MOOC-based degrees: A review of MOOC stats and trends in 2018. Cl. Cent.https://www.classcentral.com/report/MOOCS-stats-and-trends-2018/ (2018).
  • 11.Shah, D. MOOC Tracker – Never Miss a Course, Notification/Reminder service for MOOCS, Class Central. https://www.classcentral.com/report/mooc-tracker (2016).
  • 12.Aspiring Minds. National Employability Report 2016 - Aspiring Minds, Annual Report http://www.ijera.com (2016).
  • 13.Jayakumar, S. Educational paradigm shift: are we ready to adopt MOOC? Int. J. Emerg. Technol. Learn.9(4), 10.3991/ijet.v9i4.3756 (2014).
  • 14.Chauhan, J. An overview of MOOC in India international journal of computer trends and technology (IJCTT), ISSN: 2231–2803, 111–120 (2017).
  • 15.Kuhn, A. et al. Who gets lost? How digital academic reading impacts equal opportunity in higher education. New MediaSociety. 10.1177/14614448211072306 (2015). (2022).
  • 16.Keskin, H. K., Bastug, M. & Atmaca, T. Factors Directing Students To Academic Digital Reading. EgitimveBilim, 41(188)MOOCS Impact in Higher Education Institution: A Pilot Study In Indian Context (2016).
  • 17.Habibi, A. et al. Drivers afecting Indonesian pre-service teachers’ intention to use m-learning: structural equation modeling at three universities. E-Learning Digit. Media. 10.1177/20427530221118775 (2022). [Google Scholar]
  • 18.Leong, L. W., Ibrahim, O., Dalvi-Esfahani, M., Shahbazi, H. & Nilashi, M. The moderating effect of experience on the intention to adopt mobile social network sites for pedagogical purposes: an extension of the technology acceptance model. Educ. Inform. Technol.23, 2477–2498. 10.1007/s10639-018-9726-2 (2018). [Google Scholar]
  • 19.UNESCO. Open Education Resources, https://en.unesco.org/themes/buildingknowledge-societies/oer (2020).
  • 20.Government of India. ICT initiatives of MHRD, Ministry of Human Resource Development, Government of India. Retrieved from https://mhrd.gov.in/ict-initiatives (2020).
  • 21.Hutcheson, G. The Multivariate Social Scientist: Introductory Statistics Using Generalized Models10.4135/9780857028075 (Sage publications Ltd., 1999).
  • 22.(IGNOU) Indira Gandhi National Open University Wikipedia:The Free Encyclopedia. http://en.wikipedia.org/wiki/. Indira_Gandhi_National_Open_University (Wikimedia Foundation, Inc., 2013).
  • 23.NIIT -. Learning for careers in Banking, Financial Services, Insurance, IT, Computers, Software, Hardware, Employability Training, School Solutions, College Solutions, Corporate Training, Executive Management. https://www.niit.com/pages/defaultindia.aspx.
  • 24.Chakravarty, R. & Jaspreet, K. MOOCS in India: yet to shine. Int. J. Inf. Stud. Lib.1(1), ISSN Number: 2456 – 1827. (2016).
  • 25.Fox Feed. Carolyn. Higher, Open Education for India. Opensource.com, http://opensource.com/education/13/8/higher-education-india-MOOCS (2013).
  • 26.Devgun, P. VOCED Plus, the International Tertiary Education and Research Database. Prospects for Success of MOOC in Higher Education in Indiahttp://www.irphouse.com/ijict.htm (International Research Publications House, 2013).
  • 27.Christensen, G. and Brandon Alcorn Can MOOCS Help Expand Access to Higher Education in India? Center for the Advanced Study of India (CASI), https://casi.sas.upenn.edu/iit/christensenalcorn (2014).
  • 28.Asoke et al. Int. J. Eng. Res. Appl. ISSN: 2248–9622, Vol. 4, Issue 7(Version 3), 156–163 https://www.ijera.com/pages/v4-no7.html (2014).
  • 29.Bralic, A. & Divjak, B. Integrating MOOCS in traditionally taught courses: achieving learning outcomes with blended learning. Int. J. Educ. Technol. High. Educ.15(3). (2018).
  • 30.http://www.mordorintelligence.com/industry-reports/massive-open-online-course-mooc-market.MOOC Market size & share analysis - growth trends & forecasts (2023–2028).
  • 31.Holsapple, C. W. & Lee, P. A. Defining, assessing, and promoting E-Learning success: an information systems perspective. Decis. Sci. J. Innovative Educ.4(1), 67–85. 10.1111/j.1540-4609.2006.00102.x (2006). [Google Scholar]
  • 32.Wang, L. Socio-cultural learning theories and information literacy teaching activities in higher education. Am. Libr. Association. 47(2), 149–158 (2007). [Google Scholar]
  • 33.Fianu, E., Blewett, C., Ampong, G. & Ofori, K. Factors affecting MOOC usage by students in selected Ghanaian universities. Educ. Sci.8(2), 70. 10.3390/educsci8020070 (2018). [Google Scholar]
  • 34.Yang, M., Shao, Z., Liu, Q. & Liu, C. Understanding the quality factors that influence the continuance intention of students toward participation in MOOCS. Education Tech. Research Dev.65(5), 1195–1214. 10.1007/s11423-017-9513-6 (2017). [Google Scholar]
  • 35.Ouellette, K. How peer review enables better learning in online courses. MIT Open. Learninghttps://openlearning.mit.edu/news/how-peer-review-enables-better-learning-online-courses (2024).
  • 36.Yang, A. C., Chen, I. Y., Flanagan, B. & Ogata, H. How students’ self-assessment behavior afects their online learning performance. Computers Education: Artifcial Intell.3, 100058. 10.1016/j.caeai.2022 (2022a). [Google Scholar]
  • 37.Bork, R. H. & Rucks-Ahidiana, Z. Role ambiguity in online courses: an analysis of student and instructor expectations. https://ccrc.tc.columbia.edu/publications/role-ambiguity-inonline-courses.html (Teachers College, Columbia University, 2013).
  • 38.Stapa, S. H. The roles of teachers and students in computer supported collaborative learning among distance learners. In Proceedings of the 5th International Conference in Open & Distance Learning - November 2009, Athens, Greece10.12681/icodl.48 (2009).
  • 39.Subramaniam, T., Suhaimi, N., Latif, A., Abu Kassim, Z. & Fadzil, M. MOOCS readiness: the scenario in Malaysia. Int. Rev. Res. Open. Distrib. Learn.20(3), 80–101. 10.19173/irrodl.v20i3.3913 (2019). [Google Scholar]
  • 40.Moore, M. G. Editorial: three types of interaction. Am. J. Distance Educ.3(2), 1–7. 10.1080/08923648909526659 (1989). [Google Scholar]
  • 41.Moore, G. E., Warner, W. J. & Jones, D. W. W. Student-to-Student interaction in distance education classes: what do graduate students want?? J. Agricultural Educ.57(2), 1–13 (2016). https://eric.ed.gov/?id=EJ1122974 [Google Scholar]
  • 42.Gutiérrez-Rojas, I., Alario-Hoyos, C., Pérez-Sanagustín, M., Leony, D. & Delgado-Kloos, C. Scaffolding self-learning in MOOCS. In Proceedings of the European MOOC Stakeholder Summit 2014 Conference, 43–44 (2014).
  • 43.Raygan, A. & Moradkhani, S. Factors infuencing technology integration in an EFL context: investigating EFL teachers’ attitudes, TPACK level, and educational climate. Comput. Assist. Lang. Learn.35(8), 1789–1810. 10.1080/09588221.2020.1839106 (2022). [Google Scholar]
  • 44.Hood, N., Littlejohn, A. & Milligan, C. Context counts: how learners’ contexts influence learning in a MOOC. Comput. Educ.91(1), 83–91. 10.1016/j.compedu.2015.10.019 (2015). [Google Scholar]
  • 45.Hong, J. C., Hwang, M. Y., Tsai, C. M., Liu, M. C. & Lee, Y. F. Exploring teachers’ attitudes toward implementing new ICT educational policies. Interact. Learn. Environ.30(10), 1823–1837. 10.1080/10494820.2020.1752740 (2022). [Google Scholar]
  • 46.Willis, J. E. III. MOOCS and Foucault’s heterotopia: On community and self-efficacy. In Proceedings of the Sixth International Conference of MIT’s Learning International Networks Consortium (LINC) (2013).
  • 47.Chen, B., Chang, Y. H., Ouyang, F. & Zhou, W. Fostering student engagement in online discussion through social learning analytics. Internet High. Educ.37, 21–30. 10.1016/j.iheduc.2017.12.002 (2018). [Google Scholar]
  • 48.Tang, Y. M. et al. Comparative analysis of student’s live online learning readiness during the coronavirus (COVID-19) pandemic in the higher education sector. Computers & Education. 168, 104211 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Deeva, G., Bogdanova, D., Serral, E., Snoeck, M. & De Weerdt, J. A review of automated feedback systems for learners: classifcation framework, challenges and opportunities. Comput. Educ.162, 104094. 10.1016/j.compedu.2020.104094 (2021). [Google Scholar]
  • 50.Wise, A. F. & Vytasek, J. Learning analytics implementation design. In Handbook of Learning Analytics (eds Lang, C. et al.) 151–160 (Society for Learning Analytics Research, 2017).
  • 51.Zhang, S. et al. Understanding student teachers’ collaborative problem solving: insights from an epistemic network analysis (ENA). Comput. Educ.183, 104485. 10.1016/j.compedu.2022.104485 (2022).
  • 52.Ozbek, I. E. A. A classification of student skills and competencies in open and distance learning. Int. J. New. Trends Educ.6(3), 174–185 (2015). [Google Scholar]
  • 53.Palomo-Duarte, M., Dodero, J. M., Medina-Bulo, I., Rodríguez-Posada, E. J. & Ruiz-Rube, I. Assessment of collaborative learning experiences by graphical analysis of Wiki contributions. Interact. Learn. Environ.22(4), 444–466. 10.1080/10494820.2012.680969 (2014). [Google Scholar]
  • 54.Stangier, U. Kananian,S.,&Schuller,J(2021),Perceived vulnerability to disease, knowledge about COVID-19 and changes in preventive behavior during lockdown in German convenience sample. Curr. Psychol., 7362–7370 10.1007/s12144-021-01456-6. [DOI] [PMC free article] [PubMed]
  • 55.Zhao, H. Factors influencing self-regulation in e-learning 2.0: confirmatory factor model. Can. J. Learn. Technol.42(2), 2–22. 10.21432/T2C33K (2016). [Google Scholar]
  • 56.Chen, C. M. & Chen, F. Y. Enhancing digital reading performance with a collaborative reading annotation system. Comput. Educ.77, 67–81. 10.1016/j.compedu.2014.04.010 (2014). [Google Scholar]
  • 57.Macdonald, J. Assessing online collaborative learning: process and product. Comput. Educ.40(4), 377–391. 10.1016/S0360-1315(02)00168-9 (2003). [Google Scholar]
  • 58.Van Aalst, J. Assessment in collaborative learning. In The International Handbook of Collaborative Learning (ed Hmelo-Silver, C. E.), 280–296 (Routledge, 2013).
  • 59.AbuSeileek, A. F. The effect of computer-assisted cooperative learning methods and group size on the EFL learners’ achievement in communication skills. Comput. Educ.58(1), 231–239. 10.1016/j.compedu.2011.07.011 (2012). [Google Scholar]
  • 60.Ebadijala, M. & Moradkhani, S. Impacts of computer-assisted collaborative writing, collaborative prewriting, and individual writing on EFL learners’ performance and motivation. Comput. Assist. Lang. Learn.10.1080/09588221.2023.2178463 (2023). [Google Scholar]
  • 61.Karim, S., Mohamad, K. & Saman, E. Vygotsky’s zone of proximal development: instructional implications and teachers’ professional development. Engl. Lang. Teach. Vol. 3(4), 237–248. 10.5539/elt.v3n4p237 (2010). [Google Scholar]
  • 62.Richardson John, T. E. Academic attainment in students with dyslexia in distance education. 21(4), 323–337 10.1002/dys.1502 (2015). [DOI] [PubMed]
  • 63.Che, X., Luo, S., Wang, C. & Meinel, C. An attempt at MOOC localization for Chinese-speaking users. Int. J. Inform. Educ. Technol.6(2), 90–96. 10.7763/ijiet.2016.v6.665 (2016). [Google Scholar]
  • 64.Lieberman, D. A., Bates, C. H. & So, J. Young children’s learning with digital media. Comput. Sch.26(4), 271–283. 10.1080/07380560903360194 (2009). [Google Scholar]
  • 65.Ramachandran, L., Gehringer, E. F. & Yadav, R. K. Automated assessment of the quality of peer reviews using natural language processing techniques. Int. J. Artifcial Intell. Educ.27(3), 534–581. 10.1007/s40593-016-0132-x (2017). [Google Scholar]
  • 66.Lei, C. & Chan, C. K. Developing metadiscourse through refective assessment in knowledge building environments. Comput. Educ.126, 153–169. 10.1016/j.compedu.2018.07.006 (2018). [Google Scholar]
  • 67.Liu, Z., Kong, X., Chen, H., Liu, S. & Yang, Z. MOOC-BERT: automatically identifying learner cognitive presence from MOOC discussion data. IEEE Trans. Learn. Technol.10.1109/TLT.2023.3240715 (2023). [Google Scholar]
  • 68.Yang, J., Du, X., Hung, J. L. & Tu, C. H. Analyzing online discussion data for Understanding the student’s critical thinking. Data Technol. Appl.56(2), 303–326. 10.1108/DTA-04-2021-0088 (2022b). [Google Scholar]
  • 69.Raza, S.A., Qazi, W., Khan, K. A. & Salam, J. Social isolation and acceptance of the learning management system(LMS) in the time of COVID 19 pandemic:an expansion of the UTAUT model. J. Edcational Comput. Res.59(2), 183–208. 10.1177/0735633120960421 (2020). [Google Scholar]
  • 70.Wang, Y., Yu, L. H. & Yu, Z. G. An extended CCtalk technology acceptance model in EFL education. Educ. Inform. Technol., 27(5), 6621–6640 10.1007/s10639-022-10909-9 (2022).
  • 71.Yu, Z. G. & Yu, X. Z. An extended technology acceptance model of a mobile learning technology. Comput. Appl. Eng. Educ.27(3), 721–732. 10.1002/cae.22111 (2019). [Google Scholar]
  • 72.Albelbisi, N. A. & Yusop, F. D. Systematic review of a nationwide MOOC initiative in Malaysian higher education system. Electron. J. e-Learning. 18(4), 288–299. 10.34190/EJEL.20.18.4.002 (2020). [Google Scholar]
  • 73.Pham, L., Limbu,Y.B,Bui, T. K., Ngyuyen, H. T. & Pham, H. T. Does e-learning service quality influence e-learning student satisfaction and loyality? Evudence from Vietbam. Int. J. Educ.16(1), 1–26. 10.1186/s4123901901363 (2019). [Google Scholar]
  • 74.Taherdoost, H., Sahibuddin, S. & &Jalaliyoon, N. Exploratory factor analysis: concepts and theory. Adv. Appied Pure Math.27, 375–382 (2022). [Google Scholar]
  • 75.Mohan, M. M., Upadhyaya, P. & Pillai, K. R. Intention and barriers to use MOOCS: an investigation among the post graduate students in India. Educ. Inform. Technol. ISSN: 1360–2357, 10.1007/s10639-020-10215-2
  • 76.Podsakoff, P. M., MacKenzie, S. B. & Podsakoff, N. P. Sources of method bias in social science research and recommendations on how to control it. Ann. Rev. Psychol.63, 539–569. 10.1146/annurev-psych-120710-100452 (2012). [DOI] [PubMed] [Google Scholar]
  • 77.Strijbos, J. W. Assessment of (computer-supported) collaborative learning. IEEE Trans. Learn. Technol.4(1), 59–73. 10.1109/TLT.2010 (2010). [Google Scholar]
  • 78.Haumin, L. & Madhusudhan, M. An Indian Based MOOC: An Overview,Library Philosophy and Practice (e-journal), https://digitalcommons.unl.edu/libphilprac/2382 (2019).
  • 79.Hsu, H. T. & Lin, C. C. ,Extending the technology acceptance model of college learners mobile assisted language learning by incorporating psychological constructs,british. J. Educational Technol.53(2), 286–306. 10.1111/bjet.13165 (2022). [Google Scholar]
  • 80.Su, B., Bonk, C. J., Magjuka, R. J., Liu, X. & Lee, S. H. The importance of interaction in web-based education: A program-level case study of online MBA courses. J. Interact. Online Learn.4(1), 1–19 (2005). [Google Scholar]
  • 81.Hair, J. F. Jr. et al. Partial least squares structural equation modeling (PLS-SEM): an emerging tool in business research. Eur. Bus. Rev.26, 106–121. 10.1108/EBR-10-2013-0128 (2014). [Google Scholar]
  • 82.Hair, J. F., Hult, G. T. M., Ringle, C. M. & Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed. (Sage Publications Inc., 2017).
  • 83.Hair, J., Black, W., Babin, B., Anderson, R. & Tatham, R. Multivariate Data Analysis. 6th ed, (Pearson Prentice Hall, 2006).
  • 84.Watulak, S. L. I’m not a computer person’: negotiating participation in academic discourses. Br. J. Edu. Technol.43(1), 109–118. 10.1111/j.1467-8535.2010.01162.x (2012). [Google Scholar]
  • 85.Topali, P. et.al. Unveiling the role of learning design on feedback in MOOCS. ResearchGate. (2024).
  • 86.Lin, Y. P. & Yu, Z. G. A bibliometric analysis of peer assessment in online language courses. Languages8, 47. 10.3390/languages8010047 (2023). [Google Scholar]
  • 87.Miao, Y., Li, Y. & Zhang, H. Teacher-student interaction, student-student interaction, and social presence: their impacts on learning engagement in online learning environments. https://www.researchgate.net/publication/361883105_Teacher-Student_Interaction_Student-Student_Interaction_and_Social_Presence_Their_Impacts_on_Learning_Engagement_in_Online_Learning_Environments (2022). [DOI] [PubMed]
  • 88.Albelbisi, N., Yusop, F. & Salleh, S. Mapping the factors influencing success of massive open online courses (MOOC) in higher education. Eurasia J. Math. Sci. Technol. Educ.14(7), 2999–3012. 10.29333/ejmste/91286 (2018). [Google Scholar]
  • 89.Margaryan, A., Bianco, M. & Littlejohn, A. Instructional quality of massive open online courses (MOOCS). Comput. Educ.80, 77–83. 10.1016/j.compedu.2014.08.005 (2015). [Google Scholar]
  • 90.El Said, G. R. Understanding how learners use massive open online courses and why they drop out. J. Educational Comput. Res.55(5), 724–752. 10.1177/0735633116681302 (2016). [Google Scholar]
  • 91.Shen, D., Cho, M. H., Tsai, C. L. & Marra, R. Unpacking online learning experiences: online learning self-efficacy and learning satisfaction. Internet High. Educ.19(1), 10–17. 10.1016/j.iheduc.2013.04.001 (2013). [Google Scholar]
  • 92.Inie, N., Barkhuus, L. & Brabrand, C. Interacting with academic readings—A comparison of paper and laptop. Social Sci. Humanit. Open.4(1). 10.1016/j.ssaho.2021.10022 (2021).
  • 93.Akram, H. & Li, S. Understanding the role of teacher-student relationships in students’ online learning engagement: mediating role of academic motivation. Percept. Mot. Skills. 131(5), 1–24. 10.1177/00315125241248709 (2024). [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1 (279.6KB, pdf)

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.Point of Contact: [Priyanka Jarial ], [jarial.priyanka@gmail.com].


Articles from Scientific Reports are provided here courtesy of Nature Publishing Group

RESOURCES