Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2023 Feb 6:1–22. Online ahead of print. doi: 10.1007/s10639-023-11621-y

An information system success model for e-learning postadoption using the fuzzy analytic network process

Puong Koh Hii 1,, Chin Fei Goh 2, Owee Kowang Tan 2, Rasli Amran 3, Choon Hee Ong 2
PMCID: PMC9900560  PMID: 36779193

Abstract

The underutilization of e-learning among university lecturers is an important issue that needs to be resolved. This study aimed to formulate an e-learning postadoption model for Malaysian universities. Data were collected using self-administered questionnaires involving 36 e-learning experts who from lecturers in public and private universities in Malaysia. The data collected was then analyzed using the extent analysis method proposed by Chang (European Journal of Operational Research, 95(3), 649–655, 1996) to examine the weights and rankings of the factors and subfactors. This study showed that for e-learning postadoption, the most important factor is institution service quality, followed by system quality, content quality, instructors' characteristics, and learners' characteristics. This study extends the information systems success model into the e-learning postadoption context. In particular, this study offered insights concerning the dependencies among the factors in the model within the Malaysian university context. The findings are useful for the long-range strategic management of university administrators, and the model can be adopted as a reference to form a rating system to analyze e-learning postadoption. University administrators can analyze critical factors that increase e-learning’s post adoption and lead to more efficient resource allocation and management of e-learning.

Keywords: E-learning, Fuzzy Set Theory, Analytic Network Process, Information System Success Model, Extent analysis method

Introduction

Integrating the use of e-learning into teaching and learning is a current challenge in many universities. Despite large investments on information communication technology (ICT) infrastructures in many universities, prior research have revealed that underutilization of e-learning is the major issue (Azhari & Ming, 2015; Lawrence & Tar, 2018; Yim et al., 2019). According to a recent survey, the completion percentage of online courses is extremely low when compared to the number of students enrolled in these courses, implying a high dropout rate (Aldowah et al., 2020).

Some studies report that university lecturers may have low willingness to use an e-learning system in teaching and learning. This problem is commonly found among university lecturers with low competency in information technology particularly those who lack confidence in the benefits of e-learning in teaching and learning (Azhari & Ming, 2015; Lawrence & Tar, 2018; Razak et al., 2020). Heavy teaching and research workloads are also causing university lecturers to perceive using e-learning in teaching as another job burden (Moustakas & Robrade, 2022). Additionally, some university lecturers are resistant to e-learning because they are more comfortable with face-to-face instruction (Moustakas & Robrade, 2022).

Ineffective management is another hindrance to successfully implement e-learning within universities (Azhari & Ming, 2015; Embi, 2011; Razak et al., 2020). can be evidenced by various problems in infrastructure, incentives, technical support, and training, among others (Lawrence & Tar, 2018). The absence of effective management makes the postadoption period of the e-learning system difficult for teachers and students. For example, engaging university lecturers in e-learning training is a typical university management initiative. However, past studies have shown that improper training schedules that overlap with teaching work have caused some university lecturers to have only moderate motivation to participate in e-learning workshops (Azhari & Ming, 2015; Embi, 2011). Such management initiatives fail to enhance attendance and engagement in e-learning training among university lecturers. Therefore, effective management is of paramount importance to enhance and sustain the willingness of university lecturers to learn and use e-learning.

Previous researchers have adopted Delone and McLean (2003) information system success model (D&M model) to examine the factors influencing e-learning’s postadoption. The review of literature shows that studies that use the D&M model to assess the factors that influence e-learning’s postadoption can be divided into three research perspectives. The first research perspective focuses on the three fundamental factors of the D&M model, i.e. system quality, content quality, and service quality (Fitriastuti et al., 2019; Ghazal et al., 2017; Motaghian et al., 2013; Su et al., 2016). Another research perspective highlights learners’ characteristics and instructors’ characteristics (Bhuasiri et al., 2012; Xaymoungkhoun et al., 2012; Yassine et al., 2017). The final research perspective concentrates on the user interface and learning community (Choi & Jeong, 2019; Farid et al., 2018; Iryanti et al., 2016). Intriguingly, there are limited studies that examine the factors from the second and third research perspectives (Choi & Jeong, 2019; Farid et al., 2018; Yassine et al., 2017).

Furthermore, there is a lack of studies on the dependencies among the factors and subfactors in the e-learning postadoption literature that apply the Multiple Criteria Decision Making (MCDM) method. The literature review shows that the AHP is the most well-known MCDM technique for evaluating the factors and subfactors that increase e-learning's postadoption. The popularity of AHP can be attributed to its convenience and simplification; however, it ignores dependencies among the factors and subfactors (Mikhailov & Singh, 2003; Zare et al., 2016). Ignoring these factors will lead to overestimated or underestimated weights of factors and subfactors. Such shortcoming can be addressed by using the Analytic Network Process (ANP), which allows complex inter-relationships among decision levels and factors (Saaty, 2005; Tseng et al., 2011).

In conclusion, albeit the widespread implementation of e-learning during the COVID-19 pandemic, the post adoption of e-learning by lecturers and students is relatively low (Awang et al., 2018; Yim et al., 2019). Thus, this study attempts to shed light on the factors that increase e-learning's post adoption within universities in Malaysia.

Literature review

The study uses Delone and McLean (2003) information system successmodel (D&M model) to confirm the factors and subfactors that increase e-learning's postadoption. In addition, Fuzzy Set Theory (FST) was used to deal with vague and imprecise information obtained from the experts' judgments and opinions gathered in this study.

Information Systems Success Model (D&M Model)

The initial D&M model was proposed by DeLone and McLean (1992) to examine the post adoption of an information system. The D&M model consists of six dimensions: system quality, content quality, use, user satisfaction, personal influence, and organizational influence. The D&M model was later advanced by Delone and McLean (2003) to include service quality because theoretical evidence shows that service quality is a success factor that influences the use and user satisfaction of e-learning. Another new factor, i.e., net benefits, was also added into the D&M model as a result of merging personal influence and organizational influence. According to Delone and McLean (2003), system quality is viewed as the usability, performance, and technical characteristics of the system itself. Content quality relates to the quality of course contents in terms of accuracy, completeness, ease of understanding, consistency, relevance, and being up to date. Service quality corresponds to the support provided by the institutions to ensure the sustainability of the e-learning system. System use, user satisfaction, and net benefits evaluate the website's effectiveness. For instance, use is viewed as the effective use of a system, user satisfaction is the perceived level of agreeableness towards the entire system in terms of effectiveness and appropriateness, and net benefits are the perceived organizational and individual influence on task performance and efficiency (Delone & McLean, 2003).

As mentioned in the introduction, the review demonstrates that there is a critical information gap about the components in the D&M Model that influence e-learning after adoption. In this study, the research framework was first formulated through an intensive literature review, followed by pre-testing done by six e-learning experts (refer to Fig. 1). Figure 1 shows the research framework of this study which consists of the goal, factors, subfactors, and dependencies. The normal arrows show that the influence of lower elements on the higher elements. For instance, five factors influence the goal, i.e. e-learning's post adoption. The dotted arrows indicate the dependencies among factors and subfactors. For instance, learners’ characteristics are influenced by instructors’ characteristics, institution service quality, system quality, and content quality.

Fig. 1.

Fig. 1

Research framework of this study

Fuzzy Set Theory

Fuzzy Set Theory (FST) was introduced by Zadeh (1965) to deal with vague and imprecise information. FST is a mathematical theory that models human cognitive processes' fuzziness by expressing vague data using natural language with linguistics (Chang et al., 2015). The word “fuzzy" means that things are vague and unclear. Fuzzy logic is different from Boolean logic, which reduces all values to either true or false. Figure 2 illustrates an example that shows the differences between Boolean logic and fuzzy logic.

Fig. 2.

Fig. 2

Boolean logic and fuzzy logic

In Fig. 2, the following question was asked: “Is Thomas honest or not?” Under Boolean logic, there are only two possible answers: "yes" and "no". In contrast, in fuzzy logic, the answer is indicated by a value in the range from "0" to "1" where "1.0" means that Thomas is extremely honest, "0.75" indicates that Thomas is very honest, "0.25" implies that Thomas is somewhat honest, and "0.0" indicates that Thomas is extremely dishonest. Unlike Boolean logic, fuzzy logic is a gross oversimplification of real-world issues that represent the degree of truth.

The foundation of FST is the classical set theory which can be understood in the context of set membership (see Fig. 3) (Chang et al., 2015; Onut et al., 2011). Classical set theory includes elements that satisfy the precise properties of membership, while FST includes elements that satisfy the imprecise properties of membership. That is, classical set theory only allows values of "1.0" (full membership) or "0.0" (full non-membership). On the other hand, fuzzy set theory allows partial membership, e.g., a value of "0.8" indicates there is a strong but partial membership, and a value of "0.2" means there is a weak but partial membership.

Fig. 3.

Fig. 3

Membership functions of the classical set and fuzzy set

In short, fuzzy logic is not logic that is fuzzy but is rather the logic used to define fuzziness. Fuzzy logic turns a crisp value into a fuzzy value, enhancing the decision-making process (Yüksel & Dağdeviren, 2010) as decision-makers frequently find that making fuzzy judgments is easier than making fixed value judgments.

Analytic Network Process (ANP)

ANP is one of the MCDM methods developed by Saaty in 1996. The ANP method is a generalization of the Analytic Hierarchy Process (AHP) (Chang et al., 2015; Onut et al., 2011; Saaty, 2008). The ANP is utilized as a quantifying tool to evaluate qualitative and quantitative factors in many disciplines (Soma, 2003). Inputs can be obtained from actual measurements, including weights, prices, heights, and other data, or subjective opinions, such as preferences, feelings, satisfaction, and other opinions. The ANP can evaluate the weights of the factors and subfactors of specific issues.

The ANP method uses a pairwise comparison feature to design the questions (see Table 1). The pairwise comparison feature enables the respondents to compare two elements only in each question, which will reduce the burden of respondents to judge how much an element dominates another (Saaty, 2008). Respondents are required to circle (or tick) only one value in each question. For instance, if a respondent perceives that F1 and F2 are equally important, the respondent circles the value “1”. Another example is if a respondent perceives that F1 is extremely more important than F3 in the second question, the respondent circles the value “9” on the left side.

Table 1.

Example of pairwise comparison questions

No. Factor A A is more important than B. B is more important than A Factor B
1 F1 9 8 7 6 5 4 3 2 2 3 4 5 6 7 8 9 F2
2 F1 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 F3
3 F2 9 8 7 6 5 4 3 2 1 2 3 4 5 7 8 9 F3

The ANP calculates the consistency ratio (CR) to determine the respondents’ consistency in answering the questionnaire (Saaty, 2005). As a rule of thumb, 10% of the consistency ratio (CR) is allowed because human judgments are not always consistent. If the consistency ratio is less than 10%, the transitivity issue of the ANP will be minimized (Saaty, 1980). No transitivity problem denotes that there is consistency in the comparison of the (sub)factors, i.e., if A is more important than B and B is more important than C, then A must be more important than C.

A CR value exceeding 0.1 denotes that there is inconsistency among the respondents in answering the questionnaire (Saaty, 2005). The issue can be addressed through the removal of some respondents’ data to reduce the CR. This step is repeated until the CR of all the pairwise comparison matrices are smaller than 0.1.

Integration of Fuzzy Set Theory and Analytic Network Process

The nine-point scale proposed by Saaty (1990) is mainly used with the crisp decision application, and the scale is not suitable for qualitative factors that are difficult to numerically measure (Lin et al., 2015). Beauty, kindness and happiness are examples of such qualitative factors. In many cases, there is uncertainty in the preference model of the respondents due to incomplete knowledge or the complexity of the decision-making environment (Mikhailov & Singh, 2003). Respondents may find it difficult to provide exact numerical values when they are uncertain about their level of preference in the decision environment. A nine-point scale is also an unbalanced scale of judgment if the respondents prefer to answer all the questions using a larger scale or smaller scale (Lin et al., 2015).

The above discussions have highlighted that the use of crisp pairwise comparisons is imprecise and insufficient for capturing experts’ judgments and opinions. According to Yüksel and Dağdeviren (2010), the problem can be resolved by integrating FST with the ANP, namely, the Fuzzy Analytic Network Process (FANP) as FANP uses a linguistic scale to transform crisp values into fuzzy values. In the FANP, respondents answer the questionnaire using a linguistic scale to enhance the decision-making process (see Table 2). Therefore, respondents will find that it is easier to give fuzzy judgments compared to fixed value judgments (Chang et al., 2015).

Table 2.

Fuzzy scale for the importance

Linguistic scale Explanation Fuzzy scale Reciprocal fuzzy scale
Equally important (EI) Both elements are equally important. (1,1,1) (1,1,1)
Weakly more important (WI) An element is weakly more important than another element. (2,3,4) (1/4,1/3,1/2)
Strongly more important (SI) An element is strongly more important over another element. (4,5,6) (1/6,1/5,1/4)
Very strongly more important (VSI) An element is very strongly more important over another element. (6,7,8) (1/8,1/7,1/6)
Absolutely more important (AI) An element is absolutely more important over another element. (8,9,10) (1/10,1/9,1/8)

Methodology

A cross-sectional survey was conducted to identify the factors and subfactors that influence the post adoption of e-learning in Malaysia. Pairwise comparison questions were used in the questionnaire so that the respondents can compare two elements (factors or subfactors) only in each question, which reduced the burden of respondents to judge how much an element dominates another (Saaty, 2008) (Refer Table 1). The scale of the pairwise comparison questions were “EI” denotes equally important, “WI” denotes weakly more important, “SI” denotes strongly more important, “VSI” denotes very strongly more important and “AI” denotes absolutely more important (Kahraman et al., 2004; Kahraman et al., 2003) (Refer Table 2).

This study was conducted in the Malaysian public and private universities. The university lecturers were given questionnaires in person and by email. Purposive sampling technique was applied to collect data from the e-learning experts in Malaysia. Six filtering questions were included in questionnaire to identify the qualified e-learning experts. The filtering questions asked whether respondents are actively using the e-learning system to conduct teaching and learning activities. The activities may include uploading course outline, uploading course content, such as PowerPoint slide, video clip, audio, image and others, conducting a forum or a discussion, response to learners' questions and comments and conducting an assessment, such as quiz and test. Thirty-six e-learning experts participated in this study. The sample size was sufficient to run the FANP analysis as per previous studies by (Bathaei et al., 2019; Hemmati et al., 2018; Nilashi et al., 2016; Sadeghi & Larimian, 2018; Youneszadeh et al., 2017).

The extent analysis method by Chang (1996) which is one of the popular analysis techniques in FANP, was applied to perform the analysis in this study. Figure 4 presents the systematic procedure of the FANP.

Fig. 4.

Fig. 4

Systematic procedure of Fuzzy Analytic Network Process

Findings

Table 3 shows the sample characteristics of the respondents. The final data from the 36 experts fulfill the definition of expert used in this study. First, experts are academic staff with more than 3-year e-learning experience. The expert definition is consistent with prior e-learning based Multicriteria Decision Making (MCDM) literature (Chen, 2009; Jeong & Yeo, 2014; Tseng et al., 2011). Second, experts have experience in using six basic activities e-learning functions. Third, experts must be the academic staff with e-learning awards, professors, associate professors or academic staff with a relevant administrative position. Thus, it is believed that expert opinions collected from this group of respondents can provide a precise and insightful result to this study.

Table 3.

Demographic profile of the respondents

Sample characteristics Frequency Percentage (%)
Gender
 Male 17 47.22
 Female 19 52.78
Age
 31 ~ 40 years old 6 16.67
 41 ~ 50 years old 10 27.78
 51 ~ 60 years old 14 38.89
 61 years old and above 6 16.67
Working Place
 Universiti Islam Antarabangsa Malaysia 6 16.67
 Universiti Teknologi MARA 1 2.78
 Universiti Kebangsaan Malaysia 2 5.56
 Universiti Malaya 1 2.78
 Universiti Malaysia Pahang 2 5.56
 Universiti Malaysia Sabah 3 8.33
 Universiti Putra Malaysia 4 11.11
 Universiti Pendidikan Sultan Idris 3 8.33
 Universiti Tun Hussein Onn Malaysia 2 5.56
 Universiti Teknologi Malaysia 2 5.56
 Universiti Teknologi Petronas 1 2.78
 Universiti Utara Malaysia 5 13.89
 Multimedia University 2 5.56
 Sunway University 2 5.56
Years of involvement in e-learning
 3 ~ 7 years 11 30.56
 8 ~ 12 years 11 30.56
 More than 12 years 14 38.89
Activities Involved through E-learning
 Upload course outline 36 100.00
 Upload course content, such as PowerPoint slides, video clips, audio, image, and others. 36 100.00
 Conduct a forum or a discussion 36 100.00
 Response to learners’ questions and comments. 36 100.00
 Conduct an assessment, such as quiz and test. 36 100.00
 Distribute assignments online 36 100.00
Position
 Deputy Director of that manage e-learning development 3 8.33
 E-learning coordinator 4 11.11
 Committee of an e-learning development team 4 11.11
 A director who received e-learning award 2 5.56
 Deputy director who received e-learning award 1 2.78
 Head of Department (active e-learning user) 4 11.11
 Professor (active e-learning user) 7 19.44
 Assoc. Prof. (active e-learning user) 11 30.56

The extent analysis method by Chang (1996) was used to analyze data from questionnaire answered by the e-learning experts and the result was then tabulated in Table 4. All the consistency ratio (CR) values are less than 0.1, indicating that the results are consistent and reliable (Saaty, 2005). Based on Table 4, the most important factor is institution service quality (0.390), followed by system quality (0.277), content quality (0.268), instructors' characteristics (0.050), and learners' characteristics (0.015).

Table 4.

Ranking of the factors and subfactors

No Factors Global Weights Ranking for factors Subfactors Local Weights Global Weights Ranking for subfactors Consistency test
1 Learners’ characteristics 0.015 4 Computer self-efficacy 0.380 0.006 16

λ = 3.008

CI = 0.004

RI = 0.580

CR = 0.007 < 0.1

Internet self-efficacy 0.450 0.007 15
Attitude toward e-learning 0.170 0.003 18
2 Instructors’ characteristics 0.050 5 Timely response 0.112 0.006 17

λ = 3.005

CI = 0.003

RI = 0.580

CR = 0.005 < 0.1

Technology control 0.048 0.002 19
Attitude toward student 0.841 0.042 9
3 Institution service quality 0.390 1 Network infrastructure 0.691 0.270 1

λ = n/a

CI = n/a

RI = 0.000

CR = 0.000 < 0.1

Availability of technical support staff 0.309 0.121 2
4 System quality 0.277 2 System accessibility 0.233 0.064 5

λ = 5.041

CI = 0.010

RI = 0.120

CR = 0.084 < 0.1

System response 0.111 0.031 12
System reliability 0.249 0.069 4
System stability 0.210 0.058 7
System security 0.198 0.055 8
5 Content quality 0.268 3 Relevant content 0.075 0.020 13

λ = 6.024

CI = 0.005

RI = 1.240

CR = 0.004 < 0.1

Accuracy 0.036 0.010 14
Up-to-date content 0.221 0.059 6
Complete content 0.132 0.035 11
Consistency 0.140 0.038 10
Useful content 0.396 0.106 3
Total 1.000 1.000

In terms of the local weights, the values of internet self-efficacy (0.450) and computer self-efficacy (0.380) show that they are the most critical subfactors among learners' characteristics. Among the subfactors for instructors' characteristics, attitude toward students (0.841) indicates the highest importance. Network infrastructure (0.691) is rated higher than the availability of technical support staff (0.309) among the institution service quality subfactors. The values of system reliability (0.249) and system accessibility (0.233) are the most important subfactors in system quality. The most important subfactors of content quality are useful content (0.396) and up-to-date content (0.221).

Concerning the global weights, the five most essential subfactors are the network infrastructure (0.270), availability of technical support staff (0.121), useful content (0.106), system reliability (0.069), and system accessibility (0.064).

Discussion

The findings of this study offer several important insights concerning the factors influencing the postadoption of e-learning in the Malaysian context.

First, this study shows that institution service quality (global weight = 0.390) is the most important factor influencing the postadoption of e-learning in the Malaysian context. This finding is inconsistent with prior studies that have suggested that institution service quality is the least important factor (Anggrainingsih et al., 2016; Bhuasiri et al., 2012; Fitriastuti et al., 2019; Xaymoungkhoun et al., 2012). Nevertheless, the results of this study are consistent with Anggrainingsih et al. (2018) and Lee (2008) studies. In this study, university lecturers in Malaysian universities generally perceive that there is a room for improvement in institution service quality. For example, the network infrastructure should provide stable and complete network coverage in universities. Additionally, strong technical support for university lecturers is required to ensure that the e-learning related issues are resolved promptly (Lee, 2008). Moreover, relevant and the quality training should be provided to ensure that e-learning skills among university lecturers are up-to-date (Lee, 2008; Siritongthaworn et al., 2006). In short, strong institution service quality is the main pillar to enhance the use of e-learning among university lecturers .

This study shows that the second most important factor is system quality (global weight = 0.277). This finding contradicts prior studies that suggest system quality is the least important (Anggrainingsih et al., 2018; Jie, 2010). Nonetheless, this result is consistent with the majority of prior empirical studies (Anggrainingsih et al., 2016; Fitriastuti et al., 2019; Garg & Jain, 2017; Lukhayu Pritalia et al., 2018; Tseng et al., 2011). University lecturers perceive that the system quality should be tailored to the needs of learners. At a minimum, users require a responsive, stable and reliable e-learning platform (Fitriastuti et al., 2019; Tseng et al., 2011). A bad experience with e-learning platform will lower learners’ and instructors’ interests and hamper intentions to use e-learning in the future (Garg & Jain, 2017; Lukhayu Pritalia et al., 2018).

Content quality (global weight = 0.268) is the third most important factor in this study. This finding contradicts previous studies suggesting that content quality is the least crucial factor in e-learning’s postadoption (Garg & Jain, 2017; Hwang et al., 2004; Lin et al., 2014). However, this finding is in line with prior studies that verify that content quality is an important factor influencing the postadoption of e-learning (Choi & Jeong, 2019; Iryanti et al., 2016; Jain et al., 2016; Jie, 2010; Su et al., 2016). The rationale is that university lecturers perceive that good content quality motivates learners to use e-learning for learning activities. Good quality course content should be relevant, accurate, up-to-date, complete, and useful (Choi & Jeong, 2019; Jie, 2010). Good course content also contains active and vivid multimedia designs, such as images and videos, to attract learners' attention (Chao & Chen, 2009; Lin et al., 2014). In short, content quality is a significant factor in ensuring that the learners perceive that it is worthwhile to use e-learning.

In this study, instructors’ characteristics are the fourth most important factor (global weight = 0.05) influencing e-learning’s postadoption. This finding is inconsistent with the study of Bhuasiri et al. (2012), which suggested that instructors' characteristics are the most significant factor. Instructors’ characteristics, such as their responsiveness to learners’ inquiries, technology control, and attitude toward students, are important to enhance the use of e-learning among instructors. Nevertheless, the finding is largely congruent on e-learning postadoption studies suggesting instructors' characteristics play a less significant role in e-learning postadoption (Anggrainingsih et al., 2016; Lukhayu Pritalia et al., 2018). One possible explanation is that the qualities of instructors are a result of the institution's service and system quality to which they were exposed (see dependencies in Fig. 1). Training and workshops will generally enhance instructors' technology control toward e-learning. In addition, the system quality determines instructors' intention to use e-learning. In other words, instructors’ characteristics can be shaped by factors, i.e., service quality and system quality; therefore, how they experience such factors is more important in Malaysian universities.

The results show that the least important factor influencing e-learning' postadoption (global weight = 0.015) is learners' characteristics. This finding contradicts previous findings, which conclude that learners' characteristics significantly influence the postadoption of e-learning (Bhuasiri et al., 2012; Mehregan et al., 2011; Xaymoungkhoun et al., 2012). These studies found that learners’ characteristics, namely, computer self-efficacy, internet self-efficacy, and attitude toward e-learning, enhance learners' intentions to use an e-learning system. Nonetheless, the results are in line with the studies by Anggrainingsih et al. (2018) and Lukhayu Pritalia et al. (2018), which suggest that learners' characteristics are probably less significant factors for tech-savvy learners. These learners generally have great technology exposure and thus they have a high willingness to use e-learning.

Regarding learners' characteristics, internet self-efficacy (local weight = 0.450) and computer self-efficacy (local weight = 0.380) are two most important learners' characteristics. These results suggest that training can be offered to increase learners’ computer and ICT skills, which in turn, will develop a positive attitude toward e-learning (Anggrainingsih et al., 2018; Bhuasiri et al., 2012; Lukhayu Pritalia et al., 2018).

Among the subfactors of instructors' characteristics, attitude toward students (local weight = 0.841) is the most importance. A positive attitude toward students means that an instructor will likely prepare good course content and be responsive to learners' inquiries (Anggrainingsih et al., 2018; Bhuasiri et al., 2012; Lukhayu Pritalia et al., 2018).

Furthermore, the network infrastructure (local weight = 0.691) is more important than the availability of technical support staff (local weight = 0.309) among the institution service quality subfactors. A sufficient network infrastructure implies wide network coverage and network access on campus. High speed internet is also important to enhance the teaching and learning productivity of e-learning users, especially during COVID-19 pandemic (Begičević et al., 2007; Djajadikerta et al., 2021; Hunjak & Begičević, 2006).

System reliability (local weight = 0.249) and system accessibility (local weight = 0.233) are the two most important subfactors in system quality. Similarly, prior studies suggest that a reliable e-learning system can motivate both learners and instructors to use e-learning (Chen & Fu, 2010; Fitriastuti et al., 2019; Lai, 2010). This reliability is demonstrated through uploading materials and conducting assessments and discussions in the e-learning system (Chen & Fu, 2010; Fitriastuti et al., 2019; Lai, 2010). The availability and ease of access are important to not frustrate the instructors and learners using e-learning.

The two most important subfactors in content quality are useful content (local weight = 0.396) and up-to-date content (local weight = 0.221), which means that the course materials uploaded in e-learning should be useful, up-to-date, and fit the current needs of learners (Garg & Jain, 2017; Munkhtsetseg et al., 2014; Sadi-Nezhad et al., 2010). Good content quality enhances learner-content interaction while poor content quality raises the dropout rates of e-learning courses (Bhuasiri et al., 2012).

In terms of the global weights, the five most important subfactors are network infrastructure (0.270), availability of technical support staff (0.121), useful content (0.106), system reliability (0.069), and system accessibility (0.064). In short, the ranking of subfactors help researchers to understand what subfactors should be emphasized in each factor.

Implication of study

This study offers several theoretical implications. First, this study advances Delone and McLean (2003)'s Information Systems Success Model (D&M model) in e-learning's post adoption context by using Fuzzy Analytic Network Process (FANP) method. This study is believed to be among the first few studies that modified the D&M model by including another four factors, which are learners’ characteristics, instructors’ characteristics, user interface and learning community. For instance, some studies integrated learners’ characteristic and instructors’ characteristics in the D&M model (Ozkan & Koseler, 2009; Xaymoungkhoun et al., 2012; Yassine et al., 2017), while some other studies included user interface and learning community in the D&M model (Abdellatief et al., 2010; Choi & Jeong, 2019; Farid et al., 2018; Iryanti et al., 2016).

Overlapping of subfactors under these factors are also resolved by categorizing them under relevant factors. In result, seven factors and thirty-nine subfactors are identified through intensive literature review. However, in the third step of the Fuzzy Analytic Network Process, e-learning experts agree to remove the user interface and learning community out of the research framework. The rationale is that they need to ensure that the remaining factors and subfactors are relevant and applicable to the current e-learning environment in Malaysia. In result, five factors and nineteen subfactors have remained in this study.

Prior studies on e-learning in the Malaysian context tend to be fragmented. For example, Musa and Othman (2012) only focus on the technology and student perspectives concerning the success of e-learning. Another study by Masrom (2008) only focuses on the institutional support and technological factors in determining e-learning success. Ramayah et al. (2010), on the other hand, only focus on system quality, information quality and service quality in post adoption of e-learning but they overlooked both the student's and lecturer's characteristics. Hence, the advancement of the D&M model in this study bridges the gap and enriches the understanding the e-learning's post adoption issue in the Malaysian context.

Second, by using FANP technique, this study contributes to extend theoretical insights of the conceptual framework by discovering the dependencies that crucially relevant to e-learning issues in the Malaysian context. Literature review shows that majority of the extant studies had adopted the Analytic Hierarchy Process (AHP) or Fuzzy Analytic Hierarchy Process (FAHP) method, which overlooked the dependencies and feedback among the elements. Nevertheless, the research gap is filled by the identification of the dependencies in this study. The identification of dependencies is crucial because it influences the final weightages, which better illustrates the e-learning's post adoption. Thus, this study applies Fuzzy Analytic Network Process (FANP) method in identifying several significant dependencies in D&M Model.

Third, this study also reinforces the conceptual framework by analyzing the high quality of expert opinions. In prior studies, the operational definition of e-learning experts is not specific enough. For example, the majority of studies viewed e-learning experts as those academic staff who familiar and use e-learning. Some studies viewed experts as those who use e-learning for specific years, while some viewed experts as system developers, IT managers or university management. The definition is not strong enough because academic staff who familiar and use e-learning might just use e-learning at minimum usage. Thus, this study filters real e-learning experts by using filtering questions. It is believed that expert opinions collected from this group of respondents can provide a precise and insightful result to this study.

In addition, this study also offers several educational implications. First, university administrators can utilise the results of this study to implement policies that enhance post adoption of e-learning by lecturers and students. This study unveils the weightage of factors and subfactors that influence post adoption of e-learning. Such results are useful input for university management in addressing the low post adoption rate of e-learning. Universities can then determine which criteria should be prioritised when implementing policies. Institution service support, such as availability of network infrastructure and technical supports, should be prepared inside the campus. Moreover, the system quality should be advanced in terms of system response, reliability, accessibility, stability and security. Lastly, the online course content should be up-to-date and useful to increase e-learning's post adoption.

Moreover, this study can improve the degree of existing awareness of e-learning's post adoption in the Malaysian context. The research model of this study is useful for university administrators in long-range strategic management. Specifically, the model can be adopted as a reference to form a rating system to scrutinize e-learning's post adoption. Such a rating system can be used to evaluate the e-learning's post adoption of a specific university, and gauge the changing level of e-learning's post adoption in the universities over the years. Thus, the rating system allows a more efficient resource allocation and management.

Conclusion and recommendations

In conclusion, the emergence of the COVID-19 pandemic has catalyzed the rapid growth and usage of e-learning in Malaysian universities. Nevertheless, researchers point out that the students and teachers face certain obstacles in using e-learning, including poor internet connectivity, incomplete content development, insufficient knowledge on the use of ICT, and absence of social interaction. Hence, this study extends the information systems success model into the e-learning postadoption context. In particular, this study offered insights concerning the dependencies among the factors in the model within the Malaysian university context. The findings are useful for the long-range strategic management of university administrators, and the model can be adopted as a reference to form a rating system to analyze e-learning postadoption.

This limitation of this research is that this research focuses on Malaysian universities and may not be generalizable to e-learning's postadoption in a broader context. First, future studies can extend the current research in primary and secondary schools as they may have different perspectives. Second, future studies can extend the current research in the non-university context. Third, the research can be extended to the mobile learning context to provide greater insight into e-learning related issues. Finally, the framework proposed can be studied in other countries to compare the ranking of factors that influence the postadoption of e-learning. A qualitative study on management effectiveness is also recommended to better understand the phenomenon.

Data availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Puong Koh Hii, Email: HiiPK@ucsiuniversity.edu.my.

Chin Fei Goh, Email: gcfei@utm.my.

Owee Kowang Tan, Email: oktan@utm.my.

Rasli Amran, Email: amran.rasli@newinti.edu.my.

Choon Hee Ong, Email: o.choonhee@utm.my.

Reference

  1. Abdellatief, M., Sultan, A. M., Jabar, M. A., & Abdullah, R. (2010). Developing general view of quality models for e-learning from developers perspective. Proceedings of Knowledge Management 5th International Conference 2010, 143–149. https://repo.uum.edu.my/id/eprint/11112
  2. Aldowah H, Al-Samarraie H, Alzahrani AI, Alalwan N. Factors affecting student dropout in MOOCs: a cause and effect decision-making model. Journal of Computing in Higher Education. 2020;32(2):429–454. doi: 10.1007/s12528-019-09241-y. [DOI] [Google Scholar]
  3. Anggrainingsih, R., Nugroho, A. A., Suryani, E., & Wahyuningsih, D. (2016). Determining e learning critical success factor at Sebelas Maret University using Analytical Hierarchy Process (AHP). 2016 1st International Seminar on Application for Technology of Information and Communication (191–196). IEEE. 10.1109/ISEMANTIC.2016.7873836
  4. Anggrainingsih, R., Umam, M. Z., & Setiadi, H. (2018). Determining e-learning success factor in higher education based on user perspective using Fuzzy AHP. In MATEC web of conferences, 154, 03011. 10.1109/ISEMANTIC.2016.7873836
  5. Awang H, Aji ZM, Yaakob MFM, Osman WRS, Mukminin A, Habibi A. Teachers’ intention to continue using virtual learning environment (VLE): Malaysian context. JOTSE. 2018;8(4):439–452. doi: 10.3926/jotse.463. [DOI] [Google Scholar]
  6. Azhari FA, Ming LC. Review of e-learning practice at the tertiary education level in Malaysia [Article] Indian Journal of Pharmaceutical Education and Research. 2015;49(4):248–257. doi: 10.5530/ijper.49.4.2. [DOI] [Google Scholar]
  7. Bathaei A, Mardani A, Baležentis T, Awang SR, Streimikiene D, Fei GC, Zakuan N. Application of fuzzy Analytical Network Process (ANP) and VIKOR for the assessment of green agility critical success factors in dairy companies. Symmetry. 2019;11(2):250. doi: 10.3390/sym11020250. [DOI] [Google Scholar]
  8. Begičević, N., Divjak, B., & Hunjak, T. (2007). Development of AHP based model for decision making on e-learning implementation. Journal of Information and Organizational Sciences, 31(1), 13–24.
  9. Bhuasiri W, Xaymoungkhoun O, Zo H, Rho JJ, Ciganek AP. Critical success factors for e-learning in developing countries: A comparative analysis between ICT experts and faculty. Computers & Education. 2012;58(2):843–855. doi: 10.1016/j.compedu.2011.10.010. [DOI] [Google Scholar]
  10. Chang B, Kuo C, Wu CH, Tzeng GH. Using fuzzy analytic network process to assess the risks in enterprise resource planning system implementation. Applied Soft Computing. 2015;28:196–207. doi: 10.1016/j.asoc.2014.11.025. [DOI] [Google Scholar]
  11. Chang DY. Applications of the extent analysis method on fuzzy AHP. European Journal of Operational Research. 1996;95(3):649–655. doi: 10.1016/0377-2217(95)00300-2. [DOI] [Google Scholar]
  12. Chao RJ, Chen YH. Evaluation of the criteria and effectiveness of distance e-learning with consistent fuzzy preference relations. Expert Systems with Applications. 2009;36(7):10657–10662. doi: 10.1016/j.eswa.2009.02.047. [DOI] [Google Scholar]
  13. Chen, M., & Fu, Y. (2010). Comprehensive evaluation of teaching websites based on intelligence methods. 2010 2nd IEEE International Conference on Information Management and Engineering
  14. Chen SY. Identifying and prioritizing critical intellectual capital for e-learning companies. European Business Review. 2009;21(5):438–452. doi: 10.1108/09555340910986664. [DOI] [Google Scholar]
  15. Choi CR, Jeong HY. Quality evaluation for multimedia contents of e-learning systems using the ANP approach on high speed network. Multimedia Tools and Applications. 2019;78(20):28853–28875. doi: 10.1007/s11042-019-7351-8. [DOI] [Google Scholar]
  16. DeLone WH, McLean ER. Information systems success: The quest for the dependent variable. Information Systems Research. 1992;3(1):60–95. doi: 10.5267/j.uscm.2014.12.002. [DOI] [Google Scholar]
  17. Delone WH, McLean ER. The DeLone and McLean model of information systems success: a ten-year update. Journal of Management Information Systems. 2003;19(4):9–30. doi: 10.1080/07421222.2003.11045748. [DOI] [Google Scholar]
  18. Djajadikerta HG, Trireksani T, Ong T, Roni SM, Kazemian S, Zhang J, Noor AHM, Ismail S, Ahmad MAN, Azhar Z. Australian, Malaysian and Indonesian accounting academics' teaching experiences during the COVID-19 pandemic. Australasian Accounting Business and Finance Journal. 2021;15(2):103–113. doi: 10.14453/aabfj.v15i2.7. [DOI] [Google Scholar]
  19. Embi, M. A. (2011). e-Learning in Malaysian higher education institutions: Status, trends, & challenges. Department of Higher Education Ministry of Higher Education. http://www.ukm.my/ctlt/wp-content/uploads/2014/08/e-learning-in-Malaysian-Higher-Education-Institutions.pdf
  20. Farid, Ahmad R, Alam M, Akbar A, Chang V. A sustainable quality assessment model for the information delivery in e-learning systems. Information Discovery and Delivery. 2018;46(1):1–25. doi: 10.1108/IDD-11-2016-0047. [DOI] [Google Scholar]
  21. Fitriastuti, F., Rahmalisa, U., & Girsang, A. S. (2019). Multi-criteria decision making on succesfull of online learning using AHP and regression. Journal of Physics: Conference Series, 1175(1), 012071. 10.1088/1742-6596/1175/1/012071
  22. Garg R, Jain D. Prioritizing e-learning websites evaluation and selection criteria using fuzzy set theory. Management Science Letters. 2017;7(4):177–184. doi: 10.5267/j.msl.2017.1.002. [DOI] [Google Scholar]
  23. Ghazal, S., Aldowah, H., & Umar, I. (2017). Critical factors to learning management system acceptance and satisfaction in a blended learning environment. International Conference of Reliable Information and Communication Technology, 688–698. 10.1007/978-3-319-59427-9_71
  24. Hemmati N, Galankashi MR, Imani DM, Farughi H. Maintenance policy selection: a fuzzy-ANP approach. Journal of Manufacturing Technology Management. 2018 doi: 10.1108/JMTM-06-2017-0109. [DOI] [Google Scholar]
  25. Hunjak T, Begičević N. Prioritisation of e-learning forms based on pair-wise comparisons. Journal of Information and Organizational Sciences. 2006;30(1):47–61. [Google Scholar]
  26. Hwang GJ, Huang TC, Tseng JC. A group-decision approach for evaluating educational web sites. Computers & Education. 2004;42(1):65–86. doi: 10.1016/S0360-1315(03)00065-4. [DOI] [Google Scholar]
  27. Iryanti, E., Pandiya, R., & Ieee (2016). Evaluating the quality of e-learning using consistent fuzzy preference relations method. Proceedings of the 2016 6th International Conference on System Engineering and Technology, 61–66. 10.1109/FIT.2016.7857539
  28. Jain D, Garg R, Bansal A, Saini KK. Selection and ranking of e-learning websites using weighted distance-based approximation. Journal of Computers in Education. 2016;3(2):193–207. doi: 10.1007/s40692-016-0061-6. [DOI] [Google Scholar]
  29. Jeong HY, Yeo SS. The quality model for e-learning system with multimedia contents: a pairwise comparison approach. Multimedia Tools and Applications. 2014;73(2):887–900. doi: 10.1007/s11042-013-1445-5. [DOI] [Google Scholar]
  30. Jie, C. (2010). Evaluation and modeling of online course using fuzzy AHP. In International Conference on Computer and Information Application (ICCIA), 232–235. 10.1109/ICCIA.2010.6141579
  31. Kahraman C, Cebeci U, Ruan D. Multi-attribute comparison of catering service companies using fuzzy AHP: The case of Turkey. International Journal of Production Economics. 2004;87(2):171–184. doi: 10.1016/S0925-5273(03)00099-9. [DOI] [Google Scholar]
  32. Kahraman C, Cebeci U, Ulukan Z. Multi-criteria supplier selection using fuzzy AHP. Logistics information management. 2003 doi: 10.1108/09576050310503367. [DOI] [Google Scholar]
  33. Lai, H. F. (2010). Determining the sustainability of virtual learning communities in e-learning platform. Computer Science and Education (ICCSE), 2010 5th International Conference on.
  34. Lawrence JE, Tar UA. Factors that influence teachers’ adoption and integration of ICT in teaching/learning process. Educational Media International. 2018;55(1):79–105. doi: 10.1080/09523987.2018.1439712. [DOI] [Google Scholar]
  35. Lee YC. The role of perceived resources in online learning adoption. Computers & Education. 2008;50(4):1423–1438. doi: 10.1016/j.compedu.2007.01.001. [DOI] [Google Scholar]
  36. Lin C, Madu CN, Kuei C, Tsai HL, Wang K. Developing an assessment framework for managing sustainability programs: A Analytic Network Process approach. Expert Systems with Applications. 2015;42(5):2488–2501. doi: 10.1016/j.eswa.2014.09.025. [DOI] [Google Scholar]
  37. Lin TC, Ho HP, Chang CT. Evaluation model for applying an e-learning system in a course: An analytic hierarchy process-multi-choice goal programming approach. Journal of Educational Computing Research. 2014;50(1):135–157. doi: 10.2190/EC.50.1.g. [DOI] [Google Scholar]
  38. Lukhayu Pritalia, G., Djoko Budiyanto, S., Triani Dewi, L., & Kusrohmaniah, S. (2018). Critical factor of e-learning component using HELAM and AHP. MATEC Web of Conferences, 218, 03020. 10.1051/matecconf/201821803020
  39. Masrom M. Critical success in e-learning: an examination of technological and institutional support factors. International Journal of Cyber Society and Education. 2008;1(2):131–142. doi: 10.29430/IJCSE.200811.0131. [DOI] [Google Scholar]
  40. Mehregan, M. R., Jamporazmey, M., Hosseinzadeh, M., & Mehrafrouz, M. (2011). Proposing an approach for evaluating e-learning by integrating critical success factor and fuzzy AHP. In International Conference on Innovation, Management and Service, 14, 125–130. http://www.ipedr.com/vol14/23-ICIMS2011S00038.pdf
  41. Mikhailov L, Singh MG. Fuzzy analytic network process and its application to the development of decision support systems. IEEE Transactions on Systems Man and Cybernetics Part C (Applications and Reviews) 2003;33(1):33–41. doi: 10.1109/TSMCC.2003.809354. [DOI] [Google Scholar]
  42. Motaghian H, Hassanzadeh A, Moghadam DK. Factors affecting university instructors' adoption of web-based learning systems: Case study of Iran. Computers & Education. 2013;61:158–167. doi: 10.1016/j.compedu.2012.09.016. [DOI] [Google Scholar]
  43. Moustakas L, Robrade D. The challenges and realities of e-learning during Covid-19: The case of University Sport and Physical Education. Challenges. 2022;13(1):9. doi: 10.3390/challe13010009. [DOI] [Google Scholar]
  44. Munkhtsetseg, N., Garmaa, D., & Uyanga, S. (2014). Multi-criteria comparative evaluation of the e-learning systems: a case study. Ubi-Media Computing and Workshops (UMEDIA), 2014 7th International Conference on, 190–195. 10.1109/U-MEDIA.2014.47
  45. Musa MA, Othman MS. Critical success factor in e-learning: an examination of technology and student factors. International Journal of Advances in Engineering & Technology. 2012;3(2):140. [Google Scholar]
  46. Nilashi, M., Ahmadi, H., Ahani, A., Ravangard, R.,bin, & Ibrahim, O. (2016). Determining the importance of hospital information system adoption factors using fuzzy Analytic Network Process (ANP). Technological Forecasting and Social Change, 111, 244–264. 10.1016/j.techfore.2016.07.008
  47. Onut S, Tuzkaya UR, Torun E. Selecting container port via a fuzzy ANP-based approach: A case study in the Marmara Region, Turkey. Transport Policy. 2011;18(1):182–193. doi: 10.1016/j.tranpol.2010.08.001. [DOI] [Google Scholar]
  48. Ozkan S, Koseler R. Multi-dimensional students’ evaluation of e-learning systems in the higher education context: An empirical investigation. Computers & Education. 2009;53(4):1285–1296. doi: 10.1016/j.compedu.2009.06.011. [DOI] [Google Scholar]
  49. Ramayah T, Ahmad NH, Lo MC. The role of quality factors in intention to continue using an e-learning system in Malaysia. Procedia-Social and Behavioral Sciences. 2010;2(2):5422–5426. doi: 10.1016/j.sbspro.2010.03.885. [DOI] [Google Scholar]
  50. Razak, F. Z. A., Bakar, A. A., & Abdullah, W. S. W. (2020). The role of system quality and content quality in explaining e-learning continuance intention: An Evidence from Malaysian e-learning users [Conference Paper]. Journal of Physics: Conference Series, 1529, Article 052095. 10.1088/1742-6596/1529/5/052095
  51. Saaty, T. L. (1980). The analytical hierarchical process. J Wiley, New York. https://iraqjournals.com/article_35338_216b6ea12dfae49f35fa7cf33ee5a2e8.pdf
  52. Saaty TL. How to make a decision: The analytic hierarchy process. European Journal of Operational Research. 1990;48(1):9–26. doi: 10.1016/0377-2217(90)90057-I. [DOI] [PubMed] [Google Scholar]
  53. Saaty, T. L. (2005). Theory and applications of the analytic network process: decision making with benefits, opportunities, costs, and risks. http://sutlib2.sut.ac.th/sut_contents/H111204.pdf
  54. Saaty, T. L. (2008). The analytic network process. https://www.sid.ir/en/Journal/ViewPaper.aspx?ID=114912
  55. Sadeghi A, Larimian T. Sustainable electricity generation mix for Iran: A fuzzy analytic network process approach. Sustainable Energy Technologies and Assessments. 2018;28:30–42. doi: 10.1016/j.seta.2018.04.001. [DOI] [Google Scholar]
  56. Sadi-Nezhad, S., Etaati, L., & Makui, A. (2010). A fuzzy ANP model for evaluating e-learning platform. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, 254–263. 10.1007/978-3-642-13022-9_26
  57. Siritongthaworn S, Krairit D, Dimmitt NJ, Paul H. The study of e-learning technology implementation: A preliminary investigation of universities in Thailand. Education and Information Technologies. 2006;11(2):137–160. doi: 10.1007/s11134-006-7363-8. [DOI] [Google Scholar]
  58. Soma K. How to involve stakeholders in fisheries management—a country case study in Trinidad and Tobago. Marine Policy. 2003;27(1):47–58. doi: 10.1016/S0308-597X(02)00050-7. [DOI] [Google Scholar]
  59. Su CH, Tzeng GH, Hu SK. Cloud e-learning service strategies for improving e-learning innovation performance in a fuzzy environment by using a new hybrid fuzzy multiple attribute decision-making model [Article] Interactive Learning Environments. 2016;24(8):1812–1835. doi: 10.1080/10494820.2015.1057742. [DOI] [Google Scholar]
  60. Tseng ML, Lin RJ, Chen HP. Evaluating the effectiveness of e-learning system in uncertainty. Industrial Management & Data Systems. 2011;111(6):869–889. doi: 10.1108/02635571111144955. [DOI] [Google Scholar]
  61. Xaymoungkhoun O, Bhuasiri W, Rho JJ, Zo H, Kim MG. The critical success factors of e-learning in developing countries. Kasetsart Journal - Social Sciences. 2012;33(2):321–332. doi: 10.1016/j.compedu.2011.10.010. [DOI] [Google Scholar]
  62. Yassine, S., Khalifa, M., & Franck, P. (2017). Towards a multidimensional model to study a critical success factors affecting continuity and success in e-learning systems. Developments in eSystems Engineering (DeSE), 2017 10th International Conference on, 129–134. 10.1109/DeSE.2017.26
  63. Yim JSC, Moses P, Azalea A. Predicting teachers’ continuance in a virtual learning environment with psychological ownership and the TAM: a perspective from Malaysia. Educational Technology Research and Development. 2019;67(3):691–709. doi: 10.1007/s11423-019-09661-8. [DOI] [Google Scholar]
  64. Youneszadeh H, Ardeshir A, Sebt MH. Exploring critical success factors in urban housing projects using fuzzy Analytic Network Process. Civil Engineering Journal. 2017;3(11):1048–1067. doi: 10.28991/cej-030937. [DOI] [Google Scholar]
  65. Yüksel İ, Dağdeviren M. Using the fuzzy analytic network process (ANP) for Balanced Scorecard (BSC): A case study for a manufacturing firm. Expert Systems with Applications. 2010;37(2):1270–1278. doi: 10.1016/j.eswa.2009.06.002. [DOI] [Google Scholar]
  66. Zadeh LA. Fuzzy sets. Information and Control. 1965;8(3):338–353. doi: 10.1142/9789814261302_0021. [DOI] [Google Scholar]
  67. Zare M, Pahl C, Rahnama H, Nilashi M, Mardani A, Ibrahim O, Ahmadi H. Multi-criteria decision making approach in E-learning: A systematic review and classification [Review] Applied Soft Computing. 2016;45:108–128. doi: 10.1016/j.asoc.2016.04.020. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.


Articles from Education and Information Technologies are provided here courtesy of Nature Publishing Group

RESOURCES