Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2022 Dec 6;26(2):515–538. doi: 10.1007/s10984-022-09443-9

Development and validation of the online learning process questionnaire (OLPQ) at home for primary-school children and their caregivers

Joseph Hin Yan Lam 1, Shelley Xiuli Tong 1,
PMCID: PMC9734829  PMID: 36530464

Abstract

Despite the increasing use of virtual modalities in schools since the COVID-19 pandemic, no systematic tools exist to evaluate the process of online learning. We developed and validated an Online Learning Process Questionnaire (OLPQ) for assessing online at-home learning among 219 Hong Kong primary-school students and 474 caregivers. Exploratory and confirmatory factor analyses of caregivers’ data classified the 58-item OLPQ into 11 subscales: (1) learning aims, (2) environmental structuring, (3) learning environment, (4) time management, (5) engagement in learning activities, (6) persistence, (7) interaction between teachers and students, (8) interaction among students, (9) feedback from the interface, (10) application of learning, and (11) meaning of learning under three learning phases. Confirmatory factor analysis of students’ data further categorized the 11-subscale framework into three learning phases: preparatory, performance, and transfer. The OLPQ demonstrated excellent reliability and discriminant validity across caregiver (Cronbach’s alpha = 0.98) and student samples (alpha = 0.98). These findings indicate that the OLPQ is a valid and reliable instrument for assessing the online at-home learning process among both students and their caregivers.

Keywords: At-home online learning, Caregivers and students, Scale development, School-age children, Virtual modalities

Introduction

Online learning has become an increasingly-popular model of instruction in primary schools around the world, especially during the COVID-19 pandemic. Compared with traditional classroom-based instruction, online learning advantageously transcends geographical barriers, allows for flexible learning speeds, and improvesstudents’ technological skills (e.g., Allen & Seaman 2014; Livingstone & Bober, 2004; Picciano & Seaman, 2007). However,  as online learning requires students to be more self-motivated and self-sufficient, it places higher demands on students’ metacognitive and behavioral skills (Cho et al., 2010). Despite the increasing adoption of online learning and its advantages, no learning-specific tool exists to evaluate students’ experience and their perception of the online learning process. To address this issue, we developed and validated an Online Learning Process Questionnaire (OLPQ) to systematically assess various aspects of online learning and its underlying componential structure from both student and caregiver perspectives.

A conceptual framework for the OLPQ

The three-phase model of learning lays the theoretical foundation for understanding the online learning process. According to this model, the three phases are (1) the preparatory phase, referring to understanding of what is being learned and connecting it to prior knowledge; (2) the performance phase, describing engagement in learning activities and interactions; and (3) the transfer phase, applying what one has learned to real-life situations (Anderson, 2008; Puustinen & Pulkkinen, 2001). All three phases require metacognitive skills, such as strategies planning, regulation of strategies used, and monitoring learning process (Jansen et al., 2017). Moreover, some metacognitive skills are connected to an extent across different phases of learning, for example, planning in the preparatory phase, executing and monitoring the plan in the performance phase, and evaluating it in the transfer phase. Figure 1 presents an overview of the theoretical components of the three-phase model of learning.

Figure 1.

Figure 1

An Overview of the Theoretical Three-phase Online Learning Model and Its Scales

The preparatory phase usually involves teachers’s providing an outline of the learning activities before starting. Students then access their prior knowledge and skills to establish a basic understanding, and they become intellectually prepared for upcoming learning activities (Anderson, 2008). However, the challenges encountered by students and their caregivers differ between online home learning and face-to-face instruction in two key aspects. First, a classroom environment, by default, is set up for learning, whereas a home environment contains many distractions that can hinder concentration, especially for young students (Lau & Lee, 2020). Second, the paper-based format of learning is more familiar to students and their caregivers compared with the diverse learning tools available to complete online learning activities, such as real-time learning platforms, learning management systems, and e-mail clients, which take time to become accustomed to (Jansen et al., 2017; Lau & Lee, 2020).

The subsequent performance phase of learning involves students’ engagement in learning activities. Unlike face-to-face teaching for which the timetable is pre-assigned by schools and teachers can motivate students more easily, online learning offers students greater flexibility. Therefore, self-regulatory skills, such as time management, persistence, and engagement, become more crucial (Jansen et al., 2017; Puustinen & Pulkkinen, 2001). Because online learning is more individually-based (Toven-Lindsey et al., 2015), greater interaction and assistance are necessary to enhance students’ motivation to learn and complete online learning activities (Nicpon et al., 2006). According to Moore (1989), three types of instruction are critical to online learning: the interaction between students and interfaces; the interaction between students and teachers; and the interaction among students. The first one facilitates the learning of tasks and elicits immediate feedback, while the second one promotes help-seeking and allows teachers to motivate students. The third one permits students to share thoughts, provide feedback, and learn from each other, which prevents loneliness during online learning (Cho & Cho, 2017; Nicpon et al., 2006).

After completing learning activities, students proceed to the transfer phase of learning by retaining what they have learned and applying it to real life. During learning activities, information and skills are stored in students’ working memory and consolidated into long-term memory for application (Anderson, 2008). By applying the acquired knowledge and skills to real-life situations that possess personal meaning, students can engage with their learning experiences positively (Anderson, 2008). The transfer phase is thus crucial not only for learning in the present, but also has potential impacts on future attitudes towards learning and how students adapt their learning-related behaviors in the preparatory and performance phases of subsequent learning experience (Hsu et al., 2009).

A review of previous studies regarding the online learning process

To date, several instruments have been developed for evaluating different aspects of the online learning process among high-school and tertiary students. For example, the Self-Regulated Online Learning Questionnaire (SROLQ) assesses adults’ self-regulated behaviors and metacognitive activities in large-scale open online courses. The 36-item SROLQ consists of five subscales of metacognitive skills, environmental structuring, time management, help seeking, and persistence (Jansen et al., 2017). The Online Self-Regulation Questionnaire (OSRQ) focuses on three types of interactions experienced by university students, namely, the interaction between students and learning content, the interaction between teacher and students, and the interaction among students (Cho & Cho, 2017). The Distance Education Learning Environments Survey (DSELES) targets students’ psychological and social perception of online learning environments in higher education and consists of 34 items, covering instructor support, student interaction and collaboration, personal relevance, authentic learning, active learning, and student autonomy (Walker & Fraser, 2005). The Web-Based Learning Environment Inventory (WBLEI) measures post-secondary students’ perception of four features of the online learning environment, including accessibility, interaction, responsiveness and results (Chang & Fisher, 2003). The Business Statistics Computer Learning Environment Inventory (BSCLE) assesses university students’ perceptions of online learning environment in practical subjects (i.e., business statistics) with five subscales focusing on students’ readiness to apply the skills after learning interactions (Nguyen-Newby & Fraser, 2021).

One noticeable limitation of all aforementioned instruments is their inadequacy for evaluating the three preparatory-performance-transfer phases of learning comprehensively. Specifically, the scope of the existing tools is limited in terms of the phases of learning captured and their distinctions. For example, in the SROLQ, items from the preparatory and appraisal phases of learning are merged into the larger domain of metacognitive skills without assessing learning environment and other prerequisites (Jansen et al., 2017). This results in a lack of distinction between the preparation and transfer of learning.

Furthermore, very few questionnaires consider all three phases of learning, and the performance phase is broadly given greater attention than other phases. For example, the OSRQ emphasizes different types of online interaction during the performance phase of learning only (Cho & Cho, 2017). Similarly, the DELES targets the learning performance stage only (Walker & Fraser, 2005). However, even when the performance phase of online learning is frequently considered, it is often incomplete or less well-defined . For example, the WBLEI (Chang & Fisher, 2003) focuses only on the interaction dimension of the performance phase in online learning, without evaluating students’ self-regulatory skills. Considering the low social presence resulting from the online medium, self-regulatory skills and autonomy are especially important for students to engage in the learning activities, compared with traditional classroom learning (Jansen et al., 2017; Puustinen & Pulkkinen, 2001; Richardson et al., 2017). Thus, self-regulation should be a critical component for any evaluation tools for online learning.

With emphasis placed on performance, the transfer phase has often been neglected. For example, the BSCLEI prioritizes the preparedness of students for online learning (i.e., integration and technology adequacy) and students’ learning interaction (i.e., student cohesiveness, involvement, and task orientation without considering transfer of learning (Nguyen-Newby & Fraser, 2021). This neglects the agentic role of students in their own learning. To capture this, real-life applications and meaning of learning should be included.

Another significant limitation is that most of these instruments target secondary and college students who have greater mastery of self-regulation skills and more sophisticated technological skills than primary-school students. Thus, these existing tools might not be directly applicable to young primary-school students. Given the increasing use of online learning as a mode of instruction in this age group, it is necessary to develop and validate an age-appropriate questionnaire to evaluate the process of online learning among primary-school students. Additionally, while students themselves are central to their learning, it is also important to consider those who assist in the process, such as caregivers. Existing measures have only been validated with a student sample, but not among caregivers. This is justified, considering that available measures have been designed for use among secondary and college students via a self-report format. However, given the age of primary-school students, the inclusion of caregivers is important because they tend to be actively engaged in their children’s online learning activities, such as to provide encouragement (Lau & Lee, 2020). Considering that caregivers are likely to facilitate their children’s online learning in the home environment and thus observe their children’s learning of behavioral skills, such as time management, interaction during learning, and determination to learn (An & Lee, 2010; Lau & Lee, 2020), they can provide accurate evaluations of their children’s online learning process and difficulties. Therefore, in addition to the students themselves, caregivers are also ideal respondents.

Context and objectives of this study

The rise of online learning can be attributed to two main phenomena, namely, technological advances and the COVID-19 pandemic. With internet accessibility increasing among children and the convenient format of online learning, students are able to engage with education outside of the regular classroom environment and explore their own interests (Allen & Seaman, 2014; Livingstone & Bober, 2004). While this shift has been accelerated by school closures and social-distancing policies associated with the COVID-19 pandemic (Armitage & Nellums, 2020), the trend of online learning is likely to continue beyond the pandemic, given the development of various courses and contents in this flexible format, accompanied by the growing attention to students’ digital literacy (Dhawan, 2020). Therefore, assessing the process of online learning at home not only might be relevant during the pandemic, but also for the future of education.

In addressing this practical need, this study was designed to develop and validate the Online Learning Process Questionnaire (OLPQ) which can assess the online learning process at home and be completed by both primary-school students and their caregivers. Two research questions were addressed. First, is the three-phase model of learning applicable to primary school students for measuring their online learning process? Second, what is the underlying factor structure of the OLPQ? The development of this questionnaire involved questionnaire formulation and content validation. During the validation process, the factor structure was first suggested and validated using the caregiver sample. Afterwards, it was cross-validated using a student sample. The validity and reliability of the questionnaire were evaluated for each sample.

Method

Questionnaire formulation

In line with the three-phase model of learning (Puustinen & Pulkkinen, 2001), several subscales were developed to assess the process of online learning under the preparatory phase, performance phase, and transfer phase. Specifically, the preparatory phase section comprises the four subscales of (1) Learning Goals, (2) Prerequisites, (3) Environmental Structuring, and (4) Learning Environment. The performance phase section includes the six subscales of (1) Time Management, (2) Engagement in Learning Activities, (3) Persistence, (4) Interaction between Teachers and Students, (5) Interaction among Students, and (6) Feedback from the Interface. The transfer phase consists of the three subscales of (1) Maintenance of Skills and Knowledge, (2) Application of Skills and Knowledge, and (3) Meaning of Learning. The items for each subscale were first constructed during the literature review stage and further selected based on a systematic review of various measures of the online learning process, as well as the online learning situation in Hong Kong. Specifically, five items (Items 22, 23 and 24 from Environment Structuring and Item 27 and 28 from Persistence) were adopted and modified from the SROLQ (Jansen et al., 2017). Similarly, two items (Items 7 and 9 from Environment Structuring) were adopted and modified from the OSRLQ (Barnard et al., 2009). Additionally, three items (Items 1 and 2 from Self-Regulation in Interaction between Students and Teachers and Item 9 from Self-Regulation in Interaction between Students and Students) were adopted and modified from the OSRLQ (Cho & Cho, 2017). Moreover, one item (Item 4 from Self-efficacy to Handle Tools) was adopted and modified from the Self-Efficacy Questionnaire for Online Learning (Shen et al., 2013). Additionally, each subscale contained multiple items to minimize possible measurement errors. The authors discussed and modified the items before the questionnaire then underwent a content validation process.

Content validation

To establish content validity, or the evaluation of whether the items are supported on theoretical grounds (Trochim & Donnelly, 2001), four education researchers and three experienced Hong Kong primary-school teachers were invited via email to comment on the definitions of the subscales and questionnaire items in the first draft of the questionnaire in English. Based on their comments, three items were revised for clarity and one new item was added to the subscale of Interaction between Teachers and Students. The revised questionnaire consisted of 67 items. Figure 1 shows the distribution of questions of various subscales under the three phases of learning. The questionnaire was then translated into Chinese by a research assistant in Chinese language education, and a back translation was prepared by a research assistant in English language education from the research laboratory in our university. Because the questionnaire was expected to be used in Hong Kong primary schools, two Chinese language teachers and one English language teacher were invited to further assess the content validity and language conformity, as well as the suitability and appropriateness of items for the context of online learning in Hong Kong.

Sample and procedures

A total of 474 caregivers of primary-school students and 219 primary-school students from Hong Kong participated in this study. Recruitment was conducted on a Facebook page managed by the authors’ laboratory, and most of the audience consisted of caregivers. Informed consent was obtained from all participants. The participants were then invited to complete the online questionnaire via Qualtrics, with the whole process taking approximately 15–20 min. The order of all questions was randomized, except for those in the demographics section. Ethical approval was granted for this study by the Human Research Ethics Committee at the University of Hong Kong. Table 1 shows the participants’ demographic information and online learning experience. As reported by both caregivers and students, most primary-school students (over 90%) carried out online learning at home.

Table 1.

Frequency of demographic variables for participants

Variable Frequency

Caregiver sample

(N = 474)

Student sample

(N = 219)

Grade of student
 Grade 1 70 (14.8%) 28 (12.8%)
 Grade 2 69 (14.6%) 24 (11.0%)
 Grade 3 86 (18.1%) 38 (17.4%)
 Grade 4 76 (16.0%) 38 (17.4%)
 Grade 5 94 (19.8%) 55 (25.1%)
 Grade 6 79 (16.7%) 36 (16.4%)
Sex of student
 Female 181 (38.2%) 90 (41.1%)
 Male 293 (61.8%) 129 (58.9%)
Relationship with studenta
 Mother 36 (7.6%)
 Father 430 (90.7%)
 Others 8 (1.7%)
Online learning method(s) usedb
 Real-time online teaching 431 (90.9%) 197 (90.0%)
 Online learning platform 363 (76.6%) 162 (74.0%)
 Learning management system 337 (71.1%) 156 (71.2%)
 Learning video 321 (67.7%) 135 (61.6%)
 Online assessment 215 (45.4%) 106 (48.4%)
 E-mail 170 (35.9%) 87 (39.7%)
Place for conducting online learning activities
 Home 432 (91.1%) 197 (90.0%)
 School 32 (6.8%) 20 (9.1%)
 Library 3 (0.6%) 1 (0.5%)
 Others 7 (1.5%) 1 (0.5%)

Note. aThis question was included in the caregivers’ questionnaire only. bParticipants can choose more than one option

Data analysis

Exploratory factor analysis (EFA) was first performed to extract the hypothesized factors to identify the proposed questionnaire. Promax rotation was chosen because of the factor intercorrelations and its suitability for factor correlations larger than .15 (Tabachnick & Fidell, 2013). Items were retained or eliminated based on the following criteria: (1) factor eigenvalues larger than 1 and item loadings larger than .4; (2) the removal of items with significant factor loadings on multiple factors; and (3) factors consisting of not fewer than three items (Hair et al., 2006; Straub, 1989). Next, confirmatory factor analysis (CFA) was conducted to examine the internal validity of the items (Joreskog & Sorbom, 2018). Six goodness-of-fit indices were used: chi-square, chi-squared divided by degrees of freedom, comparative fit index (CFI), Incremental Fit Index (IFI), standardized root mean square residual (SRMR), and root-mean-square error approximation (RMSEA). Furthermore, we assessed the reliability and discriminant validity. Finally, CFA, reliability, and discriminant validity analyses were carried out with the student data to cross-validate the factor structure with the caregiver data.

Results

Identifying the underlying factor structure of Online Learning Process Questionnaire (OLPQ): Exploratory factor analysis with a Hong Kong Chinese caregiver sample

To determine factor loadings and their dimensions for the 67-item online questionnaire, EFA was performed on the caregiver data. Bartlett’s sphericity test yielded a significant result, χ2(2, 211) = 26466.38, p < .001, indicating that the intercorrelation matrix had sufficient common variance for the EFA analysis. The Kaiser-Meyer-Olkin value was .97, indicating that the factorable matrix had adequate sampling (Kaiser, 1958). By using eigenvalues larger than 1 and factor loadings greater than .4 as the criteria, 12 factors that explained 67.30% of the total variance were extracted. However, the 12th factor, which explained 1.30% of the variance and contained two questions, was removed because of multiple loadings on the factors. Seven further items were removed because of low factor loadings (smaller than .4). The final 11-factor structure, consisting of 58 items, explained 66.00% of the variance and demonstrated satisfactory fit to the data (Kline, 2014).

Table 2 summarizes the factor loadings of all items and their dimensions. Items from Learning Goals and Prerequisites in the preparatory phase were merged into one factor called Learning Aims. Items from Maintenance of Skills and Knowledge and Application of Skills and Knowledge in the transfer phase were merged into one factor called Application of Learning. For the remaining factors, the items matched their previously-assigned subscales, which showed that the convergent validity and dimensionality were acceptable. Figure 2 and Table 3 present the final 11-factor structure, the definitions of the subscales, and sample items, as well as the three subscales in the preparatory phase, six subscales in the performance phase, and two subscales in the transfer phase. The final items included in the questionnaire are presented in Appendix. As shown in Tables 2, factor loadings for Learning Aims, Environmental Structuring, and Learning Environment in the preparatory phase were between .41 and .80, between .65 and .84, and between .53 and .71, respectively. Factor loadings for the subscales in the performance phase were as follows: Time management (.61–.87), Engagement in Learning Activities (.63–.90), Persistence (.51–.60), Interaction between Teachers and Students (.59–.89), Interaction among Students (.54–.91), and Feedback from the Interface (.76–.91). Factor loadings for Application of Learning and Meaning of Learning in the transfer phase were between .43 and .93 and between .47 and .81, respectively. No cross-loading of the remaining items exited among the 11 factors, thus supporting the discriminant validity of the questionnaire.

Table 2.

Factor loadings of rotated component matrix for caregiver sample (N = 474) and regression coefficients of items for caregiver and student sample (N = 219)

Item Component factor loadings Regression coefficient
AL ITS LA TM ELA IS FI ES LE ML P Caregivers Students
AL2 .93 .90 .89
AL3 .83 .85 .84
AL6 .82 .81 .83
AL1 .81 .88 .89
AL4 .81 .87 .83
AL5 .77 .81 .88
AL7 .56 .78 .79
AL8 .47 .71 .77
AL9 .43 .71 .80
ITS4 .89 .85 .86
ITS6 .87 .87 .85
ITS2 .86 .85 .88
ITS1 .80 .80 .87
ITS3 .69 .77 .81
ITS5 .59 .70 .80
LA2 .80 .85 .82
LA1 .78 .84 .80
LA4 .74 .86 .88
LA3 .74 .81 .85
LA5 .72 .86 .84
LA6 .49 .55 .70
LA7 .43 .57 .75
LA8 .41 .64 .75
TM2 .87 .81 .79
TM3 .85 .84 .86
TM4 .76 .83 .81
TM1 .70 .68 .82
TM5 .61 .70 .79
ELA3 .90 .82 .83
ELA1 .68 .87 .83
ELA2 .66 .84 .84
ELA4 .63 .75 .75
IS4 .91 .92 .90
IS1 .89 .87 .93
IS2 .88 .89 .94
IS3 .78 .80 .86
IS5 .54 .73 .87
FI1 .91 .89 .91
FI4 .86 .87 .87
FI2 .84 .87 .90
FI5 .82 .86 .91
FI3 .76 .88 .86
ES3 .84 .77 .90
ES4 .81 .85 .84
ES1 .77 .82 .85
ES2 .65 .63 .74
LE2 .71 .76 .81
LE4 .62 .76 .73
LE1 .61 .78 .74
LE3 .53 .57 .70
ML1 .81 .76 .79
ML2 .78 .87 .88
ML4 .68 .74 .85
ML3 .53 .78 .77
ML5 .47 .75 .83
P3 .60 .85 .74
P1 .53 .75 .77
P2 .51 .83 .87
Eigenvalues 29.43 3.66 3.19 2.62 2.47 1.99 1.81 1.66 1.58 1.37 1.06
% of variance 38.23 4.75 4.14 3.40 3.21 2.59 2.35 2.15 2.05 1.78 1.37
Cumulative % of variance 38.23 42.97 47.11 50.50 53.72 56.30 58.65 60.80 62.85 64.63 66.00

Note. Extraction method: principal component analysis; rotation method: Promax rotation with Kaiser normalization, rotation converged in 15 iterations. AL = application of learning, ITS = interaction between teachers and students, LA = learning aims, TM = time management, ELA = engagement in learning activities, IS = interaction among students, FI = feedback from the interface, ES = environmental structuring, LE = learning environment, ML = meaning of learning, P = persistence. The number under the component column indicates the item number. ps < .001 for all regression coefficients

Figure 2.

Figure 2

An Overview of the Exploratory Three-phase Online Learning Model and Its Scales

Table 3.

Each subscale’s description and its sample item for Online Learning Process Questionnaire (OLPQ)

Learning phases and subscales Description Sample items
Learning preparatory phase
Learning aims Extent to which students are aware of the outline and knowledge required to complete online learning activities I am aware of the learning objective of online learning services/programs.
Environmental structuring Extent to which students can participate in online learning activities in a comfortable physical environment I am satisfied with the learning environment where I access online learning services.
Learning environment Extent to which students are at ease using the online learning platform The online learning platform(s) is(are) easy to use.
Learning performance phase
Time management Extent to which students can manage their time to complete online learning activities I can allocate sufficient time suitably for online learning activities.
Engagement in Learning Activities Extent to which students have attentive interest and participate actively in the online learning activities I am engaged in the online learning activities.
Persistence Extent to which students can overcome difficulties and continue to complete the online learning activities I can finish the required online learning activities even if I find the content challenging.
Interaction between teachers and students Extent to which students can interact with, seek help and receive feedback from teachers during the online learning process The interaction between me and my teachers is adequate during online learning activities.
Interaction among students Extent to which students can interact with and learn from their peers during online learning activities I can learn from my classmates during online learning activities.
Feedback from the interface Extent to which feedback from the learning platform is supportive and facilitates learning The feedback received from the learning platform(s) is helpful for me to learn.
Learning transfer phase
Application of learning Extent to which students can maintain the knowledge and skills learned, then apply them in real-life situations I feel satisfied when I can apply the skills and knowledge learned from online learning activities to other subjects.
Meaning of learning Extent to which students understand the aim of learning and facilitate their own academic learning process The skills and knowledge learned from online learning activities are meaningful to my personal life.

Verifying the 11-factor structure of OLPQ: Confirmatory factor analysis with both caregiver and student samples

To verify the identifying factor structure of the OLPQ, CFA was performed with the caregiver sample. As shown in Table 2, significant regression coefficients ranged from .55 to .92, ps < .001. The model fit indices were acceptable, with χ2(1874, N = 474) = 4242.80 (p < .001), χ2/df = 2.26, SRMR = .051, CFI = .90, IFI = .91, and RMSEA = .05 (Hu & Bentler, 1999; Ullman & Bentler, 2013).

To cross-validate the dimensionality and relationships within the 58-item OLPQ, a 11-factor model was specified and tested using data from the student sample. As shown in Table 2, the regression coefficients of all items ranged from .70 to .94, ps < .001. The model fitted the data well, as indicated by all fit indices: χ2(1540, N = 219) = 2947.68 (p < .001), χ2/df = 1.91, SRMR = .05, CFI = .90, IFI = .89, and RMSEA = .07. This indicates that the 11-factor structure identified using data from the caregiver sample was stable and can be generalized to the student sample.

Reliability and discriminant validity of OLPQ

To evaluate the internal consistency of all 58-items in the OLPQ from the caregiver sample, Cronbach’s alpha was calculated. The alpha coefficient for the 58-item questionnaire was 0.98, indicating excellent reliability, exceeding the minimum requirement of .70 (Hair et al., 2006). Table 4 shows reliability coefficients of each subscale and their correlations. Reliability coefficients of the subscales, ranging from .81 to .95, are classified as good to excellent (Cronbach, 1951). All of the correlations between the subscales (ranging from .02 to .67) and the mean correlations (ranging from .18 to .47) were lower than the reliability coefficients, suggesting that the subscales assessed distinct yet interrelated aspects of the online learning process.

Table 4.

Internal consistency reliability (Cronbach’s alpha coefficient) and discriminant validity (correlation with other scales) for caregiver sample (N = 474)

Scale Number of items Reliability Mean correlation Correlations
LA ES LE TM ELA P ITS IS FI LT ML
LA 8 .95 .45
ES 4 .85 .43 .65
LE 4 .81 .36 .38 .40
TM 5 .88 .37 .66 .59 .26
ELA 4 .89 .32 .40 .33 .32 .27
P 3 .85 .18 .13 .04 .34 .12 .18
ITS 6 .92 .44 .56 .53 .36 .46 .50 .20
IS 5 .92 .46 .55 .58 .50 .45 .46 .17 .59
FI 5 .94 .34 .38 .40 .32 .29 .23 .19 .43 .41
AL 9 .95 .47 .67 .64 .43 .59 .38 .25 .59 .60 .43
ML 5 .89 .18 .08 .16 .30 .02 .12 .18 .21 .24 .32 .16

Note. LA = learning aims, ES = environmental structuring, LE = learning environment, TM = time management, ELA = engagement in learning activities, P = persistence, ITS = interaction between teachers and students, IS = interaction among students, FI = feedback from the interface, AL = application of learning, ML = meaning of learning

The same psychometric analysis of the OLPQ was conducted using data from the student sample. Cronbach’s alpha coefficient of the 58-item questionnaire was .98, indicating excellent reliability. Table 5 shows the reliability coefficients of each subscale and their correlations for the student sample. The reliability coefficients of the subscales ranged from .83 to .95, which can be classified as good to excellent (Cronbach, 1951). All correlations between the subscales (ranging from .35 to .82) and the mean correlations (ranging from .49 to .69) were lower than the reliability coefficients, which indicates that the subscales assessed distinct yet interrelated aspects of the online learning process.

Table 5.

Internal consistency reliability (Cronbach’s alpha coefficient) and discriminant validity (correlation with other scales) for student sample (N = 219)

Scale Number of items Reliability Mean correlation Correlations
LA ES LE TM ELA P ITS IS FI LT ML
LA 8 .93 .62
ES 4 .83 .49 .46
LE 4 .83 .57 .56 .58
TM 5 .91 .56 .62 .46 .50
ELA 4 .89 .63 .66 .52 .64 .65
P 3 .84 .54 .48 .47 .62 .63 .63
ITS 6 .94 .62 .65 .52 .56 .52 .68 .55
IS 5 .95 .53 .62 .35 .48 .42 .52 .38 .72
FI 5 .95 .64 .67 .52 .57 .61 .66 .60 .73 .65
AL 9 .95 .69 .75 .58 .63 .65 .73 .61 .72 .62 .75
ML 5 .91 .60 .72 .46 .55 .55 .62 .47 .59 .56 .61 .82

Note. LA = learning aims, ES = environmental structuring, LE = learning environment, TM = time management, ELA = engagement in learning activities, P = persistence, ITS = interaction between teachers and students, IS = interaction among students, FI = feedback from the interface, AL = application of learning, ML = meaning of learning

Discussion

In this study, we developed and validated the OLPQ to assess the process of online learning at home from the perspectives of primary school students and their caregivers. By assessing a large sample of primaryschool students and their caregivers in Hong Kong, an 11-factor structure for 58-items in the OLPQ questionnaire was developed and verified. The final OLPQ consists of the subscales of (1) learning aims, (2) environmental structuring, (3) learning environment, (4) time management, (5) engagement in learning activities, (6) persistence, (7) interaction between teachers and students, (8) interaction among students, (9) feedback from the interface, (10) application of learning, (11) meaning of learning in preparatory, performance and transfer phases. EFA conducted on the caregiver sample suggested an 11-factor structure that explained 66.00% of the variance, while CFA confirmed the factor structure for the caregiver sample and cross-validated it for the student sample. The discriminant validity was indicated by lower correlations between subscales than their internal consistency for both caregiver and student samples. The internal reliabilities of all 11 subscales in the OLPQ were acceptable to excellent for both caregiver and student samples.

In line with previous research (Anderson, 2008; Puustinen & Pulkkinen, 2001), the OLPQ clarifies that online learning involves the preparatory, performance and transfer phases. The identification of these three phases of learning in the QLPQ not only supports the notion that different metacognitive skills and learning behaviors are required before, during and after the learning task to achieve the learning outcomes (Puustinen & Pulkkinen, 2001), but it also suggests that online learning is a continuous and dynamic process and any effective evaluation tools should not be limited to the skills and outcome of the performance phase (i.e., the learning activities).

More importantly, the development and validation of the OLPQ for primaryschool students extends previous studies which focused on online learning among college and postgraduate students who have better mastery of self-control and self-learning skills (e.g., Prior et al., 2016). Specifically, the QLPQ has taken into consideration the characteristics of primaryschool students, such as the relatively lower level of attention during online learning and the greater need for academic and behavioral support from teachers and caregivers (Lau & Lee, 2020). Furthermore, different from most existing self-regulation questionnaires which are primarily course-based (Jansen et al., 2017; Nguyen-Newby & Fraser, 2021), the OLPQ considers the higher prevalence of hybrid learning (i.e., the use of both face-to-face teaching and online learning) experienced by primary students by placing emphasis on their unique online learning context. Additionally, the OLPQ is the first to measure the online learning process from both the students’ and caregivers’ perspectives. As caregivers plays an important role in online learning among primary school students, such as providing academic and technological support, and monitoring learning process (Lau & Lee, 2020), the validation of the QLPQ in both samples enables this assessment tool to be comprehensive in evaluating the online learning process among primary-school students.

Educational and theoretical implications

The development of the OLPQ has far-reaching implications. First, the OLPQ can be easily adopted by schools as a tool for teachers for better understanding students’ learning processes and potential challenges that they might encounter. This is critically important because most online learning takes place outside school and involves a lower social presence from teachers (Richardson et al., 2017). Because the OLPQ is structured using the three learning phases, teachers can focus on the processes with which they are most concerned. Specifically, previous research has been reported that learning preparation, students’ interaction during online learning, and their engagement in learning activities are harder to observe during online learning (Appana, 2008). By utilizing the OLPQ, teaching staff and schools can be more involved in evaluating how teacher-designed online learning materials and resources are implemented at home. Additionally, based on findings from the OLPQ, teachers can obtain professional advice to enhance students’ online learning at home.

From a caregiver’s perspective, the OLPQ is an important platform of expression. Given the critical role of caregivers in supporting their young students’ learning at home and the increasing importance of their connection with schools, it is of great significance to include their genuine feedback on how online learning is implemented at home (Lau & Lee, 2020; Ma et al., 2016). By discussing findings from the OLPQ together, caregivers become more involved and can feel that their children’s development at school is prioritized by school staff (Chenhall et al., 2011; Choi, 2017). Together, one of the practical implications of OLPQ is the inclusion of caregivers in the evaluation process,  enhancing school-caregiver relationships.

From a research perspective, educational researchers can utilize results of the OLPQ to develop programs that enhance the effectiveness of online learning at home. As the process of online learning is also related to students’ learning attitudes, motivation, and learning experience (Hsu et al., 2009), the OLPQ promotes educators’  comprehensive understanding of tudents’ learning and provide insight into how online learning can be modified. While the OLPQ was developed with the context of COVID-19 , research on online learning is much needed so that it can become more effective for students (Lau & Lee, 2020; Lestari & Gunawan, 2020; Picciano & Seaman, 2007).

Limitations and future directions

Despite its significant theoretical and educational implications, three limitations of the OLPQ include the sample, the lack of test-retest reliability, and other potential factors. Specifically, the OLPQ was only validated by Hong Kong primary school students and their caregivers . Given the rise of online learning, the questionnaire would benefit from validation among secondaryschool students in other places. Also, given the time constraint, no test-retest reliability was established for the QLPQ. Future research should extend this work by examining measuring stability, correlation coefficients, and the coefficient alpha of the OLPQ. Additionally, several other factors such as age, socioeconomic status, online learning methods, special educational needs, and online learning attitudes might affect the process of online learning (Lam & Tong, 2022). Future research could explore the roles of these factors by testing different age samples.

Conclusion

This study involved developing and validating the OLPQ for assessing the process of online learning at home for both primary school students and their caregivers . The 58-item questionnaire consisted of 11 factors:(1) learning aims, (2) environmental structuring, (3) learning environment, (4) time management, (5) engagement in learning activities, (6) persistence, (7) interaction between teachers and students, (8) interaction among students, (9) feedback from the interface, (10) application of learning, and (11) meaning of learning under three phases of learning, namely the preparatory phase, the performance phase, and the transfer phase. The 11 factors of three phases of learning were validated for both the caregiver and student samples with good to excellent reliability. The QLPQ is the first instrument to evaluate the online learning processing from both students’ and caregivers’ perspectives. Schools can adopt this tool and administer it to caregivers and students to gather information regarding how to adjust the implementation of online learning for enhancing students’ learning experiences and effectiveness.

Acknowledgements

We thank Joanna Lee and Nicole Law for their assistance in translation of questionnaire. Additionally, we would like to thank Justine Wai for their English editing and proofreading.

Appendix

The Online Learning Process Questionnaire (OLPQ) in English and Chinese.

  • A)

    The Preparatory Phase of Learning.

    • I.
      Learning aims.
      1. I am aware of the learning objectives of online learning services/programs.
        我意識到網上學習的學習目標。
      2. I understand the learning objectives of online learning services/programs.
        我明白網上學習的學習目標。
      3. The learning objectives are appropriate for online learning services/programs.
        網上學習的學習目標是合適的。
      4. The learning objectives help me understand the content of online learning services/programs.
        學習目標幫助我理解網上學習的内容。
      5. I am confident that I will achieve the learning goals after engaging in online learning services.
        在進行網上學習後, 我有信心可以達到學習目標。
      6. The prerequisite questions or learning activities are provided in the online learning services.
        網上學習會提供預習問題或學習活動。
      7. The prerequisite questions or learning activities enhance the effectiveness of online learning.
        預習問題或學習活動能提高網上學習的效率。.
      8. I will review the prerequisite skills and knowledge before I start online learning activities, if necessary.
        當我進行網上學習前, 如有需要, 我會預習相關技巧和知識。
    • II.
      Environmental structuring.
      1. I can find a place where I can concentrate when accessing online learning services.
        我找到一個可以專心進行網上學習的地方。
      2. I have a regular place for engaging in online learning services.
        我有固定的地方來進行網上學習。
      3. The place for online learning services has minimal distractions.
        在我進行網上學習的地方有著最少的干擾。
      4. I am satisfied with the learning environment where I access online learning services.
        我對進行網上學習時的學習環境感到滿意。
    • III.
      Learning environment.
      1. The online learning platform(s) is(are) easy to use.
        網上學習平台容易使用。
      2. The layout of the online learning platform(s) is(are) organized.
        網上學習平台的設計是有條理的。
      3. The online learning platform(s) has(have) minimally excessive sounds or graphical information.
        網上學習平台有著最少多餘的聲音或影像。
      4. I am capable of using different online learning platform(s).
        我能使用不同的網上學習平台。
  • B)

    The Performance Phase of Learning.

    • I.
      Time management.
      1. I know how much time I need to spend on online learning services.
        我知道我在網上學習需要花了多少時間。
      2. I can allocate enough time for online learning services without clashing with other activities.
        我可以分配足夠的時間給網上學習, 而且不會與其他活動的時間有衝突。
      3. I can allocate sufficient time suitably for online learning activities.
        我可以分配合適的時間用來進行網上學習。
      4. It is easy for me to schedule my time to access online learning services.
        分配時間進行網上學習, 對我而言是一件容易的事。
      5. I can complete online learning activities on time.
        我能夠準時完成網上學習的活動。
    • II.
      Engagement in learning activities.
      1. I am engaged in the online learning activities.
        我投入於網上學習。
      2. I feel a sense of participation while completing online learning activities.
        當我進行網上學習時, 我感受到“參與”的感覺。
      3. I feel that I can concentrate during online learning activities.
        我專注於進行網上學習。
      4. I am eager to learn during online learning activities.
        我渴望在進行網上學習時學習。
    • III.
      Persistence.
      1. I can finish the required online learning activities even if I do not like the content.
        即使我不喜歡其中的内容, 我也能完成網上學習。
      2. I can finish the required online learning activities even if I find the content challenging.
        即使我認爲内容具有挑戰性, 我也能完成網上學習。
      3. I find ways to force myself to complete the online learning activities.
        我會用不同方法去迫使自己完成網上學習。
    • IV.
      Interaction between teachers and students.
      1. I am able to interact with my teachers during online learning activities.
        在進行網上學習時, 我能與老師互動。
      2. The interaction between me and my teachers is adequate during online learning activities.
        在進行網上學習時, 我與老師有足夠的互動。
      3. The interaction between me and my teachers can facilitate online learning.
        我與老師的互動能促進網上學習。
      4. I can seek help from my teachers during online learning activities whenever I need it.
        在進行網上學習時, 我能尋求老師的幫助。
      5. I know how to seek help from my teachers during online learning activities.
        在進行網上學習時, 我知道怎樣可以尋求老師的幫助。
      6. My teachers can provide timely responses to my questions during online learning activities.
        在進行網上學習時, 老師能適時回應我的問題。
    • V.
      Interaction among students.
      1. I am able to interact with my classmates during online learning activities.
        在進行網上學習時, 我能與同學互動。
      2. The interaction between me and my classmates is adequate during online learning activities.
        在進行網上學習時, 我與同學有足夠的互動。
      3. I enjoy my interaction with classmates during online learning activities.
        在進行網上學習時, 我享受與同學的互動。
      4. It is easy to interact with my classmates during online learning activities.
        在進行網上學習時, 我能容易地與同學互動。
      5. I can learn from my classmates during online learning activities.
        在進行網上學習時, 我能夠從同學身上學習。
    • VI.
      Feedback from the interface.
      1. I am able to receive feedback from the learning platform(s) during online learning activities.
        在進行網上學習時, 我能夠收到學習平台給我的回饋意見。
      2. The feedback received from the learning platform(s) is easy to find during online learning activities.
        在進行網上學習時, 我能輕易找到學習平台給我的回饋意見。
      3. The feedback received from the learning platform(s) is adequate for me to continue to learn.
        學習平台給我的回饋意見, 足夠讓我繼續學習。
      4. The feedback received from the learning platform(s) is helpful for me to learn.
        學習平台給我的回饋意見, 對於我的學習是有幫助的。
      5. The feedback received from the learning platform(s) allows me to further study the topic.
        學習平台給我的回饋意見, 讓我繼續深入學習某個課題。
  • C)

    The Transfer Phase of Learning.

    • I.
      Application of learning.
      1. I can apply the skills and knowledge learned from online learning activities to other subjects.
        我能我從網上學習中所學習到的技巧和知識應用到其他科目。
      2. I can apply the skills and knowledge learned from online learning activities to daily life.
        我能我從網上學習中所學習到的技巧和知識應用到日常生活。
      3. It is easy to apply the skills and knowledge learned from online learning activities to other subjects.
        將我從網上學習所學習到的技巧和知識應用到其他科目是一件容易的事。
      4. It is easy to apply the skills and knowledge learned from online learning activities to daily life.
        將我從網上學習所學習到的技巧和知識應用到日常生活是一件容易的事。
      5. I feel satisfied when I can apply the skills and knowledge learned from online learning activities to other subjects.
        當我將從網上學習所學習到的技巧和知識應用到其他科目, 我會感到滿足。
      6. I feel satisfied when I can apply the skills and knowledge learned from online learning activities to daily life.
        當我將從網上學習所學習到的技巧和知識應用到日常生活, 我會感到滿足。
      7. I know what I have learned through online learning activities.
        我知道我從網上學習中學習到什麽。
      8. I can recall the skills and knowledge learned from online learning activities via verbal reminders.
        只要有口頭的提示, 我就能記起我從網上學習中所學習到的技巧和知識。
      9. I can use the skills and knowledge learned from online learning activities in assessments and examinations.
        我能夠將我從網上學習中所學習到的技巧和知識運用於評估和考試。
    • II.
      Meaning of learning.
      1. Learning online broadens my horizon.
        網上學習拓闊了我的視野。
      2. I feel more motivated to learn when it is done online.
        網上學習能推動我學習。
      3. Learning online makes me an independent learner.
        網上學習令我獨立學習。
      4. The skills and knowledge learned from online learning activities are meaningful to my personal life.
        我從網上學習中所學習到的技巧和知識, 對於我的個人生活是有幫助。
      5. I understand how I learn best when learning online.
        我明白怎樣才能在網上學習裏學習得最好。

Funding

This research was in part supported by the Funding Programme for Research Projects on Equal Opportunities 2020/21 (R-2020/21–111) granted by the Equal Opportunities Commission, the Research Fellow Scheme (RFS2021-7H05) and General Research Fund (17609518, 17620520) granted by Research Grant Council, Hong Kong SAR to Dr. Shelley Xiuli Tong.

Code availability

Not applicable.

Availability of data and material

Derived data supporting the findings of this study are available from the corresponding author upon request.

Declarations

Conflict of interest

The author declare no conflict of interest.

Ethics approval

This study was granted ethics approval by the Human Research Ethics Committee, the University of Hong Kong.

Consent to participate

Informed online written consent was obtained from the parents for their own and their children’s participation in the study before testing began.

Consent for publication

The authors give their consent for this submitted manuscript to be published in the journal.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Allen, I. E., & Seaman, J. (2014). Grade change: Tracking online education in the United States Babson Survey Research Group. https://eric.ed.gov/?id=ED602449
  2. An SK, Lee D. An integrated model of parental mediation: the effect of family communication on children’s perception of television reality and negative viewing effects. Asian Journal of Communication. 2010;20(4):389–403. doi: 10.1080/01292986.2010.496864. [DOI] [Google Scholar]
  3. Anderson, T. (2008). The theory and practice of online learning (2nd ed.). Athabasca University (AU) Press.
  4. Appana S. A review of benefits and limitations of online learning in the context of the student, the instructor and the tenured faculty. International Journal on E-learning. 2008;7(1):5–22. [Google Scholar]
  5. Armitage R, Nellums LB. Considering inequalities in the school closure response to COVID-19. The Lancet Global Health. 2020;8(5):e644. doi: 10.1016/S2214-109X(20)30116-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Barnard L, Lan WY, To YM, Paton VO, Lai SL. Measuring self-regulation in online and blended learning environments. The Internet and Higher Education. 2009;12(1):1–6. doi: 10.1016/j.iheduc.2008.10.005. [DOI] [Google Scholar]
  7. Chang V, Fisher DL. The validation and application of a new learning environment instrument for online learning in higher education. In: Khine MS, Fisher DL, editors. Technology-rich learning environments: a future perspective. Singapore: World Scientific Publishing; 2003. pp. 1–20. [Google Scholar]
  8. Chenhall, R. D., Holmes, C., Lea, T., Senior, K., & Wegner, A. (2011). Parent-school engagement: exploring the concept of ‘invisible’ indigenous parents in three north australian school communities. The Northern Institute. https://ro.uow.edu.au/cgi/viewcontent.cgi?article=2457&context=sspapers Charles Darwin University.
  9. Cho MH, Cho Y. Self-regulation in three types of online interaction: a scale development. Distance Education. 2017;38(1):70–83. doi: 10.1080/01587919.2017.1299563. [DOI] [Google Scholar]
  10. Cho MH, Demei S, Laffey J. Relationships between self-regulation and social experiences in asynchronous online learning environments. Journal of Interactive Learning Research. 2010;21(3):297–316. [Google Scholar]
  11. Choi JA. Why I’m not involved: parental involvement from a parent’s perspective. Phi Delta Kappan. 2017;99(3):46–49. doi: 10.1177/0031721717739593. [DOI] [Google Scholar]
  12. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16(3):297–334. doi: 10.1007/BF02310555. [DOI] [Google Scholar]
  13. Dhawan S. Online learning: a panacea in the time of COVID-19 crisis. Journal of Educational Technology Systems. 2020;49(1):5–22. doi: 10.1177/0047239520934018. [DOI] [Google Scholar]
  14. Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. (2006). Multivariate data analysis. Pearson Prentice Hall.
  15. Hsu MK, Wang SW, Chiu KK. Computer attitude, statistics anxiety and self-efficacy on statistical software adoption behavior: an empirical study of online MBA learners. Computers in Human Behaviour. 2009;252:412–420. doi: 10.1016/j.chb.2008.10.003. [DOI] [Google Scholar]
  16. Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal. 1999;6(1):1–55. doi: 10.1080/10705519909540118. [DOI] [Google Scholar]
  17. Jansen RS, Van Leeuwen A, Janssen J, Kester L, Kalz M. Validation of the self-regulated online learning questionnaire. Journal of Computing in Higher Education. 2017;29(1):6–27. doi: 10.1007/s12528-016-9125-x. [DOI] [Google Scholar]
  18. Joreskog, K., & Sorbom, D. (2018). LISREL 10.1. Scientific Software International.
  19. Kaiser HF. The varimax criterion for analytic rotation in factor analysis. Psychometrika. 1958;23(3):187–200. doi: 10.1007/BF02289233. [DOI] [Google Scholar]
  20. Kline, P. (2014). An easy guide to factor analysis. Routledge.
  21. Lam, J. H. Y., & Tong, S. X. (2022). Development and validation of the online learning attitude questionnaire (OLAQ) among primary school children and caregivers. Interactive Learning Environments, 1–15. 10.1080/10494820.2022.2043911.
  22. Lau EYH, Lee K. Parents’ views on young children’s distance learning and screen time during COVID-19 class suspension in Hong Kong. Early Education and Development. 2020;33:1–18. doi: 10.1080/10409289.2020.1843925. [DOI] [Google Scholar]
  23. Lestari PAS, Gunawan G. The impact of COVID-19 pandemic on learning implementation of primary and secondary school levels. Indonesian Journal of Elementary and Childhood Education. 2020;1(2):58–63. [Google Scholar]
  24. Livingstone, S., & Bober, M. (2004). UK children go online: surveying the experiences of young people and their parents. LSE Research Centre. http://eprints.lse.ac.uk/395/.
  25. Ma X, Shen J, Krenn HY, Hu S, Yuan J. A meta-analysis of the relationship between learning outcomes and parental involvement during early childhood education and early elementary education. Educational Psychology Review. 2016;28(4):771–801. doi: 10.1007/s10648-015-9351-1. [DOI] [Google Scholar]
  26. Moore MG. Editorial: three types of interaction. American Journal of Distance Education. 1989;3:1–7. doi: 10.1080/08923648909526659. [DOI] [Google Scholar]
  27. Nguyen-Newby TH, Fraser BJ. Computer laboratory workshops as learning environments for university business statistics: validation of questionnaires. Learning Environments Research. 2021;24:389–407. doi: 10.1007/s10984-020-09324-z. [DOI] [Google Scholar]
  28. Nicpon MF, Huser L, Blanks EH, Sollenberger S, Befort C, Kurpius SER. The relationship of loneliness and social support with college freshmen’s academic performance and persistence. Journal of College Student Retention: Research Theory & Practice. 2006;8(3):345–358. doi: 10.2190/A465-356M-7652-783R. [DOI] [Google Scholar]
  29. Picciano, A. G., & Seaman, J. (2007). K-12 online learning: a survey of US school district administrators. Sloan Consortium.
  30. Prior DD, Mazanov J, Meacheam D, Heaslip G, Hanson J. Attitude, digital literacy and self-efficacy: Flow-on effects for online learning behavior. The Internet and Higher Education. 2016;29:91–97. doi: 10.1016/j.iheduc.2016.01.001. [DOI] [Google Scholar]
  31. Puustinen M, Pulkkinen L. Models of self-regulated learning: a review. Scandinavian Journal of Educational Research. 2001;45(3):269–286. doi: 10.1080/00313830120074206. [DOI] [Google Scholar]
  32. Richardson JC, Maeda Y, Lv J, Caskurlu S. Social presence in relation to students’ satisfaction and learning in the online environment: a meta-analysis. Computers in Human Behavior. 2017;71:402–417. doi: 10.1016/j.chb.2017.02.001. [DOI] [Google Scholar]
  33. Shen D, Cho MH, Tsai CL, Marra R. Unpacking online learning experiences: online learning self-efficacy and learning satisfaction. The Internet and Higher Education. 2013;19:10–17. doi: 10.1016/j.iheduc.2013.04.001. [DOI] [Google Scholar]
  34. Straub DW. Validating instruments in MIS research. MIS Quarterly. 1989;13(2):147–169. doi: 10.2307/248922. [DOI] [Google Scholar]
  35. Tabachnick, B. G., & Fidell, L. S. (2013). Using multivariate statistics: Pearson new international edition. Pearson.
  36. Toven-Lindsey B, Rhoads RA, Lozano JB. Virtually unlimited classrooms: pedagogical practices in massive open online courses. The Internet and Higher Education. 2015;24:1–12. doi: 10.1016/j.iheduc.2014.07.001. [DOI] [Google Scholar]
  37. Trochim, W. M., & Donnelly, J. P. (2001). Research methods knowledge base (2 vol.). Atomic Dog Pub.
  38. Ullman, J. B., & Bentler, P. M. (2013). Structural equation modeling. In J. A. Schinka, W. F. Velicer, & I. B. Weiner (Eds.), Handbook of psychology: Research methods in psychology (pp. 661–690). John Wiley & Sons, Inc.
  39. Walker SL, Fraser BJ. Development and validation of an instrument for assessing distance education learning environments in higher education: the Distance Education Learning environments Survey (DELES) Learning Environments Research. 2005;8(3):289–308. doi: 10.1007/s10984-005-1568-3. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Not applicable.

Derived data supporting the findings of this study are available from the corresponding author upon request.


Articles from Learning Environments Research are provided here courtesy of Nature Publishing Group

RESOURCES