Abstract
Online learning has been experienced more acutely and affectively by students during COVID‐19. Age, program type, and online learning environment were found to significantly and uniquely contribute to students’ satisfaction through a hierarchical regression analysis. These results shed light on counselor training, particularly in the context of the pandemic.
Keywords: counselor trainees, COVID‐19 pandemic, hierarchical regression, online learning, satisfaction
INTRODUCTION
The expansion and embedding of digital technologies in education are not new or unique, but they are being experienced more acutely and affectively by students and educators across the world due to COVID‐19 (Williamson et al., 2020). After the spring break of 2020, many in‐person counseling programs, like others in higher education, suspended their face‐to‐face classes and adopted an emergency remote learning mode (Ghazi‐Saidi et al., 2020). In tandem with this swift transition, the steady proliferation of online (50% or more of a program's curriculum is offered online) counseling programs had been a trend that started prior to the pandemic (Li & Su, 2021). The confluence of these two change processes—the forced, abrupt transition to remote learning and the incremental growth of online counseling programs—delineated the new landscape of counselors training (Harrison, 2021).
Counseling students are active participants of online learning, but how they perceive and experience this modality has been understudied (Li & Su, 2021). The prevalence of computer‐mediated learning due to COVID‐19 further made it imperative to extend this line of inquiry. Despite the recent literature uptick of online learning in counseling, the journal articles are mainly conceptual in nature (e.g., Harrison, 2021; Wasik et al., 2019), studied through a qualitative lens (e.g., Bender & Werries, 2021; Holmes et al., 2020), or explored from counselor educators’ perspective (e.g., Hinkley et al., 2021). However, the broader education literature suggested that students’ perception of online learning plays a pivotal role in their learning outcomes, including motivation and persistence (Kauffman, 2015). To bridge this gap, a hierarchical regression analysis was conducted to examine the extent to which three layers of factors (age, program type, online learning environment) uniquely account for counselor trainees’ satisfaction with online learning, in the hope of providing implications for curricular, programmatic, and institutional practices that may promote students’ online learning.
It is worth noting that scholars have not come to consensus on a unified definition of online learning, distance learning, or e‐learning (J. L. Moore et al., 2011). In counseling, the term online denotes a wide range of computer‐mediated instruction modes (e.g., remote, distance, technology, hybrid, cyber, and internet; Li & Su, 2021). To ensure clarity and consistency, online in this study broadly refers to a modality where “educators and students communicate with each other synchronously (i.e., at the same time) or asynchronously (i.e., at different times) via telecommunications technology in formats ranging from a small online component in teaching and learning to an entire program offered online” (Li & Su, 2021, p. 317). To this end, counseling students from either online/distance programs or in‐person programs but experienced remote learning due to the pandemic were eligible to participate in this study.
Students’ satisfaction with online learning
Students’ satisfaction influences their persistence, retention, motivation, and success, and thus satisfaction‐related studies have long been established as part of the higher education research landscape (Richardson et al., 2017). Same with online education, students’ satisfaction is often a significant predictor of their learning outcomes (e.g., Eom et al., 2006). Nevertheless, counselor trainees’ satisfaction with online learning was relatively understudied, which was a natural consequence of the overall paucity of literature aimed at online teaching and learning in counselor education (Barrio Minton & Hightower, 2020; La Guardia, 2021; Li & Su, 2021). And it was not uncommon to see the construct of satisfaction examined by self‐constructed measures (e.g., Furlonger & Gencic, 2014). While one could argue that satisfaction can be a process indicator, it can also be an outcome measure, as consistent with most counseling and supervision literature (e.g., Furlonger & Gencic, 2014; Li et al., 2021). In this study, student satisfaction is treated as an affect indicator of student enjoyment with online learning (Walker & Fraser, 2005) and is measured by eight validated items in Walker's (2020) attitudinal scale of enjoyment.
Kauffman's (2015) review suggested that online learning is comparable to face‐to‐face learning in terms of learning effectiveness, but the two differed in student satisfaction, with lower ratings for online courses. Such pattern of comparable learning outcomes was consistently identified in the training of counselors (Furlonger & Gencic, 2014). The broader education literature also evidenced learner, instructor, and course characteristics as related to students’ perceived satisfaction in online learning contexts, such as self‐motivation, learning styles, instructor knowledge/facilitation, instructor feedback, interaction/dialogue, and course design/structure (e.g., Eom et al., 2006; Eom & Ashill, 2016). However, these studies targeted both undergraduate and graduate students across areas of study at large, which may render limited implications for understanding students’ satisfaction with online learning in the counseling field that has its unique characteristics (e.g., the “high‐touch” nature of counseling), particularly when it has been complicated by COVID‐19.
Student characteristics and program type
Age, accessibility, cost, location, and time were once, and may still be, hurdles to adults seeking further education (Renfro‐Michel et al., 2010). But many graduate students nowadays are nontraditional adult learners who may work full‐time while completing their degrees (Renfro‐Michel et al., 2010). Online learning has commonly been the chosen educational mode for working adults in the 25–50 age range (M. G. Moore & Kearsley, 2005). Online counseling programs can provide more flexibilities for working adults and caregivers who may want to study counseling but be constricted by complex responsibilities, schedules, and mobility‐related issues such as disabilities and travel difficulties (Snow & Coker, 2020). In addition, one of the four main assumptions of andragogy (Knowles, 1973) suggests that as people grow and mature, their self‐concept moves from one of total dependency to one of increasing self‐directedness. Reasonably, the accessibility, flexibility, and self‐directedness of online learning can enhance adult learners’ satisfaction if they deem such modality a good fit. But if students are forced to adopt this learning mode, their satisfaction may be affected.
COVID‐19 has caused an unprecedented disruption in higher education. In March 2020, many brick‐and‐mortar higher education institutions were forced to adopt remote teaching and learning (Ghazi‐Saidi et al., 2020). This unexpected, sudden transition incurred confusion, chaos, and frustration for students, educators, and academic administration (Ghazi‐Saidi et al., 2020). What made this transition even more challenging was that continuing studies online was not a choice (Ghazi‐Saidi et al., 2020). While the education accessibility that online learning affords amid the pandemic is widely acknowledged, it is not uncommon for students to undergo hardships when they see the online modality a mismatch for them (Li et al., under review). And this abrupt change is especially difficult for applied education programs (e.g., counseling, psychology, social work) due to the lack of easily accessible infrastructure to ensure all courses can be taught and standards met online in a secure and confidential manner (Christian et al., 2021). But whether and to what extent this sudden change contributes to counselor trainees’ satisfaction with online learning warrants empirical investigation.
The online learning environment
There is a myriad of ways to describe and define the scope and characteristics of an online learning environment, which varies from a course, a program, a learning object, a learning management system (e.g., Canvas, Blackboard), a type of learning based on its design methodology (e.g., instructor‐led, self‐paced; J. L. Moore et al., 2011), to a broad reference that encompasses all elements associated with online learning. In counseling, the term online learning environment often refers to the online learning modality in general, with the absence of an operational definition. Nonetheless, some elements stood out as pivotal to an online learning environment, such as community building (Wasik et al., 2019), virtual classroom safety (Chen et al., 2020), and a higher level of structure and effective classroom management skills (Bender & Werries, 2021), which are co‐constructed and maintained by students and instructors and may affect students’ satisfaction with online learning. In this study, the online learning environment is confined to the psychosocial aspect of it—students and instructors who constitute the major participants in this environment. It is measured by six factors: instructor support, student interaction and collaboration, personal relevance, authentic learning, active learning, and student autonomy (Walker & Fraser, 2005). These factors were found to be significantly associated with students’ satisfaction with online learning (e.g., Walker & Fraser, 2005).
Purpose of the study
Despite the rapid proliferation of online programs and computer‐mediated learning in counseling, research initiatives targeting students’ experiences with and perceptions about online learning significantly lagged. To fill in this gap, a hierarchical regression analysis was performed to examine three layers of factors (age, program type, online learning environment) that may relate to counselor trainees’ satisfaction with online learning, particularly amid the pandemic. The aim is to assist counselor educators and clinical supervisors in designing and teaching quality online courses and guiding students in selecting appropriate formats of counselor education based on their attributes and preferred learning styles.
METHOD
Participants
This is a large national sample of counselor trainees across the United States. Of the 397 participants, 342 (86.15%) self‐identified as female, 47 (11.84%) male, six (1.51%) indicated other (e.g., nonbinary, transgender), and two (0.50%) preferred not to answer. In this sample, 278 (70.03%) reported themselves as White, 36 (9.07%) Hispanic or Latino/a, 28 (7.05%) Black or African American, 22 (5.54%) Asian, and 20 (5.04%) two or more races. The average age of the sample was 31.04 (SD = 9.524). The majority (n = 237, 59.70%) were in the age range of 20–29, 86 (21.66%) in 30–39, and 74 (18.64%) were 40 years old or more. Most counselor trainees (n = 347, 87.41%) were at the master's level, and 46 (11.59%) were doctoral students. Of the 397, 200 (50.38%) reported their instruction mode prior to COVID‐19 as in‐person, 180 (45.34%) online, and 17 (4.28%) other.
Procedure
Upon receiving the university institutional review board approval, the author adopted various means to recruit participants: (a) e‐mailing all Council for Accreditation of Counseling and Related Educational Programs (CACREP) liaisons of more than 800 counseling programs for recruitment distribution and (b) posting recruitment announcements via professional networks, including COUNSGRADS Listserv, ACA Connect, CESNET‐L, and DIVERSEGRAD‐L. To be eligible to participate in the study, participants had to be (a) at least 18 years of age; (b) currently enrolled in a counseling program in the United States; and (c) having online learning experience, including both that from online programs and that from remote learning due to COVID‐19. Over the 2 months of active recruitment in Fall 2020, the study link via Qualtrics was accessed 457 times. Participants who did not complete the Distance Education Learning Environments Survey (DELES; Walker, 2020) that includes the core variables to be examined were not included in this study, leaving a total of 397 valid surveys. The estimated response rate was less than 1%, given a total of 53, 000 student enrollments in 2018 (CACREP Annual Report, 2019) that did not count student enrollment from non‐CACREP‐accredited programs.
Measures
Demographic questionnaire
A self‐designed demographic questionnaire is used to collect the basic unidentifiable information about participants, which helps describe the sample composition. It includes gender, race/ethnicity, age, training level, and program type (the instruction mode prior to COVID‐19).
The Distance Education Learning Environments Survey
The DELES (Walker, 2020) is used to study the psychosocial learning environment in post‐secondary distance education. While it was normed on an international sample (e.g., the United States, Australia, Canada, New Zealand) at both graduate and undergraduate levels, a further breakdown of the demographic composition would provide more psychometric information about this instrument as applied to multicultural populations. The DELES comprises seven scales, with the first six measuring the distance education learning environment and the last gauging students’ affective trait (enjoyment or satisfaction with online learning). Participants responded to each item on a five‐point Likert scale (1 = never, 2 = seldom, 3 = sometimes, 4 = often, 5 = always). The instructor support scale has eight items (score range 8–40), and a sample item is “In this class, if I have an inquiry, the instructor finds time to respond.” The student interaction and collaboration scale includes six items (score range 6–30), and a sample item is “In this class, I work with others.” The personal relevance scale has seven items (score range 7–35), and a sample item is “In this class, I can relate what I learn to my life outside of university.” The authentic learning scale includes five items (score range 5–25), and a sample item of is “In this class, I study real cases related to the class.” The active learning scale has three items (score range 3–15), and a sample item of is “In this class, I explore my own strategies for learning.” The student autonomy scale includes five items (score range 5–25), and a sample item is “In this class, I make decisions about my learning.” The enjoyment scale has eight items (score range 8–40), and a sample item is “Distance education is stimulating.”1
Since the six scales measured distinct constructs (Walker & Frazer, 2005), as per the scoring key (Walker, 2020), each participant's DELES scale scores should be calculated and interpreted separately rather than be combined to form a composite DELES score. Accordingly, in this study, participants’ responses to the six psychosocial scales were used as independent variables (grouped under the online learning environment), and those to the affect scale was used as the dependent variable. The Cronbach's alphas for the six psychosocial scales were 0.87, 0.94, 0.92, 0.89, 0.75, and 0.79, respectively. These reliability coefficients were consistent with the current study where the Cronbach's alphas were 0.91, 0.91, 0.93, 0.89, 0.75, and 0.83, respectively, for the six scales. The Cronbach's alpha of the affect scale was 0.95 for both the original sample (Walker & Frazer, 2005) and the current study. Walker and Frazer (2005) investigated the construct validity using principal component factor analysis with varimax rotation and Kaiser normalization. Their Kaiser–Meyer–Olkin calculation determined a distribution of adequate values for conducting factor analysis to be 0.91, which suggested the appropriateness of using factor analysis based on the sample data (Walker, 2020). In this study, the means of the six psychosocial scales and the affect scale were 4.1192 (SD = 0.77330), 3.6924 (SD = 0.94150), 3.9599 (SD = 0.81969), 3.9854 (SD = 0.80843), 3.9471 (SD = 0.71378), 4.1657 (SD = 0.64999), and 2.8296 (SD = 1.01073), respectively.
Data analysis
The assumptions of linearity and homoscedasticity were made by spotting the plot that shows no systematic relationship between the errors in the model and what the model predicts (Field, 2017). The independence assumption was made provided that participants independently filled out the survey so the errors in the model were not related to each other. The assumption of normality was made by performing Kolmogorov–Smirnov and Shapiro–Wilk tests (Field, 2017). A total of 24 missing values (0.1%) scattered across different DELES items, and these missing values were replaced through regression imputation in SPSS Amos 28 (Arbuckle, 2021).
Hierarchical regression was deemed an appropriate analytical method given its “ability to examine the significance in the incremental increases in R 2 when more than one predictor is of interest or a set of predictors that share some relevant commonality are of interest (blocks)” (Petrocelli, 2003, p. 19). Specifically, it helps determine the degree to which variables (e.g., program type, online learning environment) entered later in the analysis account for variance in the criterion (counselor trainees’ satisfaction) over and above that which is accounted for by variables entered earlier in the analysis (e.g., age; Petrocelli, 2003). The a priori power analysis (G*Power 3.1.9.7) indicated that 114 would be an adequate sample size for this study with nine predictors, assuming the desired power of 0.80, a Type I error of 0.05, and a medium effect size (f 2 = 0.15; Cohen, 1988). And the current sample size of 397 met this criterion.
Given the popularity of online learning among adult learners (M. G. Moore & Kearsley, 2005; Renfro‐Michel et al., 2010; Snow & Coker, 2020), the unprecedented impact of COVID‐19 on the learning modality (Christian et al., 2021; Ghazi‐Saidi et al., 2020), and the critical role that the online learning environment plays in students’ perception about online learning (Walker & Fraser, 2005), a series of three research questions were formulated. RQ1: Does counselor trainees’ age account for a significant amount of variance in their self‐reported satisfaction with online learning? RQ2: Does program type (the primary instruction mode prior to COVID‐19) account for a significant amount of variance in their satisfaction over and above that accounted for by age? RQ3: Does online learning environment (a total of six variables) account for a significant amount of variance in student satisfaction over and above that accounted for by age and program type?
It was hypothesized that each layer of factors uniquely and significantly explained the variance in counselor trainees’ satisfaction with online learning. To test these hypotheses, a three‐step hierarchical regression analysis was conducted, with counselor trainees’ satisfaction as the dependent variable. Age was entered at Step 1 of the regression as a demographic variable. Since participants in the current sample did not differ significantly in satisfaction based on their gender, race or ethnicity, or training level (master's vs. doctoral), these demographic variables were excluded from the regression model for further analysis. The program type (instruction mode prior to COVID‐19) was entered at Step 2 because this mode was determined by each institution, which was not flexible to change. Given the categorical nature of this variable (in‐person, hybrid, and online), it was dummy coded prior to being entered into the model. Six online learning environment variables were entered at Step 3 because they were more subject to change, compared to age and program type. In every step (except for the first step due to the only predictor variable) of the hierarchical regression, the variance inflation factor (VIF) was computed to detect whether a predictor has a strong linear relationship with the other predictors (Field, 2017). If the largest VIF is greater than 10, then this indicates a serious problem (Bowerman & O'Connell, 1990; Myers, 1990).
RESULTS
The results of this study supported all three hypotheses. To answer RQ1, counselor trainees’ age contributed significantly to the regression model, F (1, 395) = 24.152, p < 0.001, and explained 5.8% of the variance in their self‐reported satisfaction with online learning (see Table 1). The effect size was small to medium (f 2 = 0.06; Cohen, 1988). To answer RQ2, introducing the variable of program type, or the instruction mode prior to COVID‐19, explained an additional 9.6% of the variance in satisfaction after controlling for the effect of age, and this change in R 2 was statistically significant, F (2, 393) = 22.254, p < 0.001. The effect size attributable to this addition was small to medium (f 2 = 0.11; Cohen, 1988). The VIFs for the three predictors ranged from 1.010 to 1.048, which were much lower than 10 that signifies a concern for multicollinearity (Bowerman & O'Connell, 1990). Compared to participants from in‐person (face‐to‐face) programs, those from hybrid programs reported a 0.445 higher average on satisfaction, while controlling for the effect of age on satisfaction, although this higher average was not statistically significant (p = 0.060). Consistently, participants from online programs reported a 0.640 higher average on satisfaction than those in the face‐to‐face mode, while controlling for the effect of age on satisfaction, and this higher average was statistically significant (p < 0.001).
TABLE 1.
Hierarchical regression results for counselor trainees’ satisfaction with online learning
| 95% CI for B | ||||||||
|---|---|---|---|---|---|---|---|---|
| Variable | B | LL | UL | SE B | β | R 2 | ΔR 2 | |
| Step 1 | 0.058*** | 0.058*** | ||||||
| Constant | 2.039*** | 1.708 | 2.370 | 0.168 | ||||
| Age | 0.025*** | 0.015 | 0.036 | 0.005 | 0.240*** | |||
| Step 2 | 0.153*** | 0.096*** | ||||||
| Constant | 1.829*** | 1.509 | 2.150 | 0.163 | ||||
| Age | 0.022*** | 0.013 | 0.032 | 0.005 | 0.210*** | |||
| Hybrid | 0.445 | −0.019 | 0.910 | 0.236 | 0.089 | |||
| Online | 0.640*** | 0.451 | 0.830 | 0.096 | 0.316*** | |||
| Step 3 | 0.378*** | 0.224*** | ||||||
| Constant | −1.430*** | −2.072 | −0.789 | 0.326 | ||||
| Age | 0.020*** | 0.011 | 0.029 | 0.004 | 0.189*** | |||
| Hybrid | 0.272 | −0.134 | 0.678 | 0.206 | 0.055 | |||
| Online | 0.489*** | 0.321 | 0.657 | 0.086 | 0.241*** | |||
| Instructor support | 0.248*** | 0.116 | 0.379 | 0.067 | 0.189*** | |||
| Student interaction | 0.130** | 0.025 | 0.235 | 0.053 | 0.121** | |||
| Personal relevance | 0.109 | −0.035 | 0.253 | 0.073 | 0.088 | |||
| Authentic learning | 0.101 | −0.039 | 0.240 | 0.071 | 0.080 | |||
| Active learning | −0.054 | −0.206 | 0.098 | 0.077 | −0.038 | |||
| Student autonomy | 0.309*** | 0.152 | 0.465 | 0.080 | 0.198*** | |||
Note: N = 397.
Abbreviations: CI, confidence interval; LL, lower limit; UL, upper limit.
p < 0.05;
p < 0.01;
p < 0.001.
To answer RQ3, adding the six online learning environment variables (instructor support, student interaction and collaboration, personal relevance, authentic learning, active learning, and student autonomy) to the regression model explained an additional 22.4% of the variance in satisfaction after controlling for the effects of age and program type, and this change in R 2 was statistically significant, F (6, 387) = 23.249, p < 0.001. The effect size attributable to this addition was large (f 2 = 0.36; Cohen, 1988). The VIFs for these predictors ranged from 1.045 to 2.202, indicating no concern of multicollinearity. Despite the statistically significant (p < 0.001) positive bivariate correlations between satisfaction and the six variables (ranging from 0.301 to 0.407), the partial correlations between satisfaction and each of the six measured factors while controlling for the effect of the other five indicated that only student autonomy (r = 0.227, p < 0.001), instructor support (r = 0.161, p = 0.001), and student interaction and collaboration (r = 0.144, p = 0.004) were associated with satisfaction the most and statistically significant. Notably, as all the variables were included in Step 3 of the regression model, the statistically significant predictors of satisfaction included age, the online program, instructor support, student interaction and collaboration, and student autonomy. Collectively, all the nine variables accounted for 37.8% (adjusted R 2 = 36.3%) of the variance in satisfaction.
DISCUSSION
Results of this study showed that counselor trainees with higher ages were more likely to report higher levels of satisfaction with online learning. This pattern may be accounted for by a constellation of factors, including the accessibility, flexibility, and convenience that online learning affords (Bender & Werries, 2021; Renfro‐Michel et al., 2010; Snow & Coker, 2020), one's increasing self‐directedness in learning as one grows (Knowles, 1973), and the drastic changes brought about by technology that one lives through over time (Li et al., under review). Particularly, online learning has gained growing popularity among working adults in the age range of 25–50 (M. G. Moore & Kearsley, 2005). The age distribution in the current sample (age 20–24, n = 118; age 25–50, n = 256; age 51–70, n = 23) reflected such a trend. Logically and as evidenced by the current study, counselor trainees in the online or other (mostly hybrid) group (n = 197) showed a slightly higher age average (M = 31.99, SD = 10.42) than that (M = 30.10, SD = 8.48) of their in‐person program counterparts (n = 200).
Aside from age, program type uniquely and significantly contributed to the variance in counselor trainees’ satisfaction with online learning, which may be intertwined with the abrupt learning modality transition for in‐person programs amid the pandemic. Specifically, students from online programs exhibited highest level of satisfaction, those from in‐person programs reported lowest level of satisfaction, and students from hybrid programs sat in between. This finding seems inconsistent with past research where students’ satisfaction was often rank ordered as in‐person, hybrid, and online, from high to low. For example, Furlonger and Gencic (2014) found that on‐campus (36 h of face‐to‐face instruction per subject) counseling students reported significantly higher course satisfaction (M = 24.26, SD = 5.23) than offshore (20 h per subject; M = 21.00, SD = 5.42) and off‐campus (15 hours in total for only one subject, being primarily web‐based; M = 19.46, SD = 6.21) students. But students’ satisfaction in that study was assessed based on students’ self‐selected learning modality, which is different from the current study where students from in‐person programs were forced to attend classes online due to COVID‐19 (Ghazi‐Saidi et al., 2020). Because these students were enrolled in face‐to‐face programs in the first place, they probably preferred in‐person to online. When forced to take classes online to continue their education, they were likely to see this modality a mismatch for them, particularly when learning was complicated by the stressful global health crisis.
Regardless of the online instruction modes (e.g., synchronous, asynchronous, a combination of both), students and instructors constitute the two main bodies in any online learning environment. Unsurprisingly, student‐ and instructor‐focused online learning environment measured in this study uniquely and significantly accounted for the largest part (ΔR 2 = 22.4%) of the variance in counselor trainees’ satisfaction after controlling for age and program type, which was consistent with the broader education literature (e.g., Eom et al., 2006; Eom & Ashill, 2016). Of the six online learning environment variables, only student autonomy, instructor support, and student interaction and collaboration were significant predictors, which were consistent with the results from partial correlation analysis between satisfaction and each of the six variables, respectively. Notably, student autonomy was the strongest predictor. These findings were different from Walker and Fraser's (2005) study where all but active learning were significant predictors of satisfaction and personal relevance appeared to be the strongest predictor. One possible contributor to this difference may be the sample composition. Walker and Fraser's (2005) sample had 72.6% graduate students and 27.5% undergraduate students internationally, whereas current sample comprised all counseling graduate students across the United States. As students grow and mature, their self‐concept evolves from dependency to self‐directedness (Knowles, 1973), which is also corroborated by counselor trainees’ autonomy growth in professional identity development (Dollarhide et al., 2013) and in supervision (Li et al., 2018; Stoltenberg & McNeill, 2010).
A very recent study (Brown et al., 2022) of 151 Australian undergraduate occupational therapy students who completed their studies online during 2020 due to COVID‐19 consistently documented a similar amount of variance (R 2 = 0.253, R 2 adj = 0.222, p = 0.001) in student satisfaction explained by the six DELES learning environment variables. Slightly different from the present study, Brown et al.’s (2022) research suggested that instructor support was the only significant predictor, whereas the current study suggested additional two predictors (student autonomy, student interaction and collaboration). This discrepancy could be partially attributed to the sample difference where the current study also included students from online programs. In spite of this variation, both studies signified instructor support as a key factor in affecting students’ satisfaction with online learning. This echoed one of the findings in Bender and Werries's (2021) study where participants endorsed the need for competent faculty, particularly their efficient use of technology that set effective online faculty apart from face‐to‐face faculty in counselor education and supervision.
A third significant learning environment factor was student interaction and collaboration, which specifically referred to interactions and collaborations among students (Walker, 2020). While using a different measure, Muzammil et al.’s (2020) study of online learning students in economics in Indonesia suggested that interaction among students, together with interaction with content and interaction with tutors, strengthened their engagement, which ultimately led to student satisfaction. Their finding revealed another layer of complexity for the seemingly intuitive, direct relationship between student interaction and satisfaction.
While not a significant predictor in the current study, personal relevance was the strongest predictor of student satisfaction in Walker and Fraser's (2005) sample. This may again be due to participants’ different levels of training and fields of study across two studies. Walker and Fraser (2005) targeted higher education students in general with both undergraduate and graduate students included, whereas the present study exclusively recruited counseling students at the graduate level. Although making a personal reference is frequently experienced and even encouraged during counselor training, it may be experienced more extensively and intensively in other fields of study, particularly at the undergraduate level where students are exposed to introductory courses in a range of subjects before committing to a major (Kowarski, 2022).
Authentic learning was another nonsignificant predictor in the present study. While instructors may introduce real‐life examples and experiences to counselor trainees in class, practicum and internship are the main phases when students are exposed to and work around real‐life scenarios. Participants who have not yet arrived at such culminating training stage in this study may not find this group of questions relatable.
Finally, active learning was not a significant predictor of student satisfaction in both the current study and Walker and Fraser's (2005) original study. This may be because items in the active learning scale described actions on the contrary to student interaction and collaboration that uniquely and significantly contributed to student satisfaction in both studies. As a logical consequence, active learning with a focus on independent learning and problem solving would not significantly affect student satisfaction after controlling for the other five factors.
Implications for counselor education and clinical supervision
Students’ perception of online learning is a significant predictor of their learning outcomes (Eom et al., 2006) and it also plays a pivotal role in their learning outcomes, including motivation and persistence (Kauffman, 2015). The three sets of variables examined in this study uniquely and significantly explained the variance in counselor trainees’ satisfaction, which shed light on ways that counselor educators and clinical supervisors can adopt as well as resources that they can leverage to enhance students’ satisfaction with online learning. These implications are especially timely given the pervasive computer‐mediated learning in counselor education that will continue to be the trend (Coker et al., 2021), particularly during COVID‐19.
Despite an emphasis on both student and instructor roles in an online learning environment that accounted for the largest variance in student satisfaction in the current study, counselor educators can, and should take initiative to enhance students’ learning satisfaction and mobilize changes in the desired direction. Particularly in light of the three significant DELES (Walker, 2020) predictors, counselor educators can encourage student autonomy in online learning (e.g., a flipped classroom where students have much autonomy in studying the information‐based content), provide adequate instructor support (e.g., making themselves available to students through virtual office hours), and facilitate student interaction and collaboration (e.g., creating a discussion board that encourages engaging, scholarly discourses by setting clear expectations that each student not only posts their own but comments on each other's writing). Wasik et al. (2019) offered creative approaches that counselor educators can adopt when designing courses, creating a classroom community, and engaging with students in an online educational environment.
In view of students’ varied satisfaction levels of online learning across program types, counselor educators and clinical supervisors, particularly those from in‐person programs, should be attentive to students’ needs given the unexpected, abrupt transition. With many in‐person programs moving back to their original instruction modality, even if not 100%, instructors should be mindful of the new needs that may arise as they experience the backward transition from remote to in‐person, for both students who encountered the double transitions and those who only experienced the latter. While COVID‐19 has posed collective challenges to teaching and learning by and large, it was especially difficult for clinical or skill‐related courses (Christian et al., 2021), such as experiential learning that could be constrained by incorporating interactional modalities, locating a practicum or internship site given limited options, staying informed of the new clinical requirements stipulated by CACREP, and accumulating clinical hours to meet graduation requirements in time (Li et al., under review). Clinical supervision itself was further complicated by the pandemic as well. Supervisors not only had to gatekeep supervisees’ clinical competency and safeguard the welfare of supervisees’ clients (Bernard & Goodyear, 2019) but they needed to be sensitive to supervisees’ needs as related to the global health crisis, as supervisors themselves probably have been navigating a myriad of complications due to the pandemic. Together, all these factors could affect counselor trainees’ perception of and satisfaction with online learning. Counselor educators and clinical supervisors may infuse self‐care and wellness into CACREP curricula (Harrichand et al., 2021).
Limitations and future research
This study is not exempted from limitations that may be tackled in future research. First, while more than a third (37.8%) of the variance in satisfaction was explained by three layers of factors examined in the study, other variables suggested by the existing literature may be added to consider, such as course size (Kauffman, 2015), course design (Eom & Ashill, 2016), and course type (e.g., content‐focused vs. experiential; Christian et al., 2021). Subtler differences in both student and instructor groups (e.g., the number of online courses students have taken; students who experienced online learning only through online programs vs. students who have taken online courses in primarily face‐to‐face programs; instructors’ previous training/experience in online teaching) could also convolute study results. Second, counselor trainees’ satisfaction in this study was only examined in the affective domain (Walker, 2020). Other domains (e.g., cognitive), together with students’ learning outcomes, may be included in future research.
Third, the six learning environment variables were entered into the last model as an intact group, which rendered limited information for the potentially complex relationships among the six variables, and between each of the six variables and student satisfaction, as in the identified indirect relationship between student interaction and satisfaction through student engagement (Muzammil et al., 2020). Fourth, this study was conducted amid COVID‐19 when counselor trainees’ online learning was interfaced by the global health crisis. A qualitative design may be adopted to help illuminate what and how counselor trainees experienced online learning, particularly as related to the pandemic. An additional group of variables embedded within the larger context (e.g., pandemic‐related factors) may be incorporated as a follow‐up to further explicate factors that relate to students’ online learning, such as personal health conditions, financial situation, and access to computers and internet. Finally, although this is a national sample, an estimated response rate of less than 1% was low. Researchers in future studies may be more creative and strategic in recruiting participants.
CONCLUSION
The confluence of burgeoning online counseling programs and sprouting application of computer‐mediated technology to counselor education and counseling, particularly during the pandemic, necessitates the study of online teaching and learning in counselor education. This study revealed that age, program type, and online learning environment significantly and uniquely contributed to counselor trainees’ satisfaction with online learning during COVID‐19. These findings may assist counselor educators and clinical supervisors in offering quality online courses and guiding students in selecting appropriate formats of counselor education based on their attributes and preferred learning styles. On a final note, counselor trainees’ online learning in this study was contextualized in a historical event (the pandemic) that has permeated through all aspects of people's life and higher education, among others. This same study may be replicated at later times to disentangle the impact of COVID‐19 on students’ perceptions about online learning.
Li, D. (2022). Predictors of counselor trainees’ satisfaction with online learning during COVID‐19. Counselor Education and Supervision, 61, 379–390. 10.1002/ceas.12254
Footnotes
Note: Reproduction by special permission of the Publisher, Mind Garden, Inc., www.mindgarden.com from the Distance Education Learning Environments Survey by Scott L. Walker. Copyright © 2020 by Scott L. Walker. Further reproduction is prohibited without the Publisher's written consent.
REFERENCES
- Arbuckle, J. L. (2021). IBM SPSS Amos 28 user's guide. Amos Development Corporation, SPSS Inc. [Google Scholar]
- Barrio Minton, C. A. , & Hightower, J. M. (2020). Counselor education and supervision: 2018 annual review. Counselor Education and Supervision, 59(1), 2–15. 10.1002/ceas.12162 [DOI] [Google Scholar]
- Bender, S. , & Werries, J. (2021). Doctoral‐level CES Students’ lived experiences pursuing courses in an online learning environment. The Journal of Counselor Preparation and Supervision, 14(2), 1–32. https://westcollections.wcsu.edu/handle/20.500.12945/1964 [Google Scholar]
- Bernard, J. M. , & Goodyear, R. K. (2019). Fundamentals of clinical supervision (6th ed.). Pearson. [Google Scholar]
- Bowerman, B. L. , & O'Connell, R. T. (1990). Linear statistical models: An applied approach. Brooks/Cole. [Google Scholar]
- Brown, T. , Robinson, L. , Gledhill, K. , Yu, M. L. , Isbel, S. , & Greber, C. (2022). Reliability and validity evidence of two distance education learning environments scales. American Journal of Distance Education, Advance online publication. 10.1080/08923647.2022.2065147 [DOI] [Google Scholar]
- Chen, S. Y. , Wathen, C. , & Speciale, M. (2020). Online clinical training in the virtual remote environment: Challenges, opportunities, and solutions. The Professional Counselor, 10(1), 78–91. https://files.eric.ed.gov/fulltext/EJ1250982.pdf [Google Scholar]
- Christian, D. D. , McCarty, D. L. , & Brown, C. L. (2021). Experiential education during the COVID‐19 pandemic: A reflective process. Journal of Constructivist Psychology, 34(3), 264–277. 10.1080/10720537.2020.1813666 [DOI] [Google Scholar]
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Erlbaum. [Google Scholar]
- Coker, K. , Snow, W. , & Hinkle, S. (2021). The past, present and future of online counselor education. Journal of Technology in Counselor Education and Supervision, 1(1), 39–41. 10.22371/tces/0006 [DOI] [Google Scholar]
- Council for Accreditation of Counseling and Related Educational Programs. (2019). CACREP annual report 2018 . http://www.cacrep.org/wp‐content/uploads/2019/05/CACREP‐2018‐Annual‐Report.pdf
- Dollarhide, C. T. , Gibson, D. M. , & Moss, J. M. (2013). Professional identity development of counselor education doctoral students. Counselor Education and Supervision, 52(2), 137–150. 10.1002/j.1556-6978.2013.00034.x [DOI] [Google Scholar]
- Eom, S. B. , & Ashill, N. (2016). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An update. Decision Sciences Journal of Innovative Education, 14(2), 185–215. 10.1111/dsji.12097 [DOI] [Google Scholar]
- Eom, S. B. , Wen, H. J. , & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), 215–235. 10.1111/j.1540-4609.2006.00114.x [DOI] [Google Scholar]
- Field, A. (2017). Discovering statistics using IBM SPSS statistics (5th ed.). Sage. [Google Scholar]
- Furlonger, B. , & Gencic, E. (2014). Comparing satisfaction, life‐stress, coping and academic performance of counselling students in on‐campus and distance education learning environments. Journal of Psychologists and Counsellors in Schools, 24(1), 76–89. 10.1017/jgc.2014.2 [DOI] [Google Scholar]
- Ghazi‐Saidi, L. , Criffield, A. , Kracl, C. L. , McKelvey, M. , Obasi, S. N. , & Vu, P. (2020). Moving from face‐to‐face to remote instruction in a higher education institution during a pandemic: Multiple case studies. International Journal of Technology in Education and Science, 4(4), 370–383. https://eric.ed.gov/?id=EJ1271208 [Google Scholar]
- Harrichand, J. J. , Litam, S. D. A. , & Ausloos, C. D. (2021). Infusing self‐care and wellness into CACREP curricula: Pedagogical recommendations for counselor educators and counselors during COVID‐19. International Journal for the Advancement of Counselling, 43, 372–385. 10.1007/s10447-021-09423-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Harrison, K. L. (2021). A call to action: Online learning and distance education in the training of couple and family therapists. Journal of Marital and Family Therapy, 47(2), 408–423. 10.1111/jmft.12512 [DOI] [PubMed] [Google Scholar]
- Hinkley, P. , Shaler, L. , Chamberlin, B. , Peters, C. , Davis, J. , & Kuhnley, A. (2021). Effective strategies for promoting faculty and student success in online counselor education programs. International Journal of Online Graduate Education, 4(1), 1–9. 10.5281/zenodo.4420131 [DOI] [Google Scholar]
- Holmes, C. M. , Reid, C. , Hawley, C. , & Wagner, C. (2020). Social presence in online counselor education. The Journal of Counselor Preparation and Supervision, 13(4), 1–30. [Google Scholar]
- Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology, 23, 1–13. 10.3402/rlt.v23.26507 [DOI] [Google Scholar]
- Knowles, M. S. (1973). The adult learner: A neglected species. Gulf Publishing Company. [Google Scholar]
- Kowarski, L. (2022, April 8). The many ways grad school differs from college. U.S. News. & World Report. [Google Scholar]
- La Guardia, A. C. (2021). Counselor education and supervision: 2019 annual review. Counselor Education and Supervision, 60(1), 2–21. 10.1002/ceas.12192 [DOI] [Google Scholar]
- Li, D. , & Su, Y.‐W. (2021). Online teaching and learning in counselor education: A 23‐year content analysis. Counselor Education and Supervision, 60(4), 316–330. 10.1002/ceas.12219 [DOI] [Google Scholar]
- Li, D. , Duys, D. K. , & Liu, Y. (2021). Working alliance as a mediator between supervisory styles and supervisee satisfaction. Teaching and Supervision in Counseling, 3(3), 5. 10.7290/tsc030305 [DOI] [Google Scholar]
- Li, D. , Liu, Y. , & Lee, I. (2018). Supervising Asian international counseling students: Using the integrative developmental model (IDM). Journal of International Students, 8(2), 1129–1151. 10.32674/jis.v8i2.137 [DOI] [Google Scholar]
- Li, D. , Liu, Y. , & Werts, R. C. (under review). Counselor trainees’ lived experiences of online learning during COVID‐19.
- Moore, J. L. , Dickson‐Deane, C. , & Galyen, K. (2011). E‐Learning, online learning, and distance learning environments: Are they the same? The Internet and Higher Education, 14(2), 129–135. 10.1016/j.iheduc.2010.10.001 [DOI] [Google Scholar]
- Moore, M. G. , & Kearsley, G. (2005). Distance education: A systems view. Wadsworth. [Google Scholar]
- Muzammil, M. , Sutawijaya, A. , & Harsasi, M. (2020). Investigating student satisfaction in online learning: The role of student interaction and engagement in distance learning university. Turkish Online Journal of Distance Education, 21, (Special Issue‐IODL), 88–96. 10.17718/tojde.770928 [DOI] [Google Scholar]
- Myers, R. (1990). Classical and modern regression with applications (2nd ed.). Duxbury. [Google Scholar]
- Petrocelli, J. V. (2003). Hierarchical multiple regression in counseling research: Common problems and possible remedies. Measurement and Evaluation in Counseling and Development, 36(1), 9–22. 10.1080/07481756.2003.12069076 [DOI] [Google Scholar]
- Renfro‐Michel, E. L. , O'Halloran, K. C. , & Delaney, M. E. (2010). Using technology to enhance adult learning in the counselor education classroom. Adultspan Journal, 9(1), 14–25. 10.1002/j.2161-0029.2010.tb00068.x [DOI] [Google Scholar]
- Richardson, J. C. , Maeda, Y. , Lv, J. , & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta‐analysis. Computers in Human Behavior, 71, 402–417. 10.1016/j.chb.2017.02.001 [DOI] [Google Scholar]
- Snow, W. H. , & Coker, J. K. (2020). Distance counselor education: Past, present, future. The Professional Counselor, 10(1), 40–56. https://files.eric.ed.gov/fulltext/EJ1251002.pdf [Google Scholar]
- Stoltenberg, C. D. , & McNeill, B. W. (2010). IDM supervision: An integrative developmental model for supervising counselors and therapists (3rd ed.). Routledge. [Google Scholar]
- Walker, S. L. (2020). The Distance Education Learning Environments Survey (DELES). Mind Garden, Inc. [Google Scholar]
- Walker, S. L. , & Fraser, B. J. (2005). Development and validation of an instrument for assessing distance education learning environments in higher education: The Distance Education Learning Environments Survey (DELES). Learning Environments Research, 8(3), 289–308. 10.1007/s10984-005-1568-3 [DOI] [Google Scholar]
- Wasik, S. Z. , Barrow, J. C. , Royal, C. , Brooks, R. M. , Scott‐Dames, L. S. , Corry, L. , & Bird, C. (2019). Online counselor education: Creative approaches and best practices in online learning environments. Research on Education and Psychology, 3(1), 43–52. [Google Scholar]
- Williamson, B. , Eynon, R. , & Potter, J. (2020). Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learning, Media and Technology, 45(2), 107–114. 10.1080/17439884.2020.1761641 [DOI] [Google Scholar]
