Skip to main content
Journal of Microbiology & Biology Education logoLink to Journal of Microbiology & Biology Education
. 2021 Dec 15;22(3):e00193-21. doi: 10.1128/jmbe.00193-21

Modifying the ASPECT Survey to Support the Validity of Student Perception Data from Different Active Learning Environments

Nicole Naibert a, Erin E Shortlidge b, Jack Barbera a,
PMCID: PMC8672914  PMID: 34970392

ABSTRACT

Measuring students’ perceptions of active learning activities may provide valuable insight into their engagement and subsequent performance outcomes. A recently published measure, the Assessing Student Engagement in Class Tool (ASPECT), was developed to assess student perceptions of various active learning environments. As such, we sought to use this measure in our courses to assess the students’ perceptions of different active learning environments. Initial results analyzed with confirmatory factor analysis (CFA) indicated that the ASPECT did not function as expected in our active learning environments. Therefore, before administration within an introductory biology course that incorporated two types of active learning strategies, additional items were created and the wording of some original items were modified to better align with the structure of each strategy, thereby producing two modified ASPECT (mASPECT) versions. Evidence of response process validity of the data collected was analyzed using cognitive interviews with students, while internal structure validity evidence was assessed through exploratory factor analysis (EFA). When data were collected after a “deliberative democracy” (DD) activity, 17 items were found to contribute to 3 factors related to ‘personal effort’, ‘value of the environment’, and ‘instructor contribution’. However, data collected after a “clicker” day resulted in 21 items that contributed to 4 factors, 3 of which were similar to the DD activity, and a fourth was related to ‘social influence’. Overall, these results suggested that the same measure may not function identically when used within different types of active learning environments, even with the same population, and highlights the need to collect data validity evidence when adopting and/or adapting measures.

KEYWORDS: structural validity, factor analysis, active learning, undergraduate, engagement

INTRODUCTION

The continued shift in undergraduate science courses from instructor-centered classrooms to student-centered learning has been influenced in part by national reports aimed at improving higher education within the science, technology, engineering, and mathematics (STEM) fields (1, 2). Many studies have found that including active learning strategies in the classroom positively impacts student outcomes (e.g., higher exam grades, lower withdrawal rate, etc.) (3, 4). However, while including these strategies may increase student performance outcomes, the extent of these benefits may vary in different student populations (5), and it cannot be assumed that every student in the classroom engages in or benefits from an active learning environment to the same extent (6). Because active learning strategies are inherently student-centered, it is up to the student to decide to interact with and “buy-in” to the activity and learning environment (7). Student buy-in, along with other perceptions, such as trust in the instructor and their growth mindset, have been shown to influence student engagement and course outcomes (7, 8). Thus, measuring students’ perceptions of the active learning environment could provide valuable information about how students engage with and benefit from different active learning environments.

Measuring student perceptions of active learning environments

Because multiple types of active learning strategies are implemented in our classrooms at Portland State University (PSU), we were interested in measuring students’ perceptions of these various environments. Although individual student perceptions can be gathered through qualitative methods (e.g., Shortlidge et al. (9)), quantitative methods, such as a self-report survey, can be used to easily and efficiently collect perceptions from every student in the class. Recently, the Assessing Student Perspective of Engagement In-Class Tool (ASPECT) was developed by Wiggins et al. (10) to measure students’ perceptions of their cognitive and affective engagement in different active learning environments incorporated in a large-format introductory biology classroom. Their results suggested that student perceptions of the value of the activity and the instructor contribution differed based on the activity type (i.e., students perceived there to be less value and less instructor contribution during worksheet activity days compared to clicker-question activity days) and demographic group. No significant differences were detected in students’ perceptions of their personal effort across different activity types.

Active learning environments can vary even between classes that implement the same active learning strategy. Therefore, evidence of validity and reliability of data generated by an instrument should be gathered before interpreting any results in a different environment and/or with a different population (11). The types and amount of validity evidence collected for a measure depend on the goals of the project as well as what types of validity evidence had previously been assessed. Collecting evidence of the internal structure validity of a previously published measure provides evidence that the constructs are being measured in a similar way in the different learning environment (11). Additionally, gathering evidence of response process validity can provide confidence that students are interpreting the items correctly in the new environment (11, 12), especially if modifications are made to the original measure.

We evaluated the ASPECT in our learning environments through two experimental phases. Phase I focused on gathering evidence of internal structure validity for data collected with the original ASPECT measure in our learning environments. The initial results from Phase I led to modifications of the ASPECT (results and details from Phase I are included in Appendix 1 in the supplemental material). Here, we focused on Phase II, where the modified ASPECT (mASPECT) was used to measure student perceptions of two different types of active learning strategies. Because the mASPECT included additional items, evidence of both internal structure and response process validity were gathered. An overview of Phases I and II, ASPECT versions, and types of validity evidence collected are shown in Fig. 1.

FIG 1.

FIG 1

Overview of the active learning strategies, survey versions, and validity evidence collected during Phases I and II. aDetails and results from Phase I are included in Appendix 1.

We hypothesized that data collected with the mASPECT during Phase II would show evidence of similar factors related to student perceptions that were discovered with the original ASPECT (i.e., personal effort, the value of the activity, instructor contribution) as well as an additional group-related factor. To this end, evidence of response process and internal structure validity and reliability of the data collected with the mASPECT in two active learning environments were gathered, and the resulting survey structures and scale scores were evaluated in both environments. This work sought to answer two research questions. (i) What modifications could be made to the ASPECT to obtain sufficient evidence of internal structure validity of the collected data? and (ii) what factor structure best represented the modified ASPECT (mASPECT) data from our active learning environments? Answering these questions would provide support for the student perception data from our course and could serve as a model for others seeking to use the ASPECT or mASPECT when evaluating their active learning environments.

METHODS

Course information and active learning environments

This study was completed in a third-term introductory biology course at PSU with a week-1 enrollment of 266 students. Demographic information of students who consented to participate in this study is provided in Table S1 in Appendix 1. Two types of active learning strategies were assessed within the same class: (i) deliberative democracy (DD) modules and (ii) classroom response systems (clickers). DD is a small group active learning strategy that includes a multiday deliberation exercise where students are introduced to a real-world problem that correlates with their course content and, through reading, deliberation, and research, they are asked to come to a consensus on a policy recommendation (1315). In this study, the DD activity consisted of a 2-day module where students gathered information on their own between DD activity days and brought the information back to class to inform group discussion and consensus making. Students were assigned readings, quizzes, and group worksheets to build a consensus statement. Students worked in the same randomly assigned groups of 3 to 5 on DD activity days, and the professor, graduate teaching assistant (TA), and multiple undergraduate learning assistants (LAs) (∼15) facilitated the group work. The TA functioned in an instructor role during DD activity days and trained the LAs in each DD activity. The second active learning strategy investigated was clicker days. These were ‘normal’ lecture days where students were regularly encouraged to ‘think-pair-share’ with other students nearby in response to clicker prompts given by the professor. Although no undergraduate LAs were in class during the clicker days, the graduate TA was present.

All data collected within this study was approved by the Institutional Review Board (IRB no.196410-18) at Portland State University and appropriate consent was gathered from instructors and students as required by the IRB.

Survey items

The surveys administered in both environments consisted of a modified ASPECT (mASPECT) survey based on the original ASPECT (10). Two versions of the mASPECT were created: one for a DD activity day (mASPECT-DD) and one for a clicker day (mASPECT-C). The modifications to the surveys included minor wording changes to the 19 original ASPECT items (10) as well as the creation of new items based on the structure of the active learning environments and the results from Phase I (details provided in Appendix 2). The mASPECT-DD version contained 35 items and the mASPECT-C version contained 31 items. Both versions included the 19 original ASPECT items in a slightly modified form (Items 1 to 19; Table 1), 8 new items related to personal effort (Items 20 to 27; Table 2), and 4 new items related to group function (Items 28 to 31; Table 2). The four-item difference between mASPECT-DD and mASPECT-C versions was due to the addition of ‘LA-worded’ items (Items 13B, 14B, 15B, and 16B; Table 1) that paralleled the ‘professor/TA’ items. Because the LAs were not present during the clicker day, the items did not apply to that environment. All survey items were administered on a 6-point Likert-type scale from strongly disagree (1) to strongly agree (6).

TABLE 1.

Original ASPECT Items (Wiggins et al. (10)) (Items 1 to 19) and modifications for mASPECT-DD and mASPECT-C versionsa

Factor ASPECT wording Item mASPECT-DD wording mASPECT-C wording
PE I was focused during today's group activity. 1 I was focused during today's class. I was focused during today's class.
I worked hard during today's group activity. 2 I worked hard during today's class. I worked hard during today's class.
I made a valuable contribution to my group today. 3 I made valuable contributions when working with other students during today's class. I made valuable contributions when having discussions with other students during today's class.
VGA Explaining the material to my group improved my understanding of it. 4 Explaining the material to my group members improved my understanding of it. Explaining the material to other students improved my understanding of it.
Having the material explained to me by my group members improved my understanding of the material. 5 Having the material explained to me by my group members improved my understanding of it. Having the material explained to me by other students improved my understanding of it.
Group discussion during the activity contributed to my understanding of the course material. 6 Working with other students during today's class contributed to my understanding of the material. Discussion with other students during today's class contributed to my understanding of the material.
Overall, the other members of my group made valuable contributions during the group activity. 7 The students I worked with made valuable contributions during today's class. The students I had discussions with made valuable contributions during today's class.
I had fun during today's group activity. 8 I had fun during today's class. I had fun during today's class.
I would prefer to take a class that includes this [topic] group activity over one that does not include this [topic] activity. 9 I would prefer to take a class that included today's activity over one that does not include it. I would prefer to take a class that included today's clicker questions over one that does not include them.
I am confident in my understanding of the material presented during today's group activity. 10 I am confident in my understanding of the material presented during today's class. I am confident in my understanding of the material presented during today's class.
The group activity increased my understanding of the course material. 11 Today's class increased my understanding of the material. Today's class increased my understanding of the material.
The group activity stimulated my interest in the course material. 12 Today's class stimulated my interest in the course material. Today's class stimulated my interest in the course material.
IC The instructor's enthusiasm made me more interested in the group activity. 13A The professor/teaching assistant's enthusiasm made me more interested in today's class. The professor/teaching assistant's enthusiasm made me more interested in today's class.
13B The learning assistant's enthusiasm made me more interested in today's class. NA
The instructor put a good deal of effort into my learning for today's class. 14A The professor/teaching assistant put a good deal of effort into my learning for today's class. The professor/teaching assistant put a good deal of effort into my learning for today's class.
14B The learning assistant put a good deal of effort into my learning for today's class. NA
The instructor seemed prepared for the group activity. 15A The professor/teaching assistant seemed prepared for today's class. The professor/teaching assistant seemed prepared for today's class.
15B The learning assistant seemed prepared for today's class. NA
The instructor and TAs were available to answer questions during the group activity. 16A The professor/teaching assistant was available to answer questions during today's class. The professor/teaching assistant was available to answer questions during today's class.
16B The learning assistant was available to answer questions during today's class. NA
NA I felt comfortable with my group. 17 I felt comfortable working with other students during today's class. I felt comfortable having discussions with other students during today's class.
I knew what I was expected to accomplish during the group activity. 18 I knew what I was expected to accomplish during today's class. I knew what I was expected to accomplish during today's class.
One group member dominated the discussion during today's group activity. 19 One of the students I worked with dominated discussion during today's class. One of the students I had discussions with dominated discussion during today's class.
a

The original ASPECT factors of personal effort (PE), the value of group activity (VGA), and instructor contribution (IC) are included. Wording differences between mASPECT-DD and mASPECT-C are underlined.

TABLE 2.

New survey items (Items 20 to 31) created for mASPECT-DD and mASPECT-C related to personal effort and group functiona

Item mASPECT-DD wording mASPECT-C wording
20b I completed the prework for today's class. I completed the prework for today's class.
21b I did not make much of an effort during today’s class. I did not make much of an effort during today's class.
22b I guessed or made stuff up so that I could finish today's activity. I guessed or made stuff up so that I could finish today's activity.
23b I skipped or guessed on the hard parts of today's activity. I skipped or guessed on the hard parts of today's activity.
24b I found it difficult to maintain my concentration during today's class. I found it difficult to maintain my concentration during today's class.
25b I tried to relate today's class to prior material from the course. I tried to relate today's class to prior material from the course.
26b I was not very engaged in today's class. I was not very engaged in today's class.
27b I was fully engaged in today's class. I was fully engaged in today's class.
28c The students I worked with were focused during today's class. The students I had discussions with were focused during today's class.
29c The students I worked with worked hard during today's class. The students I had discussions with worked hard during today's class.
30c The students I worked with had fun during today's class. The students I had discussions with had fun during today's class.
31c Each student I worked with made an equal contribution during today's class. Each student I had discussions with made an equal contribution during today's class.
a

Wording differences between mASPECT-DD and mASPECT-C are underlined.

b

Personal effort related items.

c

Group related items.

Quantitative methods

Quantitative survey data were collected after both a final day of a DD activity and after a clicker day. Students were notified of the surveys during in-class announcements as well as an announcement posted on the course’s learning management site with a link to the Qualtrics survey. Students were given 48 h following completion of the in-class activity to access and complete the survey. Students who accessed the survey were given a nominal amount of extra credit regardless of consent or completion. Before analysis, the responses were cleaned by removing (i) students that did not consent, (ii) any duplicate submissions by the same student, (iii) incomplete responses, and/or (iv) responses that did not correctly respond to the ‘check items’. One check item asked the students to select a specific response (i.e., somewhat agree). Students who did not respond to this check item correctly were assumed to have responded to the survey randomly without reading the items. Therefore, their responses were removed. Additionally, a topic-based check item was included that asked students to select the topic covered during the day of the activity. Students who responded with the incorrect class topic were assumed to have not attended class and were also removed from the data set. Because some items contained statements about interactions with others, surveys also included an item asking students if they worked with a group or discussed with other students during class that day. Only students who selected that they worked or discussed with other students were included in the final data set. Overall, 183 responses were collected for the DD activity day and 215 for the clicker day, which was a 69% and 81% response rate, respectively, based on the week-1 enrollment of the course of 266 students. After data cleaning, there were 149 remaining student responses for the DD activity day and 136 student responses for the clicker day. Item descriptive statistics are provided in Appendix 3.

To gather evidence of internal structure validity, the survey data were analyzed using exploratory factor analysis (EFA), as EFA does not require an a priori structure to be specified. This allowed for the factor structure of both mASPECT versions to be explored. The number of factors used for the EFAs was selected based on results from both the Kaiser criterion and the scree test (16). These analyses were completed using the ‘stats’ package in R (version 3.6.2) and EFAs were completed with the ‘psych’ package (version 1.9.12.31) in R. All EFAs used principal axis factoring with a promax (oblique) rotation, as an oblique rotation method allows for correlation between the factors. Negatively worded items were reverse coded before EFAs were completed. The data were analyzed using an iterative process consisting of an EFA, removal of items that did not meet certain criteria, and then a subsequent EFA with the remaining items (17). Items were removed at each step if they had factor loadings of less than 0.4, cross-loaded on two or more factors, or loaded on factors that contained less than three items. For exploratory purposes, items with cross-loadings between 0.3 and 0.4 were flagged but not immediately removed. This process was repeated until all remaining items met the criteria and produced well-formed factors. All items included in the final EFAs had loadings of less than 0.35 on the nonprimary factors.

Reliability evidence of the data collected was evaluated using the final factor structure found through EFA. Because EFA allows all items to load onto each factor, individual factor models were not evaluated, therefore it is unknown if the final factor models contained equal item loadings (i.e., a tau-equivalent model). Thus, the decision was made to estimate the single-administration reliability using omega instead of alpha for each factor, as the criteria for omega do not require equal item loadings or errors (18). Although there are no formal cutoffs for good single-administration reliability, values above 0.7 are generally considered acceptable.

Student interviews

Because modifications were made to the original items and new items were also included in the mASPECT surveys, evidence of response process validity was gathered from students using cognitive interviews (19). At the end of the associated quantitative surveys, students were given the option to include their email address to indicate they were interested in participating in a short in-person interview about the survey. After the survey closed, emails were sent to randomly selected students, and interviews were scheduled. Response process interview data were gathered for both types of active learning environments (i.e., DD activity day and clicker day) separately. Four students participated in on-campus interviews about the survey items related to the DD activity day (mASPECT-DD), and eight students about the items related to the clicker day (mASPECT-C). Each student was interviewed and all interviews were audio-recorded. During each interview, students were directed to read each item aloud, state which response they selected, and then explain their reasoning for choosing that response. When needed, students were asked follow-up questions to gain more details about their understanding of the items themselves and/or their response reasoning. All students who participated in an interview were compensated with a $20 gift card.

The audio recording of each interview was initially analyzed by two researchers individually. Student responses to each item were recorded as either being in alignment with the intention of the item or were flagged for possible confusion or irrelevance to the active learning environment. The two researchers then came together and discussed the responses for each item and came to a consensus on which items seemed unclear to the students or were not relevant to the type of active learning environment. The student responses to these items were then provided to a third researcher, who similarly analyzed the items for clarity and relevance. The items that all three researchers agreed were unclear or irrelevant to the type of environment based on the student interviews were removed before quantitative analysis and provided insights for items that were not found to contribute to the final factor structure.

RESULTS

Evaluation of mASPECT-DD data

Through the interview results (n = 4), Item 10, “I am confident in my understanding of the material presented during today’s class” was found to be irrelevant to this type of activity. When students were asked to explain their response to this item, they would refer to the out-of-class assignment of finding articles to bring in instead of their confidence in what was learned during the activity itself. Additionally, Item 22, “I guessed or made stuff up so that I could finish today’s activity” and Item 23, “I skipped or guessed on the hard parts of today’s activity” were also found to be irrelevant to the students based on the structure of the DD activity, which required students to work together toward finding a solution to a ‘real-world’ problem which was intentionally nuanced with no ‘right answers’. Thus, students said that there was no reason to guess and that there were no ‘hard parts’ to the activity. Two more items were also removed based on student interviews. Item 25, “I tried to relate today’s class to prior material from the course” was removed as students were unable to consistently justify their response, and Item 30, “The students I worked with had fun during today’s class” was removed as students indicated they were unsure how to gauge how much fun other students had. These five items (Items 10, 22, 23, 25, and 30) were removed before quantitative analysis through EFA.

An iterative EFA process was used to determine which items created well-formed factors. A summary of the entire process, including the items that were removed at each step, is displayed in Fig. 2 (details provided in Appendix 4). The final EFA for the mASPECT-DD version consisted of 17 items, which were found to load onto three factors related to ‘personal effort’ (PE; 6 items), ‘value of environment’ (VE; 5 items), and ‘classroom support (instructors and LAs)’ (CS; 6 items) (Table 3). The descriptions given to these factors were based on their relation to the original ASPECT factors (10) and observed similarities of the included items. These three factors were found to explain 18% (PE), 22% (VE), and 16% (CS) of the variance in responses, for a total of 56%. The single-administration reliability coefficient, omega, was calculated for each of the three factors and found to be 0.85 (PE), 0.84 (VE), and 0.90 (CS), which suggested good reliability for each.

FIG 2.

FIG 2

Summary of the analysis process for the mASPECT-DD survey. The final factors were ‘personal effort’ (PE), ‘value of environment’ (VE), and ‘classroom support’ (CS).

TABLE 3.

Factor loadings for the final 3-factor EFA structure for the mASPECT-DD survey given during a DD activity (n = 149)a

Survey item
Personal effort Value of environment Classroom support
1 I was focused during today's class. 0.69 0.14 0.07
2 I worked hard during today's class. 0.56 0.05 0.06
21 I did not make much of an effort during today's class. (rev) 0.97 0.32 0.03
24 I found it difficult to maintain my concentration during today's class. (rev) 0.52 0.20 −0.09
25 I was not very engaged in today's class. (rev) 0.71 0.02 −0.02
26 I was fully engaged in today's class. 0.67 0.15 −0.02
6 Working with other students during today's class contributed to my understanding of the material. 0.21 0.42 0.13
8 I had fun during today's class. 0.04 0.70 0.06
9 I would prefer to take a class that included today's activity over one that does not include it. −0.14 0.97 −0.17
11 Today's class increased my understanding of the material. −0.01 0.69 −0.03
12 Today's class stimulated my interest in the course material. 0.05 0.69 0.08
14A The Professor/Teaching Assistant put a good deal of effort into my learning for today's class. −0.17 0.20 0.73
14B The Learning Assistant put a good deal of effort into my learning for today's class. 0.09 0.07 0.68
15A The Professor/Teaching Assistant seemed prepared for today's class. 0.11 −0.07 0.73
15B The Learning Assistant seemed prepared for today's class. 0.06 −0.14 0.88
16A The Professor/Teaching Assistant was available to answer questions during today's class. −0.08 0.02 0.75
16B The Learning Assistant was available to answer questions during today's class. −0.04 −0.13 0.91
a

Item loadings above 0.4 are bolded. Items that were reverse coded are marked with (rev).

Average scale scores were calculated using the final mASPECT-DD factor structure (Table 4). Because EFAs allow items to load on all factors, weighted means could not be calculated and, as such, the values presented assume each item contributed equally to the factor.

TABLE 4.

Average scale scores for mASPECT-DD factors (n = 149)a

Factor Avg scale score (SD)
Personal effort
(Items 1–2, 21, 24–26)
4.64 (0.79)
Value of environment
(Items 6, 8–9, 11–12)
4.26 (0.92)
Classroom support
(Items 14A-16B)
5.15 (0.71)
a

All item responses were collected on a six-point Likert-type scale from 1 (strongly disagree) to 6 (strongly agree).

Evaluation of mASPECT-C data

Data collected with the items administered during the clicker day were also analyzed using student interviews and EFAs. Response process interviews (n = 8) about the mASPECT-C items led to the removal of three items. Item 18, “I knew what I was expected to accomplish during today’s class” and Item 20, “I completed the prework for today’s class” were removed as students mentioned that these items did not relate to clicker days because their only expectation during class was to understand the material and there was no required “prework” to complete before attending the class that day. Additionally, Item 30, “The students I had discussions with had fun during today’s class” was removed as students indicated they were unsure of how to respond to this statement.

The remaining items were quantitatively analyzed with an iterative EFA process. A summary of the entire process, including the items that were removed at each step, is displayed in Fig. 3 (details provided in Appendix 4). The final EFA for the clicker day mASPECT-C items was found to contain 21 items with four factors related to ‘personal effort’ (PE; 5 items), ‘social influence’ (SI; 8 items), ‘value of environment’ (VE; 4 items), and ‘classroom support (instructors only)’ (CS; 4 items) (Table 5). The three factors of ‘personal effort’, ‘value of environment’, and ‘classroom support’ were similar to the factors found with mASPECT-DD and thus were named accordingly. The fourth factor was named ‘social influence’, as the included items appeared to be related to working with other students. The four factors were found to explain 17% (PE), 19% (SI), 13% (VE), and 7% (CS) of the variance, for a total of 55%. Omega was calculated for each of the four factors and found to be 0.85 (PE), 0.89 (SI), 0.81 (VE), and 0.81 (CS), which indicated good single-administration reliability.

FIG 3.

FIG 3

Summary of the analysis process for the mASPECT-C survey. The final factors were ‘personal effort’ (PE), ‘social influence’ (SI), ‘value of environment’ (VE), and ‘classroom support’ (CS).

TABLE 5.

Factor loadings for the final EFA structure for the mASPECT-C (n = 136)a

Survey item
Personal effort Social influence Value of environment Classroom support
1 I was focused during today's class. 0.61 0.04 0.19 0.14
21 I did not make much of an effort during today's class. (rev) 0.67 0.14 −0.24 0.10
24 I found it difficult to maintain my concentration during today's class. (rev) 0.74 −0.13 0.32 −0.20
26 I was not very engaged in today's class. (rev) 0.78 −0.03 0.06 −0.06
27 I was fully engaged in today's class. 0.52 0.13 0.25 0.04
3 I made valuable contributions when having discussions with other students during today's class. −0.09 0.63 0.23 −0.08
4 Explaining the material to other students improved my understanding of it. −0.05 0.52 0.15 0.12
5 Having the material explained to me by other students improved my understanding of it. 0.04 0.76 −0.27 0.10
6 Discussion with other students during today's class contributed to my understanding of the material. 0.09 0.80 −0.15 0.04
7 The students I had discussions with made valuable contributions during today's class. −0.01 0.94 −0.08 −0.15
17 I felt comfortable having discussions with other students during today's class. −0.08 0.62 0.27 −0.16
28 The students I had discussions with were focused during today's class. 0.14 0.54 0.14 −0.01
29 The students I had discussions with worked hard during today's class. 0.05 0.75 −0.15 0.08
8 I had fun during today's class. 0.03 0.18 0.58 0.04
10 I am confident in my understanding of the material presented during today's class. 0.04 −0.11 0.73 −0.09
11 Today's class increased my understanding of the material. 0.05 −0.02 0.65 0.15
12 Today's class stimulated my interest in the course material. 0.10 −0.13 0.66 0.22
13 The Professor/Teaching Assistant's enthusiasm made me more interested in today's class. −0.12 0.09 0.28 0.60
14 The Professor/Teaching Assistant put a good deal of effort into my learning for today's class. −0.15 0.01 0.21 0.61
15 The Professor/Teaching Assistant seemed prepared for today's class. 0.12 −0.17 −0.11 0.86
16 The Professor/Teaching Assistant was available to answer questions during today's class. 0.00 0.08 −0.03 0.66
a

Item loadings above 0.4 are bolded. Items that were reverse coded are marked with (rev).

Average scale scores were calculated for the mASPECT-C version using the final factor structure (Table 6). The values presented assume each item contributed equally to the factor, as EFAs allow all items to load on each factor.

TABLE 6.

Average scale scores for mASPECT-C factors (n = 136)a

Factor Avg scale score (SD)
Personal effort
(Items 1, 21, 24, 26–27)
4.64 (0.87)
Social influence
(Items 3–7, 17, 28–29)
4.94 (0.70)
Value of environment
(Items 8, 10–12)
4.75 (0.74)
Classroom support
(Items 13–16)
5.34 (0.61)
a

All item responses were collected on a six-point Likert-type scale from 1 (strongly disagree) to 6 (strongly agree).

DISCUSSION

Interview and EFA results provided evidence of response process and structural validity for the data collected with both mASPECT versions and resulted in well-formed factor structures.

Comparisons among the factor structures of mASPECT-DD and mASPECT-C

Although similarly worded items were used in both mASPECT versions, different factor structures were discovered for the two environments. A 3-factor solution was found to describe the DD activity day (mASPECT-DD) data, while a 4-factor solution described the clicker day (mASPECT-C) data (Table 7). The data from both active learning environments included factors related to ‘personal effort’, ‘value of environment’, and ‘classroom support’, however, these factors included different items for the different environments. Thus, although they could be considered similar constructs, they were not found to be identical. Additionally, a fourth factor related to ‘social influence’ was discovered for data collected in the clicker day environment with mASPECT-C. This factor was not found for data collected for the DD activity (mASPECT-DD) nor was it an original ASPECT factor (see Appendix 5). This result suggests that students’ perceptions of the clicker day environment included a social component, which may not have been an important factor in the DD activity environment. However, as open-ended student interviews asking about their general perceptions of the active learning environments were not conducted during this study, we cannot say that students did not find social influence to contribute to their perceptions of the DD activity, just that none of the included items were found to measure this perception.

TABLE 7.

Comparison of the final factor structures found for mASPECT-DD and mASPECT-C

Factors mASPECT-DD Item no. mASPECT-C Factors
Personal effort I was focused during today's class. 1 I was focused during today's class. Personal effort
I worked hard during today's class. 2 Removed
I did not make much of an effort during today's class 21 I did not make much of an effort during today's class
I found it difficult to maintain my concentration during today's class. 24 I found it difficult to maintain my concentration during today's class.
I was not very engaged in today's class. 26 I was not very engaged in today's class.
I was fully engaged in today's class. 27 I was fully engaged in today's class.
Removed 3 I made valuable contributions when having discussions with other students during today's class. Social influence
Value of environment Removed 4 Explaining the material to other students improved my understanding of it.
Removed 5 Having the material explained to me by other students improved my understanding of it.
Working with other students during today's class contributed to my understanding of the material. 6 Discussion with other students during today's class contributed to my understanding of the material.
Removed 7 The students I had discussions with made valuable contributions during today's class.
Removed 17 I felt comfortable having discussions with other students during today's class.
Removed 28 The students I had discussions with were focused during today's class.
Removed 29 The students I had discussions with worked hard during today's class.
I had fun during today's class. 8 I had fun during today's class. Value of environment
I would prefer to take a class that included today's activity over one that does not include it. 9 Removed
Removed 10 I am confident in my understanding of the material presented during today's class.
Today's class increased my understanding of the material. 11 Today's class increased my understanding of the material.
Today's class stimulated my interest in the course material. 12 Today's class stimulated my interest in the course material.
Classroom support (Instructors and LA) Removed 13A The Professor/Teaching Assistant's enthusiasm made me more interested in today's class. Classroom support (Instructors only)
Removed 13B not applicable
The Professor/Teaching Assistant put a good deal of effort into my learning for today's class. 14A The Professor/Teaching Assistant put a good deal of effort into my learning for today's class.
The Learning Assistant put a good deal of effort into my learning for today's class. 14B not applicable
The Professor/Teaching Assistant seemed prepared for today's class. 15A The Professor/Teaching Assistant seemed prepared for today's class.
The Learning Assistant seemed prepared for today's class. 15B not applicable
The Professor/Teaching Assistant was available to answer questions during today's class. 16A The Professor/Teaching Assistant was available to answer questions during today's class.
The Learning Assistant was available to answer questions during today's class. 16B not applicable

Student perceptions of the environments

Although the factor names for ‘personal effort’, ‘value of environment’, and ‘classroom support’ for mASPECT-DD and mASPECT-C are identical, because the factors contain different items, the final scale scores cannot be compared to each other. However, independently considering the scale scores from each environment can still provide insight into how students viewed the environments. For example, based on the average scale scores it appeared that students positively recognized the classroom support that was present during both the DD activity (Table 4) and the clicker day (Table 6). They also perceived their personal effort and the value of the environment to be fairly high for both types of environments, as all of the averaged scale scores were above 4 (i.e., somewhat agree). Within the clicker day environment, it appeared that students also perceived the social influence positively. These results suggest that the students thought fairly highly of both the DD activity and the clicker day learning environments, as measured by these factors.

Limitations

The relatively low survey response rates (∼50%) were a limitation of this study. However, these percentages only represent the students who consented to be part of the study and do not include the students who accessed the surveys for extra credit only. Overall, 69 to 81% of enrolled students accessed the surveys; however, as these surveys were given in the course as part of a research study, students could not be required to complete it. Additionally, student interviews only captured the perceptions of a small subset of the classroom population who self-selected to participate.

While the scale scores indicate that students generally perceived both environments positively, these results should be interpreted cautiously. Even with the well-formed factor structure found for both surveys, the amount of variance explained by each factor only ranged from 7 to 22%. This indicates that there could have been additional factors that contributed to students’ perceptions of the environment that were not measured with this survey. Additionally, although the general descriptions given to the factors aligned with the original ASPECT factor descriptions and appear to describe the items that were contributed to each factor, neither the original study (10) nor this study evaluated test content validity (11, 12) in relation to theoretical definitions of the different constructs. As such, these factors cannot be said to measure theoretically defined constructs of personal effort, the value of the environment, classroom support, or social influence.

Implications for research

Collecting data with the mASPECT may provide insight into students’ perceptions of in-class active learning environments, which could be an important contributor to the variation in student performance outcomes found in these environments. There are several opportunities for comparisons of students’ perceptions of personal effort, the value of the environment, classroom support, and social influence and how those might change based on the type of environment. However, as evidenced by the differences in factor structures between mASPECT-DD and mASPECT-C, these measures should not be used to directly compare results from different active learning environments unless evidence of validity has been gathered in each environment for data collected with the same version of the survey. Therefore, we encourage users of the mASPECT or ASPECT to continue to collect evidence of response process validity to ensure that the items on both measures make sense to students and are relevant for a given type of active learning environment. Although this could take the form of student interviews, a larger number of student response process data could alternatively be collected using open-ended written survey responses. Because active learning strategies can take many forms, the use of response process data could be used to determine what students find important in different types of active learning environments and ensure that these or related items are worded properly to appropriately capture those perceptions. Additionally, as Wiggins et al. (10) noted, an important potential use of the data collected by these scales is to better understand if there are equitable outcomes and experiences across student and/or demographic groups in the same classroom. However, evidence of measurement invariance between different groups would first have to be evaluated (20).

Finally, although the mASPECT versions provide information on students’ perceptions of these active learning activities, the measures were not developed to directly align with theoretical definitions of student engagement. The ASPECT was developed as a measure of students’ perceived cognitive and emotional engagement during in-class active learning activities, however, the original authors note that the psychometric properties of the ASPECT were not evaluated with respect to the theoretical definition of engagement (10). To assess the extent to which the ASPECT or mASPECT measures are a representation of engagement, evidence of test content validity that is aligned with a theoretical definition of engagement would have to be gathered and evaluated (11, 12). Alternatively, future studies could administer both a measure of engagement and mASPECT or ASPECT to evaluate the overlap between constructs.

Implications for teaching

Instructors who want to learn more about how students’ perceptions differ across active learning environments could use the mASPECT measure to gather feedback about different active learning strategies. For example, the mASPECT could be used to gather predata and postdata that could be used to inform the instructor if group-level dynamics improved after a certain strategy was implemented or adapted. As evidenced by the differences in factor structures between mASPECT-DD and mASPECT-C, the scale scores (i.e., item averages within a scale) from these measures should not be used to directly compare results from different active learning environments unless evidence of validity has been gathered in each environment for data collected with the same version of the survey. However, even if scale scores cannot be compared, instructors may still wish to use one or more of the individual mASPECT survey items as formative feedback for environments that are similar to the ones described in this study. For example, if an instructor implements a group-work-focused activity similar to DD or includes clicker questions in their course, they could collect feedback about students’ perceptions using common items from the mASPECT-DD and mASPECT-C, which could be used to inform changes or modifications to the environment or facilitation of the activity.

Although the mASPECT versions provide information on students’ perceptions of these active learning activities, the measures were based on the original ASPECT items, thus do not directly align with theoretical definitions of student engagement (10). Therefore, if an instructor’s goal is to measure student engagement in the classroom, other measures may be better suited. For example, some observational protocols have been developed to evaluate student engagement during class such as the Behavioral Engagement Related to Instruction (21) and the ICAP framework (22). Additionally, some survey measures have been developed to assess different dimensions of student engagement in higher education STEM classrooms (23) and laboratories (24).

ACKNOWLEDGMENTS

We thank the students who participated in the surveys and instructors for allowing data collection in their courses. We thank Eric Earle for their assistance with the interview coding.

E.E.S. acknowledges partial support by an award to PSU under the Howard Hughes Medical Institute Science Education Program (award 52008105).

Footnotes

Supplemental material is available online only.

SUPPLEMENTAL FILE 1
Supplemental material. Download jmbe00193-21_Supplemental_File.pdf, PDF file, 0.2 MB (621.9KB, pdf)

REFERENCES

  • 1.President’s Council of Advisors on Sciences and Technology (PCAST). 2012. Engaged to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Washington, DC. [Google Scholar]
  • 2.National Research Council. 2012. Discipline-based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Press NA, Washington, DC. [Google Scholar]
  • 3.Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, Wenderoth MP. 2014. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci U S A 111:8410–8415. doi: 10.1073/pnas.1319030111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Rahman T, Lewis SE. 2020. Evaluating the evidence base for evidence‐based instructional practices in chemistry through meta‐analysis. J Res Sci Teach 57:765–793. doi: 10.1002/tea.21610. [DOI] [Google Scholar]
  • 5.Eddy SL, Hogan KA. 2014. Getting under the hood: how and for whom does increasing course structure work? CBE Life Sci Educ 13:453–468. doi: 10.1187/cbe.14-03-0050. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Wiltbank LB, Williams KR, Marciniak L, Momsen JL. 2019. Contrasting cases: students' experiences in an active-learning biology classroom. CBE Life Sci Educ 18:ar33. doi: 10.1187/cbe.19-01-0006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Cavanagh AJ, Aragon OR, Chen X, Couch A, Durham F, Bobrownicki A, Hanauer DI, Graham MJ. 2016. Student buy-in to active learning in a college science course. CBE Life Sci Educ 15. doi: 10.1187/cbe.16-07-0212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Cavanagh AJ, Chen X, Bathgate M, Frederick J, Hanauer DI, Graham MJ. 2018. Trust, growth mindset, and student commitment to active learning in a college science course. CBE Life Sci Educ 17. doi: 10.1187/cbe.17-06-0107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Shortlidge EE, Rain-Griffith L, Shelby C, Shusterman GP, Barbera J. 2019. despite similar perceptions and attitudes, postbaccalaureate students outperform in introductory biology and chemistry courses. CBE Life Sci Educ 18:ar3. doi: 10.1187/cbe.17-12-0289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Wiggins BL, Eddy SL, Wener-Fligner L, Freisem K, Grunspan DZ, Theobald EJ, Timbrook J, Crowe AJ. 2017. ASPECT: a survey to assess student perspective of engagement in an active-learning classroom. CBE Life Sci Educ 16. doi: 10.1187/cbe.16-08-0244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Knekta E, Runyon C, Eddy S. 2019. One size doesn't fit all: using factor analysis to gather validity evidence when using surveys in your research. CBE Life Sci Educ 18:rm1. doi: 10.1187/cbe.18-04-0064. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Arjoon JA, Xu X, Lewis JE. 2013. Understanding the state of the art for measurement in chemistry education research: examining the psychometric evidence. J Chem Educ 90:536–545. doi: 10.1021/ed3002013. [DOI] [Google Scholar]
  • 13.Weasel LH, Finkel L. 2016. Deliberative pedagogy in a nonmajors biology course: active learning that promotes student engagement with science policy and research. J Coll Sci Teach 45. doi: 10.2505/4/jcst16_045_04_38. [DOI] [Google Scholar]
  • 14.Komperda R, Barbera J, Shortlidge EE, Shusterman GP. 2018. Connecting chemistry to community with deliberative democracy, p 81–98, Citizens first! Democracy, social responsibility and chemistry. ACS Publications. [Google Scholar]
  • 15.Rain-Griffith L, Sheghewi S, Shusterman GP, Barbera J, Shortlidge EE. 2020. Deliberative democracy: investigating the longitudinal impacts of democratic activities in introductory biology courses. Am Biol Teach 82:453–462. doi: 10.1525/abt.2020.82.7.453. [DOI] [Google Scholar]
  • 16.Brown TA. 2015. Confirmatory factor analysis for applied research. Guilford publications. [Google Scholar]
  • 17.Hancock GR, Mueller RO, Stapleton LM. 2010. Factor Analysis: Exploratory and Confirmatory, The reviewer’s guide to quantitative methods in the social sciences. Routledge. [Google Scholar]
  • 18.Komperda R, Pentecost TC, Barbera J. 2018. Moving beyond Alpha: a primer on alternative sources of single-administration reliability evidence for quantitative chemistry education research. J Chem Educ 95:1477–1491. doi: 10.1021/acs.jchemed.8b00220. [DOI] [Google Scholar]
  • 19.Willis GB. 2005. Cognitive interviewing. SAGE Publications, Inc., Thousand Oaks, CA. doi: 10.4135/9781412983655. [DOI] [Google Scholar]
  • 20.Rocabado GA, Komperda R, Lewis JE, Barbera J. 2020. Addressing diversity and inclusion through group comparisons: a primer on measurement invariance testing. Chem Educ Res Pract 21:969–988. doi: 10.1039/D0RP00025F. [DOI] [Google Scholar]
  • 21.Lane ES, Harris SE. 2015. A new tool for measuring student behavioral engagement in large university classes. J Coll Sci Teach 44:83–91. [Google Scholar]
  • 22.Chi MTH, Wylie R. 2014. The ICAP framework: linking cognitive engagement to active learning outcomes. Educ Psychol 49:219–243. doi: 10.1080/00461520.2014.965823. [DOI] [Google Scholar]
  • 23.Skinner E, Saxton E, Currie C, Shusterman G. 2017. A motivational account of the undergraduate experience in science: brief measures of students’ self-system appraisals, engagement in coursework, and identity as a scientist. Int J Sci Educ 39:2433–2459. doi: 10.1080/09500693.2017.1387946. [DOI] [Google Scholar]
  • 24.Smith KC, Alonso V. 2020. Measuring student engagement in the undergraduate general chemistry laboratory. Chem Educ Res Pract 21:399–411. doi: 10.1039/C8RP00167G. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SUPPLEMENTAL FILE 1

Supplemental material. Download jmbe00193-21_Supplemental_File.pdf, PDF file, 0.2 MB (621.9KB, pdf)


Articles from Journal of Microbiology & Biology Education are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES