Abstract

Student mindset beliefs about the malleability of intelligence have been linked to student outcomes. However, recent meta-analyses showed mixed findings on how student mindset impacts their outcomes depending on the environment and context, such as the mindset that the instructor projects in the classroom. The current work utilizes Social Cognitive Theory to elucidate the relationship among student perceptions of faculty mindset, affective factors (belonging, self-efficacy, and utility value), and behavioral factors (course grade) using a Diversity, Equity, and Inclusion (DEI) lens within the chemistry context at a demographically diverse institution. Structural Equation Modeling (SEM) path analysis revealed that student perceptions of the instructor mindset did not directly predict chemistry course grades. However, a significant indirect effect, mediated by students’ sense of academic misfit, was detected. The more students perceived instructors to endorse a fixed mindset, the more academic misfits they reported in their courses, which led to lower chemistry grades. ACT math scores (indicators of prior preparation) unsurprisingly had significant direct and indirect impact on chemistry course grades. Additionally, multigroup moderation analysis revealed that regression pathways did not differ based on race, gender, or age group. While this work highlights the benefit of instructors promoting a healthy learning environment that projects a growth mindset to students, this must be coupled with institutional support to help build foundational knowledge to prepare students for the rigor of chemistry courses and increase the chance of success for all students.
Keywords: First-Year Undergraduate/General, Second-Year Undergraduate, Chemical Education Research, Collaborative/Cooperative Learning, Student-Centered Learning, Minorities in Chemistry
The National Academies of Sciences and Engineering has called to improve science education for all individuals, regardless of whether they are scientists.1 This helps foster and promote a healthy democracy and workforce that increasingly relies on the skills developed in science, technology, engineering, and mathematics (STEM) courses.1 They emphasize the need for STEM educators who know the content and teach it in engaging and meaningful ways by adopting practices that “invite” students in rather than “weeding them out”. The instructors’ psychology2 has the potential to impact all students and has been tied to shaping and perpetuating educational inequality when they are perceived to be negative.3 Hence, this work examines how student perceptions of the instructors’ psychology in chemistry courses influence their affective factors (attitudes and emotions) and how that impacts their performance in gateway chemistry courses while assessing whether those pathways differ by demographic groups. This work was conducted at a diverse [61% women; 59% nontraditional (over the age of 25), and 37% Historically Underserved Groups (HUGs: Black, Hispanic/Latina/o/e, and Multiple/Other Race)], moderately selective institution, supplementing similar studies that have been conducted at selective, predominantly white institutions4,5 within the chemistry context. Other institutions with similar characteristics may apply the insight gained during this study to tailor professional development and student support services to better serve their student population.
Social Cognitive Theory (SCT) was used as the theoretical framework to better inform this inquiry into the learning environment. SCT posits that student learning depends on the interplay between environmental, affective, and behavioral factors that reciprocally influence one another (Figure 1).6,7
Figure 1.
Social Cognitive Theory (SCT): Student learning depends on the interplay between environmental, affective, and behavioral factors that reciprocally influence one another.6,7
While humans are self-regulating organisms who have control over their lives, human agency is heavily entangled with and affected by what is modeled within their social structure.7 In the classroom, students look to the teacher to model learning techniques, shaping student attitudes about the subject and their subsequent behaviors. Because student affective factors have been shown to predict course outcomes,8,9 this work examines how the classroom environment shapes those attitudes and assesses if those pathways differ based on demographic factors (race, gender, age). The classroom environment in this work is defined as the student perceptions of the faculty mindset, whether students perceive the faculty member to believe all students can develop their intelligence (growth mindset) or that intelligence is an innate characteristic that cannot be developed (fixed mindset).10 SCT identifies self-efficacy, utility value, and social comparisons with peers (a dimension of belonging) as central components of student emotional affect and hence will explored in this work.6 Lastly, the third component of the triad, behavior, is defined in this work as the collective set of actions (attending class, completing assignments, and exams/quizzes) that result in a course grade.
Variables
Self-identified Mindset
Mindset is a set of beliefs that human intelligence is either fixed (an innate characteristic that cannot be taught or developed) or malleable (can be enhanced through practice and diligence).11 Student mindset beliefs are correlated with learning and achievement, especially in STEM courses, where challenges are expected. Students with fixed mindsets tend to be performance-oriented (focused on the grade), avoid challenges, and show lower persistence.10 Students with a growth mindset are more mastery-oriented (focused on increasing competence), seek challenges, and show more persistence.10,11 Student mindset beliefs about their abilities have been shown to be just as influential in predicting student success as those of systemic inequities that are out of their control. In a national study conducted in Chile, high school students with a growth mindset showed comparable academic achievement to those with higher socioeconomic status, suggesting that mindset beliefs may override external circumstances toward academic success.12 Another study conducted with US high school students found that female students performed better in mathematics courses than their male counterparts when they endorsed a growth mindset.13
Luckily, mindset beliefs are malleable and can be reshaped. Exposure to a growth mindset intervention propels adolescents toward a more positive trajectory in achieving better grades and reverses downward trajectories in sensitive academic transitory periods such as middle school.14 A national US study on adolescents found that a short 1 h growth mindset intervention increased grades for lower-performing students and increased enrollment in subsequent advanced mathematics courses.15 At the college undergraduate level, African American students exposed to a growth mindset intervention showed increased academic engagement and performance compared to students not exposed to these interventions.16 In chemistry, such psychosocial interventions were found to eliminate the racial achievement gaps in first-year courses after controlling for prior academic preparation.17
However, recent meta-analyses have shown heterogeneity in the effectiveness of mindset interventions based on who the interventions are targeting, how these interventions are delivered, and the mechanism by which these interventions work.18−21 Although growth mindset interventions are lucrative for improving student affect and persistence in STEM fields, outcomes are sensitive to classroom cues. Student perception of the instructors’ beliefs about intelligence may be responsible for swinging the pendulum in either direction.22 Thus, understanding the instructor mindset (self-identified by faculty and perceived by students) as a facet of the learning environment is essential for others to consider before investing time and resources in student mindset interventions.
Instructor Mindset (Self-identified and Perceived)
The instructor is a significant part of the mindset context for students since they are the most powerful person in the classroom.23 Student perceptions of faculty mindset impact their own mindset, which alters their outcomes.24 For example, in a national study of high school students in the US, mindset interventions were shown to improve math grades only when students were in a supportive learning environment where the instructor endorsed a growth mindset.25 In addition, students who perceive the instructor to have more of a fixed mindset experienced more psychological vulnerabilities that led to more significant attrition in STEM courses.4 The positive effects of interventions that encourage adaptability and perseverance only persist when they are nurtured by an environment that permits those psychological affordances, which have been referred to as the “seed” (student traits) and “soil” (context) model.2 In line with SCT, cues from the instructor can provide evidence to confirm or reject newly discovered or pre-existing mindset beliefs.
Furthermore, instructor psychology about the nature of ability has also been linked to perpetuating the equity gap (disparities in outcomes based on demographic factors) in STEM education.3 Although both men and women experience a lower sense of belonging and more stereotype threat when they perceive the instructor to have a fixed mindset, these environmental cues only negatively impact women’s performance in STEM.5 Faculty who self-identified as endorsing a fixed mindset doubled the equity gap in STEM courses and increased a negative affect.26 Hence, not only are all students negatively impacted when they perceive that faculty endorse a fixed mindset, but also this can exacerbate equity gaps for students from marginalized groups. Previous work has primarily been conducted within selective institutions with a less diverse student population than the general population.5,26 This work seeks to add to the literature by identifying how student perceptions of faculty mindset influence student outcomes within a moderately selective institution with a diverse student population. Findings can be extrapolated to similar institutions that serve an increasingly diverse student body. Student perception of faculty mindset was chosen (over faculty self-identified mindset) because these perceptions are students’ reality (although they may be biased). In addition, determining how student perceptions of the instructor influence student affective factors (sense of belonging, utility value, self-efficacy) and course grades may uncover more specific approaches to increasing student success.
Sense of Belonging: Social and Academic Fit Scale (SAFS)
In addition to safety and security, a sense of belonging is a basic human need that precedes gaining esteem.27 In college, belonging is feeling “cared about, accepted, respected, and valued by, and important to the campus community.”
High attrition rates in STEM majors are related to a lower sense of belonging, which can be context-specific.28 It has been demonstrated that classroom-level belonging in STEM courses (as opposed to campus-level belonging) is more closely related to behavioral and emotional engagement and is correlated to student success and achievement.29 Course level belonging in general chemistry predicts student attrition and performance for first-year undergraduates after controlling for demographics, student preparation, and participation in Supplemental Instruction (a form of peer mentoring).9 Additionally, the intersectionality of race and gender further compounds the decrease in students’ sense of belonging, with female students from HUGs being the least likely to feel they belong in STEM majors.29,30 When students perceive that the instructor endorses a fixed mindset (the learning environment), this triggers stereotype threat and a lower sense of belonging (affect).5 Because “belonging” is a broad term, this work particularly focuses on social and academic fit adapted to the chemistry classroom level.8 Sense of belonging can be social (fitting in with peers: “right now I feel like other students in this chemistry course accept me”) or academic (pertaining to content mastery relative to peers: “right now, I feel that I have less ability than others in this chemistry class”).8,31
A supportive learning environment has been shown to be associated with an increased sense of belonging, which has been found to be a precursor to student self-efficacy.32 Thus, based on these findings and grounded in SCT, self-efficacy is examined as another mediating affective variable.
Self-efficacy
While mindset pertains to the underlying beliefs and attitudes one holds about ability and the nature of intelligence or talent, self-efficacy refers to one’s belief in their ability to achieve specific goals. That is, self-efficacy is an individual’s confidence in performing a specific task despite external obstacles.3 A core part of SCT, self-efficacy, predicts students’ effort and persistence in complex tasks (i.e., succeeding in chemistry courses). When students perform poorly on assignments, a higher self-efficacy may push them to study harder to perform better on the next assignment. A lower self-efficacy may result in their attribution of the failure to their inability to learn the material, leading to disengagement and poor performance.33 STEM self-efficacy correlates with student STEM grades even when controlling for prior preparation.34 Although “confidence gaps” (low self-efficacy, high performance) have been identified in women and minorities in STEM,35 mixed results in those differences suggest examining the effect of the learning environment and the context more thoroughly to supplement the current gaps in the literature.34 For example, the instructor mindset has been shown to predict students’ self-efficacy in one study,24 and self-efficacy has been shown to influence student STEM outcomes in another study.34 However, we are unaware of work that has connected the perception of faculty mindset to STEM course outcomes using self-efficacy as a mediating variable from the same study population. This work provides a holistic view of how self-efficacy impacts student success in chemistry courses as a function of the learning environment. Lastly, SCT identifies task utility as an essential aspect of student motivation, which will be explored next.
Utility Value
Expectancy Value Theory has identified four domains of task value: (1) finding joy in learning the subject (intrinsic value), (2) perceiving the task to be relevant to achieve future goals (utility value), (3) perceiving the task to enhance the sense of self (attainment value), and (4) perceptions of the cost of engaging in a task (cost value).36 Utility value may be the most relevant of the four domains within this context, as students are primarily nonchemistry majors taking the course as a means to a future goal (i.e., professional/graduate school). Utility value interventions in chemistry courses improved student achievement and raised the emotional satisfaction of students who started the course with lower attitudes.37 Hence, it may be relevant that students who perceive the instructors to have a fixed mindset may not see as much value in their course, especially if these students are not chemistry majors.
Performance (Course Grade)
In this work, student behavior (the third component of the SCT triad) is defined as the collective set of actions (attending class, completing assignments, and exams/quizzes) that result in a course grade. This is a limited definition of behavior because course grades depend on other factors, besides student behavior in the classroom, such as life circumstances, mental and physical well-being, and prior preparation. For the purpose of this study, the definition is operationalized to focus on the course grades after controlling for prior preparation and will be used as the primary outcome variable.
ACT Math Score
ACT math scores are consistent grade predictors for many STEM courses including chemistry38 and were used as a control variable for prior preparation. This will allow for modeling the impact of the exogenous variable in question (perceptions of the instructor mindset) on the outcome variable (grades), irrespective of prior preparation. If a student did not have an ACT math score, their SAT/Accuplacer/Accuplacer Next Generations scores were converted to ACT Math scores using concordance tables.39
Purpose of This Study
This study aims to address literature gaps and recommendations from recent meta-analyses on the impact of mindset context on student outcomes by
Extending research beyond selective institutions to a diverse setting and within the chemistry context.
Using factor analyses to assess the validity of the data in the current setting within an institution with a diverse student population.
Using Structural Equation Modeling (SEM) to assess multiple variables under a unified theoretical framework (SCT).
Testing for demographic differences in pathways via measurement invariance and multigroup moderation analysis to tailor approaches to increase success for all students.
Using SEM, the proposed path model was tested based on previous work in SCT32,40 to address the research questions below.
-
(1)
How do student perceptions of faculty mindset influence their course grade as mediated by affective factors (sense of belonging, self-efficacy, utility value) while controlling for students’ own mindset and prior preparation (ACT math score)?
-
(2)
How are these pathways moderated by demographic factors (race, gender, age)?
Methods
Research Setting
Student data was collected from a chemistry program at a public, moderately selective metropolitan research institution in the southern United States (∼7,000 undergraduate students). The Institutional Review Board (IRB) reviewed and approved this protocol.
Five chemistry instructors teaching six gateway chemistry courses (first- and second-year courses determining continuation in a major or entry into other programs) were recruited to participate in the study. All five instructors consented to the survey: three were men from non-HUGs, and two were women from HUGs. Two of the five instructors are investigators in this research project (first and last authors). To minimize conflict of interest and biases in grading, only an undergraduate research assistant had access to student data, which became accessible to the researchers and instructors of record after grades had been submitted and the grade appeal period had passed.
Instructors consented to allow researchers to collect student survey data (consent and questionnaires) for the fall and spring semesters during the COVID-19 pandemic. The courses surveyed ranged from remedial chemistry courses for students with no chemistry background to Organic Chemistry 2 (details in SI).
Students with participating instructors were recruited to answer questionnaires (N = 372; fall = 190, spring = 182). Redundancies were removed, yielding a final sample of 361 student responses distributed among six courses, with a response rate ranging from 55% to 91% and from 27 to 119 participants per instructor (details in SI).
Student Demographics
Consenting students were asked to identify their demographic factors including race, gender, and age. The multiselect race/ethnicity categories consisted of African American/Black, American Indian/Alaska Native, Asian, Hispanic/Latina/o/e, Native Hawaiian/Pacific Islander, White, other, and prefer not to answer. Although the authors recognize that Hispanic refers to an ethnicity of individuals who trace their origins to Spanish-speaking countries and Latina/o/e as a geographic identifier of individuals from Latin America, the intention was to provide an inclusive term that could capture the experience of a group of people who have been underserved in US society. For example, a student with Brazilian heritage may identify as Latina/o/e but not Hispanic, whereas a student with Spanish heritage may identify as Hispanic but not Latina/o/e. However, both students were categorized as HUGs due to disparities in the representation of this group of students in STEM fields in the US.1
Students who identified as White and/or Asian were categorized as non-HUGs.41,42 Students who identified as any of the remaining categories (or at least one of those categories for multiple races) were categorized as HUGs. The gender categories included men, women, nonbinary/third gender, other, and “prefer not to answer.” Because there were fewer than five students who identified as nonbinary/third gender or other, they were removed from gender analysis to protect student identities. Lastly, students were asked to identify their age, condensed as traditional (18–22) and nontraditional (over 22). Students who preferred not to answer were treated as missing data for any categories in which they did not provide an answer. Thus, the final sample population consisted of 36.7% of students from HUGs, 65.7% of women, and 32.5% of nontraditional age.
Instruments
The student questionnaire measured 5 latent variables (33 items) that were adapted from previous work and probed student mindset (SSM, 2 items),4 perception of instructor mindset (PIM, 6 items),4 self-efficacy (SSE, 8 items),43 perceptions of course utility value (CUV, 5 items),44 Social and Academic Fit Scale (SAFS, 12 items),8 and self-identified demographic factors (race/ethnicity, gender, age). The text of some items was modified to reflect the chemistry context. For example, the social fit item, “Right now, I feel like the students in the computer science department are a lot like me,” was modified to “Right now, I feel like the students in this chemistry class are a lot like me.” The PIM items were used in their original form to focus students’ attention on the chemistry instructor, the focal point of interest for the study: “The Professor in this class seems to believe that students can learn new things, but they can’t really change their basic intelligence”5 (see SI for full questionnaire).
The research team administered the questionnaires during the first or last ten min of the instructors’ class time in the 12th/13th week of a 14-week semester. The research team distributed/projected a QR code for students to scan on their mobile devices that would take them directly to the Google form, including the questionnaire. Students were asked to answer how much they agreed with the statements on a Likert-scale. (1 - strongly disagree, 2 - disagree, 3 - somewhat disagree, 4 - somewhat agree, 5 - agree, 6 - strongly agree, prefer not to answer).
Method of Analysis
To answer the research questions above, a 4-step approach to building SEM Path Analysis was utilized, which is summarized in Figure 3.45
Figure 3.
Method of Analysis: The study sample (N = 361) was randomly divided in half. Step (1): An EFA was conducted on the first half of the sample (N = 179) to determine the number of latent variables. Step (2): A CFA was conducted on the other half of the sample (N = 182) to confirm the structure determined by the EFA analysis, followed by measurement invariance testing to assess if factors have similar meanings for different groups (race, gender, age). Step (3): An SEM path analysis was built with all 361 participants to establish relationships among the latent variables and test for direct and indirect effects. Step (4): Multigroup moderation testing was conducted to determine if the pathways differ based on demographic group (race, gender, age group).
Because some instrument scales were modified for the chemistry context and applied to a new setting, the sample was randomly divided into two groups (using a random number generator with “1” being EFA and “2” being CFA) to conduct a factor analysis. Step (1): An exploratory factor analysis (EFA) was conducted on one set (N = 179) to determine the number of latent variables using oblique factor rotation, since the factors are expected to be related. Step (2): The factor structure uncovered in the EFA was then tested with the other half of the data set (N = 182) using confirmatory factor analysis (CFA). Measurement invariance testing was conducted to assess whether latent variables mean similar things for different demographic groups. If this is not the case, then claims about any differences among demographic groups are fallible if students interpret the questionnaire differently. This was followed by Step (3), building a Structural Equation Model (SEM) Pathway with all 361 students that specified how the latent variables are related (based on SCT and previous work,33Figure 2) and testing for direct and indirect effects. Lastly, in Step (4), a multigroup moderation analysis was conducted by sequentially fixing and freeing parameters to determine if the pathways differ based on demographic groups through a series of nested models.
Figure 2.
Proposed path model built upon SCT and previous work:32,40 SSM was covaried with PIM (no directionality is assumed) to isolate the effect of PIM on mediating and outcome variables. ACT math scores were used to control for prior preparation on course grades.
Results and Discussion
Steps 1 and 2: Exploratory and Confirmatory Factor Analysis (EFA/CFA)
An Exploratory Factor Analysis (EFA) and a Confirmatory Factor Analysis (CFA) were conducted to determine the factor structure and validity of the data set (details in the SI). The EFA was conducted on half of the randomly selected data set (N = 179) using oblique factor rotations. This resulted in a 5-factor model that describes 72% of the variance with all eigenvalues above one. All standardized factor loadings were above the 0.60 threshold of acceptability, with the lowest communality value being 0.48. Eight of the twelve items were removed from the SAFS because they loaded on the self-efficacy scale. Hence, the four remaining items, all negatively worded, were identified as students’ sense of academic misfit with peers (AMF) to remain consistent with the original source that measured academic fit.8 The negative wording, such as “Right now, I feel like other students understand more than I do about what’s going on in this chemistry course,” may possibly explain why these items loaded on a separate factor. However, it must be noted that further insight from qualitative response process interviews should be conducted for a more extensive validation of the inferences made from the instrument. Lastly, one item was removed from the PIM factor due to poor loading, leaving five items on the PIM factor.
A subsequent confirmatory factor analysis (CFA) was conducted using the 5-factor model generated from the EFA analysis on the other half of the data set (N = 182) using full information maximum likelihood approximations (FIMLs) for missing data and the robust maximum likelihood (MLR) estimator (data was not normally distributed; details in SI). Three out of four model fit indices did not meet the threshold of acceptability ((CFI/TLI > 0.95, RMSEA < 0.08, and SRMR < 0.06): robust CFI/TLI = 0.948/0.940, Robust RMSEA = 0.064 (90% CI: 0.053–0.075), SRMR = 0.051). Because the Robust CFI/TLI and RMSEA were close to the threshold of acceptability and for the sake of maximizing sample size, the entire sample size (N = 361) was used to test the model that resulted in excellent model fit (Robust CFI/TLI = 0.968/0.963, Robust RMSEA = 0.051 (90% CI: 0.043–0.059), SRMR = 0.045), with all four parameters meeting the threshold of acceptability. Configural, metric, and scalar invariances held for all demographic groups (race, gender, age), indicating that students from different demographic groups interpreted questions similarly (details in SI). Lastly, reliability measures (Cronbach’s alpha) for each factor indicated high internal consistency between items ranging from 0.83 to 0.96.
Overview of Data
The PIM and SSM items were inversely coded so that the highest value (6) is associated with a growth mindset and the lowest value (1) is associated with a fixed mindset (e.g., 5 → 2, 1 → 6, etc.). High values for SSE and CUV indicate greater self-efficacy and a course utility value. Original scoring was retained for AMF so that a greater value (6) means more academic misfit (negative) and a lower value (1) means less academic misfit (positive).
Table 1 summarizes descriptive statistics regarding the variables of interest in the final data set. This will equip the reader with an initial understanding of the data set that will help them interpret statistical analyses and provide the reasoning behind the statistical techniques employed in the analysis.
Table 1. Descriptive Statistics of Variables.
| Variable | N | Median (IQR)a | Skew | Kurtosis | SWN Testb |
|---|---|---|---|---|---|
| PIM | 348 | 5.00 (1.5) | –1.14 | 0.96 | 0.88**** |
| CUV | 346 | 5.00 (1.6) | –1.18 | 1.24 | 0.88**** |
| SSE | 346 | 4.12 (1.6) | –0.34 | –0.30 | 0.97**** |
| AMF | 345 | 3.00 (1.8) | 0.28 | –0.60 | 0.98**** |
| SSM | 345 | 5.00 (2.0) | –0.90 | 0.05 | 0.87**** |
| Course grade | 329 | 76.0 (17.1) | –0.67 | 0.34 | 0.97**** |
| ACT math | 198 | 23 (7.0) | –0.02 | –0.60 | 0.97**** |
It must be noted that the median and IQR reported here (with missing data removed) do not account for the errors associated with latent variables and are meant as descriptive statistics to be read for general trends rather than drawing conclusions.
Shapiro-Wilks Normality (SWN) Test; ****, p < 0.001 indicates non-normal distribution.
Generally, students perceived the faculty to have more of a growth mindset (median = 5.00, IQR = 1.5) and their course to have a high utility value (median = 5.00, IQR = 1.6). Students also self-report as endorsing more of a growth mindset themselves (median = 5.00, IQR = 2.0) and leaned more positively in their self-efficacy (median = 4.12, IQR = 1.6) and academic misfit (median = 3.00, IQR = 1.8, lower score indicated more positive outcome, i.e., less academic misfit). All variables showed a non-normal distribution according to the Shapiro-Wilks Normality (SWN) test and data visualizing techniques (quantile-quantile (QQ) plots and density plots). There were no indications of multicollinearity (details in SI). Additionally, no interaction effect was detected between course type and instructor, suggesting that regardless of which course the instructor taught, they were generally perceived to have a similar mindset among students (details in SI).
Step 3: Structural Equation Model (SEM) Path Analysis and Mediation Analysis
Once validity (EFA/CFA, measurement invariance testing) and reliability (Cronbach’s alpha) were established for the data set, the SEM path model was built to answer the first research question: How do student perceptions of faculty mindset influence their chemistry grade as mediated by affective factors (academic misfit, self-efficacy, utility value) while controlling for students’ own mindset and prior preparation (ACT math score)?
The SEM Path Model generated based on the proposed model in Figure 2 did not show a significant effect of perceived instructor mindset on course grades (β = 0.02, p = 0.70, details in SI). Regarding mediating variables, only academic misfit significantly impacted course grades (β = −0.50, p < 0.001). In contrast, self-efficacy and course utility value were poor predictors of course grade (β = 0.08 (p = 0.36), β = −0.001 (p = 0.99), respectively; see SI for detailed model).
Hence, a more parsimonious model was generated where self-efficacy and utility value were removed (perceived instructor mindset was retained because it is the primary variable of interest). This resulted in poor model fit (robust CFI/TLI = 0.94/0.92, robust RMSEA = 0.080 (0.065–0.095), SRMR = 0.10), but modification indices suggested setting ACT math scores as a predictor of academic misfit would improve model fit. Since this was also theoretically justified (it makes sense that prior preparation may influence student sense of academic misfit), the final model (Figure 4) did indeed show a better model fit, meeting all thresholds of acceptability (Robust CFI/TLI = 0.96/0.95, Robust RMSEA = 0.058 (0.045–0.070), SRMR = 0.074).
Figure 4.
SEM Path Model (N = 356, 5 of the previously discussed 361 cases could not be used due to excessive missing data): The dashed arrows represent insignificant pathways, and the solid lines represent significant pathways. SSM was covaried with PIM to distill the impact of PIM on student outcomes. Course grade and academic misfit were controlled for prior preparation (ACT math score) to isolate the effect of variables in the model on student outcomes.
Student perceptions of their instructors’ mindset did not directly predict the distal outcome of course grades in chemistry (β = 0.05, p = 0.31) but did significantly predict the more proximal outcome of academic misfit (β = −0.32, p < 0.0001). However, ACT math scores were significantly likely to predict both academic misfit (β = −0.39, p < 0.0001) and course grade (β = 0.32, p < 0.0001), indicating that students’ prior preparation played a seminal and direct role in their affect and performance in chemistry courses.
While PIM → AMF (β = −0.32, p < 0.0001) and AMF → CG (β = −0.47, p < 0.0001) were found to be significant, it does not necessarily mean that the indirect path PIM → AMF→ CG is significant. Direct effects, such as PIM to CG, are assessed by estimating their coefficients directly from the model. Indirect effects, such as PIM → AMF → PIM, involve an intermediary variable (AMF) and are tested by examining the product of the regression coefficients involved. Bootstrapping is a method that involves resampling from the data with replacement to generate numerous samples (e.g., 10,000 iterations) to estimate direct and indirect effects. This process provides more accurate confidence intervals for assessing the significance of the effects in the SEM model (Table 2).
Table 2. Direct and Indirect Effects of PIM on Course Gradea.
| Pathway | Estimateb | Boot SEc | Boot LLCId | Boot ULCIe |
|---|---|---|---|---|
| Direct PIM → CG | 0.54 | 0.64 | –0.71 | 1.82 |
| Indirect Effect: PIM → AMF → CG | 1.88 | 0.57 | 0.86 | 3.09 |
| Total Indirect/Direct Effects | 2.42 | 0.75 | 0.95 | 3.92 |
Bolded pathways represent significant effects.
Direct and Indirect Pathways: Bootstrap sample size = 10,000, Unstandardized estimate.
Boot SE = bootstrapped standard error.
Boot LLCI = bootstrapped lower level 95% confidence interval.
Boot ULCI = bootstrapped upper level 95% confidence interval.
While findings indicated no direct effect of student perceptions of instructor mindset on course grades, significant indirect effects were found. The more the students perceived the instructor to have a fixed mindset, the more academic misfit they reported in the course and the lower their course grades (Table 2). The total direct/indirect effects were also significant, indicating that the influence of student perceptions of instructor mindset does significantly impact their grades through direct and indirect pathways. Hence, evidence supports that students’ perception of the learning environment influences their affect, which influences their performance in chemistry courses. It then becomes prudent to assess whether that is the case for all demographic groups, especially if the aim is to determine how to serve those who have been marginalized by societal inequities.
Step 4: Multigroup Modeling Moderation Analysis Based on Student Demographics
Using Multigroup Modeling, a moderation analysis was conducted to answer research question #2: How are these pathways moderated by demographic factors (race, gender, and age)?
Moderations generally test the interaction between two variables (PIM on CG) based on a third variable (race, etc.). A multigroup moderation model uses the same principle for the entire SEM model to assess if all regression coefficients are the same or different based on a group. Although it is possible to create two models for each subgroup (e.g., HUGs and non-HUGs), this will not allow for an easy comparison of which pathways are different, an important aspect of the research question. Thus, multigroup moderation is conducted through a series of nested models in which the regressions and intercepts are constrained to be equal for both groups (example: HUGs and non-HUGs) and compared to the unconstrained model, in which all parameters are freely estimated.43
The chi-squared difference test was used to assess whether there was a significant difference between the free and the constrained model between demographic categories. The chi-square difference test is preferred over the Wald test (a statistical test used to assess the significance of individual parameters in a regression or structural equation model) for latent parameter testing due to its resilience to model identification issues.46 While sample size remains a concern, particularly with larger data sets (>500 observations), the risk of statistical overestimation is minimal given the sample size for this study (N = 356).47 Results suggested that the regression coefficients did not differ based on race (ΔX2 = 10.53, df = 5, p = 0.062), gender (ΔX2 = 2.11, df = 5, p = 0.83), or age (ΔX2 = 9.38, df = 5, p = 0.095). In that case, it would be valid to interpret the regression coefficients as being the same for race, gender, and age groups and aggregate the data in one model.
Conclusions and Implications
SCT posits that the learning environment reciprocally impacts student affect (emotions and feelings) and behavior (performance). Previous work has shown that students’ perceptions of the instructor mindset (the learning environment) trigger stereotype threat that undermines women’s performance in STEM, and faculty who endorse fixed mindsets double racial achievement gaps in STEM.5,26 While these studies are insightful, they were conducted at predominantly White and selective institutions within the general STEM context. This work supplements the literature by investigating the role of affective factors (academic misfit, self-efficacy, and utility value) as mediating variables between the learning environment (student perceptions of instructors’ mindset) and performance (course grade) while moderating for demographic factors. Specifically, this study hones in on the chemistry context within a diverse institution while using advanced statistical techniques to disentangle affective factors and their contributions to the overall narrative of student success, especially for students from marginalized groups.
Findings revealed a possible confounding of self-efficacy and student sense of social and academic fit in the sample population. Statements such as “Right now, I feel I belong in chemistry class” or “Right now, I feel like I fit in well in this chemistry class” may not necessarily be measures of belonging (feeling valued, heard, and represented in a space). The EFA/CFA analysis from this study revealed that such items are more correlated with student self-efficacy (feeling they can do well in chemistry), thus, suggesting that self-efficacy may be a dimension of belonging in this setting. This is consistent with recent work that highlights situational value as a dimension of belonging associated with the feeling that students attribute to the learning environment based on their experiences in the classroom.48 Particularly, students may feel confident in their abilities to learn chemistry (self-efficacy, “I am good at chemistry; hence, I belong in the course”).48 Perhaps this may explain why some of the items on the SAFS seem to overlap with the self-efficacy scale, indicating that self-efficacy may fall under the broader umbrella of belonging in the form of a situational value specific to the context. Although EFA/CFA analyses are one step toward this validation process, student response process interviews may contribute to a more nuanced understanding of how students interpret these items. This is important when policy/program decisions are based on these conclusions, especially concerning students from HUGs and other marginalized populations in STEM. Thus, instruments that better measure the dimensions of a sense of belonging48 should be developed and tested so more robust conclusions can be drawn about how belonging influences student outcomes. For this study, only four items from the SAFS questionnaire were utilized, and they are referred to as “academic misfit” to remain consistent with the original source.8
Second, there were no direct effects between student perception of the instructor’s mindset and course grade when controlling for students’ mindset and prior preparation in the SEM path models. Although this contradicts the findings in prior work (which also controlled for prior preparation),4,5 this study was completed in a different setting and may be more applicable to less selective institutions with a diverse student population, where students experience many more challenges outside the classroom that may impact their course grades. Even though there was no direct effect of perceived instructor mindset on the more distal outcome (course grade), a significant indirect effect was mediated by students’ report of academic misfit in their chemistry courses (Table 2). In other words, when students perceived the instructor as having a fixed mindset, this increased their reported sense of academic misfit, which was associated with lower course grades. Hence, professional development programs that shift faculty toward a more growth mindset and support faculty adoption of techniques to reflect their endorsement of a growth mindset may improve student outcomes.
Interestingly, when students perceived the instructor to have a growth mindset, this led to a greater utility value of the course, but a greater utility value did not necessarily lead to higher course grades. Hence, student perceptions of instructors’ mindset are related to the value they see in their chemistry courses (to their careers and lives), but that does not translate into better grades in this context. However, it is unclear from this study if the mindset is the driving force in the observed relationship. Perhaps faculty who project a growth mindset also use more student-centered teaching practices (e.g., project-based instruction) that make their course more relevant to students. Future work should focus on parsing the impact of teaching practices on student perceptions of their instructor mindset and the downstream impact on student outcomes.
Student perception of the instructor mindset neither predicted their self-efficacy nor did self-efficacy predict their grades in the SEM path model, which is contradictory to previous findings. While self-efficacy has been linked to student success, it could be that a stronger sense of self-efficacy does not necessarily mean better performance if students lack proper prior preparation. For example, students’ math ACT scores were significant predictors of their course grades. If a student has high perceived self-efficacy but limited preparation, then overcoming foundational learning gaps perpetuated by systemic inequities may be more difficult. While instructors can support students through promoting a healthy learning environment, this must be coupled with foundational support from the institution, such as summer bridge programs, recitation programs, and/or Learning Assistant programs that help students build foundational skills and knowledge to better prepare for the rigor of chemistry courses.49 Through these mechanisms, there is an enrichment of the “soil” that helps support the students’ affective as well as their content knowledge, more effectively nourishing the “seed”.2
Lastly, the regression pathways in the simplified model, where nonsignificant pathways were removed (CUV and SSE, Figure 4), were not found to be different based on race, gender, or age. Thus, no moderation effect was observed for this data set based on demographic factors. Although this is contrary to previous work,5,26 this work was done in a novel context. The lack of differences in findings based on demographics would suggest that students’ reported experience at a moderately selective institution with a diverse student population spans race, gender, or age (for the variables measured in this study). This may be due to more parity in representation in chemistry courses in this setting, where students readily see others who look like them, which is not typically the case at predominantly White and traditional institutions. At such diverse institutions, foundational cognitive barriers (student preparation) may become the primary and direct barriers to student performance in chemistry with affective barriers taking on a secondary and indirect role in student success. Thus, institutions with similarly diverse student backgrounds (demographics and prior preparation, which represent a substantial proportion of US colleges) may be more apt to create support systems (remediation, summer bridge programs, peer tutoring support) for students that aid the instructor in providing a healthy learning environment for their students.
Limitations and Future Directions
Faculty and student participation in these studies was voluntary, possibly leading to selection bias. Student questionnaires were collected at the end of the semester after the drop date had passed, which also led to selection bias. Students who dropped or disengaged from the course are not represented in the study sample, leading to a limited interpretation of these models.
While these SEM models were grounded in prior literature that has established a causal connection between perceptions of instructor mindset, student affect, and performance in STEM,4 this field study acknowledges that other factors may have influenced student perceptions of their instructors’ mindset. Previous studies manipulated student perceptions of the instructor mindset within a laboratory setting, whereas this study was conducted in chemistry courses in their native environment. The stress students experience toward the end of the semester could have influenced their perceptions of the learning environment and their affective factors. In most cases, students also have an idea of their course grade toward the end of the semester, which could have influenced their perceptions. These data were also collected during the COVID-19 Pandemic, during which students experienced various disruptions in their education and home lives that may have impacted all variables in this study.
Because of the small instructor sample size (N = 5), there may not have been enough variability in the classroom environment to detect significant relationships between student perceptions of the instructor mindset and course outcomes. The instructor’s identity (race, gender, etc.), personality, and teaching practices could impact student perceptions. Future work should focus on collecting data from a larger, more diverse instructor population with more diverse teaching practices. Expanding the sample populations will also shed some light on whether the course (organic chemistry vs general chemistry) plays a role in shaping student perceptions of instructor mindset and how those perceptions influence student outcomes in different chemistry levels.
In addition to quantitative models, qualitative measures (interviews, reflections, etc.) can better bring out the richness of student narratives, especially regarding the intersectionality of multiple identities and experiences. For example, what is the underlying mechanism of belonging for students? Is it driven by feelings that pertain to the student identity (i.e., demographics) in chemistry, or is it driven by self-efficacy in chemistry or a combination of both? Simply answering statements, such as “I belong in chemistry class”, does not provide enough depth on what that means for students to belong. Hence, valid and reliable instruments need to be developed to assess the construct of belonging quantitatively, while qualitative studies can reveal more depth about the student experience, which can help increase the success for students from marginalized populations.
Although researchers are attempting to uncover additional pathways that support students that the system has historically underserved, the community continues to bin students based on the color of their skin and other demographic characteristics, which may stem from a deficit model of thinking. Even though well-intentioned, this continues to assume that people of a particular demographic group have generalizable traits, when they can be quite different. Binning students (i.e., HUGs and non-HUGs) also overlooks commonalities across demographics, such as life circumstances (jobs, caretaking obligation, cultural perspectives, etc.) that may be assets or barriers to student success depending on the context. Better methods need to be developed to allow us to focus on students’ experiences with adversity, regardless of their demographic profile, and improve ways that the educational system can help them overcome those obstacles. This may uncover hidden dimensions overlooked in SEM predictive models (and other quantitative models) that lead to better interventions supporting student success. While the instructors/institutions have limited control over other factors (home life, work) determining student success, understanding students’ life experiences holistically can lead to more tailored approaches for student success initiatives.
Lastly, quantitive models require multiple judgment calls that, despite efforts, may lead to biases. For example, using college entrance exams such as the ACT to predict student college grades is biased due to deep-rooted systemic inequities. In this work, ACT math scores were not used as the primary measure of student success; they were used as a control variable while examining other nonacademic factors. Although this does not eliminate bias, it does allow for a more holistic view of student achievement. Second, although theoretically justified, modification indices to improve the model fit may result in overfitting the data, thus limiting the interpretability of conclusions drawn from the SEM path models presented.
Acknowledgments
The authors thank the Promoting Active Learning & Mentoring (PALM) Network (RCN-UBE NSF Grant #1624200), the National Science Foundation (IUSE Grant #2142611: Upholding Active Learning Reform in STEM (UALRS) Project), and UA of Little Rock’s Graduate School and Donaghey College of STEM Deans’ Office for funding this work. This work would also not have been possible without the UA of Little Rock School of Physical Sciences, Chemistry Program faculty members, and the students enrolled in their courses, who have our deepest gratitude. In addition, thanks to Ibraheem Abbood and Khristina Huff, the undergraduate researchers who assisted in collecting data for this project when there was a conflict of schedules for PIs, and Stephanie Feola (UALRS postdoc researcher) for their guidance and counsel on this project.
Supporting Information Available
The Supporting Information is available at https://pubs.acs.org/doi/10.1021/acs.jchemed.3c00971.
The authors declare no competing financial interest.
Supplementary Material
References
- National Academies of Sciences Engineering Medicine Call to Action for Science Education: Building Opportunity for the Future; 2021; 76 pages; https://www.nap.edu/catalog/26152/call-to-action-for-science-education-building-opportunity-for-the. [PubMed]
- Walton G. M.; Yeager D. S. Seed and Soil: Psychological Affordances in Contexts Help to Explain Where Wise Interventions Succeed or Fail. Curr. Dir. Psychol. Sci. 2020, 29 (3), 219–226. 10.1177/0963721420904453. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Turetsky K. M.; Sinclair S.; Starck J. G.; Shelton J. N. Beyond Students: How Teacher Psychology Shapes Educational Inequality. Trends Cogn. Sci. 2021, 25 (8), 697–709. 10.1016/j.tics.2021.04.006. [DOI] [PubMed] [Google Scholar]
- Muenks K.; Canning E. A.; LaCosse J.; Green D. J.; Zirkel S.; Garcia J. A.; Murphy M. C. Does My Professor Think My Ability Can Change? Students’ Perceptions of Their STEM Professors’ Mindset Beliefs Predict Their Psychological Vulnerability, Engagement, and Performance in Class. J. Exp. Psychol. Gen. 2020, 149 (11), 2119–2144. 10.1037/xge0000763. [DOI] [PubMed] [Google Scholar]
- Canning E. A.; Ozier E.; Williams H. E.; AlRasheed R.; Murphy M. C. Professors Who Signal a Fixed Mindset About Ability Undermine Women’s Performance in STEM. Soc. Psychol. Personal. Sci. 2021, 927 10.1177/19485506211030398. [DOI] [Google Scholar]
- Ryan R. M., Ed. The Oxford Handbook of Human Motivation; Oxford library of psychology; Oxford University Press: New York, 2012. [Google Scholar]
- Ewen R. B.An Introduction to Theories of Personality, 7th ed.; Psychology Press: New York, 2014; 10.4324/9781315793177. [DOI] [Google Scholar]
- Walton G. M.; Cohen G. L. A Question of Belonging: Race, Social Fit, and Achievement. J. Pers. Soc. Psychol. 2007, 92 (1), 82–96. 10.1037/0022-3514.92.1.82. [DOI] [PubMed] [Google Scholar]
- Fink A.; Frey R. F.; Solomon E. D. Belonging in General Chemistry Predicts First-Year Undergraduates’ Performance and Attrition. Chem. Educ. Res. Pract. 2020, 21 (4), 1042–1062. 10.1039/D0RP00053A. [DOI] [Google Scholar]
- Dweck C. S.; Leggett E. L. A Social-Cognitive Approach to Motivation and Personality. Psychol. Rev. 1988, 95 (2), 256–273. 10.1037/0033-295X.95.2.256. [DOI] [Google Scholar]
- Yeager D. S.; Dweck C. S. Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed. Educ. Psychol. 2012, 47 (4), 302–314. 10.1080/00461520.2012.722805. [DOI] [Google Scholar]
- Claro S.; Paunesku D.; Dweck C. S. Growth Mindset Tempers the Effects of Poverty on Academic Achievement. Proc. Natl. Acad. Sci. U. S. A. 2016, 113 (31), 8664–8668. 10.1073/pnas.1608207113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Degol J. L.; Wang M.-T.; Zhang Y.; Allerton J. Do Growth Mindsets in Math Benefit Females? Identifying Pathways between Gender, Mindset, and Motivation. J. Youth Adolesc. 2018, 47 (5), 976–990. 10.1007/s10964-017-0739-8. [DOI] [PubMed] [Google Scholar]
- Blackwell L. S.; Trzesniewski K. H.; Dweck C. S. Implicit Theories of Intelligence Predict Achievement Across an Adolescent Transition: A Longitudinal Study and an Intervention. Child Dev. 2007, 78 (1), 246–263. 10.1111/j.1467-8624.2007.00995.x. [DOI] [PubMed] [Google Scholar]
- Yeager D. S.; Hanselman P.; Walton G. M.; Murray J. S.; Crosnoe R.; Muller C.; Tipton E.; Schneider B.; Hulleman C. S.; Hinojosa C. P.; Paunesku D.; Romero C.; Flint K.; Roberts A.; Trott J.; Iachan R.; Buontempo J.; Yang S. M.; Carvalho C. M.; Hahn P. R.; Gopalan M.; Mhatre P.; Ferguson R.; Duckworth A. L.; Dweck C. S. A National Experiment Reveals Where a Growth Mindset Improves Achievement. Nature 2019, 573 (7774), 364–369. 10.1038/s41586-019-1466-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aronson J.; Fried C. B.; Good C. Reducing the Effects of Stereotype Threat on African American College Students by Shaping Theories of Intelligence. J. Exp. Soc. Psychol. 2002, 38 (2), 113–125. 10.1006/jesp.2001.1491. [DOI] [Google Scholar]
- Fink A.; Cahill M. J.; McDaniel M. A.; Hoffman A.; Frey R. F. Improving General Chemistry Performance through a Growth Mindset Intervention: Selective Effects on Underrepresented Minorities. Chem. Educ. Res. Pract. 2018, 19 (3), 783–806. 10.1039/C7RP00244K. [DOI] [Google Scholar]
- Burnette J. L.; Billingsley J.; Banks G. C.; Knouse L. E.; Hoyt C. L.; Pollack J. M.; Simon S. A Systematic Review and Meta-Analysis of Growth Mindset Interventions: For Whom, How, and Why Might Such Interventions Work?. Psychol. Bull. 2023, 149 (3–4), 174–205. 10.1037/bul0000368. [DOI] [PubMed] [Google Scholar]
- Tipton E.; Bryan C.; Murray J.; McDaniel M. A.; Schneider B.; Yeager D. S. Why Meta-Analyses of Growth Mindset and Other Interventions Should Follow Best Practices for Examining Heterogeneity: Commentary on Macnamara and Burgoyne (2023) and Burnette et al. (2023). Psychol. Bull. 2023, 149 (3–4), 229–241. 10.1037/bul0000384. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sisk V. F.; Burgoyne A. P.; Sun J.; Butler J. L.; Macnamara B. N. To What Extent and Under Which Circumstances Are Growth Mind-Sets Important to Academic Achievement? Two Meta-Analyses. Psychol. Sci. 2018, 29 (4), 549–571. 10.1177/0956797617739704. [DOI] [PubMed] [Google Scholar]
- Macnamara B. N.; Burgoyne A. P. Do Growth Mindset Interventions Impact Students’ Academic Achievement? A Systematic Review and Meta-Analysis With Recommendations for Best Practices. Psychol. Bull. 2023, 149 (3-4), 133–173. 10.1037/bul0000352. [DOI] [PubMed] [Google Scholar]
- Canning E. A.; Limeri L. B. Theoretical and Methodological Directions in Mindset Intervention Research. Soc. Personal. Psychol. Compass 2023, 17 (6), e12758 10.1111/spc3.12758. [DOI] [Google Scholar]
- Muenks K.; Yan V. X.; Telang N. K. Who Is Part of the “Mindset Context”? The Unique Roles of Perceived Professor and Peer Mindsets in Undergraduate Engineering Students’ Motivation and Belonging. Front. Educ. 2021, 6, 1. 10.3389/feduc.2021.633570. [DOI] [Google Scholar]
- Lytle A.; Shin J. E. L. Self and Professors’ Incremental Beliefs as Predictors of STEM Engagement Among Undergraduate Students. Int. J. Sci. Math. Educ. 2023, 21, 1013. 10.1007/s10763-022-10272-8. [DOI] [Google Scholar]
- Yeager D. S.; Carroll J. M.; Buontempo J.; Cimpian A.; Woody S.; Crosnoe R.; Muller C.; Murray J.; Mhatre P.; Kersting N.; Hulleman C.; Kudym M.; Murphy M.; Duckworth A. L.; Walton G. M.; Dweck C. S. Teacher Mindsets Help Explain Where a Growth-Mindset Intervention Does and Doesn’t Work. Psychol. Sci. 2022, 33 (1), 18–32. 10.1177/09567976211028984. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Canning E. A.; Muenks K.; Green D. J.; Murphy M. C. STEM Faculty Who Believe Ability Is Fixed Have Larger Racial Achievement Gaps and Inspire Less Student Motivation in Their Classes. Sci. Adv. 2019, 5 (2), eaau4734 10.1126/sciadv.aau4734. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Strayhorn T. L.College Students’ Sense of Belonging, 2nd ed.; Routledge, 2018; 10.4324/9781315297293. [DOI] [Google Scholar]
- Wilson D.; Jones D.; Bocell F.; Crawford J.; Kim M. J.; Veilleux N.; Floyd-Smith T.; Bates R.; Plett M. Belonging and Academic Engagement Among Undergraduate STEM Students: A Multi-Institutional Study. Res. High. Educ. 2015, 56 (7), 750–776. 10.1007/s11162-015-9367-x. [DOI] [Google Scholar]
- Dortch D.; Patel C. Black Undergraduate Women and Their Sense of Belonging in STEM at Predominantly White Institutions. NASPA J. Women High. Educ. 2017, 10 (2), 202–215. 10.1080/19407882.2017.1331854. [DOI] [Google Scholar]
- Rainey K.; Dancy M.; Mickelson R.; Stearns E.; Moller S. Race and Gender Differences in How Sense of Belonging Influences Decisions to Major in STEM. Int. J. STEM Educ. 2018, 5 (1), 10. 10.1186/s40594-018-0115-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Walton G. M.; Cohen G. L. A Brief Social-Belonging Intervention Improves Academic and Health Outcomes of Minority Students. Science 2011, 331 (6023), 1447–1451. 10.1126/science.1198364. [DOI] [PubMed] [Google Scholar]
- Zumbrunn S.; McKim C.; Buhs E.; Hawley L. R. Support, Belonging, Motivation, and Engagement in the College Classroom: A Mixed Method Study. Instr. Sci. 2014, 42 (5), 661–684. 10.1007/s11251-014-9310-0. [DOI] [Google Scholar]
- Zimmerman B. J. Self-Efficacy: An Essential Motive to Learn. Contemp. Educ. Psychol. 2000, 25 (1), 82–91. 10.1006/ceps.1999.1016. [DOI] [PubMed] [Google Scholar]
- Rittmayer A. D.; Beier M. E.. Overview: Self-Efficacy in STEM; 2009; http://aweonline.org/arp_selfefficacy_overview_122208.pdf.
- Wilson D.; Bates R.; Scott E. P.; Painter S. M.; Shaffer J. Differences in Self-Efficacy among Women and Minorities in STEM. J. Women Minor. Sci. Eng. 2015, 21 (1), 27. 10.1615/JWomenMinorScienEng.2014005111. [DOI] [Google Scholar]
- Wigfield A.; Eccles J. S. Expectancy–Value Theory of Achievement Motivation. Contemp. Educ. Psychol. 2000, 25 (1), 68–81. 10.1006/ceps.1999.1015. [DOI] [PubMed] [Google Scholar]
- Wang Y.; Rocabado G. A.; Lewis J. E.; Lewis S. E. Prompts to Promote Success: Evaluating Utility Value and Growth Mindset Interventions on General Chemistry Students’ Attitude and Academic Performance. J. Chem. Educ. 2021, 98 (5), 1476–1488. 10.1021/acs.jchemed.0c01497. [DOI] [Google Scholar]
- Frey R. F.; Cahill M. J.; McDaniel M. A. Students’ Concept-Building Approaches: A Novel Predictor of Success in Chemistry Courses. J. Chem. Educ. 2017, 94 (9), 1185–1194. 10.1021/acs.jchemed.7b00059. [DOI] [Google Scholar]
- Home - ACCUPLACER | College Board; https://accuplacer.collegeboard.org/ (accessed 2023–03–22).
- Conner M.; Norman P.. Predicting and Changing Health Behaviour: Research and Practice with Social Cognition Models; McGraw-Hill Education: UK, 2015. [Google Scholar]
- Shortlidge E. E.; Rain-Griffith L.; Shelby C.; Shusterman G. P.; Barbera J. Despite Similar Perceptions and Attitudes, Postbaccalaureate Students Outperform in Introductory Biology and Chemistry Courses. CBE—Life Sci. Educ. 2019, 18 (1), ar3. 10.1187/cbe.17-12-0289. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kattoum R. N.; Abbood I.; Huff K. E.; Baillie M. T. Perceived Barriers to Equitable Participation in the Learning Assistant Program. J. Chem. Educ. 2023, 100 (7), 2495–2503. 10.1021/acs.jchemed.2c00635. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen G.; Gully S. M.; Eden D. Validation of a New General Self-Efficacy Scale. Organ. Res. Methods 2001, 4 (1), 62–83. 10.1177/109442810141004. [DOI] [Google Scholar]
- Hulleman C. S.; Godes O.; Hendricks B. L.; Harackiewicz J. M. Enhancing Interest and Performance with a Utility Value Intervention. J. Educ. Psychol. 2010, 102 (4), 880–895. 10.1037/a0019506. [DOI] [Google Scholar]
- Schumacker R. E.; Lomax R. G.. A Beginner’s Guide to Structural Equation Modeling, 3rd ed.; Routledge: New York, 2010. [Google Scholar]
- Gonzalez R.; Griffin D. Testing Parameters in Structural Equation Modeling: Every “One” Matters. Psychol. Methods 2001, 6 (3), 258–269. 10.1037/1082-989X.6.3.258. [DOI] [PubMed] [Google Scholar]
- Little T. D.Longitudinal Structural Equation Modeling; Guilford Press, 2013. [Google Scholar]
- Young J. D.; Demirdöğen B.; Lewis S. E. Students’ Sense of Belonging in Introductory Chemistry: Identifying Four Dimensions of Belonging via Grounded Theory. Int. J. Sci. Math. Educ. 2023, 10.1007/s10763-023-10433-3. [DOI] [Google Scholar]
- Barrasso A. P.; Spilios K. E. A Scoping Review of Literature Assessing the Impact of the Learning Assistant Model. Int. J. STEM Educ. 2021, 8, 12. 10.1186/s40594-020-00267-8. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.




