Skip to main content
PLOS One logoLink to PLOS One
. 2023 Aug 17;18(8):e0289464. doi: 10.1371/journal.pone.0289464

Iterating toward change: Improving student-centered teaching through the STEM faculty institute (STEMFI)

Jeffrey Shipley 1, Rebecca L Sansom 2, Haley Mickelsen 1, Jennifer B Nielson 2, R Steven Turley 3, Richard E West 4, Geoffrey Wright 5, Bryn St Clair 6, Jamie L Jensen 1,*
Editor: Ayse Hilal Bati7
PMCID: PMC10434963  PMID: 37590212

Abstract

One of the primary reasons why students leave STEM majors is due to the poor quality of instruction. Teaching practices can be improved through professional development programs; however, several barriers exist. Creating lasting change by overcoming these barriers is the primary objective of the STEM Faculty Institute (STEMFI). STEMFI was designed according to the framework established by Ajzen’s Theory of Planned Behavior. To evaluate its effectiveness, the Classroom Observation Protocol for Undergraduate STEM (COPUS) tool was used before and after an intensive year-long faculty development program and analyzed using copusprofiles.org, a tool that classifies each COPUS report into one of three instructional styles: didactic, interactive lecture, and student-centered. We report the success of our program in changing faculty teaching behaviors and we categorize them into types of reformers. Then, thematically coded post-participation interviews give us clues into the characteristics of each type of reformer. Our results demonstrate that faculty can significantly improve the student-centeredness of their teaching practices in a relatively short time. We also discuss the implications of faculty attitudes for future professional development efforts.

Introduction

Economic forecasts suggest that the demand for Science, Technology, Engineering, and Mathematics (STEM) majors is likely to increase by 5–20% justifying a need to increase retention of STEM majors [1]. Poor STEM teaching is a major contributing factor to attrition from STEM majors [2]. STEM classes are frequently taught didactically through lecture [3], which can cause students to disengage or struggle to learn while in class [4]. In contrast, we suggest students should be engaged in active and inquiry-based approaches, which include collaborative learning and student-centered teaching (SCT) strategies. Active learning may include dialoguing, group work, guided inquiry, or the use of personal response systems, among others (see Freeman et al. [5], for a meta-analysis). We specifically focused this faculty development on student-centered teaching strategies that we define as those that encourage students to be engaged in the learning process instead of sitting passively in class. The use of SCT strategies has resulted in several academic benefits including improved critical thinking skills, greater involvement of students in the learning process, and the personalization of large lectures [6]. Additionally, these strategies can improve student grades and achievement [5] and reduce the high attrition rates in STEM courses.

Faculty development workshops have emerged as a primary vehicle by which administrators and leaders in STEM education attempt to improve teaching [e.g., 79]. Such workshops often last for several hours a day over days, within and cross-discipline [10], and address a variety of topics including the importance of active learning to improve student understanding, engagement, and experience. Faculty participate in teaching workshops for a variety of reasons including discontent with teaching practices, student participation, or student experience [11].

Some faculty development programs have proven to be effective at enhancing faculty knowledge, professional competence, and student performance [4, 12]. However, such development programs frequently do not cause lasting changes to teaching strategies or student engagement. Several factors have been proposed as barriers to lasting change: (a) lack of awareness of the evidence that supports the use of SCT techniques [13], (b) reluctance by faculty to buy into the published literature since they frequently did not learn through SCT techniques themselves [13], and (c) inadequate follow-up after workshop participation to support implementation [14].

Past research has clarified some of the barriers to and drivers for instructional change [15, 16]. Baker et al. [7], suggest the need to align the framework of faculty support that includes institutional and department-level affairs, as well as individual instructor characteristics. A culturally appropriate context, particularly discipline-specific application on teaching scholarship is also desired [16] as is continuity in training within departmental cohorts [10]. Furthermore, empirical evidence beyond self-reported qualitative data will elucidate the impact of faculty development activities on student learning [17].

To address these barriers, the STEM Faculty Institute (STEMFI) program was created as a year-long faculty development program with a dual purpose to support lasting faculty change to SCT and to better understand what drives faculty to make that change.

Theoretical rationale

We used the Theory of Planned Behavior as a theoretical framework to study the causal mechanisms involved in promoting lasting changes in STEM instruction. Originally proposed as a way to think about changes in public health behaviors [18], it addresses three different factors—attitude toward the behavior, subjective norms, and perceived behavioral control—that influence the intention of an individual to behave a certain way (see Fig 1). When an individual develops a favorable attitude toward a behavior, believes the behavior is expected by others and perceives the behavior as possible, the person’s intention to perform the behavior increases [19]. We believe that these same principles apply to changes in teaching behaviors and thus we sought to design our faculty institute to address these factors.

Fig 1. The theory of planned behavior.

Fig 1

The first factor, attitude toward the behavior, is an individual’s overall perception of the behavior, often evaluated on the potential benefits or drawbacks to both the individual performing the behavior and others. In the context of STEMFI, our research sought to understand what attitudes faculty members had regarding the usefulness of SCT strategies and their effectiveness in helping students learn and stay engaged in STEM classes. Our program sought to support this attitudinal change by introducing faculty participants to the evidence supporting the use of SCT and structuring the workshop so that participants could experience the strategies for themselves and change their perceptions of the utility of SCT.

The second factor, subjective norms, “consist of a person’s beliefs about whether significant others think he or she should engage in the behavior” and can hold sway over an individual’s intention to carry out a behavior [19, p. 585]. We learned about participants’ social experiences within their departments and colleges through pre-participation interviews. We structured the STEMFI program to promote positive subjective norms by providing opportunities to interact as a cohort and with a supportive mentor. At monthly cohort meetings, participants presented strategies they had employed and heard from others in the group about their chosen strategies. Together, they examined what worked and what did not work, and were encouraged in their quest to create a student-centered classroom—thus improving their subjective norms.

The final factor, perceived behavioral control, is a person’s belief that they are capable of performing a behavior in their current situation; it is influenced by both their self-efficacy and external factors [19]. In our pre-interviews with faculty, we asked about the specific challenges they face or anticipate facing while implementing SCT. During the STEMFI workshop, participants received training on the use of SCT practices to support self-efficacy. We did not attempt to change external factors, like classroom setup, but we did try to help participants see how those challenges could be overcome.

To incorporate each of these factors, the STEMFI program was designed as a year-long program that began with a week-long summer workshop where participants experienced SCT strategies, learned about the evidence from discipline-based education research [20] to support the use of those strategies, and built a social network with colleagues. We aimed to answer the question, can we promote reform of teaching to more SCT through the lens of the Theory of Planned Behavior? By comparing teaching observations from the two semesters following the week-long summer workshop to the pre-workshop data and interviewing faculty at the end of the program, we were able to directly measure changes in faculty teaching and attitudes and therefore understand the effectiveness of the STEMFI program in facilitating those changes.

Materials and methods

Ethics statement

All participants provided consent to participate in the research study. Permission was obtained from the primary authors Institutional Review Board, approval number X17244.

Participants

The STEMFI program was established with a National Science Foundation grant at a large, private doctoral-granting institution in the western United States. Approximately 35,000 students attend the institution, and 12,000 are enrolled in a STEM degree program. Approximately 51% of the student body is female, 77% single, 81% Caucasian, 9% Hispanic or Latino, and 1% black students.

The STEMFI program was run in three year-long cohorts, consisting of 15 faculty each, over the course of four years (a gap year occurred due to COVID). Faculty came from the three STEM colleges on campus for a total of 45 faculty: 19 from Life Sciences, 13 from Physical and Mathematical Sciences, and 13 from Engineering and Technology. The Colleges were nearly evenly represented in each cohort. Eighteen participants were Assistant Professors (pre-tenure), 21 were Associate Professors (tenured), and six were Full Professors (the highest rank obtained post-tenure). There were 35 males and 10 females that participated, which, at this institution is an overrepresentation of females when compared to the faculty at large. All faculty participants volunteered to participate and were compensated with a small stipend ($600) to their research account. Each received approval from their respective department chair and dean.

STEMFI program

The STEMFI program lasted three to four semesters (over two years) and consisted of three phases: pre-, during, and post-workshop. Pre-workshop observations using the Classroom Observation Protocol for Undergraduate STEM (COPUS; Smith et al. [21]) were performed for all participants during the first year. A goal of four class-length observations were made for each faculty participant (although some received only three observations). The observations were performed on random days, and as often as possible, were observations of the same course that they planned to reform during STEMFI. We also conducted pre-workshop interviews in order to better tailor the workshop experience. Participants did not reform during the first year.

Phase 2 consisted of faculty participation in a one-time summer workshop lasting one full week (9am - 5pm), where we worked to improve their attitudes, subjective norms, and perceived behavioral control specifically for SCT through an active learning experience, collaboration with colleagues, and focused instruction on strategies. Several student-centered strategies were introduced to encourage instructors to facilitate a more active classroom. While we recognize that active learning is not always student-centered, and our COPUS instrument focused on active learning, the workshop specifically focused on student-centered strategies. The workshop is described in detail in West et al. [22]. They were required to complete one fully reformed lesson plan that specifically included SCT and encouraged to tackle a second during the week. Participants were assigned a peer mentor. In the first cohort, the workshop facilitators, along with a few other faculty who were chosen for their excellent teaching record, served as mentors. In the second and third cohorts, we chose participants from the previous cohort who had demonstrated significant reform. We tried to pair mentors and participants who were in the same discipline (so that they understood the disciplinary nuances) but outside of the department (so that they had no influence on rank or status decisions). During the workshop, participants also met with their peer mentor and made plans for implementation. The participant experience and workshop design are described in more detail in a recent publication [22].

In phase three, we followed faculty participants for two semesters after the summer workshop (one full academic year). During the first semester they taught following the summer workshop, participants were encouraged to add at least three new SCT techniques to their teaching. Some faculty chose simpler strategies such as student response systems (clickers) or think-pair-share [e.g., 23], while others chose more involved strategies like the 5E learning cycle [24], Process-oriented guided inquiry learning (POGIL) [25], or Decision-based Learning [26]. Participants also met regularly with their mentors to practice new strategies, discuss previous efforts, and set goals—actions that can be helpful in supporting lasting change [22]. At least three (with a goal of four) of these classes were also observed using COPUS to measure their post-teaching behaviors. In addition to the one-on-one guidance from an individual mentor, participants also received social support from colleagues in the cohort at monthly cohort meetings for a full academic year where they shared what they had done and brainstormed ways to improve or apply the strategy to a different course [22].

Observation instrument for quantitative analysis

The COPUS tool (Smith et al. [21]) was used to gather quantitative data. All data is available at https://scholarsarchive.byu.edu/data/49. COPUS is a data collection tool that records student and instructor behaviors every two minutes during a given class to assess active interaction. Codes include more teacher-centered approaches, such as lecturing by the instructor and listening by the students, more interactive strategies like student questions and instructor answers, and more fully student-centered strategies like group work, clicker questions, and the instructor moving and guiding around the room. It does not, however, assess specific student-centered techniques. For more description of the instrument and codes, see Smith et al. [21]. By documenting the activity of the students and teacher with a variety of codes, the observer can measure the level of student engagement and infer the degree to which the classroom is active. Undergraduate and graduate students were trained to use the COPUS using the training protocol established in Smith et al. [21]. Participants were observed in person three to four times prior and four times after (the switch from 3 to 4 occurred between cohort 2 and 3, being informed by the intervening publication of Stains et al. [3]). Pre-observations were taken at random to try to capture typical class periods. Participants were made aware of the observation, but the observer was usually a student who blended in with the class. Post-observations were made in the first semester they taught following their participation and were selected by the participant in order to showcase the new techniques they were planning to use. Thus, the post-observations represent what the participants felt were most representative of what they had learned and chosen to implement based on their STEMFI experience, and were not random.

Of our 45 participants, four were unable to complete the program, one due to COVID class cancelation, two due to unexpected leaves, and one due to his course being entirely online and inaccessible. An additional three participants had significant shifts in their course structures, due to COVID, such that post-data was collected on hybrid or online courses, but their data was still obtainable and included in analysis. These instructors provided recordings of their classes that they conducted in a hybrid or online fashion and we used the COPUS to analyze behaviors. Certainly, the online conditions hampered some active learning strategies causing measure of reform to likely be lower than they would have been in person.

Classification of participant classroom behavior

After a nationwide study and latent class analysis on more than 2,500 classroom observations, Stains et al. [3] created an online tool to classify instructor practice as didactic, interactive lecture, or student-centered at copusprofiles.org. As Stains et al. [3] describe, didactic “depicts classrooms in which 80% or more of class time consists of lecturing”; an interactive classroom “represents instructors who supplement lecture with more student-centered strategies such as ‘Other group activities’ and ‘Clicker questions with group work’; and student-centered instructors “incorporate student-centered strategies into large portions of their classes” (p.1469). We used this tool to classify each COPUS observation for each participant. Participants were classified based on the majority (two or more) of their observations. For example, if a participant had four pre-observations labeled as didactic, didactic, didactic, and interactive, they were labeled as “didactic”.

Participants who moved from didactic to interactive were labeled as “Beginning Reformers”; those who moved from didactic all the way to student-centered were labeled as “Dramatic Reformers”; those who were already interactive and moved to student-centered were labeled as “Advanced Reformers”; those who were already using interactive strategies and remained interactive (although with broadened strategies) were labeled as “Interactive Reformers”; likewise those who were already using student-centered strategies and continued being student-centered (with broadened strategy use by trying new strategies that they learned in the workshop) were labeled “Student-Centered Reformers”; and lastly, those who started with and chose to continue with only didactic strategies, even after the intervention, were labeled as “Didactic Non-Reformers” (see Fig 2).

Fig 2. Types of reformers based on movement between groups.

Fig 2

The “N” indicates the number of participants in each category.

Interview protocol for qualitative analysis

At the end of their STEMFI experience, we interviewed participants. The interview protocol, included in the supplementary information, addressed some programmatic evaluation pieces (e.g., Which STEMFI activities were most/least helpful?) and the three factors of the Theory of Planned Behavior (e.g., for Attitudes, we asked questions like, Have your attitudes about SCT changed over the course of your participation? For Subjective Norms, we asked questions like, How have your students responded to the changes you’ve made? For Perceived Behavioral Control, we asked questions like, Has your confidence to use SCT changed?). The full interview protocol is included in Supplementary Materials. Full transcripts are available at https://scholarsarchive.byu.edu/data/49.

Interview transcripts were read and thematically coded by JS, HM, and JLJ following the protocol outlined by Strauss and Corbin [27]. After the first reading of the interviews, readers compiled lists of themes, backed with quotations, that emerged from the interviews. Such themes were discussed and combined into four main themes that described issues relevant to participants’ decisions to reform their teaching. Within these themes, we made binary categories into which we put each participant. Each interview was then recorded into these emergent theme categories.

In cases where the two independent researchers did not agree on the appropriate categorization for a participant’s interview, a third researcher also read and interpreted the transcript. The three researchers then met to discuss the textual evidence that supported their ratings and continued the discussion until consensus was reached.

We then used an explanatory mixed methods design [28] to merge the findings from the quantitative and qualitative data. In the process of merging the quantitative and qualitative data, we used the Theory of Planned Behavior to organize and provide context for our findings. However, we found that the themes were not robustly tied to specific reformer types in an exclusive way, so they can only hint at potential differences between reformers.

Results

Quantitative results

Of the 41 participants with complete data, we classified 35 as “reformers” because they were able to “reform” their teaching in some significant way by successfully implementing more student-centered practices. These changes revealed that 85% of our participants were able to improve their teaching, suggesting that the STEMFI program was effective in changing faculty behavior.

Of these 35 reformers, 11 were classified as beginning reformers who moved from didactic instruction to interactive lecture (two Assistant and nine Associate professors), four as dramatic reformers who moved from didactic instruction to student-centered strategies (three Associate and one Full professor), five as advanced reformers who moved from interactive lectures to student-centered strategies (one Assistant, three Associate, and one Full professors), three as interactive reformers who already used interactive lecture techniques and simply incorporated more or different interactive lecture techniques (one Assistant, one Associate, and one Full professors), and 11 as student-centered reformers who were already using student-centered strategies and simply added new and different strategies to their repertoire (nine Assistant and two Associate professors). Only six faculty participants failed to move beyond didactic strategies (four Assistant and two Full professors) (see Fig 2).

Qualitative results

Through inductive thematic analysis of post-experience interviews, four themes emerged that seemed to be influential in participants’ decisions to make changes to their teaching practices: (1) attitude toward SCT, (2) student responses to SCT, (3) participant motivation, and (4) challenges. We then created dichotomous categories within each theme. Through the lens of the Theory of Planned Behavior, we triangulated these categories with COPUS data to make loose hypotheses about the motivations of each reformer. However, we found that the themes were not robustly tied to specific reformer types in an exclusive way, so they can only hint at potential differences between reformers.

Attitudes toward student-centered teaching

Attitudes toward SCT were categorized as either fully reformed or in transition. Fully reformed individuals displayed attitudes that indicated buy-in to the idea of SCT being beneficial and more effective than the traditional lecture-style approach. For example, one participant commented,

I have a lot more confidence in knowing that this is a good way to use class time and…seeing them all working, trying to figure out what the answer is… it gives me a lot of confidence that [SCT] really is a worthwhile thing.

Most participants in the program displayed fully reformed attitudes, especially among those who were primarily didactic in their approach to teaching (i.e., beginning and dramatic reformers). We also see this attitude among those who were already well-versed in student-centered strategies (i.e., student-centered reformers). In the framework of the Theory of Planned Behavior, these findings are associated with favorable attitudes toward SCT consistent with a self-directed choice to participate in the workshop.

In contrast, some participants, classified as in transition, made comments that seem to indicate they had some reservations about SCT, while maintaining an overall positive attitude. For example, one participant commented that SCT

was really interesting and the students were very involved, but I always think about, well, what do you do on the days where it’s not as interesting? ‘Cause, there’s some hard days when the topic’s just not going to entice the students quite as much and so… I don’t know,

indicating that he was not sure that SCT would fit for all content. Another participant, while talking about creating a SCT activity for a particular lesson, stated,

A whole day is that one [SCT activity], and I’m not prepared to say that we can afford to do that, or if it would be better but we won’t have time. That’s something that I don’t know the best answer for,

indicating that she appeared to be unsure whether the time spent doing SCT was worth the benefits. We saw these attitudes mainly among those who implemented some reformed strategies prior to the workshop, but who were not fully engaged in SCT (i.e., advanced reformers), and among those who chose to make no changes to their teaching (i.e., non-reformers).

Student responses to student-centered teaching

Student response to the implementation of SCT techniques in the classroom was categorized as positive or negative. Positive feedback would indicate that the students enjoyed the new teaching style, or saw the benefit in it. For example, one participant commented,

There was a positive impact from those newly implemented activities in my classes. Student ratings went up, and so I had the highest rating I ever had in that class. I have been teaching that class for five years now…and it was the highest rating I ever got.

Another participant noticed a positive change in student behaviors, “For the next three or four class periods, [the students] were more willing to ask questions [and be] more openly engaged.” In the framework of the Theory of Planned Behavior, positive feedback from students contributed to favorable subjective norms, where student evaluations of teaching and student feedback are viewed as extremely important indicators of the social context that faculty members experience. Those who began as didactice lecturers seemed to experience only positive feedback as they took their first steps into SCT (i.e., beginning and dramatic reformers). Others had mixed feedback.

Negative feedback would indicate that students were resistant to or expressed their dissatisfaction with the activities and could be seen as a significant challenge to participants. For example, one participant noted that “after four days of doing it, I asked them, ‘Did you like this lesson model?’ No. [They] did not like it.” Another participant noted a lack of student buy-in to the activities saying,

I felt like ‘Oh, I’m coming in with these ideas and I have more student engagement things than I’ve had before, and this should be really cool,’ and they just didn’t seem to buy it or buy into it, it just… I don’t know.

Most of the participants who were attempting to implement more advanced SCT strategies experienced at least some negative feedback (i.e., advanced and student-centered reformers), however, they demonstrated overwhelming positivity and a desire to continue using reformed strategies. This was especially true among student-centered reformers who used negative feedback as a motivator to be even more engaging, more open, more welcoming, as is seen in this comment:

I think one of the biggest challenges was the student engagement, or lack thereof, and I have not figured out how to overcome that…. Last semester was just really rough… there were lots of times where I was like, “Ugh, I have to go to this class”… I don’t know how you make it… more engaging, make it more open, make it more welcoming, I’m not really sure.”

This participant’s commitment to remain student-centered despite the struggles with negative student reactions stands out as a characteristic of a student-centered reformer. In contrast, we also see negative student feedback in non-reformers experienced, which appeared to stifle their desire to change.

Participant motivatio

Participant motivation was divided into intrinsic and extrinsic. All of our participants had an intrinsic drive to participate in STEMFI. Intrinsic motivation was characterized by faculty members who had a genuine desire to become a better teacher. When asked why they signed up for STEMFI, one participant commented,

I always want to improve my classes, and I want to become a better teacher, and so by signing up for [STEMFI], I can go from just having that as an ideal to actually trying to put a plan into action.

Another said, “I’m always trying to come up with new… active learning experiences, activities to do with the students, and so I thought [STEMFI] would be really cool.” In the framework of the Theory of Planned Behavior, this motivation significantly contributed to their overall attitudes toward teaching reform.

Occasionally, a faculty participant would express some extrinsic motivation characterized by participation in the program for reasons external to an innate desire to be a better teacher. For example, one faculty member commented that she signed up because

you [meaning the STEMFI team] invited me of course. [A colleague] told me I should do it, and I was free that spring… plus the NSF has the broader impacts part in our grants, so I was interested to see if I could tie something into that.

Another faculty member joined after being prompted by a Department Chair in preparation for a rank and status decision. This extrinsic motivation was particularly salient among non-reformers.

Challenges

Beyond the challenge of negative student feedback, additional challenges faced by participants were categorized as logistical or philosophical. Logistical challenges were challenges that related more to time constraints, classroom architecture, the desire to cover course content. For example, one participant commented that “time management’s the hardest thing, for sure.” Another faculty member said,

The timing is really hard because we have such a variety of students in the classroom… the people who don’t have a physics background take a really long time to do stuff, and the people who do have a physics background take a very short time to do things and then you kind of have this… challenging situation.

Another commented on the classroom structure saying, “every desk is full, and they have all their stuff on the floor and I… have to walk across the front, but there’s no way for me to get to the sides.” In the framework of the Theory of Planned Behavior, these challenges were representative of their perceived behavioral control. All reformers expressed logistical challenges. Not surprisingly, the amount of logistical challenges mentioned by participants was somewhat related to the level of change they made. For example, beginning reformers, who only made small challenges, reported very few challenges. Dramatic reformers, on the other hand, reported significantly more challenges to their perceived behavioral control, particularly with regard to time in class and not knowing how to make SCT part of a fast-moving course.

Philosophical challenges, on the other hand, relate more to issues with SCT approaches, administrative pressure, department climate, or a lack of confidence. One faculty participant expressed a conflict between their own beliefs and SCT ideas saying,

I’m kind of cautious in trying to introduce those things because I really do feel like as soon as you do those activities, you’re giving up a significant amount of control over the time of the class—significant amount of control over what the students are “supposed to get” based on what I lecture. When I lecture you hear every word, and it goes into your brain and then it’s there. That’s not true, but that’s the intention when I lecture. That’s what I intend to have happen and then what I expect the students to have and I can justify moving onto the next topic because we already did the old thing, and that’s false, but as soon as I do the opposite, which is, “We’re going to let everyone float a little bit,” then I lose that semblance of control and I feel… less at ease about doing that.”

Another faculty participant expressed concern over not getting good student ratings for his/her rank and status portfolio:

I’m like, “If I make these changes to the way I do exams and students hate it, then they’re going to ding me on it and is that going to affect my ability to be promoted here?” And that’s kind of not the way you should be thinking. You shouldn’t be worrying about some other external pressure, right? So, that’s what I mean when I think there’s those kinds of external conflicts that are imposed that… maybe aren’t ideal.

In response to department climate, one participant said, “I do not believe that the department encourages experimentation, exploration, and engagement on these type[s] of things [meaning SCT].” Several expressed a lack of confidence in using SCT techniques saying, “I value the time in class, I think it’s sort of precious, and I get really nervous about trying to do new things because I don’t want to fall flat, and I’ve seen a lot of lectures where you try to do fancy stuff and it’s like you tried hard and it didn’t work.” Or, “I’m not good at it yet. So, it’ll be just practicing and refining the technique of making…them discussing more than just me.” These philosophical challenges were primarily seen among non-reformers, and surprisingly by those who were well-established as student-centered practitioners (i.e., student-centered reformers). However, student-centered reformers are set apart by how they perceived that challenge. Many participants in other profile groups indicated that using SCT meant they would not have time to cover all the material in their class, and viewed that as a flaw of SCT. In contrast, student-centered reformers were more likely to acknowledge their lack of confidence or skill at facilitating SCT, and to look for creative solutions to the problem rather than rejecting SCT outright. This philosophical challenge was caused by introspection, recognizing ways in which they could potentially improve as an instructor, and indicated their tendency to be more concerned with how well students were learning. Ultimately, student-centered reformers have such strong positive attitudes about SCT that they were able to overcome significant challenges related both to subjective norms and perceived behavioral control.

Discussion

The STEM Faculty Institute was successful at creating change in at least some of the instructional behaviors of STEM faculty. By analyzing the post-interviews of these participants, we were able to characterize some of the attitudes of and challenges faced by participants and loosely relate them to reforming attitudes. While COPUS data could clearly differentiate levels of reform, the attitudes discovered in the interviews was not able to clearly differentiate between reformer types. However, the identified themes can offer insight into how attitudes, subjective norms, and perceived behavioral controls can influence desires to reform.

From beginning reformers, we learn several lessons. Beginning reformers chose to implement strategies that were simpler and less time-consuming, such as using clicker questions or think-pair-share activities, and perhaps that choice explains why they did not encounter significant challenges or barriers to the adoption of those strategies. This result may provide insight for future faculty development programs in that encouraging smaller changes to teaching behaviors was less likely to cause participants to encounter difficult challenges or barriers to implementation. Although it did not result in dramatic changes to teaching practice, the changes were measurable and received favorably by students. Additionally, beginning (and dramatic) reformers–those who came into the workshop with a didactic teaching style–showed fully reformed attitudes suggesting that the workshop was inspiring and motivating to them, significantly influencing their attitudes, which likely motivated their willingness to try new strategies. Having a sufficiently interactive and enthusiastic workshop style may contribute to success.

Dramatic reformers, those who started as traditionally didactic instructors and implemented fully student-centered strategies, also provide valuable insight about effective professional development. One of the main themes in this group is that they experienced significant challenges. Because they were implementing dramatic changes to their curriculum, it is not surprising that their challenges were substantial. This can inform future faculty development programs by reminding us that those making dramatic changes are likely to require more scaffolding and support.

Advanced reformers are an interesting group. These are participants who were already using some reformed strategies, such as interactive lectures, who attempted to implement additional SCT strategies into their repertoire. One thing to note is that they expressed mixed attitudes toward SCT. It is possible that negative student feedback and their lack of expertise in SCT led them to question the true effectiveness of their reforming efforts. With more practice, it may be possible to shore up their attitudes about the importance and effectiveness of SCT. This can inform future faculty development efforts in reminding us that continued scaffolding and encouragement are likely important features of a successful reforming experience. Although advanced reformers report experiencing many logistical challenges, they are not deterred from implementing SCT. In fact, it may be due to their increased effort to implement SCT that such logistical challenges became apparent. Thus, when they encountered such challenges, they were motivated to overcome them. That dramatic and advanced reformers experienced similar logistical challenges may inform future faculty development programs. If a shift to SCT is intended to give students more control over their learning experience, departments may need to reevaluate their expectations for content coverage in favor of deeper learning of fewer topics.

Student-centered reformers were those who already had demonstrable experience implementing SCT strategies coming into this experience. We found that this group had fully reformed attitudes. Having extensive experience with these SCT strategies seems to have solidified their attitudes about the effectiveness of student-centered teaching despite any negative student feedback or significant challenges. Thus, in this group, focusing on convincing them that SCT is better than didactic strategies is likely not an effective use of time. Rather, further encouragement of their efforts is warranted. Student-centered reformers persevere in the face of negative student feedback, using it as a motivator for increased change, rather than as a stumbling block. They do not typically resort to lecturing even when SCT is challenging. Future professional development programs should understand that in the case of both advanced and student-centered reformers, faculty had to deal with mixed feedback from students. Their chosen degree of student-centeredness may depend on how committed they are to the philosophy of SCT.

Non-reformers displayed many of the same attitudes as those who chose to make measurable changes. First, non-reforms expressed a transitional view of SCT. In ways we do not yet fully understand, we were unable to significantly shift their attitudes and therefore their intentions to change their behaviors. This may have to do with their expressions of extrinsic motivation for participation (e.g., their Department Chair encouraged their participation, or they were participating in order to bolster their Broader Impacts section of NSF grants). Although professional development designers have little control over incoming attitudes, this stresses the importance of instilling positive attitudes about the benefits of SCT during the workshop. Additionally, while non-reformers did experience some positive feedback from students, they also experienced negative feedback. Because we did not quantify the amount or severity of student feedback, it is impossible to predict the exact effect of negative feedback on perceived behavioral control. However, we can hypothesize that perhaps the negative feedback among non-reformers was more significant and impactful than among other reformer groups. We stress the importance of continued follow-up and effective mentoring to avoid these negative impacts.

It is worth considering certain limitations to our data. A small sample size (n = 41), due to limited resources, provides us with only preliminary conclusions. However, in the future, with the aid of additional resources, we hope to have larger cohorts of STEM faculty to better understand the effectiveness of the STEMFI program. Additionally, it is important to acknowledge that our study was conducted at a large, private university in the western US, which may limit us in applying our conclusions about overcoming barriers to faculty change to other learning environments. Another potential limitation is that our sample consisted mostly of volunteers who may have already had an increased interest in reforming their instruction, although a few faculty participants were strongly encouraged by the administration to participate because of low teaching evaluations. Certainly, a study where faculty were all compelled to participate could give more unbiased data about the program’s effectiveness. Lastly, because our post observations were not chosen at random, but were chosen by participants to showcase the strategies they had learned and chosen to implement, the measure of change may be an overestimate of fully reformed teaching. In other words, the features observed after reform may not have been representative of the entire course or of lasting change. However, our goal was to motivate participants to use any SCT strategies, so we felt it was still representative of their use of strategies.

Based on the improvements in the teaching of the majority of our participants, we assert that the Theory of Planned Behavior was an effective framework for producing change in faculty behavior. Part of the STEMFI workshop directly addressed the effectiveness of SCT; we believe this played a role in shifting attitudes of our participants towards SCT and enabled them to internalize the belief that SCT was beneficial to student learning and development. Through monthly cohort meetings and the week-long summer workshop, we enabled faculty in cultivating positive subjective norms, meaning that they had regular interaction with other faculty that were trying to make the same difficult transition to more SCT. Regular meetings between mentors and participants were essential to effecting lasting teacher change. Faculty who receive regular mentoring report noticeable benefit to their teaching [29]. We recommend that additional studies employ a mentoring program to enable participants to implement the SCT strategies they have acquired.

Other studies on faculty development programs cite numerous barriers that impede faculty development. Satisfaction with current methods of instruction, such as traditional lecture without student involvement, is one such barrier [30]. Another faculty development program, called the Summer Teaching Institute, found that one and two years after the program, 98% of alumni said they were “still experimenting to improve their teaching”; however, a “lack of respect of colleagues in the department’’ was a major barrier to additional success in implementing SCT [9]. STEMFI sought to address this concern by emphasizing the mentor-participant meetings and monthly cohort meetings. Our results indicate that regular meetings with experienced mentors and other faculty engaged in implementing SCT were instrumental in aiding the majority of our STEMFI participants in their reform. However, we have not collected longitudinal data to test whether the change is lasting. Future studies of this methodology are needed to assess the perseverance of such change. Our preliminary study suggests that by providing instruction aimed at changing the participants’ attitudes toward SCT in the summer workshop, improving their subjective norms through mentoring and regular cohort meetings, and helping faculty develop a positive view of their perceived behavioral control, we have built upon the efforts of previous faculty development programs to create sustainable and lasting change.

Conclusions

Based on our data, we assert that teaching practices are malleable. As a professor develops a more positive attitude towards SCT, interacts with other faculty striving to do the same, and develops the intrinsic belief that he or she has the ability to implement such changes in the classroom, intention is refined, and behaviors are changed. As more faculty continue to develop an understanding of SCT and its benefits to students, we anticipate faculty overcoming barriers and implementing them in the classroom.

Supporting information

S1 File

(DOCX)

Acknowledgments

We thank the undergraduate researchers who performed COPUS evaluations. We are grateful for those who helped organize and execute the summer workshop and the STEMFI participants for allowing us to observe their teaching.

Data Availability

Data are available from the Brigham Young University Institutional Scholars archive (via https://scholarsarchive.byu.edu/data/49).

Funding Statement

"RLS, DMW, BES, REW, and JEJ were supported under grant DUE-1712056 from the US National Science Foundation (www.nsf.gov). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

References

  • 1.Xue Y, Larson RC. STEM crisis or STEM surplus? Yes and yes. Mon Labor Rev. 2015;138:1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Seymour E, Hunter AB. Talking about leaving revisited: persistence, relocation, and loss in undergraduate STEM education. Cham, Switzerland: Springer Nature; 2019. [Google Scholar]
  • 3.Stains M, Harshman J, Barker MK, Chasteen SV, Cole R, DeChenne-Peters SE, et al. Anatomy of STEM teaching in North American universities. Science. 2018;359:1468–1470. doi: 10.1126/science.aap8892 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ebert-May D, Derting TL, Hodder J, Momsen JL, Long TM, Jardeleza SE. What we say is not what we do: effective evaluation of faculty professional development programs. BioScience. 2011;61:550–558. [Google Scholar]
  • 5.Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, et al. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci U S A. 2014;111:8410–8415. doi: 10.1073/pnas.1319030111 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Panitz T, Panitz P. Encouraging the use of collaborative learning in higher education. In: Forest JJF, editor. University teaching. London: Routledge; 2018. pp. 161–202. [Google Scholar]
  • 7.Baker VL, Pifer MJ, Lunsford LG. Faculty development in liberal arts colleges: a look at divisional trends, preferences, and needs. High Educ Res Dev. 2018;37:1336–1351. [Google Scholar]
  • 8.Henderson C. Promoting instructional change in new faculty: an evaluation of the physics and astronomy new faculty workshop. Am J Phys. 2008;76:179–187. [Google Scholar]
  • 9.Pfund C, Miller S, Brenner K, Bruns P, Chang A, Ebert-May D, et al. Professional development. Summer institute to improve university science teaching. Science. 2009;324:470–471. doi: 10.1126/science.1170015 [DOI] [PubMed] [Google Scholar]
  • 10.Stigmar M. Faculty development through an educational action programme. High Educ Res Dev. 2008;27:107–120. [Google Scholar]
  • 11.Bilal Guraya SY, Chen S. The impact and effectiveness of faculty development program in fostering the faculty’s knowledge, skills, and professional competence: a systematic review and meta-analysis. Saudi J Biol Sci. 2019;26:688–697. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Ödalen J, Brommesson D, Erlingsson GÓ, Schaffer JK, Fogelgren M. Teaching university teachers to become better teachers: the effects of pedagogical training courses at six Swedish universities. High Educ Res Dev. 2019;38:339–353. [Google Scholar]
  • 13.Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, et al. Education. Scientific teaching. Science. 2004;304:521–522. doi: 10.1126/science.1096022 [DOI] [PubMed] [Google Scholar]
  • 14.Kang HS, Cha J, Ha BW. What should we consider in teachers’ professional development impact studies? Based on the conceptual framework of desimone. Creat Educ. 2013;4:11–18. [Google Scholar]
  • 15.Austin AE. Promoting evidence-based change in undergraduate science education. 2011. March 1 [cited 2023 July 17] Available from: https://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072578.pdf. [Google Scholar]
  • 16.Henderson C, Beach A, Finkelstein N. Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. J Res Sci Teach. 2011;48:952–984. [Google Scholar]
  • 17.Phuong TT, Cole SC, Zarestky J. A systematic literature review of faculty development for teacher educators. High Educ Res Dev. 2018;37:373–389. [Google Scholar]
  • 18.Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50:179–211. [Google Scholar]
  • 19.Smelser NJ, Baltes PB. International encyclopedia of the social & behavioral sciences. Amsterdam: Elsevier; 2001. [Google Scholar]
  • 20.Laksov KB, Elmberger A, Liljedahl M, Björck E. Shifting to team-based faculty development: a programme designed to facilitate change in medical education. High Educ Res Dev. 2022;41:269–283. [Google Scholar]
  • 21.Smith MK, Jones FH, Gilbert SL, Wieman CE. The classroom observation protocol for undergraduate STEM (COPUS): a new instrument to characterize university STEM classroom practices. CBE Life Sci Educ. 2013;12:618–627. doi: 10.1187/cbe.13-08-0154 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.West RE, Jensen JL, Johnson M, Nelsen J, Sansom R, Turley S, et al. STEM faculty institute: an intensive interdisciplinary effort to improve STEM faculty adoption of evidence-based instructional practices. J Coll Sci Teach. 2020;51:79–87. [Google Scholar]
  • 23.Bamiro AO. Effects of guided discovery and think-pair-share strategies on secondary school students’ achievement in chemistry. SAGE Open. 2015;5:2158244014564754. [Google Scholar]
  • 24.Bybee R. An instructional model for science education: developing biological literacy. Colorado Springs: Biological Sciences Curriculum Studies; 1993. [Google Scholar]
  • 25.POGIL. Process oriented guided inquiry learning 2020. [cited 2023 July 17] Available from: https://pogil.org/. [Google Scholar]
  • 26.Sansom RL, Suh E, Plummer KJ. Decision-based learning: ″If I just knew which equation to use, I know I could solve this problem!″. J Chem Educ. 2019;96:445–454. [Google Scholar]
  • 27.Strauss AL, Corbin J. Basics of qualitative research: grounded theory procedures and techniques. Thousand Oaks, CA: Sage; 1998. [Google Scholar]
  • 28.Creswell JW, Plano-Clark VL. Designing and conducting mixed-methods research. Thousand Oaks, CA: Sage; 2017. [Google Scholar]
  • 29.Huling L. Teacher mentoring as professional development. Washington, DC: ERIC Digest; 2001. [Google Scholar]
  • 30.Henderson C, Dancy MH. Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics. Phys Rev Spec Top-Phys Educ Res. 2007;3:020102. [Google Scholar]

Decision Letter 0

Ayse Hilal Bati

20 Feb 2023

PONE-D-22-33194Iterating toward change: improving student-centered teaching through the STEM faculty institute (STEMFI)PLOS ONE

Dear Dr. Jensen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

I expect you to submit your article for evaluation after reviewing and editing in terms of the issues highlighted by the reviewers.

==============================

Please submit your revised manuscript by Apr 06 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Ayse Hilal Bati, Associate Professor

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please amend your current ethics statement to address the following concerns:

a) Did participants provide their written or verbal informed consent to participate in this study?

b) If consent was verbal, please explain i) why written consent was not obtained, ii) how you documented participant consent, and iii) whether the ethics committees/IRB approved this consent procedure.

3. Please include your full ethics statement in the ‘Methods’ section of your manuscript file. In your statement, please include the full name of the IRB or ethics committee who approved or waived your study, as well as whether or not you obtained informed written or verbal consent. If consent was waived for your study, please include this information in your statement as well.

4. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide.

Additional Editor Comments :

Dear authors,

I expect you to submit your article for evaluation after reviewing and editing in terms of the issues highlighted by the reviewers.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

Reviewer #3: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I have attached a word document with this information. I have two main concerns with this paper. 1. The classes selected to observe with COPUS at the end of the program were not randomly selected but rather the participants selected the courses to be observed. This introduces consider bias into the study design and undermines the results presented. 2. The results from the qualitative analysis did not further differentiate the participants in each of the categories the researchers presented and at times different arguments were presented for the same findings.

Reviewer #2: Overall, this is a much needed study in the field. However, there are some issues with the qualitative data that need to be addressed. The most major issue is that the qualitative data needs to revised to make it much more clear of the themes and the process of finding these themes. For example: how did the themes emerge? How prevalent are these themes? How were the themes found (i.e., using inductive or deductive approaches). Commentary and explanation of the provided quotes are also necessary in order to explain how the quotes are related to the theme. Additionally, the triangulation between the COPUS and interview data needs to be significantly more explained than it is now. Some visuals may be useful to represent the qualitative data.

Here is some line feedback I have as well:

Line 53: Often workshops are one-off professional development opportunities. You seem to be describing a course redesign institute here. Being clear about what you mean by workshops will be important.

Line 85: How framework is connected to teaching and learning is important to include.

Line 96-98: Additional explanation of how your description of the program is related to the attitude toward the behavior is important here.

Reviewer #3: The authors of this manuscript used Ajzen’s Theory of Planned Behavior to design the STEM Faculty Institute (STEMFI) and categorize the types of reformers who completed this professional development program. Pre and post surveys, interviews and classroom observations were used to create descriptive profiles of participants who changed their instructional styles following STEMFI. Based on the improvements of the majority of the participants, the researchers asserted that the Theory of Planned Behavior was an effective framework for producing change in faculty behavior.

Overall, I think that this manuscript was well-written with a sound rationale and sophisticated design, and analysis. I have no request for modifications. I felt inspired while reading this manuscript and I think that this content will be a substantial contribution to the study of faculty professional development programs.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Ashley Nicole Harlow

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: Review PONE.docx

PLoS One. 2023 Aug 17;18(8):e0289464. doi: 10.1371/journal.pone.0289464.r002

Author response to Decision Letter 0


4 Apr 2023

Response to Reviewers:

Reviewer #1:

I have attached a word document with this information. I have two main concerns with this paper. 1. The classes selected to observe with COPUS at the end of the program were not randomly selected but rather the participants selected the courses to be observed. This introduces consider bias into the study design and undermines the results presented. 2. The results from the qualitative analysis did not further differentiate the participants in each of the categories the researchers presented and at times different arguments were presented for the same findings.

Major Concerns

#1. Including COPUS as an objective empirical assessment of participants actual teaching practices in the classroom had the potential of providing a rigorous assessment the program. However, the fact that the researcher allowed the STEMFI participants to select the classes to be COPUSed at the end of the program rather than randomly selecting classes as they did in the first semester, severely undermines the rigor of the results. By allowing the faculty to select the post-classses to COPUS, they introduced considerable bias into their results. Given that this study is now concluded, I am not sure how to resolve this issue.

Thank you for this feedback. We agree that allowing participants to choose the class post-workshop would be biased if we were trying to show that the instructor was truly reformed in all aspects of their teaching. However, we were just trying to show that instructors could take what they had learned and successfully implement it in their courses (hence, we wanted to see those particular courses). Given this distinction, however, we agree that our language throughout was perhaps not so clear on this point. We have gone through the paper and cleaned that up. We have also made a note of this in our limitations section.

#2. Combining both a quantitative and qualitative analysis was a nice idea. However, I did not see enough differentiation among the qualitative results for each categorization of faculty post involvement in the program to be helpful. It felt like all reform groups said the same thing yet the authors interpreted the results differently and went beyond the scope of the coding rubric they established and presented in the paper.

This is an interesting perspective and it made us re-think how we described our categories. We agree that it was difficult to differentiate among the reformer types with the qualitative data and we decided that perhaps that was not the point. We have gone in and rewritten the qualitative section to be more about the four themes that emerged in conjunction with the TPB framework, with suggestions for trends that match the profiles. But, we emphasize that it is not always a direct relationship between them (i.e., there is a lot of overlap). We have also included descriptions of our didactic non-reformers.

# 3. Under the Explanatory results-Descriptive profiles section of the paper you left out two key categories of participants: Interactive Reformers and Didactic Non-reformers. Neither of these groups changed over the course of the program. In many ways your qualitative data from these two groups would be as important or more important than the data from the groups who did change.

See above - we have included additional descriptions of non-reformers.

Minor Concerns

Introduction

On line 43 the authors introduce the term student-centered teaching (STC). Could they please explain why they are using this term rather than active learning or evidence-based teaching practices. Using multiple terms for the same teaching methods can introduce confusion to the professional development field of study.

We prefer the term “Student-Centered Teaching” to emphasize the constructivist approach that we were teaching. Active learning seems too broad (i.e., teaching can be active and not constructivist). Evidence-based practices is also just not as descriptive. We have made sure we consistently use that term throughout.

Line 60. As the authors only cite one study [4], it would be best if this sentence began with the qualifier “some”. Also it is not clear what you mean by “including those with pedagogical training”. What does “those” refer to.

Thank you for catching this. We have changed this sentence.

Ln 63-67. A key citation that is missing from this section is “Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. https:// doi. org/ 10.1002/ tea. 20439” Henderson et al. review of 191 articles describing professional development, very succinctly concludes that are two common practices that do not work and he identifies four practices of successful programs. STEMFI includes all of these.

Thank you for the excellent paper and suggestion. We have added this.

Ln 70. An additional reference by Ann Austin would be most appropriate here as well.

Austin, A. E. (2011). Promoting evidence-based change in undergraduate science education. In Fourth committee meeting on status, contributions, and future directions of discipline-based education research.

Another useful citation. Thank you..

There is a good deal of repeated material in the Introduction and Methods section of the paper. I realize that the authors are try to align different components of STEMFI with their theoretical framework but it gets confusing. For instance, the information on Ln 77-81 is repeated in Methods and really does not fit in the introduction.

We have removed that paragraph.

This issue arises again in Ln 115-128. Most of this content in repeated in the Methods section starting at LN 149. Ln 125-128 particularly is methods rather than introduction.

Possibly the authors could move and consolidate the description “STEMFI Program” to the Introduction and address how you collected COPUS data and coded interview data in Methods.

We have removed redundancy in the Introduction and preferred to keep all methodologies in the methods.

Methods

Please present aggregate information on the gender and ethnicity of the participants.

We have included gender. Ethnicity data was not collected.

Please state how the four random class sessions were selected. Was COPUS conducted in person or by watching a video of class. If in person, was the faculty member aware of that this class was being observed? For the Post-COPUS observation, was this done at the end of the second term that they taught the same course or at the end of the first term they taught the course?

We have further clarified this in the methods.

How long was the summer workshop and were faculty financially compensated for attendance.

We have added this to the methods.

Please indicate who served as peer mentors and how were mentors paired with determined.

We have added this, as well

One major concern is your post-COPUS observation method. You state that you allowed the faculty to select the post-course to be observed. In the paper you gave your explanation for this (ln 185-187). However, I see this as a major limitation of your study. It is possible that the faculty member only used the new teaching method on those 3-4 days and used didactic teaching methods on the other 40 days of teaching. Therefore, did they really change and was your categorization of them (fig 2) correct. The post-COPUS score would indicate how well they implemented the select group or type of evidence-based teaching they used that day but may not reflect their actual teaching method. You need to put many more qualifiers on this part of your results.

Thank you. We have commented on this above and we have made additions to our paper to indicate this potential limitation.

The categories you created to classify participants are very clever and the figure 2 is a nice representation of that categorization method.

Thank you!

Interview Protocol

You state that your interview protocol was to align with the Theory of Planned Behavior (TPB) and address attitudes, student response and confidence. However, you then created four themes but there were only three attributes of the TPB. Two of your themes 1) attitude and 2 student responses align directly with TPB but motivation and challenges do not align. This was confusing. Could you explain how the TPB informed the last two themes of motivation and challenges. You may have to do this in results section.

Thank you for asking for this clarification. In our rewrite of our qualitative data results section, we have better clarified how these fit. It’s important to note that the interviews were only semi-structured and the data were analyzed using an emergent themes approach. Thus, two themes that naturally emerged went beyond TPB. We have included this in our methods.

The sentences from 224 to 227 do not fit under Interview Protocol but rather should be a stand alone paragraph indicating how you will triangulate your quantitative and qualitative data.

Done

Results

Quantitative

Please add the academic rank to the participants in each category. You could add three rows under your existing row at bottom of Fig 2 and indicate the number of faculty who are Assist., Assoc., Full Professors. This would help the reader see if newer faculty are more likely to change than established faculty.

We felt that adding it to the figure was just a little too messy. But we have included that information in the text of the results.

Qualitative results

The sentence on Ln 249 starting with “Attitudes… should be the beginning of a new paragraph as you are now presenting results on your first theme of attitudes.

Okay.

The quotes you presented nicely support the binary categories you established.

Thank you!

Explanatory Results- Descriptive profiles

This section was challenging to follow. I found myself creating a chart with the four themes in the first column and each column was one of your faculty categories, i.e. Beginning Reformer, Dramatic Reformer, etc. and putting the binary classification in each appropriate cell. I strongly suggest you create such a table.

Also, rather than just presenting the results from the coding of the interviews there is a good deal of interpretation of these results. Keeping the interpretation of results for the discussion section could be helpful to the reader and allow more compare and contrasting between faculty categories.

I am also concerned that the results for this section do not add further clarity to your results. When I look at the results for motivation all four faculty categories show intrinsic motivation and all four show logistical challenges. On Ln 430, you indicate that student centered reformers perceived the challenge differently but what method did you use to code perception? The authors may be reading too much into faculty answers.

When you presented results for Advanced Reformer as well as Student-centered Reformer, you changed the order of presentation of the four themes. Please keep the same order as the others, i.e. attitude, student response, motivation and then challenges. This section has high cognitive load for the reader and keeping the same order of presentation of themes within each category would help decrease the load.

I was very much looking forward to your qualitative triangulation of the Interactive and Didactic Reformer. What did you find in your interviews with them that could shed light on faculty who do not change. These are really the major challenge for future professional development programs. It is necessary that you provide the explanatory results for these two profiles.

We agree with you completely. We have significantly changed this section to try to make it clearer and more useful for interpretation. We have also included descriptions for non-reformers.

Discussion

As I mentioned earlier, the interpretation of result found in the results section would fit better in the discussion section.

Agreed. We have tried to move our interpretations all to the discussion.

I did not see data presented to support your claim in Ln 455 “By addressing the attitudes….”

We have modified this sentence to be more clear.

Ln. 471. The sentence starting with “our results…” needs a qualifier as not all faculty changed categories by the end of the program. I would suggest “SCT was instrumental in aiding 80% (or the majority of participants) STEMFI participants.

We have modified this sentence.

Ln 473. I am not clear how STEMFI program specifically addressed the participants attitudes.

We have clarified the claim in this paragraph.

Reviewer #2: Overall, this is a much needed study in the field. However, there are some issues with the qualitative data that need to be addressed. The most major issue is that the qualitative data needs to revised to make it much more clear of the themes and the process of finding these themes. For example: how did the themes emerge? How prevalent are these themes? How were the themes found (i.e., using inductive or deductive approaches). Commentary and explanation of the provided quotes are also necessary in order to explain how the quotes are related to the theme. Additionally, the triangulation between the COPUS and interview data needs to be significantly more explained than it is now. Some visuals may be useful to represent the qualitative data.

Thank you for the feedback. We have added a little bit more detail to methods explaining the thematic analysis. We did not, however, quantify themes as that was not the focus of the analysis. We do agree that the profiles portion of our quantitative analysis (i.e., the triangulation between copus “reformer types” and interview data) was not as clear as we had hoped (as another reviewer pointed out). We have opted to focus on the four themes and their relation to the Theory of Planned Behavior. And we have explained in our discussion how the qualitative data was not able to differentiate “reformer types” like the quantitative could.

Here is some line feedback I have as well:

Line 53: Often workshops are one-off professional development opportunities. You seem to be describing a course redesign institute here. Being clear about what you mean by workshops will be important.

This was a good point. We have added additional detail to clearly show that this program is much more involved than a one-off PD experience.

Line 85: How framework is connected to teaching and learning is important to include.

Great point. We have added additional detail to that paragraph.

Line 96-98: Additional explanation of how your description of the program is related to the attitude toward the behavior is important here.

Thank you. We have clarified.

Reviewer #3: The authors of this manuscript used Ajzen’s Theory of Planned Behavior to design the STEM Faculty Institute (STEMFI) and categorize the types of reformers who completed this professional development program. Pre and post surveys, interviews and classroom observations were used to create descriptive profiles of participants who changed their instructional styles following STEMFI. Based on the improvements of the majority of the participants, the researchers asserted that the Theory of Planned Behavior was an effective framework for producing change in faculty behavior.

Overall, I think that this manuscript was well-written with a sound rationale and sophisticated design, and analysis. I have no request for modifications. I felt inspired while reading this manuscript and I think that this content will be a substantial contribution to the study of faculty professional development programs.

Thank you so much!

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Ayse Hilal Bati

22 May 2023

PONE-D-22-33194R1Iterating toward change: improving student-centered teaching through the STEM faculty institute (STEMFI)PLOS ONE

Dear Dr. Jensen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

Dear Author/s Your article should be edited in line with reviewer suggestions. It will be evaluated by me after this arrangement.

==============================

Please submit your revised manuscript by Jul 06 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Ayse Hilal Bati, Professor

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

Dear Author/s

Your article should be edited in line with reviewer suggestions. It will be evaluated by me after this arrangement.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: (No Response)

Reviewer #3: All comments have been addressed

Reviewer #4: All comments have been addressed

Reviewer #5: (No Response)

Reviewer #6: All comments have been addressed

Reviewer #7: (No Response)

Reviewer #8: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

Reviewer #3: Yes

Reviewer #4: Yes

Reviewer #5: Partly

Reviewer #6: Yes

Reviewer #7: Partly

Reviewer #8: Partly

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

Reviewer #4: Yes

Reviewer #5: N/A

Reviewer #6: Yes

Reviewer #7: N/A

Reviewer #8: N/A

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

Reviewer #4: No

Reviewer #5: (No Response)

Reviewer #6: Yes

Reviewer #7: Yes

Reviewer #8: No

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

Reviewer #4: Yes

Reviewer #5: Yes

Reviewer #6: Yes

Reviewer #7: No

Reviewer #8: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: May 6, 2023

Re-Review PLoS One PONE-D-22-33194

The authors have done a very nice job of revising their manuscript based on my initial comments. I find the manuscript is now easier to read and presents a stronger case for their findings. I have only a few minor changes and I do not need to rereview these minor changes.

Unfortunately the manuscript did not come with line numbers so I will only be able to use page number to indicate areas for change.

Page 7

You start off this section indicating that the program consisted of three phases: pre-, during, and post-workshop. You indicate the Pre-workshop content and in paragraph 2 on page 8 you identify phase three. Where is Phase 2. It would be nice if paragraph one on page 8 began with the phrase, Phase 2 involved faculty participation in a on-time workshop.

On the other hand you could just drop the use of the term “phases:.

Page 9 top

“observed using COPUS to measure their post teaching behaviors. Participants chose which classes would be observed.”

Please remove the sentence “Participants chose which classes would be observed” as you more fully explain this in the next section about Observations. This much detail feels out of place here.

Page 9 bottom

“Post-observations were made in the first semester they taught following their participation and were requested by the participant in order to showcase the new techniques they were planning to use.”

This is a nice sentence that addresses the non-random selection of post-workshop courses to COPUS.

The word “selected” would be a better word than “requested”

Page 10

“likewise those who were already using student-centered strategies and continued

being student-centered were labeled “Student-Centered Reformers”.

I found it odd that a faculty member that started as student-Centered and remained at student -centered would be titles a “reformer” as in fact they did not reform they just kept doing what they were already doing. If however they “broadened their strategies’ as was stated for the interactive to interactive faculty than I could accept labeling them as reformers. Please add more text here to show how they can be considered “reformers”.

This will also impact your statement on page 12 that states 35 of 41 participants changed their teaching enough to be “reformers”. If faculty who remained in their category ( Inter to Inter, or SC to SC) and did not add new strategies, they should not be termed reformers.

Page 13-top of page.

“so they can only hint at potential differences between reforming attitudes.”

I would rewrite as “so they can only hint at potential differences between reformers.”

Page 13

You italicized the term in transition but did not italicize the dichotomus pair “Fully Reformed”. As you do not italicize the dichotomous pairs under the other themes please remove the italics from in transition.

Page 21

Student-centered reformers were those who already had demonstrable experience implementing SCT strategies coming into this experience. Their attitudes can teach us several

things. First, we see fully reformed attitudes among this group

The middle sentence, Their attitudes.. needs a rewrite. I suggest the following

Student-centered reformers were those who already had demonstrable experience implementing SCT strategies coming into this experience. We found that this groups has fully reformed attitudes.

Page 21- bottom

Non-reformers displayed many of the same attitudes as those who chose to make

measurable changes. However, we can learn a few things from them that can help inform future efforts.

The second sentence felt a bit too casual with your reader. I would just remove it.

Page 24 Conclusions

“According to the established literature, students who learn through a more student-centered

approach have a greater overall retention of information [e.g., 31].”

I have read citation 31 Smith et al. 2014. They conducted a survey of teaching practices across their university using COPUS and had faculty self-report the practices they use. They noted a high degree of alignment between COPUS results and self-reports. I did not see any mention of “greater overall retention of information”. Please remove this citation.

You may need to remove this whole last sentence because I am unaware of any established literature that has shown greater overall retention of information. There are many citations that support increased exam scores but none that I know of that deal with retention.

Reviewer #2: Overall, my biggest feedback is what Reviewer 1 previously mentioned about the post-observation not being at random and only occurring the single time when the pre-observation occurred multiple times. This is problematic as the instructor could be performative in their methods and not actually adopting SCT strategies. Additionally, only doing the post-observation once, is a similar issue that leads for the data to not be as generalizable on if the class was a one-off or if the instructor has really adopted SCT strategies. Overall, I think to help remediate this, the categorization of faculty needs to be written differently as you can’t confirm that they are now student-centered instructors, but rather that they can use the techniques efficiently. This needs to be made clear in the manuscript.

Reviewer #3: I appreciate the edits made to further clarify the results and methods sections of this manuscript. The faculty professional development community will greatly benefit from the review of this study and program.

Reviewer #4: This submission is interesting, well written and an important topic. The authors discuss the theoretical concepts leading to their intervention, which adds to the depth of the submission. The intervention and it's evaluation are also multi-layered and the authors have already made some helpful changes to assist the reader in understanding their results e.g. Figure 2. I would like to request some minor changes. 1. Please describe acronyms fully when first introduced. The paper assumes prior knowledge e.g. STEM. COPUS is described but only on Page 7 when the acronym was introduced much earlier. 2. The link provided to the data repository didn't list the study https://scholarsarchive.byu.edu/ - please provide full details about where the data is located. 3. The authors discuss the concept in the introduction that "development programs frequently do not cause lasting changes to teaching strategies or student engagement" - in the discussion please address how you plan to overcome this issue. I wouldn't classify the current duration of observation as 'long-term'.

Reviewer #5: This study evaluated the effectiveness of a STEMFI on student-centered teaching. Both classroom observations with COPUS and post-interviews were used for evaluation. The results showed that faculty shifted toward student-centered teaching after the STEMFI program. The manuscript is well-written, and the findings are interesting. However, some important clarifications about the results are needed. 1. COPUS captures teacher and student behaviors (e.g., lecturing, asking questions, group work), but it does not capture the specific student-centered strategies (e.g., think-pair-share, 5E learning cycle). Faculty shifted toward student-centered teaching could be because they spent more time on group work, but does not necessarily mean they used more student-centered strategies. Also, the characteristics of each COPUS profile (in terms of the COPUS codes) need to be discussed to help readers interpret the results, especially those who are not familiar with COPUS. Please clarify for each type of reformer, what student-centered strategies they tried after participation, or they used the same strategies as before but spent more class time on those strategies. 2. I am concerned about the reliability of the observation data. How were graduate and undergraduate students trained for using COPUS? Have you investigated the inter-rater reliability?

More detailed comments below:

1. In my opinion, COPUS codes and the characteristics of each COPUS profile should be discussed in more detail in order to help readers interpret the results. COPUS codes are somewhat general, which doesn’t capture specific student-centered strategies. I think the authors need to be more carefully describing how faculty reformed their teaching. Did they spend less time lecturing but more time on group work, or did they implement some new student-centered strategies, such as 5E learning cycle and POGIL? Also, I think you also need to give more details about the strategies introduced in STEMFI. For example, what is POGIL if someone doesn’t already know.

2. It is unclear to me how graduate and undergraduate students were trained for using COPUS. How many students did the observations? How were they assigned for observations for different faculty? What is the inter-rater reliability?

3. On page 12, the authors said some post-data was collected on hybrid or online courses. How did it affect the classroom observations? I imagine it is hard to do COPUS with online courses. Also, how did it affect the COPUS profile? Were online courses more likely to be didactic?

4. Thematic analysis. How many researchers coded the transcripts? Did you always have two researchers do independent coding first?

5. Please give more details about the triangulation of COPUS and interview results. Please say more about the explanatory mixed methods design, and how you “merge” the findings.

6. Did the faculty need to apply to get into the program? If so, how many applications each year? Also, I think those who applied are the ones more interested in SCT, and may have more positive attitudes toward SCT. Did you do pre-interview? Any chance those participants already had positive attitudes toward SCT before the program? Can you please comment on how this can affect your findings?

7. You have student responses and challenges as two separate themes. The literature has shown that students’ negative responses is one barrier for reformed teaching. Can you please comment on why you decided not to include this as a challenge? On page 18, there is a quote of student negative feedback as a perceived challenge.

8. The development of the interview protocol was informed by TPB. Am I understanding it correctly that the students were considered “significant others” and students’ responses gave subject norms? If you could be more explicitly on how each of the three factors informed the interview questions, that would be great.

9. You included faculty ranking when reporting the results. Did you see any patterns of faculty at different rankings?

Thank you for your work! Looking forward to your responses.

Reviewer #6: An excellent addition to the literature on faculty teaching development. The authors have responded well to the comments from initial review and made updates that improve the quality and readability of their manuscript. I have no further feedback to offer, and look forward to seeing this manuscript published!

Reviewer #7: There are minor revisions (e.g., use of terminology, condensing the Discussion). Careful edit of the text is necessary.

Reviewer #8: Summary:

This study attempts to examine the impacts of a faculty professional development program, STEMFI, on the impacts on instructional practices (COPUS data) and instructor’s attitudes, perceptions of students’ responses, and confidence (interview data). The study appears to be well aligned to the selected theoretical framework, The Theory of Planned Behavior. And there are some clear visual representations in Figures 1 and 2. Unfortunately, there are quite a few major revisions that need to be addressed before this manuscript is considered any further: (1) focus is on examining student-centered teaching (SCT), when I think it should be on active learning; (2) educational problem(s) and/or research question(s) are absent (or at least unclear); (3) COPUS data are presented in a qualitative manner, but authors claim that these are quantitative results; and (4) mixed methods research design is unclear. I suggest that the authors rework the introduction and methods with the suggestions below in mind.

Major issues:

Freeman et al. (2014) meta-analysis focuses on the impacts of active learning on student performance outcomes, not student-centered teaching (SCT). I would be careful to not conflate active learning and SCT. I would suggest that the terminology is changed from SCT to active learning since not all active learning is necessarily student-centered.

Relatedly, is there a reason that you decided to write your own definition of SCT (or what I am suggesting is active learning in my point above)? Why did you not want to use a definition from the literature? I suggest that you review this CBE LSE paper on active learning for potential literature-based definitions: Driessen, E. P., Knight, J. K., Smith, M. K., & Ballen, C. J. (2020). Demystifying the meaning of active learning in postsecondary biology education. CBE—Life Sciences Education, 19(4), ar52.

The educational problem(s) and/or research question(s) are not clearly stated in the abstract nor at the end of the introduction. I am not sure what is being examined in this study, so it will be hard for me to determine if the methods are appropriate and well-aligned with the educational problem and/or research question.

While Figure 2 is clear for understanding how groups of instructors shifted from one COPUS profile to another, I don’t think that Figure 2 is sufficient for the reader to be able to understand the quantitative nature of the COPUS results. Right now, the results are described in a qualitative manner, which is okay; however, the authors claim that the results are quantitative. I would suggest that you give an example of the instructor and student behaviors occurring in each of those six types of reformer classrooms. See Shi et al. (2023) CBE LSE (https://doi.org/10.1187/cbe.22-03-0047) for an example of figures.

Also, the mixed methods research design is unclear to me. Right now, it appears that COPUS data were collected, and interview data were collected, but how one dataset might inform the other needs to be further explored by the authors to answer the research question(s).

Minor issues:

Introduction:

Page 3

Please add a reference to the last statement on this page about the primary objective of these workshop. In addition, I think that “knowledge” is missing from the list of primary outcomes of faculty professional development programs.

Methods:

Page 7

Participants: Did you collect any other demographics data from the instructors? Or only discipline and gender (in the binary)? Also, could you please describe the student population being served by these instructors being studied? And finally, please described the process for recruiting these instructors.

Page 9

How did you ensure that the students were trained to code COPUS in a reliable manner? Did you calculate inter-rater reliability after this training? Please provide more details on how you tested for coder reliability.

Page 10

I was not able to see the supplemental information to be able to see the full interview protocol. Also, it would be helpful if the supplemental file information (e.g., Supplemental File S1) was noted in the text.

Page 11

Who are the several researchers that thematically coded the interview responses? I think it is important to be transparent about which specific researchers did this work to build trustworthiness in the data.

How exactly did you connect the quantitative and qualitative data using exploratory mixed methods design? It is unclear to me which COPUS and which interview data were used for these analyses. It is not until the results that I read that themes were not tied to specific reformer types. It is important that you bring up these details in the methods, not just results.

Results:

Page 11

I would move this participant info to the methods section. It’s good to be transparent about the loss of participants due to COVID, but I don’t think it is a great way to start your results section.

Why did you decide to add the ranking of the instructors to the quantitative results? Are you interested in examining how instructor rankings impacted use of student-centered teaching practices? Again, I am not sure if these results are aligned to the research questions and methods as I did not clearly see research questions earlier on.

References:

Check first author spelling for this citation: Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

Reviewer #4: Yes: Richard G McGee

Reviewer #5: No

Reviewer #6: No

Reviewer #7: No

Reviewer #8: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: PLOS ONE FEEDBACK.pdf

Attachment

Submitted filename: ReReview PLoS One PONE-D-22-33194.docx

Attachment

Submitted filename: Review PONE 5-7-23v2docx.docx

PLoS One. 2023 Aug 17;18(8):e0289464. doi: 10.1371/journal.pone.0289464.r004

Author response to Decision Letter 1


30 May 2023

Response to Reviewers

Reviewer #1: May 6, 2023

Re-Review PLoS One PONE-D-22-33194

The authors have done a very nice job of revising their manuscript based on my initial comments. I find the manuscript is now easier to read and presents a stronger case for their findings. I have only a few minor changes and I do not need to rereview these minor changes.

Unfortunately the manuscript did not come with line numbers so I will only be able to use page number to indicate areas for change.

Page 7

You start off this section indicating that the program consisted of three phases: pre-, during, and post-workshop. You indicate the Pre-workshop content and in paragraph 2 on page 8 you identify phase three. Where is Phase 2. It would be nice if paragraph one on page 8 began with the phrase, Phase 2 involved faculty participation in a on-time workshop.

On the other hand you could just drop the use of the term “phases:.

Good suggestion. We have added this phrase for clarification.

Page 9 top

“observed using COPUS to measure their post teaching behaviors. Participants chose which classes would be observed.”

Please remove the sentence “Participants chose which classes would be observed” as you more fully explain this in the next section about Observations. This much detail feels out of place here.

Good suggestion. We have removed this sentence for clarification.

Page 9 bottom

“Post-observations were made in the first semester they taught following their participation and were requested by the participant in order to showcase the new techniques they were planning to use.”

This is a nice sentence that addresses the non-random selection of post-workshop courses to COPUS.

The word “selected” would be a better word than “requested”

Thank you for this suggestion. We have adjusted the wording.

Page 10

“likewise those who were already using student-centered strategies and continued

being student-centered were labeled “Student-Centered Reformers”.

I found it odd that a faculty member that started as student-Centered and remained at student -centered would be titles a “reformer” as in fact they did not reform they just kept doing what they were already doing. If however they “broadened their strategies’ as was stated for the interactive to interactive faculty than I could accept labeling them as reformers. Please add more text here to show how they can be considered “reformers”.

This will also impact your statement on page 12 that states 35 of 41 participants changed their teaching enough to be “reformers”. If faculty who remained in their category ( Inter to Inter, or SC to SC) and did not add new strategies, they should not be termed reformers.

We have added more text to page 10 and 12 to clarify the description of classification.

Page 13-top of page.

“so they can only hint at potential differences between reforming attitudes.”

I would rewrite as “so they can only hint at potential differences between reformers.”

Thank you, we adjusted this wording.

Page 13

You italicized the term in transition but did not italicize the dichotomus pair “Fully Reformed”. As you do not italicize the dichotomous pairs under the other themes please remove the italics from in transition.

Adjusted.

Page 21

Student-centered reformers were those who already had demonstrable experience implementing SCT strategies coming into this experience. Their attitudes can teach us several

things. First, we see fully reformed attitudes among this group

The middle sentence, Their attitudes.. needs a rewrite. I suggest the following

Student-centered reformers were those who already had demonstrable experience implementing SCT strategies coming into this experience. We found that this groups has fully reformed attitudes.

Great suggestion. We have implemented the edit.

Page 21- bottom

Non-reformers displayed many of the same attitudes as those who chose to make

measurable changes. However, we can learn a few things from them that can help inform future efforts.

The second sentence felt a bit too casual with your reader. I would just remove it.

We have removed colloquial wording.

Page 24 Conclusions

“According to the established literature, students who learn through a more student-centered

approach have a greater overall retention of information [e.g., 31].”

I have read citation 31 Smith et al. 2014. They conducted a survey of teaching practices across their university using COPUS and had faculty self-report the practices they use. They noted a high degree of alignment between COPUS results and self-reports. I did not see any mention of “greater overall retention of information”. Please remove this citation.

You may need to remove this whole last sentence because I am unaware of any established literature that has shown greater overall retention of information. There are many citations that support increased exam scores but none that I know of that deal with retention.

This is an excellent point. We have removed the sentence entirely.

Reviewer #2: Overall, my biggest feedback is what Reviewer 1 previously mentioned about the post-observation not being at random and only occurring the single time when the pre-observation occurred multiple times. This is problematic as the instructor could be performative in their methods and not actually adopting SCT strategies. Additionally, only doing the post-observation once, is a similar issue that leads for the data to not be as generalizable on if the class was a one-off or if the instructor has really adopted SCT strategies. Overall, I think to help remediate this, the categorization of faculty needs to be written differently as you can’t confirm that they are now student-centered instructors, but rather that they can use the techniques efficiently. This needs to be made clear in the manuscript.

Throughout the manuscript we have added and softened the claims that the participants became student-centered instructors, but rather that they can use the techniques now more efficiently.

Reviewer #3: I appreciate the edits made to further clarify the results and methods sections of this manuscript. The faculty professional development community will greatly benefit from the review of this study and program.

Thank you for your time in reviewing and encouragement. We are excited to share our results.

Reviewer #4: This submission is interesting, well written and an important topic. The authors discuss the theoretical concepts leading to their intervention, which adds to the depth of the submission. The intervention and it's evaluation are also multi-layered and the authors have already made some helpful changes to assist the reader in understanding their results e.g. Figure 2. I would like to request some minor changes.

1. Please describe acronyms fully when first introduced. The paper assumes prior knowledge e.g. STEM. COPUS is described but only on Page 7 when the acronym was introduced much earlier.

We have clarified the acronyms. Thank you for pointing out this edit.

2. The link provided to the data repository didn't list the study https://scholarsarchive.byu.edu/ - please provide full details about where the data is located.

Thank you for the reminder. We were waiting to post it, until the manuscript was accepted. We have fixed the URL to access the data that is now posted: https://scholarsarchive.byu.edu/data/49

3. The authors discuss the concept in the introduction that "development programs frequently do not cause lasting changes to teaching strategies or student engagement" - in the discussion please address how you plan to overcome this issue. I wouldn't classify the current duration of observation as 'long-term'.

In the discussion we have qualified our perspectives on lasting change based on the evidence from this study. Thank you for helping us connect these ideas in our paper.

Reviewer #5: This study evaluated the effectiveness of a STEMFI on student-centered teaching. Both classroom observations with COPUS and post-interviews were used for evaluation. The results showed that faculty shifted toward student-centered teaching after the STEMFI program. The manuscript is well-written, and the findings are interesting. However, some important clarifications about the results are needed. 1. COPUS captures teacher and student behaviors (e.g., lecturing, asking questions, group work), but it does not capture the specific student-centered strategies (e.g., think-pair-share, 5E learning cycle). Faculty shifted toward student-centered teaching could be because they spent more time on group work, but does not necessarily mean they used more student-centered strategies. Also, the characteristics of each COPUS profile (in terms of the COPUS codes) need to be discussed to help readers interpret the results, especially those who are not familiar with COPUS. Please clarify for each type of reformer, what student-centered strategies they tried after participation, or they used the same strategies as before but spent more class time on those strategies. 2. I am concerned about the reliability of the observation data. How were graduate and undergraduate students trained for using COPUS? Have you investigated the inter-rater reliability?

We have responded to each of the detailed comments below.

More detailed comments below:

1. In my opinion, COPUS codes and the characteristics of each COPUS profile should be discussed in more detail in order to help readers interpret the results. COPUS codes are somewhat general, which doesn’t capture specific student-centered strategies. I think the authors need to be more carefully describing how faculty reformed their teaching. Did they spend less time lecturing but more time on group work, or did they implement some new student-centered strategies, such as 5E learning cycle and POGIL? Also, I think you also need to give more details about the strategies introduced in STEMFI. For example, what is POGIL if someone doesn’t already know.

The COPUS measurement allows us to distinguish active learning strategies into a few categories. Codes include more teacher-centered approaches, such as lecturing by the instructor and listening by the students, more interactive strategies like student questions and instructor answers, and more fully student-centered strategies like group work, clicker questions, and the instructor moving and guiding around the room. It does not, however, assess specific active techniques. For more description of the instrument and codes, see Smith et al 2018. We have clarified the manuscript to include this description. Thank you for pointing out this edit. Additionally, we have included brief detail of the workshop but referred readers to our other publication where the workshop is described in more detail.

We have added information and an additional citation about the specific strategies introduced in STEMFI.

2. It is unclear to me how graduate and undergraduate students were trained for using COPUS. How many students did the observations? How were they assigned for observations for different faculty? What is the inter-rater reliability?

We used the training protocol outlined in Smith et al. This is already included in the manuscript. This is a common training protocol used for the COPUS. (“ Undergraduate and graduate students were trained to use the COPUS using the training protocol established in Smith et al. [21].”)

3. On page 12, the authors said some post-data was collected on hybrid or online courses. How did it affect the classroom observations? I imagine it is hard to do COPUS with online courses. Also, how did it affect the COPUS profile? Were online courses more likely to be didactic?

We have added a clarifying statement that addresses the data collected on hybrid courses and how it affected the classroom observations.

4. Thematic analysis. How many researchers coded the transcripts? Did you always have two researchers do independent coding first?

Please see the description that was already revised from previous reviewers under the heading “Interview protocol for qualitative analysis”

5. Please give more details about the triangulation of COPUS and interview results. Please say more about the explanatory mixed methods design, and how you “merge” the findings.

Previous reviewers suggested that the triangulation was not clear. We significantly re-wrote the qualitative portion, responding to reviewers as such:

“This is an interesting perspective and it made us re-think how we described our categories. We agree that it was difficult to differentiate among the reformer types with the qualitative data and we decided that perhaps that was not the point. We have gone in and rewritten the qualitative section to be more about the four themes that emerged in conjunction with the TPB framework, with suggestions for trends that match the profiles. But, we emphasize that it is not always a direct relationship between them (i.e., there is a lot of overlap). We have also included descriptions of our didactic non-reformers.”

We also added to our manuscript, the following: “Through the lens of the Theory of Planned Behavior, we triangulated these categories with COPUS data to make loose hypotheses about the motivations of each reformer. However, we found that the themes were not robustly tied to specific reformer types in an exclusive way, so they can only hint at potential differences between reformers.”

Without more details about what you’d like to see, it is difficult to know what additional information is needed.

6. Did the faculty need to apply to get into the program? If so, how many applications each year? Also, I think those who applied are the ones more interested in SCT, and may have more positive attitudes toward SCT. Did you do pre-interview? Any chance those participants already had positive attitudes toward SCT before the program? Can you please comment on how this can affect your findings?

Thank you for this suggestion. We have included a clarification and future research direction in our limitation section about our sample that may have already had an increased interest in reforming their instruction.

7. You have student responses and challenges as two separate themes. The literature has shown that students’ negative responses is one barrier for reformed teaching. Can you please comment on why you decided not to include this as a challenge? On page 18, there is a quote of student negative feedback as a perceived challenge.

We recognize that negative student responses are a challenge (we added an additional sentence to clarify), but we separated this as a different category because there were also positive student responses. This is an important idea, however and we see this as a future avenue for specific and directed research. We have added a clarifying statement in the manuscript.

8. The development of the interview protocol was informed by TPB. Am I understanding it correctly that the students were considered “significant others” and students’ responses gave subject norms? If you could be more explicitly on how each of the three factors informed the interview questions, that would be great.

Thank you for this clarifying point. Yes, you are correct, students were considered the “significant others” that were providing feedback. But, we also considered peers and administration in our interviews. We have added details under the interview protocol for qualitative analysis to demonstrate how we developed the interview questions based on TPB.

9. You included faculty ranking when reporting the results. Did you see any patterns of faculty at different rankings?

In this iteration we didn’t have sufficient participation at each rank to see any meaningful patterns. Additionally, rank data was added at the request of a previous reviewer. We agree that this would be an important idea to flush out in future research questions.

Thank you for your work! Looking forward to your responses.

Reviewer #6: An excellent addition to the literature on faculty teaching development. The authors have responded well to the comments from initial review and made updates that improve the quality and readability of their manuscript. I have no further feedback to offer, and look forward to seeing this manuscript published!

Thank you for the feedback.

Reviewer #7: There are minor revisions (e.g., use of terminology, condensing the Discussion). Careful edit of the text is necessary.

We are unable to address the concerns of this reviewer since no details were offered. We have read through the manuscript to look for any grammatical errors and fixed any we found.

Reviewer #8: Summary:

This study attempts to examine the impacts of a faculty professional development program, STEMFI, on the impacts on instructional practices (COPUS data) and instructor’s attitudes, perceptions of students’ responses, and confidence (interview data). The study appears to be well aligned to the selected theoretical framework, The Theory of Planned Behavior. And there are some clear visual representations in Figures 1 and 2. Unfortunately, there are quite a few major revisions that need to be addressed before this manuscript is considered any further: (1) focus is on examining student-centered teaching (SCT), when I think it should be on active learning; (2) educational problem(s) and/or research question(s) are absent (or at least unclear); (3) COPUS data are presented in a qualitative manner, but authors claim that these are quantitative results; and (4) mixed methods research design is unclear. I suggest that the authors rework the introduction and methods with the suggestions below in mind.

Major issues:

Freeman et al. (2014) meta-analysis focuses on the impacts of active learning on student performance outcomes, not student-centered teaching (SCT). I would be careful to not conflate active learning and SCT. I would suggest that the terminology is changed from SCT to active learning since not all active learning is necessarily student-centered.

We have clarified in the manuscript on the STEMFI focus upon student-centered strategies. And we have modified the manuscript in places where we are just referring to active learning and places where we are specifically focused on SCT. We do not want to rename SCT throughout because our STEMFI program was specifically focused on SCT.

Relatedly, is there a reason that you decided to write your own definition of SCT (or what I am suggesting is active learning in my point above)? Why did you not want to use a definition from the literature? I suggest that you review this CBE LSE paper on active learning for potential literature-based definitions: Driessen, E. P., Knight, J. K., Smith, M. K., & Ballen, C. J. (2020). Demystifying the meaning of active learning in postsecondary biology education. CBE—Life Sciences Education, 19(4), ar52.

Reviewer #1 in the original response to reviews brought this up and we decided that student-centered teaching was the better term to use. See our response: We prefer the term “Student-Centered Teaching” to emphasize the constructivist approach that we were teaching. Active learning seems too broad (i.e., teaching can be active and not constructivist). Evidence-based practices is also just not as descriptive. We have made sure we consistently use that term throughout.

The educational problem(s) and/or research question(s) are not clearly stated in the abstract nor at the end of the introduction. I am not sure what is being examined in this study, so it will be hard for me to determine if the methods are appropriate and well-aligned with the educational problem and/or research question.

We have clarified our research question at the end of the introduction.

While Figure 2 is clear for understanding how groups of instructors shifted from one COPUS profile to another, I don’t think that Figure 2 is sufficient for the reader to be able to understand the quantitative nature of the COPUS results. Right now, the results are described in a qualitative manner, which is okay; however, the authors claim that the results are quantitative. I would suggest that you give an example of the instructor and student behaviors occurring in each of those six types of reformer classrooms. See Shi et al. (2023) CBE LSE (https://doi.org/10.1187/cbe.22-03-0047) for an example of figures.

We have cited Stains et al. in our methodology of classification. We have further described our classification of reformers based on the instructor and student behavior data. Additionally, we feel like Figure 2 is representative according to previous reviewer comments by reviewer #1: “The categories you created to classify participants are very clever and the figure 2 is a nice representation of that categorization method.”

Also, the mixed methods research design is unclear to me. Right now, it appears that COPUS data were collected, and interview data were collected, but how one dataset might inform the other needs to be further explored by the authors to answer the research question(s).

Please see our comment above.

Minor issues:

Introduction:

Page 3

Please add a reference to the last statement on this page about the primary objective of these workshop. In addition, I think that “knowledge” is missing from the list of primary outcomes of faculty professional development programs.

We removed this sentence. It seemed redundant. We agree that knowledge should be a desired outcome, but we did not find studies that measured this.

Methods:

Page 7

Participants: Did you collect any other demographics data from the instructors? Or only discipline and gender (in the binary)? Also, could you please describe the student population being served by these instructors being studied? And finally, please describe the process for recruiting these instructors.

No, we did not collect any additional demographics on these instructors. We also did not collect demographics from their students. However, their students would presumably be representative of the student body at the host institution , details of which we have now included under “Participants” in the methods section.

Page 9

How did you ensure that the students were trained to code COPUS in a reliable manner? Did you calculate inter-rater reliability after this training? Please provide more details on how you tested for coder reliability.

(Please see our comment above - we used the Copus training program outlined by the authors of the instrument. We did not systematically collect IRR, but we followed the protocol until we got to agreement.)

Page 10

I was not able to see the supplemental information to be able to see the full interview protocol. Also, it would be helpful if the supplemental file information (e.g., Supplemental File S1) was noted in the text.

Thank you for noting this missing file. We will include this file in the final submission. And we have indicated it in the text.

Page 11

Who are the several researchers that thematically coded the interview responses? I think it is important to be transparent about which specific researchers did this work to build trustworthiness in the data.

We have changed the wording in the manuscript to indicate which researchers thematically coded the interview responses.

How exactly did you connect the quantitative and qualitative data using exploratory mixed methods design? It is unclear to me which COPUS and which interview data were used for these analyses. It is not until the results that I read that themes were not tied to specific reformer types. It is important that you bring up these details in the methods, not just results.

This is an excellent point. We have duplicated the statement from the results in the methods section.

Results:

Page 11

I would move this participant info to the methods section. It’s good to be transparent about the loss of participants due to COVID, but I don’t think it is a great way to start your results section.

We have adjusted the participant info in the results section.

Why did you decide to add the ranking of the instructors to the quantitative results? Are you interested in examining how instructor rankings impacted use of student-centered teaching practices? Again, I am not sure if these results are aligned to the research questions and methods as I did not clearly see research questions earlier on.

We only added this in response to a previous reviewer who requested it. We were not interested in analyzing reform by rank as we did not have enough participants in each category to really identify any meaningful patterns.

References:

Check first author spelling for this citation: Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

We fixed it! Thank you for catching this!

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 2

Ayse Hilal Bati

3 Jul 2023

PONE-D-22-33194R2Iterating toward change: improving student-centered teaching through the STEM faculty institute (STEMFI)PLOS ONE

Dear Dr. Jensen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

 The revised form of your article requires minimal corrections as noted in the reviewer's comments. With these arrangements, which can be completed in a short time, the article will be ready for publication.

Please submit your revised manuscript by Aug 17 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Ayse Hilal Bati, Professor

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

Dear Author/s,

The revised form of your article requires minimal corrections as noted in the reviewer's comments. With these arrangements, which can be completed in a short time, the article will be ready for publication. Thank you.

[Note: HTML markup is below. Please do not edit.]

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2023 Aug 17;18(8):e0289464. doi: 10.1371/journal.pone.0289464.r006

Author response to Decision Letter 2


17 Jul 2023

There were no additional comments from reviewers.

Decision Letter 3

Ayse Hilal Bati

20 Jul 2023

Iterating toward change: improving student-centered teaching through the STEM faculty institute (STEMFI)

PONE-D-22-33194R3

Dear Dr. Jensen,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Ayse Hilal Bati, Professor

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Dear Author/s,

I am very glad that I had the opportunity to evaluate your article. You have made significant progress by improving your article with the recommendations of the referees. I think the article is suitable for publication on Plos One. Thanks

Reviewers' comments:

Acceptance letter

Ayse Hilal Bati

9 Aug 2023

PONE-D-22-33194R3

Iterating toward change: improving student-centered teaching through the STEM faculty institute (STEMFI)

Dear Dr. Jensen:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Ayse Hilal Bati

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 File

    (DOCX)

    Attachment

    Submitted filename: Review PONE.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Attachment

    Submitted filename: PLOS ONE FEEDBACK.pdf

    Attachment

    Submitted filename: ReReview PLoS One PONE-D-22-33194.docx

    Attachment

    Submitted filename: Review PONE 5-7-23v2docx.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Data Availability Statement

    Data are available from the Brigham Young University Institutional Scholars archive (via https://scholarsarchive.byu.edu/data/49).


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES