Abstract
Graduate students experience high levels of demand in their degree programs, which often results in difficulty maintaining their academic performance and managing their distress. The present study examined the effectiveness of a 6-week values clarification and committed action training program derived from acceptance and commitment therapy (ACT) to increase academic performance and psychological flexibility of graduate students in a behavior analysis and therapy program by comparing a Values intervention group to a Study Tips active treatment control group on measures of academic performance, psychological flexibility, values-driven behavior, and stress. The results suggest that the Values group demonstrated statistically significant improvements in academic performance (t (32) = 1.902, p < 0.05), psychological flexibility (t (32) = 1.895, p < .05), and ratings of the importance of education-related values (t (32) = 2.013, p < .05) compared to the control group, and nonsignificant improvements in reports of consistency with education-related values (t (32) = 0.7204, p > .05) and perceived stress (t (32) = 1.521, p > .05). The Values group also demonstrated a higher score for social validity than the control group following the intervention (t (32) = 2.449, p < .05).
Keywords: Values clarification, Randomized controlled trial, Graduate students, Psychological flexibility
In 2015, there were 2.9 million students enrolled in graduate degree programs in the USA, and within the next 10 years, that number is expected to increase by as much as 12 % (U.S. Department of Education, National Center for Education Statistics, 2017). While the number of students enrolling in graduate studies is increasing, attrition is a common problem among this population, with varying rates of program completion among programs in different areas of study. For example, attrition in master’s and doctorate level psychology-related programs can be as high as 10% of students (Michalski, Cope, & Fowler, 2016). Graduate students’ failure to complete their degree programs may be related to the high demands they experience, which can result in various academic and psychological problems.
Graduate degree programs are known for their rigor and high level of demand, which can result in a number of challenges for students completing these programs. Students may encounter challenging coursework, additional departmental and program requirements, and pressure to achieve high levels of research and clinical productivity in order to secure employment following graduation. Research with graduate students has revealed high levels of stress and mental health problems among this group. A study conducted by Hyun, Quinn, Madon, & Lustig (2006) at a large US university concluded that the graduate student participants experience high levels of mental health needs, including stress-related problems that had significantly impacted their academic performance or emotional well-being, feelings of depression as well as financial stress and hardship, and yet only approximately one-third of these students had sought counseling or other mental health services. Many factors contribute to graduate student distress, such as academic performance, quantity of school work, finances, assistantships, and poor work/school-life balance (Oswalt & Riddock, 2007; El-Ghoroury, Galper, Sawaqdah, & Bufka, 2012). Several coping strategies are commonly used by graduate students, including several maladaptive or unhealthy behaviors such as drinking alcohol, smoking cigarettes, and taking prescription medication, as well as adaptive methods such as friend and family support, “vegging out,” exercise, and yoga (Oswalt & Riddock, 2007). A correlational study conducted found psychology graduate students throughout the USA who utilized the self-care practices of sleep hygiene, social support, emotion regulation strategies, and mindful acceptance reported lower levels of perceived stress than students who did not (Myers Virtue, Sweeney, Wesley, & Fingerhut, 2012). Although these techniques may be effective for alleviating stress, another survey-based study inclusive of psychology graduate student participants in the USA found that students most frequently reported that the barriers to utilizing wellness resources were lack of time and money required to access those services or engage in self-care activities (El-Ghoroury et al., 2012). As a result of these well-documented stress and wellness-related needs for graduate students, there is an apparent need for interventions designed to decrease distress as well as increase academic performance for this group. However, more research has evaluated treatments for undergraduate populations.
Examinations of various treatments for undergraduate college students are prevalent in the literature, targeting behavior change related to stress, alcohol and marijuana use, academic performance, health, procrastination, and other achievement and wellness-related areas (for example, Lee, Neighbors, Kilmer, & Larimer, 2010; Murphey et al., 2001; Perrin et al., 2011; Walton & Cohen, 2011). The development and evaluation of interventions designed specifically for graduate students receives less empirical attention despite the well-documented psychological and academic issues experienced by this population. This occurs possibly because these individuals represent a smaller proportion of students, graduate students are incorporated into university student samples, or the frequency of graduate students participating on research teams compared to undergraduate students. Our review of the literature in this area revealed one study that evaluated the effects of reciprocal peer tutoring intervention on graduate students’ achievement, test anxiety, and self-efficacy, which did not result in significant differences between the treatment and control group on any measures, although the participants did report that they believed the tutoring was an effective technique, despite these outcomes (Griffin & Griffin, 1997). The disparity between the empirical reports of graduate student distress in various areas and interventions designed to alleviate these problems creates an area of research that behavior analytic therapeutic interventions have utility.
Various behavior analytic approaches, for example, contingency management and behavioral skills training, may be effective for changing stress- and performance-related behaviors in graduate students. However, recently, interventions derived from clinical behavior analysis such as values clarification and committed action, which are core components of acceptance and commitment therapy (ACT), have begun to receive empirical attention in the context of students in higher education. ACT incorporates mindfulness, acceptance, and behavior change techniques to reduce the influence of an individual’s psychological distress that results from ineffective rule-governed behavior and may have a negative impact on the individuals’ behavior in various life domains. These various techniques include conditioned motivating operations that evoke adaptive behavior, discrimination of verbal and overt behaviors that increase access to reinforcement, and the use of various self-management strategies to monitor progress (Hayes, Strosahl, & Wilson, 2012). The overarching goal of ACT is to increase psychological flexibility, which refers to an individual’s ability to engage in adaptive, flexible behavior appropriate to the current context, increasing contact with reinforcement without unnecessary verbal regulation (Hayes, Strosahl, Bunting, Twohig, & Wilson, 2004). ACT-based interventions have been evaluated in the context of college students in several different analyses which have produced favorable outcomes, such as an acceptance-based behavioral therapy program for first-year college students (Danitz & Orsillo, 2014), a mindfulness and yoga intervention for a college athletes (Goodman, Kashdan, Mallard, & Schumann, 2014), an ACT training to improve academic outcomes for students from low-income families at-risk for leaving college before obtaining a degree (Sandoz, Kellum, & Wilson, 2017), and a mindfulness program designed specifically for graduate students (Cohen & Miller, 2009). ACT interventions traditionally utilize six core processes that collectively increase an individual’s ability to engage in these flexible behavioral repertoires, two of which, values and committed action, may have particular utility for graduate students.
Values, from the perspective of clinical application, are considered “chosen concepts linked with patterns of action that provide a sense of meaning and that can coordinate our behavior over long time frames,” which are intangible and can never be fully attained (Dahl, 2015, p. 43). Committed actions refer to those patterns of action that are developed within the values framework (Dahl, 2015, p. 44). ACT creates a distinction between values, overarching concepts that direct behavior, and goals, specific objectives that require certain behaviors that can be successfully fulfilled or completed; goals can be considered an intermediary between one’s values and committed actions (Fitzpatrick et al., 2016). Within the ACT therapeutic environment, individuals are engaged in exercises that are designed to clarify common qualities or characteristics of behaviors and events that are meaningful or important to him or her (his or her values), and first identifying behaviors that are consistent with these values, then engaging in these behaviors that contact reinforcers in the individual’s natural environment (Dahl, 2015). While this conceptualization of values and committed action is particularly useful for implementation of therapeutic exercises, a behavior analytic interpretation of these concepts provides greater utility in the design and evaluation of these techniques.
A behavior analytic interpretation, from a relational frame theory (RFT) perspective, which is the theoretical foundation of ACT, provides an explanation that incorporates the verbal processes involved in values-based behavior. As stated, part of the therapeutic goal of ACT is to decrease an individual’s ineffective rule-governed behavior and increase adaptive, flexible responding. RFT denotes three possible forms of rule-governed behavior: pliance, which is behavior under the control of socially mediated reinforcement based on the coordination of a stated rule and an individual’s behavior (for example, obeying a parent’s rules); tracking, which is behavior under the control of the coordination between a rule and its function in the environment (for example, following the rule “wear a coat in the winter” because environmental contingencies of being cold otherwise support it); and augmenting, which is behavior that is the result of verbal behavior that change the degree to which events act as reinforcers or punishers (for example, stopping to purchase coffee on the way to work after hearing an ad that says, “There’s no better way to start your day than with a hot, comforting cup of joe,” even though the coffee was equally available before and after the ad) (Hayes, Barnes-Holmes & Roche, 2001). The values and committed action components of ACT are designed to establish effective verbal regulation of behavior, increasing tracking and decreasing pliance. Values can be conceptualized as motivative augmental rules, which increase the strength of the effect of previously established reinforcers (Plumb, Stewart, Dahl, & Lundgren, 2009). For example, “being healthy” is identified as an individual’s value because she identifies a number of reinforcing behaviors and events that relate to health, such as exercise and eating well. For this individual, eating a salad for lunch may result in reinforcement from social praise and physiological effects such as an increase in energy after a meal. Labeling his or her lunchtime choice as “living my value of being healthy,” temporarily increases the reinforcing effects of selecting the salad, resulting in more overall reinforcement and a higher probability of choosing to eat a healthy lunch again in the future. Goals and associated committed actions are hierarchically related to values, meaning that the individual responds categorically to networks of values, goals, and actions that result in similar reinforcing effects (Plumb et al., 2009). For example, the same person who states the value “being healthy” may create goals within that category, such as “exercising regularly” and “eating healthy meals and snacks” and identify specific behaviors for achieving these goals such as “attending a spin class on Tuesdays” and “preparing salads for lunch three times per week.” These values, goals, and actions are related in a hierarchy, and the reinforcing functions of the actions are transferred to the goals and strengthened by coherence with the stated values.
Two studies exemplify the potential role of values and committed action interventions for students in higher education settings. An investigation by Chase, Houmanfar, Hayes, Ward, Plumb Vilardaga, & Follette (2013) evaluated the differing effects of a goal setting intervention including and excluding ACT-based values training compared to a control group on the academic performance of college students, and found that goal setting alone and the control condition produced similar outcomes, whereas the combination of values training and goal setting significantly improved performance over the course of two semesters. Another study by Gagnon, Dionne, & Pychyl (2016) examined the relationship among psychological flexibility measures, committed action, and procrastination reported by university students, and concluded the degree of committed action was the most significant predictor of student procrastination, suggesting that committed action interventions may be particularly useful in area. Additional research is needed to determine the efficacy of values clarification and committed action interventions on the academic achievement and psychological well-being of graduate students.
The present study sought to examine the effectiveness of a 6-week values clarification and committed action training program to increase academic performance and psychological flexibility of graduate students in a behavior analysis and therapy program. The study compared measures of academic performance, psychological flexibility, values-driven behavior, and stress for two groups, an intervention group that completed values and committed action activities and a control group that completed a set of activities designed to teach study skills.
Method
Participants and Setting
Thirty-four graduate students (30 female, 4 male) enrolled in a Behavior Analysis and Therapy program between the ages of 22–40 (M = 27.65, SD = 5.48) participated in the study. Fifteen of participants were enrolled in an on-campus master’s degree program. Nineteen of participants were enrolled in a distance education master’s degree program or in an online Behavior Analysis Certification Board (BACB) approved course sequence to become eligible for the Board Certified Behavior Analyst (BCBA) certification exam. All participants were enrolled in a basic behavior analysis course, in which both on- and off-campus students viewed the same lectures and completed the same assignments and assessments; the course was taught by the same professor and integrated via group assignments. Students in the courses who had an average above 90% on quizzes at the beginning of the study were not included as participants, but were allowed to participate in the activities to receive the extra credit points available for all participants. On-campus and off-campus sections had a different teaching assistant throughout the course of the semester. The professor and teaching assistants were researchers on this study; however, the professor was unaware of which students participated in the study, and teaching assistants were blind to which students in his or her own section were involved in the study. The teaching assistant for the on-campus students was only aware of off-campus students participating, and was responsible for all communication and data collection with this group. The teaching assistant for the off-campus students was only privy to the on-campus students participating, and therefore managed all communication and data collection for this group. On-campus participants completed assessments in a classroom at a Midwestern university and off-campus participants completed assessments on an online learning platform. All participants completed intervention components on their personal computers at preferred locations.
Dependent Measures
Academic Performance Measure
Participants’ course performance was utilized in two ways in this study. Prior to group randomization, participants’ previous four quiz scores completed before the onset of the current study were used to obtain mean quiz score grades, and were used to match participants in terms of class grades. Percent scores obtained on the participants’ midterm and final exams were used as pre- and post-measures in this assessment. Both midterm and final exams included 50 multiple choice questions, and 5 short essay written response questions. Each multiple-choice question was worth one point and each written response question was worth ten points. Exams were scored by the professor and teaching assistants for the courses. The on-campus and off-campus sections of the course were each scored by a different teaching assistant.
Self-Report Measures
Participants completed three self-report measures as indices of psychological flexibility, stress, and degree of valued living twice throughout the course of the study. The Acceptance and Action Questionnaire-II (AAQ-II; Bond et al., 2011), a 7-item Likert scale questionnaire, provides a measure of participants’ psychological flexibility, with higher scores denoting greater inflexibility. The Perceived Stress Scale (PSS: Cohen, Kamarck, & Mermelstein, 1983), a 10-item Likert scale questionnaire, provides a measure of the degree to which individuals report the perception of stress in their lives, with higher scores denoting greater levels of stress. The Valued Living Questionnaire (VLQ; Wilson & Groom, 2002), a two-part questionnaire with ten Likert scale questions in each part, provides a measure of an individual’s degree of valued living in various life domains (i.e., family and education/training) in terms of importance of values and consistency of values-related behavior.
Social Validity Measure
Participants completed a 6-item social validity questionnaire, which was developed by the researchers for the current study, at the conclusion of the study in order to determine their satisfaction with the weekly intervention that they received. Four items required the participants to rate the following question statements on a 5-point Likert scale from “Strongly Disagree” (1) to “Strongly Agree” (5): (1) The weekly activities were helpful in encouraging better study habits over the course of the semester; (2) The weekly activities reduced my stress over the course of the semester; (3) The weekly activities helped me improve my academic performance this semester; and (4) The weekly activities helped me improve my performance in areas of my life other than academic performance this semester. Ratings of these four items were summed to create a “Social Validity” total score. Two items required the participants to respond to open-ended questions, including: (5) What are some things that were positive about completing the weekly activities? and (6) What are some things that were negative about completing the weekly activities? The questionnaires were created as a form in a text document, which was e-mailed to participants following their completion of the final week of intervention.
Data Analysis
Academic Performance Measure
A change score on participants’ scores on exams from pre- to post-test was calculated by subtracting their pre-test scores from their post-test scores. Then, an independent samples t test was conducted on participants’ change scores. Finally, a Cohen’s d effect size analysis was completed comparing the two groups.
Self-Report Measures
For each of the self-report measures (AAQ-II, VLQ, and PSS), change scores were calculated by subtracting the pre-test score from the post-test score for each participant. The VLQ was divided into its two component parts, the Importance and Consistency ratings. For each component, the mean rating across the ten items was calculated, and the rating for the educational domain was subtracted from it to determine the relative effect on reports of educational values for each participant. Once change scores were calculated, an independent samples t test was conducted for each self-report measure. Finally, a Cohen’s d effect size analysis was completed comparing the two groups on each self-report measure.
Social Validity Measure
Participants’ responses to the first four questions on the social validity measure were summed to provide a total score on this measure for each participant. Then, an independent samples t test was conducted between the two groups total scores. Finally, a Cohen’s d effect size analysis was completed comparing the two groups.
Experimental Design
A pretest-posttest between groups experimental design using matched randomization with one intervention and one active treatment control group was used. All participants were asked to review and provide the consent for the experiment prior to participation. Participants were assigned to one of the two groups (the Values intervention group or the Study Tips control group). First, participants were separated by class registration, those registered for the off-campus and on-campus sections of the course. Then, participants within each section were matched in pairs using the following method. First, the mean of the participants’ scores on the previous four quiz scores in the class, which occurred before the onset of the study, were calculated. Then, matching this mean score, participants in the two sections were placed in pairs and each member of the pair was assigned to either the intervention or control group using a random group generator found online. The study included 17 participants assigned to the Values intervention group and 17 participants assigned to the Study Tips control group. The age range for the control group was 22–38 years (M = 27.65, SD = 5.41), and intervention group age range was 22–40 years (M = 27.65, SD = 5.72). The pre-test mean quiz score for the control group was 83.01% (SD = 11.75) and for the intervention group was 82.09% (SD = 14.48).
Procedure
Participants in both groups completed the pre-test self-report measures prior to the first week of intervention implementation, and pre-test academic performance scores were obtained based on the participants’ performance on the course midterm exam, which was completed the week prior to the pre-test self-report measures.
Control Group
Participants in the control group completed a series of activities focused on developing study skills. Each week, the participants completed one worksheet, which was e-mailed to them on the first day of the week, completed within a form on a text document, and e-mailed back to the researchers prior to the first day of the following week. Each worksheet included one study tip, information to read, a reflection question, and a question regarding the application of this skill; an example of one worksheet is provided in Fig. 1. All worksheets followed a similar format. Study skills, which have an empirical basis for improving the academic performance of university students (e.g., Lipsky & Ender, 1990), were retrieved from a website that provides information for graduate students to support their academic performance (www.testpreview.com). This website was utilized because it is created for and accessible to graduate students. The study tip topics included in the study, in order of intervention weeks, were as follows: (1) taking good lecture notes, (2) practicing active listening, (3) taking useful notes when reading, (4) improving your concentration, (5) making the most of your notes, and (6) improving your memory skills. Participants in this group did not have access to the values and committed action intervention materials (Table 1).
Fig. 1.

Example study tips intervention worksheet
Table 1.
Participant demographic Information
| Group | Sex | Age | Pre-study quiz score |
|---|---|---|---|
| Control | Female | 22 | 86 |
| Control | Female | 23 | 81 |
| Control | Male | 37 | 58 |
| Control | Female | 26 | 57 |
| Control | Male | 22 | 68 |
| Control | Male | 38 | 78 |
| Control | Female | 22 | 69 |
| Control | Female | 29 | 76 |
| Control | Female | 23 | 84 |
| Control | Female | 25 | 90 |
| Control | Female | 24 | 82 |
| Control | Female | 30 | 82 |
| Control | Female | 26 | 83 |
| Control | Female | 30 | 81 |
| Control | Female | 26 | 87 |
| Control | Female | 30 | 88 |
| Control | Female | 37 | 88 |
| Treatment | Female | 24 | 90 |
| Treatment | Female | 29 | 87 |
| Treatment | Male | 32 | 87 |
| Treatment | Female | 22 | 36 |
| Treatment | Female | 22 | 58 |
| Treatment | Female | 23 | 61 |
| Treatment | Female | 23 | 78 |
| Treatment | Female | 22 | 90 |
| Treatment | Female | 25 | 87 |
| Treatment | Female | 30 | 79 |
| Treatment | Female | 36 | 87 |
| Treatment | Female | 23 | 90 |
| Treatment | Female | 31 | 71 |
| Treatment | Female | 36 | 84 |
| Treatment | Female | 40 | 82 |
| Treatment | Female | 28 | 88 |
| Treatment | Female | 24 | 86 |
Intervention Group
The Values intervention group completed a series of activities focused on values clarification and commitment to those values. Each week, the participants completed one worksheet, which was e-mailed to them on the first day of the week, completed, and returned in the same manner as the previous group. Each worksheet included information to read related to values and/or committed action in one or more areas of the participants’ lives, including their education, and an activity to complete related to the topic of the worksheet; an example of one worksheet is provided in Fig. 2. All worksheets followed a similar format; some were completed and never revisited, while others required that participants refer back to activities completed in previous weeks. A description of the content presented in each weekly worksheet follows, and a summary of the activities presented each week is provided in Table 2. Week 1 introduced the concept of values using the “Bull’s Eye” activity created by Lundgren, Luoma, Dahl and Strosahl (2012). Week 2 introduced the concept of committed action, barriers to progress, and goal setting. Week 3 encouraged participants to focus specifically on their values related to their graduate education. Week 4 discussed the challenges related to remaining committed to advancing in educational pursuits and prompted participants to identify specific committed actions. Week 5 provided a discussion of the participants’ sense of self in relationship to their values, creating distinctions between the content of their experiences and the perspective of themselves as the context in which their experiences occur as well as descriptions of their behaviors and evaluations of them. Week 6 served as a summary activity, reviewing the plans created in previous weeks, establishing the importance of remaining committed to valued behavior, and prompting the participants to make a plan for valued behavior following the conclusion of the study. Participants in this group were not provided access to the study tips intervention materials.
Fig. 2.

Example values intervention worksheet
Table 2.
Weekly values clarification and committed action activities
| Week | Topic | Activity |
|---|---|---|
| 1 | Introduction to Values | Complete values “Bull’s Eye” exercise, identify work/education, leisure, personal growth/health, and relationships values, and rate each value on a scale of 1–7. |
| 2 | Introduction to Committed Action | Select 2 values, and create a brief action plan, including a long-term goal for each value. Describe 2 short-term goals for each value to complete in the following week. List barriers to these goals and committed actions that can be made to achieve the short-term goals. |
| 3 | Education- Values | First, rate performance on committed action plan from previous week on a scale from 1 to 5. Then, identify 5 events from lifetime that have “led” to graduate school. Using those events as a guide, identify 3 education-related values. |
| 4 | Education- Committed Actions | First, identify one or more education-related values from the previous week. Then, select 6 committed actions related to school work that are measurable, and can be completed within the next month to move toward that value. |
| 5 | Education-Related Self | Write down a definition of self in terms of an education-related value, and list 3 evaluations based on that definition. Describe how these evaluations limit values-related behavior. Then, consider “self” if these evaluations are not necessarily true. Finally, state committed actions from previous activity to focus on for next week. |
| 6 | Conclusion | Re-visit the plan from Week 4. Identify 3–5 successful behaviors in moving toward the specified value, as well as 3–5 behaviors to improve. Then, select which remaining committed actions can continue as is, and which need to be revised. Describe how it feels to live a valued-driven life in graduate school. |
Following completion of the final week of intervention, participants in both groups completed the post-test self-report measures and the social validity questionnaire, and post-test academic performance scores were obtained based on the participants’ performance on the course final exam, which was completed 1 week following the post-test self-report measures.
Results
We sought to evaluate the effectiveness of a 6-week values and committed action training program on academic performance, psychological flexibility, educational values, and perceived stress in graduate students in a behavior analysis degree program. We additionally evaluated differences in a social validity questionnaire delivered to both groups at the conclusion of the study. The results are summarized in Figs. 3 and 4. shows a scatter-dot plot of participant change scores from the midterm examination to the final examination, across the two groups. Visual analysis of these results suggests that there was a greater increase in grades achieved by the treatment group (M = 11.9, SD = 16.0) relative to the control group (M = 3.2, SD = 10.0), and that improvements in grades were relatively consistent across the 17 participants who received the intervention. A Cohen’s d effect size analysis was conducted to determine the degree of the observed difference, and using Cohen’s (1988) conventions for interpreting effect size, results supported a large effect of the treatment group relative to the control group (Cohen’s d = 0.65). In interpreting the mean change scores, there may be an element of social significance in that the treatment group showed an average increase of just over 10%, which in many grading metrics is a full letter grade. Finally, an independent samples t test was conducted to determine if group differences were statistically significant. The results suggested that there was a significant difference in changes in academic performance across the two groups (t (32) = 1.902, p = 0.0331).
Fig. 3.

Unpaired independent samples t test for exam score change scores (a), PSS change scores (b), and social validity (c). The individual data points represent individual participant scores, the columns represent mean scores, and the thin bars represent error. Circular data points indicate control participants and square data points indicate treatment participants
Fig. 4.

Unpaired independent samples t test for AAQ-II (a), VLQ-Education importance (b), and VLQ-Education consistency (c) change scores. The individual data points represent individual participant scores, the columns represent mean scores, and the thin bars represent error. Circular data points indicate control participants and square data points indicate treatment participants
Figure 4 contains three graphs, showing changes in AAQ-II (left), VLQ—Education importance (middle), and VLQ—Education consistency (right). The AAQ-II provides a composite measure of psychological flexibility, where negative change scores suggest an improvement in flexibility (or decrease in psychological inflexibility). Visual analysis of the results shows greater AAQ-II decreases for the treatment group (M = − 3.5, SD = 6.7) compared to a slight mean increase in the control group (M = 0.6, SD = 6.0). Data suggest that only 5 participants in the treatment group failed to show a decrease in AAQ-II, compared to 11 in the control group. By examining the control group alone, more than half of the subjects showed increases in psychological inflexibility, further suggesting that the treatment may have served as a protective factor. Results of Cohen’s d effect size analysis again support a large effect of the treatment (Cohen’s d = 0.65). The difference between groups on the AAQ-II was statistically significant (t (32) = 1.895, p = 0.0336). The VLQ provides a measure of the importance and consistent action toward common domains of valued living. For educational values, relative to mean values in other areas, we see a greater mean increase in both importance and consistence for the treatment group (importance: M = 0.6, SD = 0.9; consistency: M = 0.6, SD = 2.2) compared to the control group (importance: M = − 0.3, SD = 1.6; consistency: M = −0.04, SD = 2.8). Only three participants in the treatment group showed a decrease in education importance and five showed a decrease in educational consistency. In the control group, ten showed a decrease in educational importance and eight showed a decrease in educational consistency. Interpreted at the individual level, the results suggest that the treatment may have had a greater influence on the participants’ reports of importance of educational values. Cohen’s d analyses suggested a large effect on importance (Cohen’s d = 0.69) and a small effect on consistency (Cohen’s d = 0.24). Consistent with the above interpretations, we found a statistically significant difference between the groups in the importance of education values (t (32) = 2.013, p = 0.0263), but failed to find a significant difference in consistency (t (32) = 0.7204, p = 0.2382). The discrepancy between the large effect size and lack of statistical significance for the consistency of education values may be a result of the limited sample size, as significance is a measure of the likelihood of the results occurring by chance, not of the amount of change that occurred.
Figure 3 shows a scatter-dot plot of participant change scores on the PSS, where negative values indicate a decrease in perceived stress. Visual analysis suggests that, again, the treatment group demonstrated a greater average decrease in stress (M = − 3.1, SD = 6.7) relative to the control group (M = − 0.03, SD = 2.8). Minimal consistent differences between the two groups were observed, where ten participants in the treatment group showed overall decreases in stress compared to eight in the control group; however, a large treatment effect was observed (Cohen’s d = 0.52). The results failed to suggest that the differences were statistically significant (t (32) = 1.521, p = 0.1382), so the large effect size should be interpreted with caution as results could be alternatively explained as chance alone. Figure 3 summarizes participant scores on the social validity questionnaire, presented as the sum of participant responses on the four validity questions, producing a maximum score of 20. Results suggest higher mean social validity scores for participants in the treatment group (M = 13.9, SD = 3.1) compared to the control group (M = 11.2, SD = 3.3). Cohen’s d analysis suggested that the treatment had a large effect (Cohen’s d = 0.84) and the differences were statistically significant (t (32) = 2.449, p = 0.01). Visual analysis of the results shows that a significant portion of the mean for the control group fell below 1 standard deviation of the results for the treatment group. Analyzing the results individually, we see the largest differences in participant responses to questions 3 and 4. This suggests that participants in the treatment group more so felt that the treatment helped them improve their academic performance (question 3) and helped them improve their performance in life outside of the academic setting (question 4).
Discussion
The present study sought to examine the effectiveness of a 6-week values clarification and committed action training program to increase academic performance and psychological flexibility of graduate students in a behavior analysis and therapy program, and the results indicated significant improvements student test performance and psychological flexibility. Taken together, the results of the present study extend prior research on values clarification and committed action approaches to behavior therapy (Plumb et al., 2009; Vowles & McCracken, 2008) in the context of graduate student course instruction, an area in which there has been limited prior research. Results suggested statistically significant improvements in student academic achievement measured by test performance, and in psychological flexibility as measured in the AAQ-II. As well, individual results suggest that these changes were consistent across participants in the treatment group. Improving academic performance is the primary goal of academic instruction. Several studies have been published demonstrating ways to effectively deliver course content that will improve course performance, including two notable methods developed by behavior analysts: Interteaching (Boyce & Hineline, 2002; Saville, Zinn, Neef, Norman, & Ferreri, 2006) and Equivalence-based instruction (Fienup & Critchfield, 2011; Walker, Rehfeldt, & Ninness, 2010). Our results differ considerably from this line of research in that no modifications were made to the delivery of course content; rather, the intervention occurred outside of the classroom setting and without reference to course material. Instead, by targeting verbal behavior in terms of values and goal setting in committed action, students performed significantly better than those who received study tips as an active treatment control group. Corresponding changes in psychological flexibility are consistent with prior research (e.g., Bohlmeijer, Fledderus, Rokx, & Pieterse, 2011), and a parallel between the behaviors that comprise flexibility and strong academic performance can be made. For example, if a student can accept the increased workload and decreased time to allocate to social activities, and if the same student can engage in committed actions, such as studying and attending classes, then the student is more likely to achieve successful academic outcomes.
Results reported on the VLQ, a measure of valuing, and the PSS, a measure of stress, were less consistent than observed changes in academic performance and psychological flexibility. Although participants in the treatment group had greater increases in the self-identified importance of education, the same increases were not observed in terms of consistency between educational behavior and educational values. In interpreting this outcome, recall that students in the treatment group performed better than students in the control group from the midterm examination to the final examination. An assumption could be made that the treatment group, therefore, engaged in more educational behavior (e.g., studying) than the control group. Lower ratings in consistency with educational values are not likely a reflection of the amount of action committed to that value. Instead, if the importance of the value increased, then participants may have observed a greater inconsistency between their current behavior, albeit higher than at the on-set of the study, and the optimal state of their behavior given the heightened importance of education. This interpretation is highly speculative but may potentially be useful for future research in this area. The results also failed to indicate a significant difference between groups in terms of perceived stress. This finding may have been the result of our relatively small sample size, given mean levels of stress decreased for the treatment group and increased slightly for the control group. Alternatively, this finding could have been an artifact of conducting the post-test following exams at the end of the semester, where decreases in student stress in general may be more likely given a significant decrease in workload.
Finally, we observed significant differences on participant self-reports on the social validity questionnaire, which sought to quantify the participants’ satisfaction with the weekly interventions, suggesting that the participants in the treatment group felt better prepared for academic success as well as success outside of the academic setting. Evaluating social validity is essential in ensuring that behavior analytic interventions are well perceived by consumers of our science (Carr, Austin, Britton, Kellum, & Bailey, 1999; Wolf, 1978), in this case the students who stand to benefit from this technology as well as their instructors and the universities they attend. Although an intervention may be effective, if it is not amenable for use by those it is designed for, it is less likely to be used outside of the research setting. Additionally, although contingency management in the form of prescribing a high value grade item may be effective in increasing the likelihood of students engaging in academic behavior, such as in cramming before a big test, this approach could also increase stress and therefore fail to achieve the social validity obtained in the current study. Adding an intervention similar to the values intervention in the current study to an existing course is also relatively un-intrusive, as changes do not need to be made to the syllabus or instruction format. Effective interventions that require little by way of time and financial resources may be preferable to the students as well as the educators that utilize them in their instruction.
The results of the study have several implications that extend research in this area. Graduate students report high levels of stress (Hyun et al., 2006) that can impact academic performance, where 10% of graduate students drop out before completing their degree (Michalski et al., 2016). By embedding values clarification and committed action into graduate course instruction, instructors may be able to improve the psychological flexibility of students, potentially reducing how stress affects academic behavior, as their more flexible behavior may occasion more reinforcement without unnecessary verbal regulation (Hayes et al., 2004). Our results did not report a significant difference in stress reduction across the two groups, but despite similar levels of stress, the treatment group showed greater psychological flexibility and achieved better academic outcomes. This finding is consistent with prior research that has shown that ACT-based approaches may show minimal changes in mediating symptoms (e.g., stress, anxiety), but large and maintained changes in the terminal valued response (e.g., attending class, studying) (Hayes, Luoma, Bond, Masuda, & Lillis, 2006; Powers, Vörding, & Emmelkamp, 2009). Introducing ACT-based procedures into existing coursework may also provide an avenue for preventative mental-health treatment. Stigmatization of seeking mental health treatment has been implicated in a disproportionate number of students who experience mental health problems, for example anxiety or depression, but do not seek treatment (Eisenberg, Downs, Golberstein, & Zivin, 2009). By preventatively embedding technologies that promote psychological flexibility, there may be a reduction in the number of students who require more individualized therapy. Corresponding to this implication, the activities also may provide an avenue for instructors to identify students who are at-risk for experiencing mental illness or dropping graduate studies altogether, allowing instructors to intervene and recommend more substantive therapeutic support. Finally, research in behavior analysis on RFT and ACT has surged in the past two decades, and the results reported here suggest that there may be utility in incorporating these advances to improve the quality of training that we provide future practitioners and researchers in our field. Low Behavior Analysis Certification Board (BACB) pass-rates may result from poor academic performance; therefore, technologies that improve academic performance may have implications for the number of practitioners that both pass the coursework required for the exam, as well as grasp the concepts sufficiently to pass once they get there. Helping professions additionally have notoriously high rates of burn-out (Grant & Kinman, 2014), where improving psychological flexibility could aid in this regard.
There are several limitations in the current study that should be addressed in future research. First, we used a small sample, reducing the external validity of the results. It should be noted, however, that obtaining statistically significant findings requires greater consistency in group differences with smaller samples, increasing the internal validity of the results reported here. This limitation could account for a lack of statistical significance in evaluating changes in stress and consistency of educational valued action, where a larger sample with similar differences in responding may have indicated a different level of significance. Second, all participants were in the same course. Because the course was a behavior analysis course, and ACT was developed based on core behavior analytic assumption, correspondence between the treatment and the course material could have increased the probability that successful outcomes were achieved. Although this limitation is trivial if the intervention is adopted by a behavior analysis course instructor, the degree to which the same results would be achieved in a non-behavior analytic course is currently unknown. Third, there may have been a contamination effect given the students in both groups were randomly sorted within the same classes. No measures were taken to ensure that participants from the treatment group did not provide materials to the control group, or vice versa. Future studies may utilize technology in a way that does not allow the participants to download or keep intervention materials to ensure that they do not share them with one another. Anecdotally, a student from the treatment group expressed concern that students from the control group were asking for the activities that she had been given. The student reported that she did not provide the students with the materials. Fourth, interrater agreement was not evaluated as a measure of reliability on the midterm or final exam grades. The graders were blind to the group assignment of the participants and rubrics were used to grade short and long answer questions, but evaluating agreement would increase confidence in the obtained results. Fifth, an analysis comparing the performance of on- and off-campus students was not completed due to the limited sample sizes; future research may examine any differences in performance among these two groups. Finally, follow-up measures were not conducted to determine if the treatment led to later increases in academic performance. Prior ACT research has consistently demonstrated maintenance of treatment results, which has been elusive in other therapeutic approaches, therefore future research inclusive of such follow up measures would be useful in determining if the same effect is found for this intervention (e.g., cognitive behavior therapy; Hayes et al., 2006).
Beyond addressing the above limitations, future research may extend upon the results reported here by evaluating the effect of treatment dosage on improvements in psychological flexibility and academic performance, by utilizing interventions that vary in time per activity as well as duration of the intervention over time. Whereas “more is better” is a typical outcome in therapy research, this may not be true in an academic context as a substantial amount of time spent completing activities could decrease time spent allocating in engaging in committed actions, such as studying, required for academic success. Future research may also evaluate the effectiveness of values clarification and committed action delivered in different ways. Whereas using home-based activities allowed for tight experimental control in that each student received the same activity, other modes of delivery could achieve different outcomes, such as conducting group-based values clarification for 15 min at the beginning of each class period.
In summary, values clarification and committed action in the present study was effective at improving academic performance and psychological flexibility, relative to an active treatment control group. In addition, participants in the treatment group reported greater importance in educational values, and supported the social validity of this ACT-based technology. The results extend prior research in application with graduate students pursuing a degree in behavior analysis, with implications for how instructors go about preparing future practitioners and scientists in our field.
Compliance with Ethical Standards
Ethical Approval
All procedures performed in this study which involved human participants was in accordance with the ethical standards of the institution and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed Consent
Informed consent was obtained for participation in this research.
References
- Bohlmeijer ET, Fledderus M, Rokx TAJJ, Pieterse ME. Efficacy of an early intervention based on acceptance and commitment therapy for adults with depressive symptomatology: evaluation in a randomized controlled trial. Behaviour Research and Therapy. 2011;49:62–67. doi: 10.1016/j.brat.2010.10.003. [DOI] [PubMed] [Google Scholar]
- Bond FW, Hayes SC, Baer RA, Carpenter KM, Guenole N, Orcutt HK, Waltz T, Zettle RD. Preliminary psychometric properties of the acceptance and action questionnaire – II: a revised measure of psychological inflexibility and experiential avoidance. Behavior Therapy. 2011;42(4):676–688. doi: 10.1016/j.beth.2011.03.007. [DOI] [PubMed] [Google Scholar]
- Boyce TE, Hineline PN. Interteaching: a strategy for enhancing the user-friendliness of behavioral arrangements in the college classroom. The Behavior Analyst. 2002;25:215–226. doi: 10.1007/BF03392059. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Carr JE, Austin JL, Britton LN, Kellum KK, Bailey JS. An assessment of social validity trends in applied behavior analysis. Behavioral Interventions. 1999;14:223–231. doi: 10.1002/(SICI)1099-078X(199910/12)14:4<223::AID-BIN37>3.0.CO;2-Y. [DOI] [Google Scholar]
- Chase JA, Houmanfar R, Hayes SC, Ward TA, Plumb Vilardaga J, Follette V. Values are not just goals: onlinine ACT-based values training adds to goal setting in improving undergraduate college student performance. Journal of Contextual Behavioral Science. 2013;2:79–84. doi: 10.1016/j.jcbs.2013.08.002. [DOI] [Google Scholar]
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum
- Cohen JS, Miller LJ. Interpersonal mindfulness training for well-being: a pilot study with psychology graduate students. Teachers College Record. 2009;111(12):2760–2774. [Google Scholar]
- Cohen S, Kamarck T, Mermelstein R. A global measure of perceived stress. Journal of Health and Social Behavior. 1983;24:386–396. doi: 10.2307/2136404. [DOI] [PubMed] [Google Scholar]
- Dahl J. Valuing in ACT. Current Opinion in Psychology. 2015;2:43–46. doi: 10.1016/j.copsyc.2015.03.001. [DOI] [Google Scholar]
- Danitz SB, Orsillo SM. The mindful way through the semester: an investigation of the effectiveness of an acceptance-based behavioral therapy program on psychological wellness in first-year students. Behavior Modification. 2014;38(4):549–566. doi: 10.1177/0145445513520218. [DOI] [PubMed] [Google Scholar]
- Eisenberg D, Downs MF, Golberstein E, Zivin K. Stigma and help seeking for mental health among college students. Medical Care Research and Review. 2009;66(5):522–541. doi: 10.1177/1077558709335173. [DOI] [PubMed] [Google Scholar]
- El-Ghoroury NH, Galper DI, Sawaqdah A, Bufka LF. Stress, coping, and barriers to wellness among psychology graduate students. Training and Education in Professional Psychology. 2012;6(2):122–134. doi: 10.1037/a0028768. [DOI] [Google Scholar]
- Fienup DM, Critchfield TS. Transportability of equivalence-based programmed instruction: efficacy and efficiency in a college classroom. Journal of Applied Behavior Analysis. 2011;44:435–450. doi: 10.1901/jaba.2011.44-435. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fitzpatrick M, Henson A, Grumet R, Poolokasingham G, Foa C, Comeau T, Prendergast C. Challenge, focus, inspiration, and support: processses of values clarification and congruence. Journal of Contextual Behavioral Science. 2016;5:7–15. doi: 10.1016/j.jcbs.2016.02.001. [DOI] [Google Scholar]
- Gagnon J, Dionne F, Pychyl TA. Committed action: an initial study on its association to procrastination in academic settings. Journal of Contextual Behavioral Science. 2016;5:97–102. doi: 10.1016/j.jcbs.2016.04.002. [DOI] [Google Scholar]
- Goodman FR, Kashdan TB, Mallard TT, Schumann M. A brief mindfulness and yoga intervention with an entire NCAA division I athletic team: an initial investigation. Psychology of Consciousness: Theory, Research, and Practice. 2014;1(4):339–356. [Google Scholar]
- Grant L, Kinman G. Emotional resilience in the helping professions and how it can be enhanced. Health and Social Care Education. 2014;3:23–34. doi: 10.11120/hsce.2014.00040. [DOI] [Google Scholar]
- Griffin BW, Griffin MM. The effects of reciprocal peer tutoring on graduate students’ achievement, test anxiety, and academic self-efficacy. The Journal of Experimental Education. 1997;65(3):197–209. doi: 10.1080/00220973.1997.9943454. [DOI] [Google Scholar]
- Hayes SC, Barnes-Holmes D, Roche B, editors. Relational frame theory: a post-Skinnerian account of human language and cognition. New York, NY: Springer Science & Business Media; 2001. [DOI] [PubMed] [Google Scholar]
- Hayes SC, Strosahl KD, Bunting K, Twohig M, Wilson KG. What is acceptance and commitment therapy? In S. C. Hayes & K. D. Strosahl (Eds.), A practical guide to Acceptance and Commitment Therapy. New York: Springer-Verlag; 2004. What Is Acceptance and Commitment Therapy? pp. 3–29. [Google Scholar]
- Hayes SC, Luoma JB, Bond FW, Masuda A, Lillis J. Acceptance and commitment therapy: model, processes and outcomes. Behaviour Research and Therapy. 2006;44:1–25. doi: 10.1016/j.brat.2005.06.006. [DOI] [PubMed] [Google Scholar]
- Hayes, S. C., Strosahl, K. D., & Wilson, K. G. (2012). Acceptance and commitment therapy: The process and practice of mindful change (Second ed.). New York, NY: The Guilford Press.
- Hyun JK, Quinn BC, Madon T, Lustig S. Graduate student mental health: needs assessment and utilization of counseling services. Journal of College Student Development. 2006;47(3):246–266. doi: 10.1353/csd.2006.0030. [DOI] [Google Scholar]
- Lee CM, Neighbors C, Kilmer JR, Larimer ME. A brief, web-based personalized feedback selective intervention for college student marijuana use: a randomized clinical trial. Psychology of Addictive Behavior. 2010;24(2):256–273. doi: 10.1037/a0018859. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lipsky S, Ender S. Impact of a study skills course on probationary students’ academic performance. Journal of the First-Year Experience & Students in Transition. 1990;2(1):7–16. [Google Scholar]
- Lundgren T, Luoma JB, Dahl J, Strosahl K, Melin L. The bull's-eye values survey: a psychometric evaluation. Cognitive and Behavioral Practice. 2012;19:518–526. doi: 10.1016/j.cbpra.2012.01.004. [DOI] [Google Scholar]
- Michalski DS, Cope C, Fowler GA. Summary report: student attrition. Education Directorate: Prepared for the American Psychological Association; 2016. [Google Scholar]
- Murphey JG, Duchnick JJ, Vuchinich RE, Davison JW, Karg RS, Olson AM, Smith AF, Coffey TT. Relative efficacy of a brief motivational intervention for college student drinkers. Psychology of Addictive Behaviors. 2001;15(4):373–379. doi: 10.1037/0893-164X.15.4.373. [DOI] [PubMed] [Google Scholar]
- Myers Virtue S, Sweeney AC, Wesley KM, Fingerhut R. Self-care practices and perceived stress levels among psychology graduate students. Training and Education in Professional Psychology. 2012;6(1):55–66. doi: 10.1037/a0026534. [DOI] [Google Scholar]
- Oswalt SB, Riddock CC. What to do about being overwhelmed: graduate students, stress and university services. College Student Affairs Journal. 2007;27(1):24–43. [Google Scholar]
- Perrin CJ, Miller N, Haberlin AT, Ivy JW, Meindi JN, Neef NA. Measuring and reducing college students’ procrastination. Journal of Applied Behavior Analysis. 2011;44(3):463–474. doi: 10.1901/jaba.2011.44-463. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Plumb JC, Stewart I, Dahl J, Lundgren T. In search of meaning: Values in modern clinical behavior analysis. The Behavior Analyst. 2009;32(1):85–103. doi: 10.1007/BF03392177. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powers MB, Vörding MBZVS, Emmelkamp PM. Acceptance and commitment therapy: a meta-analytic review. Psychotherapy and Psychosomatics. 2009;78:73–80. doi: 10.1159/000190790. [DOI] [PubMed] [Google Scholar]
- Sandoz EK, Kellum KK, Wilson KG. Feasibility and preliminary effectiveness of acceptance and commitment training for academic success of at-risk students from low income families. Journal of Contextual Behavioral Science. 2017;6:71–79. doi: 10.1016/j.jcbs.2017.01.001. [DOI] [Google Scholar]
- Saville BK, Zinn TE, Neef NA, Norman RV, Ferreri SJ. A comparison of interteaching and lecture in the college classroom. Journal of Applied Behavior Analysis. 2006;39:49–61. doi: 10.1901/jaba.2006.42-05. [DOI] [PMC free article] [PubMed] [Google Scholar]
- U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS). (2017). Spring 2001 through Spring 2016, Fall Enrollment component; and Enrollment in Degree-Granting Institutions Projection Model, 1980 through 2026. Retrieved from: https://nces.ed.gov/programs/coe/indicator_chb.asp
- Vowles KE, McCracken LM. Acceptance and values-based action in chronic pain: a study of treatment effectiveness and process. Journal of Consulting and Clinical Psychology. 2008;76:397–407. doi: 10.1037/0022-006X.76.3.397. [DOI] [PubMed] [Google Scholar]
- Walker BD, Rehfeldt RA, Ninness C. Using the stimulus equivalence paradigm to teach course material in an undergraduate rehabilitation course. Journal of Applied Behavior Analysis. 2010;43:615–633. doi: 10.1901/jaba.2010.43-615. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Walton GM, Cohen GL. A brief social-belonging intervention improves academic and health outcomes of minority students. Science, New Series. 2011;331(6023):1447–1451. doi: 10.1126/science.1198364. [DOI] [PubMed] [Google Scholar]
- Wilson KG, Groom J. The valued living questionnaire. 2002. [Google Scholar]
- Wolf MM. Social validity: the case for subjective measurement or how applied behavior analysis is finding its heart. Journal of Applied Behavior Analysis. 1978;11:203–214. doi: 10.1901/jaba.1978.11-203. [DOI] [PMC free article] [PubMed] [Google Scholar]
