Abstract
We examined the effectiveness of reducing response effort and an e-mail prompt for increasing preparedness of 17 therapists for a social skills group in a human services organization. We evaluated whether participants knew the correct lession and sport and whether they felt prepared for the session via a paper survey. The Performance Diagnostic Checklist-Human Services indicated deficiencies in all four domains. The most significant barriers were prompts and access to materials. Results showed that the reduced response effort to access materials and email prompt increased employee preparedness compared to the no e-mail condition.
Keywords: Antecedent, Employee preparedness, Human service setting, Prompt, Response effort
In human service organizations, staff are expected to run a variety of programs with clients, and those programs are updated as the clients gain new skills. Sometimes, the basic structure of programming remains consistent (e.g., social skills training) while elements of the program vary (e.g., type of group activity). Once staff are trained to mastery on the general program, they can often implement that program across a variety of activities and situations (Bolton & Mayer, 2008; Ducharme & Feldman, 1992). Even untrained staff can implement programs with high fidelity if the instructions are clear. For example, Graff and Karsten (2012) found that teachers who had no prior training on preference assessments were able to conduct them with 98%–100% fidelity when they were provided diagrams and step-by-step instructions without jargon. Thus, effective initial training and access to clear instructions may allow staff to prepare for subsequent sessions and perform with high treatment integrity, even without direct training on specific elements of the program. This is an important finding because most human service organizations require efficient, low-cost, and effective interventions to improve and maintain staff performance.
Despite the value of clear instructions for implementing programs, staff sometimes do not access them to prepare for sessions. Thus, an intervention may be required to encourage staff to access instructional materials prior to sessions. Recent research on staff performance has employed the Performance Diagnostic Checklist–Human Services (PDC-HS; Carr, Wilder, Majdalany, Mathisen, & Strain, 2013) to guide the selection of interventions for employees in human service organizations. The PDC-HS evaluates performance across four domains (training; task clarification and prompting; resources, materials, and processes; and consequences, effort, and competition) to identify deficiencies and suggest interventions. In many cases, a package intervention spanning more than one domain is selected to improve performance (Wilder, Cymbal, & Villacorta, 2020a). A review by Wilder, Cymbal, and Villacorta (2020a) found that performance consequences, response effort, and competing tasks were the most common deficiencies, and their table revealed that task clarification, training, feedback, and material availability were the most common interventions selected.
Although response effort can be a significant barrier to performance, as indicated by the results of the PDC-HS (Wilder, Cymbal, & Villacorta, 2020a), a review by Wilder, Ertel, and Cymbal (2020b) only identified seven studies that evaluated the manipulation of response effort to change performance in organizational settings. For example, Casella et al. (2010) found that staff were more likely to wear their protective gloves during low- and medium-effort conditions compared to high-effort conditions. Similarly, Abellon and Wilder (2014) found that making protective equipment easier to access increased compliance with wearing protective eyewear. Van Houten, Hilton, Schulman, and Reagan (2011) encouraged seat belt compliance by increasing the force required to accelerate when the seat belt was not fastened. The four remaining studies demonstrated that recycling bin proximity (in relation to students and the trash bin) influenced recycling at universities (Fritz et al., 2017; Ludwig, Gray, & Rowell, 1998; Miller, Meindl, & Caradine, 2016; O’Connor, Lerman, Fritz, & Hodde, 2010). In six of the seven studies, the accessibility of materials improved performance.
Response effort may be an effective intervention because it can make the task less aversive and/or increase the likelihood of contacting reinforcement (Friman & Poling, 1995; Wilder, Ertel, & Cymbal, 2020b). In fact, response effort deficiencies can be indicated in three of the four domains of the PDC-HS. Question 3 of the task clarification and prompting domain asks if job aids are readily available, and Question 5 asks whether “the environment is well-suited for task completion” (Carr et al., 2013, p. 28). In the resources, materials, and processes domain, Questions 2–4 ask if the materials are available, well designed, and organized, and Question 5 asks if other unfinished tasks are preventing completion. Finally, Question 4 in the performance consequences, effort, and competition domain asks if the task is effortful. Thus, researchers recognize that response effort is a major barrier to performance, and this suggests that more research demonstrating that reducing response effort improves performance is needed. Reducing response effort may be a particularly important intervention in human service organizations, where efficient, low-cost, and effective strategies to improve and maintain staff performance are needed.
The purpose of this study was to use the PDC-HS to guide the selection of the intervention to improve therapists’ preparedness for a social skills program in a human service organization. Based on the findings, we evaluated an intervention that reduced response effort to improve knowledge of the lesson topics and self-reported staff preparedness before social skills group behavioral therapy.
Method
Participants and Settings
Seventeen Registered Behavior Technicians (RBTs; n = 4 male, n = 12 female), ages 18 and over, who worked in a social skills program for individuals diagnosed with autism spectrum disorders in a human service organization participated. All RBTs had been trained on the social skills program. The social skills program consisted of two groups that met once per week for 1.5 hr. Groups 1 and 2 provided therapy to clients ages 6–10 years and 11–14 years, respectively; 9 therapists worked with Group 1, and 11 therapists worked with Group 2. Only the therapists from each social skills group participated; the clients from the social skills groups did not participate.
The social skills program comprised a didactic portion, described in the lesson plan, and an opportunity to apply the week’s target skills in the context of playing sports. The social skills curriculum assigned a different sport each week. The didactic portion of the therapy occurred in a classroom within the human service organization, and the sports portion of the therapy took place in the sports’ respective fields (e.g., the basketball lesson happened on the basketball court).
Dependent Variables and Measurement
Each therapist completed a four-item questionnaire during a mandatory meeting that occurred 15 min before the start of the session while the clients were not present. We used the four-item questionnaire to collect the primary dependent variable and secondary measures. Each therapist completed the questionnaire anonymously using a pen and paper.
Correct lesson and sport
The main dependent variable was the grouped percentage of correctly answered questions about the lesson topic (Question 1: What is the lesson topic?) and the assigned sports activity for the session (Question 2: What is the sport/activity?). Grouped percentages were obtained by dividing the sum of all correct answers for Questions 1 and 2 by the correct plus incorrect answers for Questions 1 and 2 and multiplying by 100.
Preparedness self-report
Secondary measures were the grouped percentage of reported preparedness to conduct the session (Question 3: Do you feel comfortable following the lesson plan accurately without using the paper as a prompt? [yes/no]; and Question 4: Did you read the lesson plan before 4:15 today? [yes/no]). We obtained grouped percentages by dividing the sum of all “yes” responses by the sum of all “yes” and “no” responses and multiplying the result by 100.
Interrater Reliability
Two independent researchers coded all questionnaires for 100% of the sessions. Interrater reliability was calculated by dividing the number of agreements by the total number of agreements plus disagreements and multiplying by 100. Interrater reliability across both groups was 100%.
Pretreatment Assessment
We used the PDC-HS (Carr et al., 2013) to guide intervention selection just prior to the implementation of the intervention. We conducted the PDC-HS interviews with the program supervisor, a high-performing participant, a low-performing participant, and a participant who was new to the program. Currently, there is no established best practice on who should complete the PDC-HS; however, completing an assessment with a subset of performers instead of all performers is typical of assessments in organizational behavior management (e.g., Carr et al., 2013). We selected these individuals to obtain a range of responses.
Results of the PDC-HS (depicted in Fig. 1) identified opportunities for improvement in all four categories: training; task clarification and prompting; resources, materials, and processes; and performance consequences, effort, and competition. Following the administration of the PDC-HS, we asked interviewees which of the four categories they believed were the most significant barriers to reading the lesson plan and identifying the sport for the day. All responders stated that they thought the most significant barrier was the high response effort to locate the information, followed by lack of prompts.
Fig. 1.

Percentage of items where at least one employee who was interviewed responded that the item was deficient
Experimental Procedures
We employed a multielement design alternating between baseline and intervention conditions for each of the social skills group therapists.
Baseline
Therapists were required to locate and read the lesson plan and identify the sport assigned for the session as they usually would. The steps to locate the lesson plan were to log in to the center’s password-protected shared folder (which was compliant with the Health Insurance Portability and Accountability Act), select the social skills folder (from an array of several folders), select the file labeled with the correct week number, and open the file. To identify the sport assigned for the session, therapists had to check the semester schedule, which was distributed via e-mail at the beginning of the program. A copy of the schedule was also placed in a binder located in the clinic.
Intervention
During the intervention condition, we e-mailed the lesson plan and the assigned sport for the session 24 hr before each session, making the materials more accessible to staff. The intervention took less than 5 min to implement each day. The subject line of the e-mail was “Social Skills Lesson Plan.” The body of the e-mail contained the attachment of the lesson plan description and the following message:
Hey, Social Skills Team!
Tomorrow you will be going to [sports name].
Attached below is the lesson plan for [lesson title].
Please have the lesson plan read by 4:15 tomorrow.
Thanks for all your hard work!
[experimenter’s signature]
Results
The left panel of Fig. 2 displays the percentage of correctly answered questions for the main dependent variable (answering quiz questions correctly about the lesson plan and sport) during baseline and treatment conditions for therapists in social skills Groups 1 and 2. During baseline, the percentages of correctly answered questions were 59.05% (range 20.00%–78.57%, SD = 19.52) and 65.76% (range 50.00%–87.50%, SD = 8.38) for Groups 1 and 2, respectively. The percentage of correctly answered questions increased to 97.92% for both groups (Group 1: range 91.67%–100.00%, SD = 2.08; Group 2, range 93.75%–100.00%, SD = 2.08) during the intervention condition. When participants answered incorrectly, the most common answer was “I don’t know.”
Fig. 2.
Percentage of correct responses by therapists for sport and lesson plan (Questions 1 and 2, left panel) and percentages of “yes” responses by therapists for feeling prepared (Questions 3 and 4, right panel)
The right panel of Fig. 2 shows the percentage of questions answered “yes” for the preparedness self-report measures during baseline and treatment conditions for both groups. The percentages of self-reported preparedness during baseline conditions were 32.86% (range 20.00%–57.14%, SD = 21.04) for Group 1 and 16.09% (range 6.25%–21.43%, SD = 6.85) for Group 2. During the intervention conditions, the percentages of self-reported preparedness increased to 62.67% (range 44.44%–81.25%, SD = 18.17) and 64.56% (range 56.25%–68.75%, SD = 7.22) for Groups 1 and 2, respectively.
Discussion
The results of this study indicate that an e-mail with the attached lesson plan treatment package increased the likelihood that participants would know the correct topic and sport for the week and report that they felt prepared. This study extends previous research by demonstrating that a simple intervention to reduce response effort improved the performance targets. Although the e-mail reduced the response effort to access the instructional materials, it may have also served as a prompt. This is likely true for most response effort manipulations that make materials more accessible. For example, moving recycling bins closer to participants means they may be more visible, and thus could also prompt performance. In this case, the prompt alone could have encouraged staff to find the materials, but anecdotal reports suggest that would not have been sufficient and that including the information in the e-mail was necessary.
This intervention package was guided by the results of the PDC-HS and included components from multiple domains, and this highlights that the PDC-HS elucidates multiple barriers that the treatment package can simultaneously address. Furthermore, because response effort spans multiple domains, it is possible to address multiple deficits with one intervention component. However, not all identified barriers need to be included to successfully improve performance. For instance, we did not include training in the treatment package even though it was identified as a barrier. To further guide intervention selection, iterations of the PDC-HS could include ratings or rankings of the severity of each deficit identified.
Additionally, we conducted the PDC-HS interviews with a supervisor, a high performer, a low performer, and a new performer to get multiple perspectives. However, there is no research to suggest who should be interviewed for the PDC-HS or if conducting it with multiple performers will lead to more accurate results. Future research should evaluate who should complete the PDC-HS and how many people should be interviewed to collect sufficient information.
Wilder et al. (2020a, b) identified that most interventions derived from the PDC-HS included training and consequence-based interventions. Albeit effective, both are time and resource intensive. This study suggests that in some instances, decreasing response effort can increase desired behaviors, even when training and consequence barriers are identified. Future research can directly compare less intrusive to more intrusive interventions to evaluate the interventions’ effectiveness, sustainability, and cost to establish the parameters to determine when less intrusive interventions are preferred over more intrusive ones. Nonetheless, practitioners should consider response effort when requiring staff to read updated information each week in preparation for their jobs’ tasks.
One major limitation of this study was that the impact of preparedness on the delay of the start of treatment or quality of treatment was not directly assessed. Anecdotally, staff indicated a strong preference for continuing to receive the e-mail and thought they were better able to follow the lesson plan when they received it. Future research should evaluate this intervention package using direct behavioral measures (e.g., therapists’ treatment integrity) and client outcomes. Furthermore, the topic of the lesson plan was listed in the e-mail, so participants did not have to open the lesson plan to answer the question about the lesson plan topic correctly on the quiz. However, participants were also asked if they read the lesson plan in the questions, and they reported reading it more often when the intervention was in place. Future studies can include a treatment integrity component to evaluate which components of the intervention the participants completed. Despite these limitations, this study demonstrates that a simple, brief intervention may improve reported staff preparedness for shifts. More research aimed at improving staff performance should first seek to reduce response effort before including more time- and cost-intensive interventions.
Funding
No funding was used for this research.
Compliance with Ethical Standards
Conflict of Interest
The authors declare that they have no conflicts of interest.
Ethical Approval
This research was approved by the Human Subject Institutional Review Board at the Florida Institute of Technology.
Footnotes
Research Highlights
• Asking responders to rank or rate the importance of each deficit identified in the Performance Diagnostic Checklist–Human Services can lead to effective interventions in human service organizations.
• Simple, low-cost antecedent interventions can improve some staff performances.
• Clinicians should consider reducing response effort and prompting before implementing more time- and resource-intensive consequence interventions.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Abellon OE, Wilder DA. The effect of equipment proximity on safe performance in a manufacturing setting. Journal of Applied Behavior Analysis. 2014;47(3):628–632. doi: 10.1002/jaba.137. [DOI] [PubMed] [Google Scholar]
- Bolton J, Mayer DM. Promoting the generalization of paraprofessional discrete trial teaching skills. Focus on Autism and Other Developmental Disabilities. 2008;23(2):103–111. doi: 10.1177/1088357608316269. [DOI] [Google Scholar]
- Carr JE, Wilder DA, Majdalany L, Mathisen D, Strain LA. An assessment-based solution to a human-service employee performance problem. Behavior Analysis in Practice. 2013;6:16–32. doi: 10.1007/BF03391789. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Casella SE, Wilder DA, Neidert P, Rey C, Compton M, Chong I. The effects of response effort on safe performance by therapists at an autism treatment facility. Journal of Applied Behavior Analysis. 2010;43:729–734. doi: 10.1901/jB.2010.43-729. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ducharme JM, Feldman MA. Comparison of staff training strategies to promote generalized teaching skills. Journal of Applied Behavior Analysis. 1992;25(1):165–179. doi: 10.1901/jaba.1992.25-165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Friman PC, Poling A. Making life easier with effort: Basic findings and applied research on response effort. Journal of Applied Behavior Analysis. 1995;28:583–590. doi: 10.1901/jaba.1995.28-583. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fritz JN, Dupuis DL, Wu W, Neal AE, Retting LA, Lastrapes RE. Evaluating increased effort for item disposal to improve recycling at a university. Journal of Applied Behavior Analysis. 2017;50(4):825–829. doi: 10.1002/jaba.405. [DOI] [PubMed] [Google Scholar]
- Graff RB, Karsten AM. Evaluation of a self-instruction package for conducting stimulus preference assessments. Journal of Applied Behavior Analysis. 2012;45(1):69–82. doi: 10.1901/jaba.2012.45-69. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ludwig TD, Gray TW, Rowell A. Increasing recycling in academic buildings: A systematic replication. Journal of Applied Behavior Analysis. 1998;31(4):683–686. doi: 10.1901/jaba.1998.31-683. [DOI] [Google Scholar]
- Miller ND, Meindl JN, Caradine M. The effects of bin proximity and visual prompts on recycling in a university building. Behavior and Social Issues. 2016;25(1):4–10. doi: 10.5210/bsi.v25i0.6141. [DOI] [Google Scholar]
- O’Connor RT, Lerman DC, Fritz JN, Hodde HB. Effects of number and location of bins on plastic recycling at a university. Journal of Applied Behavior Analysis. 2010;43:711–715. doi: 10.1901/jaba.2010.43-711. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Houten R, Hilton B, Schulman R, Reagan I. Using accelerator pedal force to increase seat belt use of service vehicle drivers. Journal of Applied Behavior Analysis. 2011;44(1):41–49. doi: 10.1901/jaba.2011.44-41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilder D, Cymbal D, Villacorta J. The Performance Diagnostic Checklist–Human Services: A brief review. Journal of Applied Behavior Analysis. 2020;53(2):1170–1176. doi: 10.1002/jaba.676. [DOI] [PubMed] [Google Scholar]
- Wilder, D. A., Ertel, H. M., & Cymbal, D. J. (2020b). A review of recent research on the manipulation of response effort in applied behavior analysis. Behavior Modification. 10.1177/0145445520908509. [DOI] [PubMed]

