Abstract
This study evaluated the effects of a supervisory intervention on maintenance of appearance and organization in classrooms at a human services program for children and youths. The intervention combined performance feedback to classroom staff, public posting of performance outcomes, and eligibility for a performance-based incentive. Conducted in a multiple-baseline design, intervention was immediately and consistently effective in all classrooms. These findings support organizational behavior management applications within human services programs to improve performance that is related to environmental care.
Keywords: environmental care, human services programs, organizational behavior management, performance improvement
Human services programs for persons with intellectual and developmental disabilities (IDD) require effective methods of care-provider training and performance management of competencies such as recording data, conducting preference assessments, implementing instruction, and applying behavior support plans (Lerman, LeBlanc, & Valentino, 2015; Luiselli, 2015; Parsons, Rollyson, & Reid, 2012). In addition to implementing procedures with children and adults, care providers are also responsible for environmental care. For example, having a well-organized setting allows care providers to locate materials, prepare activities, and implement procedures efficiently. Maintaining cleanliness ensures hygienic conditions and can prevent health risks. A pleasing appearance makes the work environment comfortable for service recipients and demonstrates committed care to families, visitors, and outside agencies.
Despite many applications of organizational behavior management (OBM) in human services programs (Luiselli, 2018; Reid & Parsons, 2000), very few studies have evaluated methods to improve environmental cleanliness, order, and appearance. Carr, Wilder, Majdalany, Mathisen, and Strain (2013) targeted completion of several cleanliness and organizational tasks (materials placed on countertops, chairs positioned under tables, cabinet doors/drawers closed) by graduate student employees at a university-based treatment center for children with autism. Positive intervention effects were achieved by training the employees according to a task checklist, publicly posting the checklist in multiple rooms, clarifying the locations of needed materials, and graphically displaying performance data. Similarly, Schmidt, Urban, Luiselli, White, and Harrington (2013) evaluated procedures to improve appearance, organization, and safety of classrooms at a human services program for children with IDD. Based on data derived from a 20-item rating checklist, an intervention combining task directives, daily supervision, and graphic performance feedback was effective with classroom care providers.
In light of the limited research on environmental care performance improvement in human services programs, the present study evaluated the effects of a supervisory intervention on the appearance and organization of classrooms serving children with autism spectrum disorder and other neurodevelopmental disorders. The intervention was informed through analysis of performance-limiting factors (Carr et al., 2013; Ditzian, Wilder, King, & Tanz, 2015), combined several procedures, and was designed to be implemented with already-existing resources.
Method
Participants and Setting
The participants were 17 care providers within three classrooms at a human services program for children and youths. Classroom 1 was composed of five teaching staff, Classroom 2 was composed of seven teaching staff, and Classroom 3 was composed of five teaching staff. At the time of the study, the participants had been employed in their positions from 3 months to 11 years.
In each classroom, there were six students between 4 and 12 years old with diagnoses of autism, IDD, and global developmental delay. The students were in the classrooms for 6 hr on weekdays according to daily activity schedules that were based on learning objectives from their individualized education programs. The study participants were responsible for conducting instructional sessions and behavior support plans with the students, recording data, and, as described in the following sections, maintaining the appearance and organization of the classrooms.
The classrooms were similar in size, with multiple tables, bookcases, and rolling cabinets. Classrooms 1 and 3 had one larger sized table and a break area that included beanbag chairs, folding chairs, or small couches. Classroom 2 did not have a break area but did have bookcases along the walls containing toys, visual materials, flash cards, games, and other educational supplies. Several of the bookcases in each classroom had Plexiglas doors to prevent objects from falling out. The classrooms also had moveable “cubbies” where students kept their backpacks, coats, and lunch boxes; a desk for the primary teacher; and a computer table.
Measurement
Program guidelines focused on classroom work area and materials organization, storage of objects, accessibility, and general appearance. Work area organization addressed student desks being clear of instructional materials that were not in use. Surrounding areas such as bookcases and windowsills also needed to be clear of any objects that were not required for instruction. The expectation for materials organization was that stimuli such as picture icons, schedules, and token boards should be accessible to classroom staff but stored when not being used with students. Communication devices were excluded because they had to be available to students at all times. Within break areas, objects had to be placed in marked storage locations, electronic devices contained in a locked cabinet if not being used, and food items sealed in containers. Accessibility to the classrooms required chairs being pushed under tables and a clear floor space for students, teachers, and assistants to move freely. Finally, classroom staff were expected to keep their possessions and personal belongings in designated locations and out of reach of the students.
Based on the preceding environmental care guidelines specified by program administrators and supervisors, we designed a classroom measurement checklist (CMC) that identified eight measures of appearance and organization that applied to desks, shelves, chairs, electronics, toys, floor, storage, and personal belongings. Each day of the study, an assigned observer entered the classrooms between 9:30 and 10:00 a.m. and rated each measure on the CMC as displayed correctly (+) or displayed incorrectly (−). A total classroom score was computed by dividing the number of measures displayed correctly by the total measures (displayed correctly and displayed incorrectly) and multiplying by 100.
Interobserver Agreement
During 33.3% of days comprising the study, a second individual completed the CMC simultaneously but independently with the primary observer. An agreement was counted when both observers recorded the same rating on each measure. Average interobserver agreement among the three classrooms (agreements divided by the total number of agreements plus disagreements, multiplied by 100) was 87.0% (range 50%–100%).
Procedures and Experimental Design
The study evaluated the effects of intervention in a multiple-baseline across-settings (classrooms) design.
Baseline
Preceding the study, the participants had been advised about guidelines for maintaining classroom appearance and organization. During the baseline phase, a supervisor assigned to the classrooms reminded the participants about these expectations but there was no formal system in place to monitor and improve performance. The supervisor was a master’s level Board Certified Behavior Analyst who had a regular presence in the classrooms and had multiple service and training responsibilities in the human services program.
Intervention
The basis of intervention, informed by baseline observations and data, was that the participants required greater clarity, motivation, and feedback to perform environmental care proficiently. In the intervention phase, the supervisor continued to complete the CMC in the three classrooms each day. Following observation, the supervisor presented the checklist to the participants and reviewed each measure. For measures that had been displayed correctly, the supervisor delivered praise and approval such as, “Very good, all of the classroom toys and electronic devices were put away.” For measures that were displayed incorrectly, the supervisor reminded the participants about how the respective area(s) of the classroom should look according to the CMC criteria (e.g., “Remember, you need to have chairs under tables.”).
After presenting verbal performance feedback to the participants, the supervisor computed the classroom score and posted the number on a large line graph that was positioned on a wall in each classroom. The participants were instructed to view the graph at the conclusion of each observation. The amount of time the supervisor spent in each classroom observing, delivering performance feedback, and posting scores was between 8 and 10 min.
Finally, one participant from each classroom was eligible to receive a fast-food gift card from a random drawing when the classroom achieved a 100% score across 5 consecutive days. The participants had selected a gift card as a performance incentive preceding the study. On the 4th day of intervention in each classroom, the criterion for receiving a gift card was changed to a classroom score of 80% averaged across 5 consecutive days. The revised criterion was introduced to evaluate performance-improvement effects of imposing a less stringent requirement for contacting reinforcement (Wine, 2017).
Results
Figure 1 shows the percentage of CMC measures displayed correctly during baseline and intervention phases. In Classroom 1, the baseline average was 41%. Performance improved during intervention, with a variable increasing trend and phase average of 67.3%. For Classroom 2, the average score in baseline was 35.4%, which improved to 88.5% during intervention. Classroom 3 had a baseline average of 34.5% and intervention average of 75%. Concerning the performance incentive criteria, only Classroom 2 achieved an average score of 80% across 5 consecutive days.
Figure 1.
The percentage correct performance. Solid horizontal lines during the intervention phase indicate the criterion for the performance-based incentive.
Discussion
A supervisory intervention combining performance feedback, public posting of outcomes, and positive reinforcement effectively improved appearance and organization of classrooms at a human services program. The intervention was intended to be practical and efficient by (a) integrating procedures within already-existing supervision schedules, (b) establishing performance criteria that were consistent with program expectations, and (c) assessing participant preferences for an incentive contingency. In summary, these findings add to the limited literature that supports similar performance-improvement interventions within human services programs directed toward environmental care (Carr et al., 2013; Ditzian et al., 2015; Schmidt et al., 2013).
Notably, the baseline performance of participants was similar in all of the classrooms and may have reflected the uniform appearance and organization guidelines that were in place. Intervention clearly improved performance, although it is unclear whether the same magnitude of change would have occurred relative to a lower performing baseline. Also, notwithstanding the gains achieved through intervention, only one of the classrooms received the programmed incentive and only after the criterion was made less stringent. It is likely that the source of improvement from intervention was principally supervisor-delivered verbal and graphic performance feedback, which also included praise and approval. One objective of future research would be conducting component analyses of performance-improvement interventions that typically combine multiple procedures and incentive systems.
Another facet of our intervention evaluation was that none of the classrooms improved performance to a consistent 100% level, suggesting the need for further and perhaps intensified supports. For example, the frequency of observation and supervision in the classrooms could be increased to provide staff with additional performance feedback. Other incentive options and schedules of reinforcement could also be considered. Or, staff could be taught self-management strategies to boost and sustain their performance in the absence of, or with reduced, supervision monitoring.
The participants were able to maintain improved appearance and organization of the classrooms throughout intervention, but we did not measure their performance long term. Another limitation is that intervention was introduced to Classroom 2 and Classroom 3 despite improving albeit variable performance trends during the baseline phase. The predictable schedule of classroom observations and supervision also may have influenced participant performance. Finally, the supervisor implemented intervention according to a standardized protocol, but the study did not include assessment of procedural fidelity.
As evident from our findings and supported by previous research, performance feedback and public posting of outcomes can have relatively fast-acting effects as OBM interventions (DiGennaro Reed, Hirst, & Howard, 2013; Reid, Parsons, & Green, 2011). Within human services programs, rapid changes in staff performance are often a concern to ensure proper service delivery and adherence to operational policies. In the case of environmental care, performance-based supervision can be recommended as an approach that produces desirable and meaningful on-the-job improvement.
Implications for Practice
Environmental care is an important performance objective in human services programs.
Physical indicators of environmental care can be reliably measured.
Systematic feedback to care providers is an effective approach toward performance improvement.
Incentive systems may have variable effects on performance.
Compliance with Ethical Standards
Conflict of interest
The authors declare that they have no conflicts of interest.
Ethical approval
All procedures were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent
Informed consent was obtained from all of the participants included in the study.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Carr JE, Wilder DA, Majdalany L, Mathisen D, Strain LA. An assessment-based solution to a human-service employee performance problem: An initial evaluation of the Performance Diagnostic Checklist–Human Services. Behavior Analysis in Practice. 2013;6:16–32. doi: 10.1007/BF03391789. [DOI] [PMC free article] [PubMed] [Google Scholar]
- DiGennaro Reed FD, Hirst JM, Howard VJ. Empirically supported staff selection, training, and management strategies. In: Reed DD, DiGennaro Reed FD, Luiselli JK, editors. Handbook of crisis intervention and developmental disabilities. New York, NY: Springer; 2013. pp. 71–85. [Google Scholar]
- Ditzian K, Wilder DA, King A, Tanz J. An evaluation of the Performance Diagnostic Checklist–Human Services to assess an employee performance problem in a center-based autism treatment facility. Journal of Applied Behavior Analysis. 2015;48:199–203. doi: 10.1002/jaba.171. [DOI] [PubMed] [Google Scholar]
- Lerman DC, LeBlanc LA, Valentino AL. Evidence-based application of staff and caregiver training procedures. In: Roane H, Ringdahl JE, Falcomata T, editors. Clinical and organizational applications of applied behavior analysis. New York, NY: Elsevier; 2015. pp. 321–351. [Google Scholar]
- Luiselli JK. Performance management and staff preparation. In: DiGennaro FD, Reed DD, editors. Autism service delivery: Bridging the gap between science and practice in autism service delivery. New York, NY: Springer; 2015. pp. 465–489. [Google Scholar]
- Luiselli JK. Organizational behavior management applications in human services programs. In: Wine B, Pritchard J, editors. Organizational behavior management: The essentials. Orlando, FL: Hedgehog Publishers; 2018. [Google Scholar]
- Parsons MB, Rollyson JH, Reid DH. Evidence-based staff training: A guide for practitioners. Behavior Analysis in Practice. 2012;5:2–11. doi: 10.1007/BF03391819. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reid DH, Parsons MB. Organizational behavior management in human service settings. In: Austin J, Carr JE, editors. Handbook of applied behavior analysis. Reno, NV: Context Press; 2000. pp. 274–294. [Google Scholar]
- Reid DH, Parsons MB, Green CW. The supervisor training curriculum: Evidence-based ways to promote work quality and enjoyment among support staff. Washington, DC: American Association on Intellectual and Developmental Disabilities; 2011. [Google Scholar]
- Schmidt JD, Urban KD, Luiselli JK, White C, Harrington C. Improving appearance, organization, and safety of special education classrooms: Effects of staff training in a human services setting. Education and Treatment of Children. 2013;36:1–13. doi: 10.1353/etc.2013.0012. [DOI] [Google Scholar]
- Wine B. Incentive-based performance improvement. In: Luiselli JK, editor. Applied behavior analysis advanced guidebook. New York, NY: Elsevier; 2017. pp. 117–134. [Google Scholar]

