Skip to main content
Behavior Analysis in Practice logoLink to Behavior Analysis in Practice
. 2022 Feb 4;15(3):951–957. doi: 10.1007/s40617-021-00675-4

The Performance Diagnostic Checklist - Human Services: Guidance for Assessment Administration

Denys Brand 1,, Tyra P Sellers 2, David A Wilder 3, James E Carr 2
PMCID: PMC9582049  PMID: 36465592

Abstract

The Performance Diagnostic Checklist - Human Services (PDC-HS) is an assessment designed to assess the environmental variables contributing to employee performance concerns in human-service settings. Recent research has demonstrated that interventions indicated by the PDC-HS result in improved employee performance across several human-service settings and that the assessment has acceptable reliability and validity. Although PDC-HS-indicated interventions have been effective at increasing employee performance, there is a need for additional guidance when using the assessment given the limited nature of the original administration guidelines. Thus, the purpose of the current manuscript is to introduce additional guidance for use of the PDC-HS across a variety of situations.

Key words: functional assessment, human services, performance analysis, performance diagnostic checklist - human services, performance management, staff evaluation


Performance analysis is the organizational equivalent of the functional assessment of problem behavior that is often conducted in clinical and educational settings (Ditzian et al., 2015; Wilder et al., 2020). The goal of performance analysis is to identify environmental variables contributing to employee performance concerns and subsequently develop assessment-based interventions to address them (Austin, 2000). The most common performance analysis instrument in the behavior-analytic literature in recent decades is the Performance Diagnostic Checklist (PDC; Austin, 2000; Wilder et al., 2018). The PDC is an informant-based method for conducting a performance analysis and was designed specifically for use in business and industry (Austin, 2000).

In 2013, the PDC was adapted for use in human-service settings (e.g., clinics, residential treatment facilities, schools), resulting in the Performance Diagnostic Checklist – Human Services (PDC-HS; Carr et al., 2013). Common employee performance concerns in human-service settings include poor attendance, inadequate data collection, and failure to implement treatment protocols with sufficient integrity, among others. Given the importance of services delivered in human-service settings, it is paramount that employee performance concerns are addressed efficiently and effectively. In addition, given the uniqueness of human-service settings, adapting the PDC to those environments was warranted. The PDC-HS includes 20 questions, 13 of which are informant-based, and 7 that require direct observation. The questions are organized into four domains: Training; Task Clarification and Prompting; Resources, Materials, and Processes; and Performance Consequences, Effort, and Competition. Interventions based on results from the PDC-HS have been referred to as indicated interventions, and those that are not have been referred to as nonindicated interventions; the former should be prioritized to maximize treatment efficiency.

Since its publication in 2013, the PDC-HS has been used in several empirical studies. A recent review by Wilder et al. (2020) showed that interventions indicated by the PDC-HS have been successfully implemented across a variety of human-service settings to improve employee performance, suggesting that the assessment is a valid and reliable tool for diagnosing the potential causes of performance concerns (see Wilder et al., 2020 for further discussion of the reliability and validity of the PDC-HS). Although PDC-HS-indicated interventions have been effective at improving employee performance, the original administration guidelines from Carr et al. (2013) are minimal. The original guidance appears sufficient when PDC-HS users operate under ideal or near-ideal conditions (Wilder et al., 2019). However, PDC-HS users do not always work under such conditions and may find themselves unsure of how to proceed when conducting the assessment. For example, in real-world consultations, PDC-HS questions are often followed with additional clarifying questions, and the answers to these follow-up questions may affect scoring and intervention selection (Cymbal et al., 2020). Based on our collective experience with the PDC-HS, it is our assertion that users may benefit from additional guidance and instructions when using the assessment. Thus, the purpose of the present manuscript is to provide updated guidance for administering the PDC-HS. We make several recommendations and suggestions for how to most effectively implement the PDC-HS based on available data, as well as our collective experience conducting related research, and training others to use the assessment. To that end, we discuss some considerations with respect to completing the PDC-HS, targeting performance concerns, and using data collected from PDC-HS assessments. Table 1 includes a summary of our main recommendations and suggestions.

Table 1.

Summary of PDC-HS Assessment Guidelines

User-Guidelines
How to Use the PDC-HS

• Interview format with direct supervisor (except for direct observations) and ensure accuracy of report (e.g., others confirm, no presence of bias on part of supervisor)

• Where possible, use permanent product recording (e.g., videos, reports) to verify performance concern

• Interview target staff directly to gather information of which supervisor may not be aware or to which they may not have access

Training Others to Use the PDC-HS

Take a Behavioral Skills Training approach

• Instructions (i.e., Tailor direct instructions to the needs of individual trainees. Provide resources, such as articles)

• Modeling (i.e., Demonstrate how to administer the PDC-HS)

• Rehearsal (i.e., Give trainees the opportunity to practice using the assessment)

• Feedback (i.e., Trainer reviews trainee performance and provides feedback)

Describe Performance Concern

• Accurately describe the performance concern in a way that is measurable and observable

• Describe performance concern as either a behavioral excess or deficit

Conducting Direct Observations

• Conduct direct observations during typical working conditions to ensure that data are representative of employee behavior

• Minimize employee reactivity

• Collect enough data to obtain reliable information

Using PDC-HS Data

• Every NO response is an opportunity for intervention

• When several NO responses are scored across more than one domain, carefully consider the performance concern, items scored as NO, and available organizational resources when selecting and designing interventions

• Retain and aggregate data collected from the PDC-HS across employees and performance concerns to help organizations detect the need for possible systems-levels changes

Completing the PDC-HS

The PDC-HS (Carr et al., 2013) was designed to be implemented in an interview format, except for the items that require direct observation. That is, the consultant or manager interviews the direct supervisor of the employee exhibiting the performance concern and the direct supervisor answers PDC-HS questions about the employee’s performance concern. However, no additional guidance was provided on how to complete the assessment. Despite the authors’ intention that the assessment be completed by behavior analysts, researchers have demonstrated that supervisors without training in behavior analysis can successfully use the PDC-HS (e.g., Bowe & Sellers, 2018; Smith & Wilder, 2018). In a recent review of the PDC-HS literature, Wilder et al. (2020) proposed that it might be valuable to explore having employees complete the assessment tool themselves. Given these varied applications and recommendations, supervisors and those responsible for training supervisors may benefit from some guidance related to when and how to implement the PDC-HS.

When to Use the PDC-HS

The PDC (designed for use in business and industry) and PDC-HS (used to address performance concerns in human-service settings) were designed to help supervisors identify the environmental variables related to a specific performance concern and subsequently design a matched intervention (Carr et al., 2013). Newer supervisors may have limited experience addressing staff performance concerns in a systematic and planful manner; therefore, they might use the PDC-HS each time they encounter an employee performance issue until they become proficient at (a) considering and identifying the relevant environmental factors that may be impacting performance and (b) matching an intervention to the identified barriers. Once proficient with this process, experienced supervisors may find they follow the process without using the PDC-HS for every performance issue. Conversely, consistently using the PDC-HS when addressing all performance concerns may increase the likelihood of successfully assessing and addressing the need early on and reduce the chances that the issue will worsen or resurface in the future. Thus, for inexperienced supervisors, it may be good practice to consistently implement the PDC-HS for persistent staff performance concerns.

If experienced supervisors cease using the PDC-HS for more common staff performance concerns, they may still consider using the PDC-HS under a few specific circumstances. First, experienced supervisors may wish to periodically complete the PDC-HS to evaluate their own behavior when addressing staff performance concerns to detect any drift from carefully assessing potential environmental barriers in a systematic manner (e.g., Godat & Brigham, 1999). Second, it may be helpful to use the PDC-HS if a supervisor did not initially use it, but the performance issue was unresponsive to typical strategies for improving performance. In such cases, the supervisor may have overlooked a critical contributing factor requiring adjustments to the intervention to directly address the function of the performance issue. Third, if relatively new supervisors rarely encounter employee performance concerns, they may consider using the PDC-HS when those issues do arise, as they are likely not fluent in taking a structured, function-matched approach to assessment and intervention.

All supervisors should consider using the PDC-HS in instances in which they may be negatively influenced by their own perspective, history, or bias (e.g., Bernardin et al., 2016; Gonsalvez & Freestone, 2007; Kaplan , 2018) with the target employee, such that they find it difficult to remain objective when assessing the performance concern. Using the PDC-HS may provide sufficient structure to guide the supervisor through an objective evaluation process that minimizes attributing performance concerns to personal characteristics (e.g., laziness, inability to comprehend a concept or learn a skill) and maximizes critically assessing the environmental contributors.

How to Implement the PDC-HS

In most instances, the employee’s direct supervisor will complete the PDC-HS (Carr et al., 2013). If the direct supervisor is well trained in using the assessment, they may complete the steps – answer questions; review data; and identify, design, implement, and monitor the intervention – on their own. In other cases, the employee’s direct supervisor may need support from their own supervisor to complete the aforementioned steps (e.g., when the direct supervisor is new or unfamiliar with the performance concern). In either case, questions should be answered based on current, reliable information about the employee. Many of the questions (e.g., question 1 in the Training and Task Clarification and Prompting domains) can be answered based on what the direct supervisor already knows about the employee’s performance. However, it is important to critically evaluate the degree to which the direct supervisor’s recollection represents a current, accurate, and consistent pattern of the performance concern.

Many supervisors may report that they accurately remember details regarding an employee’s performance concern. Research in organizational behavior management, and applied behavior analysis more broadly, has provided some reason for skepticism about this claim. For example, Iwata et al. (2013) found that informant methods of assessment of problem behavior may be accurate in only about 64% of cases. Researchers have also assessed the extent to which supervisors can identify employee’s most preferred items, which requires them to remember details about employee behavior. Wilder et al. (2007) found that supervisors are not very good at identifying preferred items for their employees. Wilder et al. (2007) had supervisors in a variety of industries list some of their employees’ most preferred items, and then had those employees rank their preferences. Results showed that supervisors were good at identifying an employee’s most preferred item, but correlations between the manger-identified rank and the employee-identified rank were low. Wilder et al. (2011) replicated these findings with 100 participants. Therefore, it is important to ensure that the supervisor’s account is not influenced by a singular recent instance of poor performance, an overly sensitive perception of the performance concern, some other bias, or just a faulty recollection of the performance. This is especially important when the supervisor has limited direct knowledge about the employee’s performance and is relying on the employee’s direct supervisor to provide answers to the assessment questions. To address these concerns, the supervisor can ask whether there is a consistent performance concern (versus one that has only occurred once), and whether others also describe this pattern of behavior as concerning. In some cases, there may be permanent products (e.g., reports, work areas, video footage) that the supervisor can review to assist in verifying the performance issue (e.g., Ditzian et al., 2015; Merritt et al., 2019).

Merritt et al. (2019) reported an alternative method of implementation in which the consultant directly interviewed the employees exhibiting the performance concern. These researchers individually interviewed four school-based direct-care staff members themselves, as well as their direct supervisors, using the PDC-HS to identify the variables contributing to tardiness among the direct-care staff members. Although the results obtained between the direct-care staff members and their supervisors were similar, 3 of the 4 participants showed some differences. This is not surprising, as responses from employees exhibiting the performance concern may be biased. That is, when asked questions about their performance, employees might not be skilled at self-observation and evaluation and, thus, might downplay the severity of the concern, or might provide inaccurate information to appear more capable.

Merritt et al. (2019) deviated from PDC-HS administration guidelines because they were concerned that the direct-care staff members’ supervisors were unaware of some of the variables that may have impacted tardiness, which may be a common occurrence. One possibility for future research might be to interview both parties simultaneously or interview employees after interviewing supervisors, such that discrepancies between the two could be addressed in the interview with employees. Future research should formally evaluate PDC-HS outcomes obtained from employees exhibiting performance concerns relative to their supervisors.

Conducting Direct Observations

Three of the PDC-HS domains include items requiring direct observation of employee behavior. When conducting direct observations, supervisors should do so during typical working conditions as it is important to obtain representative samples of employee performance (Sharp et al., 2015). For example, conducting direct observations on a day in which the employee is feeling ill, has to complete new or unfamiliar tasks, or is working a shift outside of their typical schedule, may not produce data that are truly representative of their performance. Thus, conducting direct observations during times and under conditions that the employee is typically expected to perform the task is recommended. Supervisors should be aware of employee reactivity; employees may behave differently when they know that they are being observed (Kazdin, 1979; Mowery et al., 2010). Moreover, direct observations of the employee should continue until enough data have been collected to obtain reliable information. It is likely that the minimum number of observations needed varies across employees and performance concerns.

Training Others to use the PDC-HS

When supervisors are training others to become supervisors, or when they are mentoring new supervisors, they should consider incorporating the PDC-HS into their training. It is recommended that supervisors follow a behavioral skills training (BST) approach in these instances (Miltenberger, 2003; Wurtele, 1986). Behavioral skills training is an evidence-based teaching procedure consisting of instructions, modeling, rehearsal, and feedback (Novak et al., 2019; Parsons et al., 2012). Researchers have demonstrated that BST can be used to train others to administer various types of assessment procedures with which they had no prior experience (e.g., Barnes et al., 2014; Shayne & Miltenberger, 2013). Thus, the evidence suggests that BST can be an effective procedure for training supervisors to administer the PDC-HS.

When taking this approach, the supervisor-trainer should first identify whether the trainee or mentee is familiar with the PDC-HS, and if so, to what degree. The supervisor-trainer can then tailor the explicit instruction to the needs of the individual. Training activities might include any combination of the following components: providing articles to read, reviewing the PDC-HS in depth, discussing prior experiences using the PDC-HS, role-playing, and arranging opportunities to collaboratively implement the PDC-HS using actual employee performance concerns or case scenarios. Training activities could include the trainee or mentee observing while the supervisor-trainer completes the PDC-HS themselves or interviews the employee’s direct supervisor. It may also involve the trainee or mentee completing the PDC-HS along with the supervisor-trainer, and then completing it on their own and reviewing the results with the supervisor-trainer, who may have also independently completed the assessment.

Considerations for Targeting Performance Concerns

Describing the Performance Concern

One of the first steps when completing the PDC-HS is to describe the performance concern. The original guidelines indicate that performance concerns should be operationalized as either a behavioral excess or deficit (Carr et al., 2013). When describing employee concerns, it is important to carefully pinpoint (i.e., accurately describing behavior in a way that is measurable and observable) the behavior of interest (Daniels & Bailey, 2014; DiGennaro Reed et al., 2018; Rodriguez et al., 2016). Using well-crafted behavioral pinpoints is important to reduce the likelihood of any misunderstandings between managers and consultants when discussing performance concerns, which could lead to unreliable answers when completing the assessment (Wilder et al., 2019). Unreliable answers are especially concerning given that interventions are selected based on the information obtained from the assessment (Carr et al., 2013). Clear descriptions of the performance concern can also help the PDC-HS user to frame questions in a way that helps them ask about the pinpointed behavior more clearly (e.g., “have you received training on how to properly close the door following the completion of your shift” versus “have you received training on the task”).

Complexity of the Performance Concern

The PDC-HS is well suited for discrete performance concerns of varying complexity that can be easily observed and measured. As noted in the review by Wilder et al. (2020), researchers have applied the PDC-HS to tasks related to managing the environment (e.g., cleaning and securing doors), implementing instructional strategies (e.g., error correction and discrete-trial instruction procedures, providing response opportunities), and other job-related duties (e.g., pricing items, arriving on time). However, it may be the case that the PDC-HS could be applied to more nuanced skills, such as issues related to interpersonal communication (e.g., being overly argumentative, failing to demonstrate good audience control by shifting verbal behavior, or demonstrating low affect), problem solving, organizational and time management skills, or self-care (e.g., identifying stressors, engaging in strategies to manage stress and maintain health). These types of skills may not have discrete beginnings and ends, are not performed in the same way each time, and often require in-the-moment adjustments to responding based on contextual information (e.g., the way a communication partner responds, the presence of risk of harm to a client, the emergence of an acute life stressor).

Completing the questions for the skills described above may necessitate flexibility on the part of the supervisor, requiring them to thoughtfully consider how the questions might support identifying related barriers or creative supports. For example, consider an employee who consistently fails to use language that is easily understood by caregivers. In such an application the supervisor will need to be flexible when answering the questions, but in doing so might identify the relevant environmental variables that allow for a successful intervention. In our example, answering the first question of the PDC-HS (Has the employee received formal training on this task?) might reveal that the employee has never had high-quality instruction on how to develop a positive therapeutic relationship with caregivers, including avoiding the use of overly technical jargon. Perhaps the skill was explained, but not directly modeled and practiced. Or maybe the skill was practiced, but the employee was never directly observed interacting with caregivers. Some questions, such as those in Task Clarification and Prompting and Resources, Materials, and Processes domains may initially appear irrelevant to our example performance concern.

However, careful consideration of the how the questions might reveal information about the relationship between the performance concern and related environmental barriers is warranted. For example, Question 3 in the Task Clarification and Prompting domain of the PDC-HS asks about the presence of a job-aid, which may appear irrelevant to the performance concern in the example. It may not be so strange to think that performing a complex skill like explaining behavior-analytic interventions in everyday language could be supported by a guide that includes some examples of common translations. Whereas, such a support is not technically a job-aid, reflecting on this specific question could lead the thoughtful supervisor to create such a support and assess whether it helps alleviate the performance concern. Similarly, careful consideration of question 1 in the Resource, Materials, and Processes domain (Are there sufficient numbers of trained staff available in the organization to complete the task?) might reveal that the employee’s colleagues also share the same performance concern, such that speaking in very technical terms with caregivers has become commonplace. Therefore, application of the PDC-HS to performance concerns that are more nuanced should not result in skipping questions that may at first appear irrelevant. Instead, those questions should prompt the supervisor to engage in a thoughtful consideration of related possible environmental barriers relevant to the issue at hand.

Using PDC-HS Data

There are several ways the data collected from the PDC-HS can be used. The most common application is to identify and implement performance improvement plans for individual employees or a small group of employees. In this typical approach, each item scored as a No on the assessment is an opportunity for intervention. When several items (within or across domains) have been scored as No, multiple interventions can be implemented either concurrently or consecutively based on the resources that are available within an organization (Carr et al., 2013). This approach has proven to be effective and has resulted in improved employee performance across a variety of settings (Wilder et al., 2020). However, there may be instances in which a supervisor uses the PDC-HS to derive an intervention that does not produce the desired result. For example, in two recent studies (Collier-Meek et al., 2021; Merritt et al., 2019), some participants were not responsive to the PDC-HS indicated interventions, requiring the researchers to modify some of their procedures or interventions.

Selecting interventions when several items across multiple domains have been scored as No can present some difficulties. Previously, it was recommended that priority be given to domains with multiple No responses (Carr et al., 2013). However, selecting interventions may not always be that simple. Consider an example in which a supervisor administers the PDC-HS and scores that the materials needed to complete the task are not readily available (question 1 in Resources, Materials, and Processes) but does not mark any other items in that domain as problematic. In that case it may appear that this domain is acceptable, whereas the reality may be that the employee cannot complete the task without having easy access to materials. In such a case, an intervention would be required for that particular item to promote improved employee performance even if items within that domain had a smaller number of No responses relative to other domains. Once the intervention has been implemented (i.e., resources needed to complete the task have been made available) the supervisor may consider reassessing employee performance to determine whether further action is required. Thus, we recommend that the performance concern itself, items scored as No on the PDC-HS assessment, and available organizational resources be carefully considered when selecting interventions.

The data collected from the PDC-HS can also be retained and aggregated across several employees and performance concerns (Wilder et al., 2020). As the use of the PDC-HS becomes more common within an organization, certain patterns within the data may begin to emerge. For example, if the Training domain is repeatedly identified as problematic across several employees and tasks, it may be that the organization needs to evaluate their training procedures and develop treatment integrity checks to ensure that high-quality training is maintained once provided. Thus, the data collected across all assessments conducted within an organization could indicate the need for systems-level changes (Diener et al., 2009), which may assist organizations in taking a more proactive approach when addressing employee performance concerns rather than always having to react every time an issue arises. We suggest that supervisors retain and aggregate the data every time the assessment is conducted and update their records as new data are collected. Future studies could investigate the utility of using PDC-HS data to inform systems-level changes.

Concluding Remarks

As the demand for effective behavior-analytic services in human service settings continues to increase, so does the need for effective supervision. The issues surrounding effective supervisory practices within the applied behavior analysis profession become even more pronounced when considering the continued large increases in the number of certified behavior-analytic practitioners (Behavior Analyst Certification Board, n.d.). Thus, developing and refining performance analysis methods designed to assist supervisors with quickly and effectively addressing employee performance concerns represent an important line of investigation. For the most part, the PDC-HS appears to be relatively simple and straightforward. However, users of the PDC-HS do not always operate under ideal conditions and may at times find themselves unsure of how to proceed when conducting the assessment. We provided a discussion of several issues involving the use of the PDC-HS with the intent of clarifying areas of concern. In addition, some recommendations, particularly those related to nontraditional implementation methods of the PDC-HS, warrant investigation by researchers. We hope that users of the PDC-HS find our guidance and recommendations useful. Moreover, we hope that our discussion points will stimulate more research into these areas.

Funding

No funding was received to assist with the preparation of this manuscript.

Disclosures

Conflict of interest

The authors declare that they have no conflicts of interest.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Austin J. Performance analysis and performance diagnostics. In: Austin J, Carr JE, editors. Handbook of applied behavior analysis. Context Press; 2000. pp. 321–349. [Google Scholar]
  2. Barnes CS, Mellor JR, Rehfeldt Implementing the verbal behavior milestones and placement program (VB-MAPP): Teaching assessment techniques. Analysis of Verbal Behavior. 2014;30(1):36–47. doi: 10.1007/s40616-013-0004-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Behavior Analyst Certification Board. (n.d.). BACB certificant data.https://www.bacb.com/bacb-certificant-data/
  4. Bernardin HJ, Thomason S, Buckley MR, Kane JS. Rater rating-level bias and accuracy in performance appraisals: The impact of factor rater personality, performance management competence, and rater accountability. Human Resource Management. 2016;55(2):321–340. doi: 10.1002/hrm.21678. [DOI] [Google Scholar]
  5. Bowe M, Sellers TP. Evaluating the performance diagnostic checklist - human services to assess incorrect error-correction procedures by preschool paraprofessionals. Journal of Applied Behavior Analysis. 2018;51(1):166–176. doi: 10.1002/jaba.428. [DOI] [PubMed] [Google Scholar]
  6. Carr JE, Wilder DA, Majdalany L, Mathison D, Strain LA. An assessment-based solution to a human-service employee problem: An initial evaluation of the performance diagnostic checklist - human services. Behavior Analysis in Practice. 2013;6(1):16–32. doi: 10.1007/BF03391789. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Collier-Meek MA, Sanetti LMH, Gould K, Pereira BA. Using the performance diagnostic checklist to evaluate and promote paraeducators’ treatment fidelity. Journal of School Psychology. 2021;86:1–14. doi: 10.1016/j.jsp.2021.02.005. [DOI] [PubMed] [Google Scholar]
  8. Cymbal DC, Wilder DA, Thomas R, Ertel H. Further evaluation of the validity and reliability of the Performance Diagnostic Checklist-Human Services. Journal of Organizational Behavior Management. 2020;40(3-4):249–257. doi: 10.1080/01608061.2020.1792027. [DOI] [Google Scholar]
  9. Daniels AC, Bailey JS. Performance management: Changing behavior that drives organizational effectiveness. (5th ed.). Aubrey Daniels International, Inc.; 2014. [Google Scholar]
  10. Diener LH, McGee HM, Miguel CF. An integrated approach for conducting a behavioral systems analysis. Journal of Organizational Behavior Management. 2009;29(2):108–135. doi: 10.1080/01608060902874534. [DOI] [Google Scholar]
  11. DiGennaro Reed FD, Novak MD, Erath TG, Brand D, Henley AJ. Pinpointing and measuring employee behavior. In: Wine B, Pritchard JK, editors. Organizational behavior management: The essentials. Hedgehog Publishers; 2018. [Google Scholar]
  12. Ditzian K, Wilder DA, King A, Tanz J. An evaluation of the performance diagnostic checklist - human services to assess an employee performance problem in a center-based autism treatment facility. Journal of Applied Behavior Analysis. 2015;48(1):199–203. doi: 10.1002/jaba.171. [DOI] [PubMed] [Google Scholar]
  13. Godat LM, Brigham TA. The effect of a self-management training program on employees of a mid-sized organization. Journal of Organizational Behavior Management. 1999;19(1):65–83. doi: 10.1300/J075v19n01_06. [DOI] [Google Scholar]
  14. Gonsalvez CJ, Freestone J. Field supervisors’ assessments of trainee performance: Are they reliable and valid. Australian Psychologist. 2007;42(1):23–32. doi: 10.1080/00050060600827615. [DOI] [Google Scholar]
  15. Iwata BA, DeLeon IG, Roscoe EM. Reliability and validity of the functional analysis screening tool. Journal of Applied Behavior Analysis. 2013;46(1):271–284. doi: 10.1002/jaba.31. [DOI] [PubMed] [Google Scholar]
  16. Kaplan SE. Further evidence on the negativity bias in performance evaluation: When does the evaluator’s perspective matter? Journal of Management Accounting Research. 2018;30(1):169–184. doi: 10.2308/jmar-51698. [DOI] [Google Scholar]
  17. Kazdin Unobtrusive measures in behavioral assessment. Journal of Applied Behavior Analysis. 1979;12(4):713–724. doi: 10.1901/jaba.1979.12-713. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Merritt TA, DiGennaro Reed FD, Martinez CE. Using the performance diagnostic checklist - human services to identify an indicated intervention to decrease employee tardiness. Journal of Applied Behavior Analysis. 2019;52(4):1034–1048. doi: 10.1002/jaba.643. [DOI] [PubMed] [Google Scholar]
  19. Miltenberger, R. G. (2003). Behavior modification: Principles and procedures (3rded.). .
  20. Mowery JM, Miltenberger RG, Weil TM. Evaluating the effects of reactivity to supervisor presence on staff response to tactile prompts and self-monitoring in a group home setting. Behavioral Interventions. 2010;25(1):21–35. doi: 10.1002/bin.296. [DOI] [Google Scholar]
  21. Novak MD, DiGennaro Reed FD, Erath TG, Blackman AL, Ruby S, Pellegrino AJ. Evidence-based performance management: Applying behavioral science to support practitioners. Perspectives on Behavior Science. 2019;42(9):955–972. doi: 10.1007/s40614-019-00232-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Parsons MB, Rollyson JH, Reid DH. Evidence-based staff training: A guide for practitioners. Behavior Analysis in Practice. 2012;5(2):2–11. doi: 10.1007/BF03391819. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Rodriguez, M., Sundberg, D., & Biagi, S. (2016). OBM applied: A practical guide to implementing organizational behavior management. ABA Technologies, Inc.
  24. Sharp RA, Mudford OC, Elliffe D. Representativeness of direct observations selected using a work-sampling equation. Journal of Applied Behavior Analysis. 2015;48(1):153–166. doi: 10.1002/jaba.193. [DOI] [PubMed] [Google Scholar]
  25. Shayne R, Miltenberger RG. Evaluation of behavioral skills training for teaching functional assessment and treatment selection skills to parents. Behavioral Interventions. 2013;28(1):4–21. doi: 10.1002/bin.1350. [DOI] [Google Scholar]
  26. Smith M, Wilder DA. The use of the performance diagnostic checklist - human services to assess and improve the job performance of individuals with intellectual disabilities. Behavior Analysis in Practice. 2018;11(2):148–153. doi: 10.1007/s40617-018-0213-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Wilder, D. A., Rost, K., & McMahon, M. (2007). The accuracy of managerial prediction of employee preference. Journal of Organizational Behavior Management, 27(2), 1-14. 10.1300/J075v27n02_01
  28. Wilder DA, Harris C, Casella S, Wine B, Postma N. Further evaluation of the accuracy of managerial prediction of employee preference. Journal of Organizational Behavior Management. 2011;31(2):130–139. doi: 10.1080/01608061.2011.569202. [DOI] [Google Scholar]
  29. Wilder DA, Lipschultz J, Gehrman C, Ertel H, Hodges A. A preliminary assessment of the validity and reliability of the performance diagnostic checklist - human services. Journal of Organizational Behavior Management. 2019;39(3-4):194–212. doi: 10.1080/01608061.2019.1666772. [DOI] [Google Scholar]
  30. Wilder DA, Cymbal D, Villacorta J. The performance diagnostic checklist - human services: A brief review. Journal of Applied Behavior Analysis. 2020;53(2):1170–1176. doi: 10.1002/jaba.676. [DOI] [PubMed] [Google Scholar]
  31. Wilder D, Lipschultz J, King A, Driscoll S, Sigurdsson S. An analysis of the commonality and type of preintervention assessment procedures in the Journal of Organizational Behavior Management (2000–2015) Journal of Organizational Behavior Management. 2018;38(1):5–17. doi: 10.1080/01608061.2017.1325822. [DOI] [Google Scholar]
  32. Wurtele, S. K. (1986). Teaching young children personal body safety: The Behavioral Skills Training Program. Author

Articles from Behavior Analysis in Practice are provided here courtesy of Association for Behavior Analysis International

RESOURCES